From golden at utdallas.edu Wed Mar 1 07:35:31 2000 From: golden at utdallas.edu (Richard M Golden) Date: Wed, 1 Mar 2000 06:35:31 -0600 (CST) Subject: March 3 deadline for submitting paper to "Neural Net Technical Session" Message-ID: Please send email correspondence to "bein at malachite.cs.unlv.edu". ===================================== From y.wilks at dcs.shef.ac.uk Wed Mar 1 11:22:25 2000 From: y.wilks at dcs.shef.ac.uk (Yorick Wilks) Date: Wed, 1 Mar 2000 16:22:25 GMT Subject: RESEARCH SCHOLARSHIPS AVAILABLE IN COMPUTER SCIENCE/NLP Message-ID: <200003011622.QAA10033@burbage.dcs.shef.ac.uk.> University of Sheffield Department of Computer Science RESEARCH DEGREES IN COMPUTER SCIENCE This department intends to recruit a number of postgraduate research students to begin studies in October 2000 or before. Successful applicants will read for an M.Phil or Ph.D. The department has research groups in: Natural Language Processing Verification and Testing Communications and Distributed Systems Speech and Hearing Computer Graphics Machine Learning Neurocomputing and Robotics UK, EU and overseas candidates with research interests in any relevant area are encouraged to apply. Candidates for all awards should have a good honours degree in a relevant discipline (not necessarily Computer Science), or should attain such a degree by September 1999. Part-time registration is a possibility. A number of EPSRC awards are available, which are available for UK students' fees and support and, on a fees-only basis, for EU students. More details of our research, and application forms, are on our world-wide-web site: http://www.dcs.shef.ac.uk. Hard copy application forms and further particulars are available from the Research Admissions Secretary, Department of Computer Science, University of Sheffield, Regent Court, 211 Portobello St, Sheffield S1 4DP. All applicants should quote reference number ST051. Informal enquiries may be addressed to: Professor Yorick Wilks: +44 114 222 1804, yorick at dcs.shef.ac.uk From wolfskil at MIT.EDU Fri Mar 3 15:33:00 2000 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Fri, 3 Mar 2000 16:33:00 -0400 Subject: book announcement--Cloete Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 2189 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/09fb884c/attachment.bin From terry at salk.edu Sat Mar 4 00:39:50 2000 From: terry at salk.edu (terry@salk.edu) Date: Fri, 3 Mar 2000 21:39:50 -0800 (PST) Subject: NEURAL COMPUTATION 12:3 Message-ID: <200003040539.VAA12984@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 3 - March 1, 2000 ARTICLE Dynamics of Encoding In Neuron Populations: Some General Mathematical Features Bruce W. Knight NOTE Local And Global Gating Of Synaptic Plasticity Manuel A. Sanchez-Montanis, Paul F. M. J. Verschure, and Peter Konig Nonlinear Autoassociation Is Not Equivalent To PCA Nathalie Japkowicz, Stephen Josi Hanson and Mark A. Gluck LETTERS No Free Lunch For Noise Prediction Malik Magdon-Ismail Self-Organization Of Symmetry Networks: Transformation Invariance From The Spontaneous Symmetry-Breaking Mechanism Chris J. S. Webber Geometric Analysis Of Population Rhythms In Synaptically Coupled Neuronal Networks J. Rubin and D. Terman An Accurate Measure of The Instantaneous Discharge Probability, With Application To Unitary Joint-Event Analysis Quentin Pauluis and Stuart N. Baker Impact Of Correlated Inputs On The Output Of The Integrate- And-Fire Model Jianfeng Feng and David Brown Weak, Stochastic Temporal Correlation of Large-Scale Synaptic Input Is A Major Determinant of Neuronal Bandwidth David M. Halliday Second-Order Learning Algorithm With Squared Penalty Term Kazumi Saito and Ryohei Nakano ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2000 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From sok at cs.york.ac.uk Mon Mar 6 07:23:49 2000 From: sok at cs.york.ac.uk (Simon E M O'Keefe) Date: Mon, 06 Mar 2000 12:23:49 +0000 Subject: Workshop on Optical, Neural and Computational Associative processing Message-ID: <38C3A355.264F139F@cs.york.ac.uk> WONCA '00 Workshop on Optical, Neural and Computational Associative processing University of York, York, United Kingdom 30-31 March 2000 Call for participation WONCA' 00 is an international workshop aimed at identifying key research issues and strategy for Associative Processing, and will focus on aspects of associative computation in the biological, optical and electronics domains. There are approximately 6 (six) funded workshop places available for scientists wishing to participate in this workshop (total number of participants approx 30). Places include one night's accomodation and meals for the duration of the workshop. In addition, standard travel within the UK will be refundable. Research Students and Research Associates are particularly encouraged to apply. Some background information and the provisional timetable are given below. More details of the arrangements may be found on the workshop website, at http://www.cs.york.ac.uk/WONCA. Interested parties are invited to complete and return the registration details below and submit a personal statement outlining their reasons for wishing to participate in the workshop. Submissions may be sent by email to WONCA at cs.york.ac.uk, or by post to: WONCA Department of Computer Science University of York Heslington YORK YO10 5DD United Kingdom Submissions should be sent to arrive no later than 17th March 2000. The workshop organisers decision regarding acceptance is final. No correspondence will be entered into. --------------------------------------------------------------------- Background EPSRC Networks The EPSRC has established a number of Emerging Computing Networks, to establish communities and stimulate research in long term, speculative, emerging areas of computing that have the potential for high rewards. The programme aims to encourage the transfer of ideas and insights, and to stimulate research to advance computing and IT beyond the foreseeable capabilities of existing technologies. Deliverables required of each Network are: i. A definition and description of the topic area. ii. A report on the state-of-the-art, both in the UK and overseas. iii. Identification of key research issues within the area. iv. Identification of enabling technologies and tools. v. Recommendations to move the area forward. Emergent Behaviour Computing Network The Emergent Behaviour Computing Network recognises the need for new computational paradigms, aimed at circumventing the limitations of conventional silicon and computing technologies, and at the comprehension and control of complex systems. Complex systems exhibit behaviours that are a product of the system as a whole. Such behaviours have long been recognised in natural systems, but man-made can also exhibit such emergent behaviours. These behaviours are always unintended and often detrimental. This Network will focus on extending current distributed computing by exploring the potential for harnessing Emergent Behaviour Computing. WONCA '00 WONCA' 00 will focus on all aspects of associative computation as related to biological, optical and electronics domains and will address the following questions: 1. What can we learn from nature about associative computation? 2. What can we build now and what we would like to construct in the future? 3. What would we like associative systems to do? The key speakers will present the state-of-the-art in neurobiology, neurophysiology, optics, and neural networks research. There will be an emphasis on active discussion, and the result of the workshop will be contributions to the deliverables required by the EPSRC. --------------------------------------------------------------------- Provisional Timetable Day 1 - 30th March 2000 12:00 Registration 13:00 Buffet Lunch 13:45 Welcome from Chairman 14:00 Speaker 1 - D Willshaw Neurobiological aspects 14:35 Speaker 2 - G Palm Theoretical aspects 15:10 Speaker 3 - I Aleksander Emergent aspects 15:45 Tea 16:00 Regroup 16:15 Break-out Session - Discussion of research questions and strategy 17:15 Regroup 17:30 Round-up discussion 18:00 Day Close 19:30 Leave for Social 20:00 Dinner Day 2 - 31st March 2000 08:00 Breakfast 09:00 Day 1 Results & Reminder of Purpose 09:10 Speaker 4 - U Rueckert Implementation aspects 09:45 Speaker 5 - A Krikelis Implementation aspects 10:20 Coffee 10:30 Break-out Session - Discussion of research questions and strategy 11:30 Reports from Group Speakers 12:15 General Discussion 13:00 Lunch 14:30 Summary of Recommendations 14:45 Close --------------------------------------------------------------------- REGISTRATION DETAILS Name: Affiliation: Address: Telephone: Fax: Email: Emergency contact: Date of arrival: Date of departure: Vehicle registration: Disability/access requirements: Dietary requirements: Personal statement: (This should indicate your background and your particular interest in participation, given the aims of the workshop) From bengioy at IRO.UMontreal.CA Mon Mar 6 09:35:59 2000 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Mon, 6 Mar 2000 09:35:59 -0500 Subject: program of workshop on Selecting and Combining Models with Machine Learning Algorithms (corrected) Message-ID: <20000306093559.53019@IRO.UMontreal.CA> Hello, The preliminary program of the Workshop on Selecting and Combining Models with Machine Learning Algorithms is ready, and posted on www.iro.umontreal.ca/~bengioy/crmworkshop2000 The workshop will be held in Montreal, April 11-14, 2000. Registration is free but mandatory (and early registration is recommanded because the number of seats may be limited). Organizers: Yoshua Bengio and Dale Schuurmans List of speakers: Grace Wahba Leo Breiman Yoav Freund Peter Bartlett Peter Sollich Tom Dietterich Michael Perrone Hugh Chipman Dale Schuurmans Christian Leger Robert Schapire Shai Ben-David Bill Armstrong Olivier Chapelle Ayan Demiriz Sorin Draghici Russ Greiner Jasvinder Kandola Ofer Melnik Ion Muslea In-Jae Myung Gunnar Raetsch -- Yoshua Bengio Professeur aggrege Departement d'Informatique et Recherche Operationnelle Universite de Montreal, addresse postale: C.P. 6128 Succ. Centre-Ville, Montreal, Quebec, Canada H3C 3J7 addresse civique: 2920 Chemin de la Tour, Montreal, Quebec, Canada H3T 1J8, #2194 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/~lisa From golden at utdallas.edu Sun Mar 5 22:04:48 2000 From: golden at utdallas.edu (Richard M Golden) Date: Sun, 5 Mar 2000 21:04:48 -0600 (CST) Subject: Request for Comments on "Mathematical Methods for Neural Network Analysis and Design" Message-ID: I'm in the process of preparing a second edition of my book "Mathematical Methods for Neural Network Analysis and Design" (MIT Press, 1996). If you have read the book and have either positive or negative comments regarding its contents or if you have suggestions regarding material which you feel should be in the book, please let me know. Thanks! Richard Golden Professor of Cognitive Science and Electrical Engineering ******************************************************************************* Cognition and Neuroscience Program, School of Human Development, GR41 The University of Texas at Dallas, Box 830688 Richardson, Texas 75083-0688, PHONE: (972) 883-2423 EMAIL: golden at utdallas.edu, WEB: http://www.utdallas.edu/~golden/index.html ******************************************************************************* From bengio at idiap.ch Tue Mar 7 03:35:10 2000 From: bengio at idiap.ch (Samy Bengio) Date: Tue, 7 Mar 2000 09:35:10 +0100 (MET) Subject: open positions in speech/vision Message-ID: IDIAP invites applications for the positions of Speech Processing Group Leader as well as Senior Research Positions in Speech and Computer Vision Qualified applicants are assumed to have a Ph. D. in Computer Science, Electrical Engineering or a related field. They must have an outstanding research record and a proven excellence in leadership, project management, and supervision of Ph. D. students. They should master state-of-the-art techniques and theories related to speech or vision processing, hidden Markov Models and neural networks, learning algorithms, and statistical learning theory. About IDIAP IDIAP is a semi-private non-profit research institute, affiliated with the Swiss Federal Institute of Technology at Lausanne (EPFL) and the University of Geneva. For the last 10 years, IDIAP has mainly been carrying research and development in the fields of speech and speaker recognition, computer vision, and machine learning. Location IDIAP is located in the town of Martigny in Valais, a scenic region in the south of Switzerland, surrounded by some of the highest mountains in Europe which offer some of the best skiing, hiking, and climbing. It is within close proximity to Lausanne , and lake Geneva, and centrally located for travel to other parts of Europe. Prospective candidates should send their detailed CV to the address given below. Prof. Herve Bourlard Director of IDIAP P.O. Box 592 CH-1920 Martigny, Switzerland Email : secretariat at idiap.ch Phone : +41 27 721 77 20 Fax : +41 27 721 77 12 ----- Samy Bengio Research Director. Machine Learning Group Leader. IDIAP, CP 592, rue du Simplon 4, 1920 Martigny, Suisse. tel: +41 27 721 77 39, fax: +41 27 721 77 12. mailto:bengio at idiap.ch, http://www.idiap.ch/~bengio From ijspeert at rana.usc.edu Tue Mar 7 21:26:34 2000 From: ijspeert at rana.usc.edu (Auke Ijspeert) Date: Tue, 7 Mar 2000 18:26:34 -0800 (PST) Subject: CFP: session on Biologically Inspired Robotics (Boston 3-8 Nov.) Message-ID: [Apologies for multiple postings. Please note that this call for papers is sent at a short notice: the deadline for the abstract is in 3 weeks already, but the abstract need only to be 250-words long.] CFP: session on BIOLOGICALLY INSPIRED ROBOTICS 3-8 November 2000, Boston, Massachusetts (the exact days of the session have still to be defined) Special session of Sensor Fusion and Decentralized Control in Robotic Systems III (RB06) http://www.spie.org/web/meetings/calls/pe00/confs/RB06.html Chairs: Auke Jan Ijspeert (USC, Los Angeles), Nicolas Franceshini (CNRS, France) This session addresses biological inspiration in robotics at several levels: from sensory systems to biomimetic structures and muscle-like actuation, via biologically inspired control mechanisms. It will look in particular at 1) what are the benefits of biological inspiration in current robotics, and 2) what insights can a robotic implementation give, in return, into the functioning of biological systems. Although this session is primarily interested in robotics, work involving realistic physically-based simulations will also be considered. Participants are invited to submit a 250-word abstract before the ---27th of MARCH 2000---. Please submit both through the conference web page at http://www.spie.org/web/meetings/calls/pe00/confs/RB06.html AND by sending an electronic copy to ijspeert at rana.usc.edu and enfranceschini at LNB.cnrs-mrs.fr . Please specify at the beginning of the abstract that it is submitted to "Biologically-Inspired Robotics session, Sensor Fusion and Decentralized Control in Robotic Systems III (RB06)". Participants whose abstract has been accepted after review will also be invited to contribute to the conference proceedings by submitting a manuscript by the 14th of August 2000 (details to be given). GENERAL CALL FOR PAPERS: SPIE Photonices East 3-8 November 2000, Boston, Massachusetts Sensor Fusion and Decentralized Control in Robotic Systems III (RB06) http://www.spie.org/web/meetings/calls/pe00/confs/RB06.html On-site Proceedings. Abstracts for this conference are due by 27 March 2000. Manuscripts are due by 14 August 2000. Conference Chairs: Gerard T. McKee, Univ. of Reading (UK); Paul S. Schenker, Jet Propulsion Lab. Program Committee: Mongi A. Abidi, Univ. of Tennessee/Knoxville; Dimi Apostolopoulos, Carnegie Mellon Univ.; Eric T. Baumgartner, Jet Propulsion Lab.; George A. Bekey, Univ. of Southern California; Henrik I. Christensen, Royal Institute of Technology (Sweden); Steven Dubowsky, Massachusetts Institute of Technology; John T. Feddema, Sandia National Labs.; Nicolas H. Franceschini, Ctr. National de la Recherche Scientifique (France); Terrance L. Huntsberger, Jet Propulsion Lab.; Pradeep K. Khosla, Carnegie Mellon Univ.; Maja J. Mataric, Univ. of Southern California; Robin R. Murphy, Univ. of South Florida; Francois G. Pin, Oak Ridge National Lab.; J=FCrgen Rossmann, Univ. of Dortmund (Germany); Arthur C. Sanderson, Rensselaer Polytechnic Institute; Tzyh-Jong Tarn, Washington Univ. This conference addresses multi-sensor fusion and distributed control in robotic systems. The theme is intelligent automation through enriched perception and action skills, at all levels of robot tasking, including cooperative interactions of multiple robots. We welcome multi-disciplinary submissions based in both technological and biological models. The primary focus of the meeting is algorithmic: submissions should clearly state a robotic sensing or control objective, outline a physical or behavioral model, reduce it to computational definition, and present experimental and/or analytical results. Describe what distinguishes your problem as a robotic sensor fusion or distributed control problem; contrast your approach with alternative methods. Papers that report working robotic systems and specific task applications are particularly desirable. In all submissions authors should relate their objectives to mainstream problems in robot navigation, manipulation, surveillance, planning, assembly, learning, etc. Topics of interest include, but are not limited to: *modeling, registration, and calibration of multiple sensors *3D object estimation from multiple features and views *visual integration of structural and motion information *robust fusion of active and passive sensors & databases *estimation, recognition and error models for data fusion *task driven planning and sequencing of robotic sensors *integration of vision and touch in dexterous robotic tasks *sensor based human-machine interaction (voice/gesture/etc.) *learning strategies for multi-sensor object recognition *decentralized control of multiple-armed and legged robots *task planning for reconfigurable & modular robotic systems *motion coordination and control in multiple robot tasks *cooperative, emergent behaviors in multiple agents *task based mapping and learning of sensor-based behaviors *biologically inspired sensing, controls, and behaviors --------------------------------------------------------------------------- Dr Auke Jan Ijspeert Brain Simulation Lab & Computational Learning and Motor Control Lab Dept. of Computer Science, Hedco Neurosciences bldg, 3614 Watt way U. of Southern California, Los Angeles, CA 90089-2520, USA Web: http://rana.usc.edu:8376/~ijspeert/ Tel: +1 213 7401922 or 7406995 (work) +1 310 8238087 (home) Fax: +1 213 7405687 Email: ijspeert at rana.usc.edu --------------------------------------------------------------------------- From margindr at CS.ORST.EDU Wed Mar 8 14:27:30 2000 From: margindr at CS.ORST.EDU (Dragos Margineantu) Date: Wed, 8 Mar 2000 11:27:30 -0800 (PST) Subject: ICML-2000 Workshop on Cost-Sensitive Learning Message-ID: <200003081927.LAA04805@ghost.CS.ORST.EDU> Invitation to Participate, Call for Contributions WORKSHOP ON COST-SENSITIVE LEARNING ----------------------------------- [In conjunction with the Seventeenth International Conference on Machine Learning - ICML-2000, Stanford University June 29-July 2, 2000] Workshop webpage is at: http://www.cs.orst.edu/~margindr/Workshops/Workshop-ICML2000.html Workshop Motivation and Description ----------------------------------- Recent years have seen supervised learning methods applied to a variety of challenging problems in industry, medicine, and science. In many of these problems, there are costs associated with measuring input features and there are costs associated with different possible outcomes. However, existing classification algorithms assume that the input features are already measured (at no cost) and that the goal is to minimize the number of misclassification errors (the 0/1 loss). For example, in medical diagnosis, different tests have different costs (and risks) and different outcomes (false positives and false negatives) have different costs. The cost of a false positive medical diagnosis is an unnecessary treatment, but the cost of a false negative diagnosis may be the death of the patient. Given a choice, a cost-sensitive learning algorithm should prefer to measure less costly features and to make less costly errors (in this example, false positives). Not surprisingly, when existing learning algorithms are applied to cost-sensitive problems, the results are often poor, because they have no way of making these tradeoffs. Another example concerns the timeliness of predictions in time-series applications. Consider a classifier that is applied to monitor a complex system (e.g., factor, power plant, medical device). It is supposed to signal an alarm if a problem is about to occur. The value of the alarm is not merely related to whether it is a false alarm or a missed alarm, but also to whether the alarm is raised soon enough to allow preventative measures to be taken. The goal of this workshop is to bring together researchers who are working on problems for which the standard 0/1-loss model with zero-cost input features is unsatisfactory. A good reference on different types of costs, and cost-sensitive learning can be found at: http://www.iit.nrc.ca/bibliographies/cost-sensitive.html. The workshop will be structured around three main topics: * ALGORITHMS FOR COST-SENSITIVE LEARNING Algorithms that take cost information as input (along with the training data) and produce a cost-sensitive classifier as output. Algorithms that construct robust classifiers that accept cost information at classification time. Algorithms designed for other types of costs. * COSTS AND LOSS FUNCTIONS THAT ARISE IN REAL-WORLD APPLICATIONS What types of costs are involved in practical applications? What are the loss functions in current and future applications of machine learning? What are the right ways of formulating various cost-sensitive learning problems? * METHODS AND PROMISING DIRECTIONS FOR FUTURE RESEARCH What methods should be applied to evaluate cost-sensitive learning algorithms? What are promising new directions to pursue? What should be our ultimate research goals? Approximately one third of the day will be devoted to each of these three topics. On each of these topics, one or two people will give overview presentations. These will be followed by a mix of discussion and short position papers presented by the participants. Submissions ----------- To participate in the workshop, please send a email message to Tom Dietterich (tgd at cs.orst.edu) giving your name, address, email address, and a brief description of your reasons for wanting to attend. In addition, if you wish to present one or more position papers on the topics listed above, please send a one-page abstract of each position paper to Tom Dietterich at the same email address. You may submit a position paper on each of the three main topics (algorithms, loss functions, future research). If you have an issue or contribution that is not covered by these three categories, please contact Tom Dietterich by email to discuss your idea prior to submitting a position paper. Submissions are especially solicited that describe the loss functions arising in industrial applications of machine learning. The organizers will review the submissions with the goal of assembling a stimulating and exciting workshop. Attendence will be limited to 40 people, with preference given to people who are presenting position papers. Important dates: - Submission deadline: April 24,2000 - Notification of acceptance: May 8, 2000 - Workshop will held between June 29 and July 2, 2000 (to be announced by mid-March) Organizers ---------- Tom Dietterich, Oregon State University (tgd at cs.orst.edu) Foster Provost, New York University, (provost at stern.nyu.edu) Peter Turney, Institute for Information Technology of the National Research Council of Canada (Peter.Turney at iit.nrc.ca) Dragos Margineantu, Oregon State University (margindr at cs.orst.edu) From ted.carnevale at yale.edu Wed Mar 8 23:33:42 2000 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Wed, 08 Mar 2000 23:33:42 -0500 Subject: NEURON 2000 Summer Course Message-ID: <38C729A6.D5DB08D1@yale.edu> COURSE ANNOUNCEMENT What: "The NEURON Simulation Environment" (NEURON 2000 Summer Course) When: Saturday, August 5, through Wednesday, August 9, 2000 Where: San Diego Supercomputer Center University of California at San Diego, CA Organizers: N.T. Carnevale and M.L. Hines Faculty includes: N.T. Carnevale, M.L. Hines, W.W. Lytton, and T.J. Sejnowski Description: This intensive hands-on course covers the design, construction, and use of models in the NEURON simulation environment. It is intended primarily for those who are concerned with models of biological neurons and neural networks that are closely linked to empirical observations, e.g. experimentalists who wish to incorporate modeling in their research plans, and theoreticians who are interested in the principles of biological computation. The course is designed to be useful and informative for registrants at all levels of experience, from those who are just beginning to those who are already quite familiar with NEURON or other simulation tools. Registration is limited to 20, and the deadline for receipt of applications is Friday, July 7, 2000. For more information see http://www.neuron.yale.edu/sdsc2000/sdsc2000.htm or contact Ted Carnevale Psychology Dept. Box 208205 Yale University New Haven, CT 06520-8205 USA phone 203-432-7363 fax 203-432-7172 email ted.carnevale at yale.edu Supported in part by: National Science Foundation National Institutes of Health National Partnership for Advanced Computational Infrastructure and the San Diego Supercomputer Center Contractual terms require inclusion of the following statement: This course is not sponsored by the University of California. But thanks anyway. --Ted From d.mareschal at bbk.ac.uk Thu Mar 9 05:06:02 2000 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Thu, 9 Mar 2000 11:06:02 +0100 Subject: Postdoc and Phd positions available Message-ID: BIRKBECK COLLEGE UNIVERSITY OF LONDON Faculty of Science School of Psychology The School of Psychology has gained an EU Fifth Framework Human Potential Initiative award and invites applications for two positions to be held jointly in the School and the Centre for Brain and Cognitive Development to work on a project entitled "The Basic Mechanisms of Learning in Natural and Artificial Systems". POSTDOCTORAL RESEARCH ASSOCIATE (Two-Year Fixed Term) The postholder will mainly use connectionist/neural network techniques to model memory dissociations in infancy and help implement and design models of explicit and implicit memory in infancy in collaboration with Dr. D. Mareschal. Applicants should have a PhD, preferably in cognitive psychology, developmental psychology, AI, or Neural Computation. The post is tenable from 1st July, 2000 or a date to be arranged thereafter. Salary range: ?18,420 to ?20,319 pa inc. For details send a large (A4) sae to the Personnel Department, Ref: APS343, Birkbeck, Malet Street, London WC1E 7HX. Closing date: 30th April 2000 DOCTORAL STUDENTSHIP (Three-Year Award) The award is available to work with Dr. D. Mareschal and Professor Mark H Johnson on a project exploring the interference effects in infant object recognition using behavioural and possibly ERP methods to test infants. Applicants should have a good first degree in experimental psychology. An interest in connectionist modelling and the neurosciences would be desirable. The award is available from 1st October 2000. For further information and application forms send a large SAE to the School of Psychology Birkbeck, Malet Street, London, WC1E 7HX or telephone (+44) 171 631 6207 Closing date: 31st March 2000 College web site: http://www.bbk.ac.uk Informal enquiries for both positions can be addressed to d.mareschal at bbk.ac.uk These positions are only available to citizens and long-term residents of the EU and Associate Member states who are not citizens or long-term residents of the UK. ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development Department of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 171 631-6582/6207 fax +44 171 631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From Uta.Schwalm at neuroinformatik.ruhr-uni-bochum.de Thu Mar 9 10:52:20 2000 From: Uta.Schwalm at neuroinformatik.ruhr-uni-bochum.de (Uta Schwalm) Date: Thu, 9 Mar 2000 16:52:20 +0100 (MET) Subject: Full professorship in Neural Computation Message-ID: <200003091552.QAA02300@luda.neuroinformatik.ruhr-uni-bochum.de> The deadline for application for the following professorship is already next week, and some people on this list may be interested in applying. The Ruhr-Universit?t Bochum has a vacancy for a full professorship (C4) in Neural Computation ("Neuroinformatik") (succession Prof. Dr. W. von Seelen) The Ruhr-Universit?t Bochum offers a wide-ranging spectrum of subjects in the natural sciences, humanities, engineering and medicine. The Institut f?r Neuroinformatik is interdisciplinary in scope and is a central research institute of the University. By developing models for functional aspects of the central nervous system and experimentally realizing artificial systems the institute contributes on the one hand to the elucidation of the function and development of the brain, and on the other to technical applications. Current research concentrates on computer vision and robot control in natural environments. In terms of methodology, emphasis is on computer experiments and the analytical and numerical treatment of learning dynamical systems. The appointee is expected to carry his or her share in administration, teaching and development of the Institute, to build and lead a large research team, and to attract research funding. There is the possibility to cooperate with ZN GmbH, a spin-off of the Institute. Applicants are expected to have a proven track record of scientific research and experience in leading a research team. It is expected that the appointee will join one of the departments of the University, depending on his or her specific field. The Ruhr-Universit?t Bochum aims at raising its percentage of females and encourages qualified women to apply. Applications by qualified disabled persons are encouraged. Applications with the usual materials are to be sent by 15 March 2000 to: Rektor der Ruhr-Universit?t Bochum Universit?tsstra?e 150 D-44780 Bochum. Homepage of the Institute: http://www.neuroinformatik.ruhr-uni-bochum.de/ From pli at richmond.edu Thu Mar 9 16:25:02 2000 From: pli at richmond.edu (Ping Li) Date: Thu, 9 Mar 2000 16:25:02 -0500 Subject: postdoc and faculty positions Message-ID: Dear Colleagues, Please distribute this information to your students or colleagues. Thank you. Sincerely Ping Li *********************************************************************** Ping Li, Ph.D. Email: ping at cogsci.richmond.edu Associate Professor of Psychology http://www.richmond.edu/~pli/ Department of Psychology Phone: (804) 289-8125 (office) University of Richmond (804) 287-6039 (lab) Richmond, VA 23173, U.S.A. Fax: (804) 287-1905 *********************************************************************** Postdoc Position: Qualified individuals are invited to apply for a postdoctoral fellowship in connectionist models of language learning. The fellowship is supported by the National Science Foundation, and provides an annual stipend of $32,000 for two years. A qualified candidate should hold a Ph.D. degree in an area of cognitive sciences and have experience in connectionist modeling and natural language processing. Technical experiences with C/C++ and Unix/Linux on SUN/Windows platforms are preferable. The successful candidate will join the PI's research team in collaboration with Brian MacWhinney of Carnegie Mellon University to develop a self-organizing neural network model of language acquisition (see the NSF homepage for a summary of the project: http://www.nsf.gov/cgi-bin/showaward?award=9975249). In addition, the fellow will have opportunities to collaborate on research and teaching with faculties at the Department of Psychology, Department of Modern Languages, and Department of Computer and Mathematical Sciences at the University of Richmond. U of R is a highly selective, small private school located on a beautiful campus 6 miles west of Richmond (capital of Virginia, 1 hour east of Charlottesville, 1 hour north of Williamsburg, and 2 hours south of Washington DC). With its nearly 1-billion endowment and progressive program enhancement efforts, the university offers a congenial research and teaching environment. The fellowship is expected to start some time between now and September 1. Consideration of applications will begin immediately until the position is filled. Applicants should send a curriculum vitae, a cover letter, and two letters of recommendation to Ping Li, Department of Psychology, University of Richmond, Richmond, VA 23173, or via email to pli at richmond.edu. The University of Richmond is an equal opportunity, affirmative action employer. Women and minority candidates are encouraged to apply. Faculty Position: Univeristy of Richmond. The Department of Psychology invites applications for a one-year replacement position at the Assistant Professor level. Preference will be given to candidates who would be able to teach undergraduate courses in statistics and in memory and cognition. Candidates should have completed the Ph.D. degree by August 2000 starting date. Scholars who show a promise of excellence in teaching and an active research program which stimulates student interest in research involvement are encouraged to apply. Send vita, statement of research and teaching interests, and three letters of recommendation to Andrew F. Newcomb, Department of Psychology, University of Richmond, Richmond, VA 23173. Consideration of applications will begin on April 15, 2000. The University of Richmond is a highly selective, small private university located on a beautiful campus six miles west of the heart of Richmond. We are an Equal Opportunity, Affirmative Action Employer and encourage applications from women and minority candidates. From harnad at coglit.ecs.soton.ac.uk Fri Mar 10 04:32:07 2000 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Fri, 10 Mar 2000 09:32:07 +0000 (GMT) Subject: Minds, Machines and Turing Message-ID: The following paper is available at: http://www.cogsci.soton.ac.uk/~harnad/Papers/Harnad/harnad00.turing.html Comments welcome. Harnad, S. (2001) Minds, Machines and Turing: The Indistinguishability of Indistinguishables. Journal of Logic, Language, and Information (special issue on "Alan Turing and Artificial Intelligence") http://www.cogsci.soton.ac.uk/~harnad/Papers/Harnad/harnad00.turing.html MINDS, MACHINES AND TURING: THE INDISTINGUISHABILITY OF INDISTINGUISHABLES Stevan Harnad Department of Electronics and Computer Science University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM harnad at cogsci.soton.ac.uk http://www.cogsci.soton.ac.uk/~harnad/ ABSTRACT: Turing's celebrated 1950 paper proposes a very general methodological criterion for modelling mental function: total functional equivalence and indistinguishability. His criterion gives rise to a hierarchy of Turing Tests, from subtotal ("toy") fragments of our functions (t1), to total symbolic (pen-pal) function (T2 -- the standard Turing Test), to total external sensorimotor (robotic) function (T3), to total internal microfunction (T4), to total indistinguishability in every empirically discernible respect (T5). This is a "reverse-engineering" hierarchy of (decreasing) empirical underdetermination of the theory by the data. Level t1 is clearly too underdetermined, T2 is vulnerable to a counterexample (Searle's Chinese Room Argument), and T4 and T5 are arbitrarily overdetermined. Hence T3 is the appropriate target level for cognitive science. When it is reached, however, there will still remain more unanswerable questions than when Physics reaches its Grand Unified Theory of Everything (GUTE), because of the mind/body problem and the other-minds problem, both of which are inherent in this empirical domain, even though Turing hardly mentions them. KEYWORDS: cognitive neuroscience, cognitive science, computation, computationalism, consciousness, dynamical systems, epiphenomenalism, intelligence, machines, mental models, mind/body problem, other minds problem, philosophy of science, qualia, reverse engineering, robotics, Searle, symbol grounding, theory of mind, thinking,Turing, underdetermination, Zombies. From gzy at doc.ic.ac.uk Fri Mar 10 13:17:22 2000 From: gzy at doc.ic.ac.uk (gzy) Date: Fri, 10 Mar 2000 18:17:22 +0000 Subject: PhD and PostDoctoral Positions at Imperial College, London, England Message-ID: <38C93C32.6AA5AC81@doc.ic.ac.uk> Imperial College of Science, Technology and Medicine Department of Computing Visual Information Processing (VIP) Group Research Assistantship and PhD Studentship Positions We have two vacancies, one for a Postdoctoral Research Assistant, and the other for a PhD studentship, available within the Visual Information Processing Group. Both positions are for three years and are funded by the EPSRC under a project entitled "ViTAL: visual tracking for active learning." The aim of the project is to develop a novel framework for active learning and knowledge gathering for decision support systems in medical imaging. It is a collaborative venture between the Visual Information Processing (VIP) Group of the Department of Computing, Imperial College, and the Lung Imaging Research Team at the Royal Brompton Hospital. The Visual Information Processing group has been involved in biomedical imaging research for the last 10 years and has produced more than 200 publications in the areas of Computational Vision, Image Processing, Perceptual Intelligence, and Biomedical Imaging Systems. The group currently has a team of 18 members, including four full-time academic staff. Detailed information about the group can be found at http://vip.doc.ic.ac.uk, or you can call Dr Guang-Zhong Yang (gzy at doc.ic.ac.uk, 020-7594 8441), head of the research group, for an informal discussion about the techincal details of the project. The appointment for the research assistant will be on the RA 1A scale, (18,420 - 26,613) inclusive of London Allowance, depending on qualifications and experience. Applicants should have a good degree in Computing. Applications should include a full CV plus names and addresses of three referees. They must be submitted by 15th March 2000 to: Dr. T. Sergot, Department of Computing, Imperial College, 180 Queen's Gate, London SW7 2BZ, UK. email: t.sergot at ic.ac.uk Fax: +44 20 7581 8024 Imperial College is striving towards equal opportunities. At the leading edge of research, innovation and learning -- Guang-Zhong Yang, PhD Visual Information Processing Department of Computing 180 Queen's Gate Imperial College London SW7 2BZ, UK Tel: 44-(0)20 7594 8441 Fax: 44-(0)20 7581 8024 Email: gzy at doc.ic.ac.uk http://vip.doc.ic.ac.uk From steve at cns.bu.edu Mon Mar 13 07:49:46 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Mon, 13 Mar 2000 07:49:46 -0500 Subject: Neural dynamics of 3-D surface perception: Figure-ground separation and lightness perception Message-ID: The following article is available at http://www.cns.bu.edu/Profiles/Grossberg/ in HTML, PDF, and Gzipped postscript: Kelly, F. and Grossberg, (2000). Neural dynamics of 3-D surface perception: Figure-ground separation and lightness perception. Perception & Psychophysics, in press. Also available in the Tech Report Version, as CAS/CNS TR-98-0226. Abstract: This article develops the FACADE theory of three-dimensional (3-D) vision to simulate data concerning how two-dimensional (2-D) pictures give rise to 3-D percepts of occluded and occluding surfaces. The theory suggests how geometrical and contrastive properties of an image can either cooperate or compete when forming the boundary and surface representations that subserve conscious visual percepts. Spatially long-range cooperation and short-range competition work together to separate boundaries of occluding figures from their occluded neighbors, thereby providing sensitivity to T-junctions without the need to assume that T-junction "detectors" exist. Both boundary and surface representations of occluded objects may be amodally completed, while the surface representations of unoccluded objects become visible through modal processes. Computer simulations include Bregman-Kanizsa figure-ground separation, Kanizsa stratification, and various lightness percepts, including the Munker-White, Benary cross, and checkerboard percepts. Key words: Amodal Completion, Depth Perception, Figure-Ground Perception, Lightness, Visual Cortex, Neural Network From patrick at neuro.kuleuven.ac.be Mon Mar 13 07:56:23 2000 From: patrick at neuro.kuleuven.ac.be (Patrick De Maziere) Date: Mon, 13 Mar 2000 13:56:23 +0100 (MET) Subject: NEW BOOK ON SELF-ORGANIZATION AND TOPOGRAPHIC MAPS Message-ID: NEW BOOK ON SELF-ORGANIZATION AND TOPOGRAPHIC MAPS ================================================== Title: Faithful Representations and Topographic Maps From Distortion- to Information-based Self-organization Author: Marc M. Van Hulle Publisher: J. Wiley & Sons, Inc. Publication Date: February 2000 with forewords by Teuvo Kohonen and Helge Ritter ------------------------------------------------------------------------------- A new perspective on topographic map formation and the advantages of information-based learning The study of topographic map formation provides us with important tools for both biological modeling and statistical data modeling. Faithful Representations and Topographic Maps offers a unified, systematic survey of this rapidly evolving field, focusing on current knowledge and available techniques for topographic map formation. The author presents a cutting-edge, information-based learning strategy for developing equiprobabilistic topographic maps -- that is, maps in which all neurons have an equal probability to be active --, clearly demonstrating how this approach yields faithful representations and how it can be successfully applied in such areas as density estimation, regression, clustering, and feature extraction. The book begins with the standard approach of distortion-based learning, discussing the commonly used Self-Organizing Map (SOM) algorithm and other algorithms, and pointing out their inadequacy for developing equiprobabilistic maps. It then examines the advantages of information-based learning techniques, and finally introduces a new algorithm for equiprobabilistic topographic map formation using neurons with kernel-based response characteristics. The complete learning algorithms and simulation details are given throughout, along with comparative performance analysis tables and extensive references. Faithful Representations and Topographic Maps is an excellent, eye-opening guide for neural network researchers, industrial scientists involved in data mining, and anyone interested in self-organization and topographic maps. ------------------------------------------------------------------------------- "I am convinced that this book marks an important contribution to the field of topographic map representations and that it will become a major reference for many years." (Ritter) "This book will provide a significant contribution to our theoretical understanding of the brain." (Kohonen) ------------------------------------------------------------------------------- http://www.amazon.com/exec/obidos/ASIN/0471345075/qid=948382599/sr=1-1/002-0713799-7248240 http://www.barnesandnoble.com/ search for (Keyword): Faithful representations From d.lowe at aston.ac.uk Mon Mar 13 06:23:03 2000 From: d.lowe at aston.ac.uk (David LOWE) Date: Mon, 13 Mar 2000 11:23:03 +0000 Subject: Faculty Position in Information Engineering Message-ID: <0003131145142U.03816@nn-bernoulli.aston.ac.uk> Lecturer in Information Engineering Neural Computing Research Group Aston University, UK Aston University is seeking to appoint an inspired, research-led individual to the Information Engineering Group within the School of Engineering and Applied Science. Information Engineering encompasses the Neural Computing Research Group, one of the premier research groups in this area in Europe. The research areas of the Group are now very diverse, covering both theoretical and practical aspects of information analysis. These include biomedical signal analysis, theory and algorithms of novel neural network structures, such as support vector machines and Gaussian processes, inference and graphical models, nonlinear time series modelling and a range of application domains. Another aspect of the group's activities focuses on the relation between statistical physics and a variety of methods in information analysis (for instance, support vector machines, Gaussian processes and Bayesian networks) and in communication (error correcting codes and cryptography). Current research contracts are valued at around ukp 1 million. The NCRG also runs a research-based MSc course on Pattern Analysis and Neural Networks. We are seeking a strong research-active person who is also capable of contributing to the current and future-planned taught activities. More details on the taught activities can be obtained from www.maths.aston.ac.uk, and further details of the research activities can be found on www.ncrg.aston.ac.uk. Members of the research team are involved in the organisation of most of the major conferences in the respective areas, and the Group has links with several Government and industrial organisations through its research. The group runs its own computing facilities based mainly on Silicon Graphics, Sun and Alpha computers (including a multiprocessor Silicon Graphics Challenge machine and a powerful multiprocessor Alpha DS20). Although the position is intended to be permanent, Aston University policy is to appoint all staff for a fixed term (usually 5 years for lecturers) in the first instance, with a transfer to a continuing appointment usually made during the fifth year. Note that this position is equivalent to an assistant professor in other world locations. Informal enquiries can be made via email to Prof Lowe (d.lowe at aston.ac.uk). Further details may be obtained from the Personnel Office at Aston University. Interested individuals should send a comprehensive resume and a list of referees to either the Personnel Office or to the Group's administrator (v.j.bond at aston.ac.uk). Electronic submissions are welcome. The closing date for submissions is 1st May 2000. From reder at cmu.edu Mon Mar 13 09:30:05 2000 From: reder at cmu.edu (Lynne M. Reder) Date: Mon, 13 Mar 2000 09:30:05 -0500 Subject: postdoctoral opportunity Message-ID: The training grant on Combined Training in Computational and Behavioral approaches to the Study of Cognition is seeking two new postdoctoral fellows, one starting in June of 2000 (i.e., the degree is already in hand before the end of this June). The second position is a bit more flexible in start time. The successful applicants need not already have experience in computational modeling but must be interested in learning. The training grant is only open to US citizens or nationals (permanent residents). Starting salary for the stipend is $26,256 and goes up with number of years of postdoctoral experience (the stipend is set by NIMH). There is also a travel allowance of $800 and a budget for ancillary expenses of $2500. The computational approaches that we offer are symbolic, subsymbolic and hybrid. Please see the following web page for more information: http://www.psy.cmu.edu/~reder/ Applications will be accepted until both positions are filled. In addition, I am looking for a postdoctoral fellow for a position in my lab (independent of the training grant). My computational approach is localist so this newsgroup may not be the appropriate venue for such an advertisement. ========================================================== Lynne M. Reder, Professor Department of Psychology Carnegie Mellon University Pittsburgh, PA 15213 phone: (412)268-3792 fax: (412) 268-2844 email: reder at cmu.edu URL: http://www.andrew.cmu.edu/~reder/reder.html From bvr at stanford.edu Wed Mar 15 03:10:18 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Wed, 15 Mar 2000 00:10:18 -0800 Subject: CALL FOR WORKSHOP PROPOSALS -- NIPS*2000 Message-ID: <4.2.0.58.20000315000959.00ca6100@bvr.pobox.stanford.edu> CALL FOR WORKSHOP PROPOSALS -- NIPS*2000 ===================================== Neural Information Processing Systems Natural and Synthetic NIPS*2000 Post-Conference Workshops December 1 and 2, 2000 Breckenridge, Colorado ===================================== Following the regular program of the Neural Information Processing Systems 2000 conference, workshops on various current topics in neural information processing will be held on December 1 and 2, 2000, in Breckenridge, Colorado. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Example topics include: Active Learning, Architectural Issues, Attention, Audition, Bayesian Analysis, Bayesian Networks, Benchmarking, Brain Imaging, Computational Complexity, Computational Molecular Biology, Control, Genetic Algorithms, Graphical Models, Hippocampus and Memory, Hybrid Supervised/Unsupervised Learning Methods, Hybrid HMM/ANN Systems, Implementations, Independent Component Analysis, Mean-Field Methods, Markov Chain Monte-Carlo Methods, Music, Network Dynamics, Neural Coding, Neural Plasticity, On-Line Learning, Optimization, Recurrent Nets, Robot Learning, Rule Extraction, Self-Organization, Sensory Biophysics, Signal Processing, Spike Timing, Support Vectors, Speech, Time Series, Topological Maps, and Vision. The goal of the workshops is to provide an informal forum for researchers to discuss important issues of current interest. There will be six hours of workshop meetings per day, split into morning and afternoon sessions, with free time in between for ongoing individual exchange or outdoor activities. Controversial issues, open problems, and comparison of competing approaches are encouraged and preferred as workshop topics. Representation of alternative viewpoints and panel-style discussions are particularly encouraged. Descriptions of previous workshops may be found at http://www.cs.cmu.edu/Groups/NIPS/NIPS99/Workshops/ Select workshops may be invited to submit their workshop proceedings for publication as part of a new series of monographs for the post-NIPS workshops. Workshop organizers will have responsibilities including: ++ coordinating workshop participation and content, which includes arranging short informal presentations by experts, arranging for expert commentators to sit on a discussion panel, formulating a set of discussion topics, etc. ++ moderating the discussion, and reporting its findings and conclusions to the group during evening plenary sessions ++ writing a brief summary and/or coordinating submitted material for post-conference electronic dissemination. ======================= Submission Instructions ======================= Interested parties should submit a short proposal for a workshop of interest via email by May 26, 2000. Proposals should include title, description of what the workshop is to address and accomplish, proposed workshop length (1 or 2 days), planned format (mini-conference, panel discussion, combinations of the above, etc), and proposed speakers. Names of potential invitees should be given where possible. Preference will be given to workshops that reserve a significant portion of time for open discussion or panel discussion, as opposed to pure "mini-conference" format. An example format is: ++ Tutorial lecture providing background and introducing terminology relevant to the topic. ++ Two short lectures introducing different approaches, alternating with discussions after each lecture. ++ Discussion or panel presentation. ++ Short talks or panels alternating with discussion and question/answer sessions. ++ General discussion and wrap-up. We suggest that organizers allocate at least 50% of the workshop schedule to questions, discussion, and breaks. Past experience suggests that workshops otherwise degrade into mini-conferences as talks begin to run over. The proposal should motivate why the topic is of interest or controversial, why it should be discussed, and who the targeted group of participants is. It also should include a brief resume of the prospective workshop chair with a list of publications to establish scholarship in the field. Submissions should include contact name, address, email address, phone and fax numbers. Proposals should be emailed to caruana at cs.cmu.edu. Proposals must be RECEIVED by May 26, 2000. If email is unavailable, mail to: NIPS Workshops, Rich Caruana, SCS CMU, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA. Questions may be addressed to either of the Workshop Co-Chairs: Rich Caruana (caruana at cs.cmu.edu) Virginia de Sa (desa at phy.ucsf.edu) PROPOSALS MUST BE RECEIVED BY MAY 26, 2000 From bvr at stanford.edu Wed Mar 15 03:09:56 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Wed, 15 Mar 2000 00:09:56 -0800 Subject: CALL FOR PAPERS -- NIPS*2000 Message-ID: <4.2.0.58.20000315000937.00c72500@bvr.pobox.stanford.edu> CALL FOR PAPERS -- NIPS*2000 ========================================== Neural Information Processing Systems Natural and Synthetic Monday, Nov. 27 -- Saturday, Dec. 2, 2000 Denver, Colorado ========================================== This is the fourteenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, statisticians, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks as well as oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Nov. 27), and following it there will be two days of focused workshops on topical issues at a nearby ski area (Dec. 1-2). Major categories for paper submission, with example subcategories (by no means exhaustive), are listed below. A special area of emphasis this year is innovative applications of neural computation. Algorithms and Architectures: supervised and unsupervised learning algorithms, feedforward and recurrent network architectures, localized basis functions, mixture models, committee models, belief networks, graphical models, support vector machines, Gaussian processes, topographic maps, decision trees, factor analysis, principal component analysis and extensions, independent component analysis, model selection algorithms, combinatorial optimization, hybrid symbolic-subsymbolic systems. Applications: innovative applications of neural computation including data mining, information retrieval, web and network applications, intrusion detection, fraud detection, bio-informatics, medical diagnosis, image processing and analysis, handwriting recognition, industrial monitoring and control, financial analysis, time-series prediction, consumer products, music, video and artistic applications, animation, virtual environments, learning dynamical systems. Cognitive Science/Artificial Intelligence: perception and psychophysics, neuropsychology, cognitive neuroscience, development, conditioning, human learning and memory, attention, language, natural language, reasoning, spatial cognition, emotional cognition, conceptual representation, neurophilosophy, problem solving and planning. Implementations: analog and digital VLSI, optical neurocomputing systems, novel neurodevices, computational sensors and actuators, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics, experimental data relevant to computational issues. Reinforcement Learning and Control: exploration, planning, navigation, Q-learning, TD-learning, state estimation, dynamic programming, robotic motor control, process control, Markov decision processes. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, speech signal enhancement, auditory scene analysis, source separation, applications of hidden Markov models to signal processing, models of human speech perception, auditory modeling and psychoacoustics. Theory: computational learning theory, statistical physics of learning, information theory, Bayesian methods, prediction and generalization, regularization, online learning (stochastic approximation), dynamics of learning, approximation and estimation theory, complexity theory, multi-agent learning. Visual Processing: image processing, image coding, object recognition, visual psychophysics, stereopsis, motion detection and tracking. ---------------------------------------------------------------------- Review Criteria: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors new to NIPS are strongly encouraged to submit their work, and will be given preference for oral presentations. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting a final camera-ready copy for the proceedings. Paper Format: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are required to use the NIPS LaTeX style files obtainable by anonymous FTP at the site given below. THE STYLE FILES HAVE BEEN UPDATED; please make sure that you use the current ones and not previous versions. Submission Instructions: NIPS has migrated to electronic submissions. Full submission instructions will be available at the web site given below. You will be asked to enter paper title, names of all authors, category, oral/poster preference, and contact author data (name, full address, telephone, fax, and email). You will upload your manuscript from the same page. We are only accepting postscript manuscripts. No pdf files will be accepted this year. The electronic submission page will be available on April 28, 2000. Submission Deadline: SUBMISSIONS MUST BE LOGGED BY MIDNIGHT MAY 19, 2000 PACIFIC DAYLIGHT TIME (08:00 GMT May 20). The LaTeX style files for NIPS, the Electronic Submission Page, and other conference information are available on the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS Copies of the style files are also available via anonymous ftp at ftp.cs.cmu.edu (128.2.242.152) in /afs/cs/Web/Groups/NIPS/formatting. For general inquiries or requests for registration material, send e-mail to nipsinfo at salk.edu or fax to (619)587-0417. NIPS*2000 Organizing Committee: General Chair, Todd K. Leen, Oregon Graduate Institute; Program Chair, Tom Dietterich, Oregon State University; Publications Chair, Volker Tresp, Siemens AG; Tutorial Chair, Mike Mozer, University of Colorado; Workshops Co-Chairs, Rich Caruana, Carnegie Mellon University, Virginia de Sa, Sloan Center for Theoretical Neurobiology; Publicity Chair, Benjamin Van Roy, Stanford University; Treasurer, Bartlett Mel, University of Southern California; Web Masters, Doug Baker and Alex Gray, Carnegie Mellon University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*2000 Program Committee: Leon Bottou, AT&T Labs - Research; Tom Dietterich, Oregon State University (chair); Bill Freeman, Mitsubishi Electric Research Lab; Zoubin Ghahramani, University College London; Dan Hammerstrom, Oregon Graduate Institute; Thomas Hofmann, Brown University; Tommi Jaakkola, MIT; Sridhar Mahadevan, Michigan State University; Klaus Obermeyer, TU Berlin; Manfred Opper, Aston University; Yoram Singer, Hebrew University of Jerusalem; Malcolm Slaney, Interval Research; Josh Tenenbaum, Stanford University; Sebastian Thrun, Carnegie Mellon University. PAPERS MUST BE SUBMITTED BY MAY 19, 2000 From yilin at stat.wisc.edu Wed Mar 15 13:31:17 2000 From: yilin at stat.wisc.edu (Yi Lin) Date: Wed, 15 Mar 2000 12:31:17 -0600 (CST) Subject: Paper announcement: SVM in nonstandard situations Message-ID: Dear Connectionists, A paper on the support vector machines for classification in nonstandard situations (with unequal misclassification cost, sampling bias present) is now available online: http://www.stat.wisc.edu/~yilin or http://www.stat.wisc.edu/~wahba Title and abstract are below: --------------------------------------------------------------------------- Support Vector Machines for Classification in Nonstandard Situations Yi Lin, Yoonkyung Lee, and Grace Wahba The majority of classification algorithms are developed for the standard situation in which it is assumed that the examples in the training set come from the same distribution as that of the target population, and that the cost of misclassification into different classes are the same. However, these assumptions are often violated in real world settings. For some classification methods, this can often be taken care of simply with a change of threshold; for others, additional effort is required. In this paper, we explain why the standard support vector machine is not suitable for the nonstandard situation, and introduce a simple procedure for adapting the support vector machine methodology to the nonstandard situation. Theoretical justification for the procedure is provided. Simulation study illustrates that the modified support vector machine significantly improves upon the standard support vector machine in the nonstandard situation. The computational load of the proposed procedure is the same as that of the standard support vector machine. The procedure reduces to the standard support vector machine in the standard situation. From pvdputten at smr.nl Fri Mar 17 09:54:11 2000 From: pvdputten at smr.nl (Peter van der Putten) Date: Fri, 17 Mar 2000 15:54:11 +0100 Subject: CoIL Competition Challenge 2000: Present your results in Greece Message-ID: Provide the best solution in the CoIL Challenge 2000 before May 4 and get free registration and travel support for the CoIL'2000 Symposium on June 19-23 in Chios, Greece! Direct mailings to a company's potential customers - "junk mail" to many - can be a very effective way for them to market a product or a service. However, as we all know, much of this junk mail is really of no interest to the majority of the people that receive it. Most of it ends up thrown away, not only wasting the money that the company spent on it, but also filling up landfill waste sites or needing to be recycled. If the company had a better understanding of who their potential customers were, they would know more accurately who to send it to, so some of this waste and expense could be reduced. Therefore, following a successful CoIL competition last year (See Synergy Issue 1, Winter 1999), CoIL has just announced a new competition challenge for 2000: "Can you predict who would be interested in buying a caravan insurance policy and give an explanation why?" This competition is organized by COiL, the Computational Intelligence and Learning Cluster. The goal of CoIL is to achieve scientific, technical and "social" integration of four european communities that perform research, development and application: Erudit (Fuzzy logic), EvoNet (Evolutionary computing), MLNet (Machine learning) and NEuroNet (Neural networks). The data was supplied by the Dutch datamining company Sentient Machine Research. We encourage any type of solutions to these problems, particularly those involving any of the CoIL technologies or any combinations of these. We are also interested in other solutions using other technologies, since CoIL is interested in being able to demonstrate how CoIL technologies relate to other techniques. The winners will be invited to present a short paper on their approach at the CoIL'2000 Symposium on Computational Intelligence and Learning (19-23 June 2000), in Chios, Greece. Your participation will be free of charge and we will pay you a travel support of 750 Euro. Important dates: 17 March 2000 Release of data 4 May 2000 Deadline for submissions 12 May 2000 Announcement of winners 22-23 June 2000 CoIL'2000 Symposium For more information, please visit CoILWeb: http://www.dcs.napier.ac.uk/coil/. Best wishes, Peter van der Putten Consultant Sentient Machine Research From fritsch at ira.uka.de Sun Mar 19 13:09:07 2000 From: fritsch at ira.uka.de (Juergen Fritsch) Date: Sun, 19 Mar 2000 19:09:07 +0100 Subject: PhD thesis available Message-ID: <38D517C3.E27D148E@ira.uka.de> Dear Connectionists, My PhD thesis on hierarchical connectionist acoustic modeling for large vocabulary speech recognition is now available on the WWW at http://isl.ira.uka.de/~fritsch For those interested, I have appended the abstract. Best regards, --Juergen Fritsch. ========================================================== Juergen Fritsch Research Scientist ---------------------------------------------------------- Interactive Systems Labs University of Karlsruhe & Carnegie Mellon University phone:++49-721-6086285 http://isl.ira.uka.de/~fritsch fax:++49-721-607721 email: fritsch at ira.uka.de ========================================================== Abstract: Hierarchical Connectionist Acoustic Modeling for Domain-Adaptive Large Vocabulary Speech Recognition Juergen Fritsch PhD Thesis, 238 pages Interactive Systems Labs Faculty of Computer Science University of Karlsruhe Germany ABSTRACT This thesis presents a new, hierarchical framework for connectionist acoustic modeling in large vocabulary statistical speech recognition systems. Based on the divide and conquer paradigm, the task of estimating HMM state posteriors is decomposed and distributed in the form of a tree-structured architecture consisting of thousands of small neural networks. In contrast to monolithic connectionist models, our approach scales to arbitrarily large state spaces. Phonetic context is represented simultaneously at multiple resolutions which allows for scalable acoustic modeling. We demonstrate that the hierarchical structure allows for (1) accelerated score computations through dynamic tree pruning, (2) effective speaker adaptation with limited amounts of adaptation data and (3) downsizing of the trained model for small memory footprints. The viability of the proposed hierarchical model is demonstrated in recognition experiments on the Switchboard large vocabulary conversational telephone speech corpus, currently considered the most difficult standardized speech recognition benchmark, where it achieves state-of-the-art performance with less parameters and faster recognition times compared to conventional mixture models. The second contribution of this thesis is an algorithm that allows for domain-adaptive speech recognition using the proposed hierarchical acoustic model. In contrast to humans, automatic speech recognition systems still suffer from a strong dependence on the application domain they have been trained on. Typically, a speech recognition system has to be tailored to a specific application domain to reduce semantic, syntactic and acoustic variability and thus increase recognition accuracy. Unfortunately, this approach results in a lack of portability as performance typically deteriorates unacceptably when moving to a new application domain. We present Structural Domain Adaptation (SDA), an algorithm for hierarchically organized acoustic models that exploits the scalable specificity of phonetic context modeling by modifying the tree structure for optimal performance on previously unseen application domains. We demonstrate the effectiveness of the SDA approach by adapting a large vocabulary conversational telephone speech recognition system to (1) a telephone dictation task and (2) spontaneous scheduling of meetings. SDA together with domain-specific dictionaries and language models allows to match the performance of domain-specific models with only 45-60 minutes of acoustic adaptation data. From king at harvard.edu Mon Mar 20 17:09:28 2000 From: king at harvard.edu (Gary King) Date: Mon, 20 Mar 2000 17:09:28 -0500 (EST) Subject: positions at HU Message-ID: We'd like to fill these positions with with lots of connectionists! --------------------------------------------------------------------- Visiting Faculty, Post-doctoral, and Pre-doctoral Positions at Harvard A major new initiative, "Military Conflict as a Public Health Problem," will commence at Harvard University beginning in the 2000-2001 academic year. This project will support political scientists, statistical methodologists, and public health scholars interested in pursing their own work or joint work related to this project. We are offering research positions for faculty, post-docs, and graduate students (including salary, office space, and computer access; there are no teaching or administrative duties) for those interested in: - forecasting and explaining international conflict and civil wars - describing or explaining the direct and indirect public health consequences of military conflict - utilizing variables that are usually used to explain or forecast conflict to study the more ultimate dependent variable of human misery (e.g., political scientists have found that democracies do not fight each other as often as other types of countries, but they have not studied in this way the ultimate consequences of democracy for human well-being.) - developing statistical methods for analyzing these data -- such as neural network models, CART, spatial statistics, data mining, hierarchical Bayesian models for numerous short time series or multiple cross-sections, models for complex dependence structures such as analyzing the presence of war or other variables in pairs of countries, forecasting models, statistical pattern recognition, visualization in large data sets, etc. - conducting research on human security, expanding the notion of military security to include other aspects of human well-being. - analyzing the best data in existence on global mortality and morbidity (by country, age, sex, and cause), military conflict (both international and civil), and thousands of explanatory variables corresponding to known or suspected predictors of each, at every level of aggregation available. Participants will have unrestricted access to these data. - other related topics. A key motivation for this project is to forge new alliances across disciplines, and so we do not expect applicants to be familiar with military conflict AND new statistical approaches AND public health research; expert knowledge in one of these fields and an interest in learning about or contributing to one of the others is sufficient. Please send a letter describing your research interests, a C.V., at least two letters of reference, and a sample of your scholarly work to Lara Birk, Center for Basic Research in the Social Sciences, 34 Kirkland Street, Harvard University, Cambridge, MA 02138; email: lbirk at latte.harvard.edu; fax 617-496-5149; phone 617-495-9271. Please get your application in by April 10th if possible. Please refer any questions you may have to Ms. Birk. The Principal Investigators for this project are Gary King (Professor of Government, Harvard; Director, Harvard-MIT Data Center; and Advisor to the World Health Organization (WHO)) and Christopher Murray (Director, Global Programme on Evidence for Health Policy, WHO; Professor of International Health Economics, Harvard School of Public Health). The project Advisory Committee includes James Alt (Professor of Government, Harvard, and Director of CBRSS), Bear Braumoeller (Assistant Professor of Government, Harvard), Paul E. Farmer, Jr. (Associate Professor, Department of Social Medicine, Harvard Medical School and Director, Institute for Health and Social Justice), Lisa Martin (Professor of Government, Harvard), Jasjeet Sekhon (Assistant Professor of Government, Harvard), Kenji Shibuya (Assistant Professor of Public Health, Teikyo University School of Medicine, Japan), and Langche Zeng (Associate Professor of Political Science, George Washington University and CBRSS fellow). The project is sponsored by the U.S. National Science Foundation, Harvard University's Weatherhead Center for International Affairs (WCFIA), Center for Basic Research in the Social Sciences (CBRSS), and Harvard-MIT Data Center, and it is in collaboration with the World Health Organization's Global Programme on Evidence for Health Policy. : Gary King, King at Harvard.Edu http://GKing.Harvard.Edu : : Center for Basic Research Internet Keyword: Gary King : : in the Social Sciences Direct (617) 495-2027 : : 34 Kirkland Street, Rm. 2 Assistant (617) 495-9271 : : Harvard U, Cambridge, MA 02138 eFax (520) 832-7022 : From bsc at microsoft.com Mon Mar 20 18:48:08 2000 From: bsc at microsoft.com (Bernhard Schoelkopf) Date: Mon, 20 Mar 2000 15:48:08 -0800 Subject: PhD studentship in SVM novelty detection, Univ. Oxford / Microsoft Cambridge Message-ID: Applications are invited for a PhD studentship supervised by Prof. L. Tarassenko (Neural Networks and Signal Processing Group, University of Oxford) and Dr. B. Schoelkopf (Microsoft Research, Cambridge, UK) in the field of Support Vector Machines (SVMs) for novelty detection. The detection of novelty is an important generic problem in health monitoring, and the SVM paradigm is an excellent theoretical framework for this. The student will be expected to develop state-of-the-art machine learning algorithms, with a view to applying them to real world problems such as epileptic seizure detection. Applicants with an excellent degree in Computer Science, Mathematics, Engineering, Physics are invited to contact Prof. Tarassenko (Lionel.Tarassenko at eng.ox.ac.uk) or Dr. Schoelkopf (bsc at microsoft.com) for further information. The student will be based at Oxford University, with the possibility of an internship at Microsoft Research during the summer vacations. The studentship, starting 1/10/2000, is generously funded by Microsoft. http://www.eng.ox.ac.uk/~wpcres/Summary/B-Neural.html http://www.research.microsoft.com/~bsc With kind regards Bernhard From steve at cns.bu.edu Mon Mar 20 21:05:17 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Mon, 20 Mar 2000 21:05:17 -0500 Subject: The Complementary Brain: A Unifying View of Brain Specialization and Modularity Message-ID: The following article is available at http://www.cns.bu.edu/Profiles/Grossberg/ in HTML, PDF, and Gzipped postscript: Grossberg, S. (2000). The complementary brain: A unifying view of brain specialization and modularity. Trends in Cognitive Sciences, in press. Preliminary version appears as Boston University Technical Report CAS/CNS-TR-98-003. Abstract: How are our brains functionally organized to achieve adaptive behavior in a changing world? This article presents one alternative to the computer metaphor suggesting that brains are organized into independent modules. Evidence is reviewed that brains are organized into parallel processing streams with complementary properties. Hierarchical interactions within each stream and parallel interactions between streams create coherent behavioral representations that overcome the complementary deficiencies of each stream and support unitary conscious experiences. This perspective suggests how brain design reflects the organization of the physical world with which brains interact. Examples from perception, learning, cognition, and action are described, and theoretical concepts and mechanisms by which complementarity is accomplished are presented. Keywords: modulatory, What and Where processing, visual cortex, motor cortex, reinforcement, recognition, attention, learning, expectation, volition, speech, neural network From sunita at it.iitb.ernet.in Tue Mar 21 01:13:26 2000 From: sunita at it.iitb.ernet.in (Dr. Sunita Sarawagi) Date: Tue, 21 Mar 2000 11:43:26 +0530 (IST) Subject: SIGKDD Explorations: call for papers: volume 2, issue 1 Message-ID: We invite submissions to the first issue of the second volume of SIGKDD Explorations to be published by the middle of the year. SIGKDD explorations is the official newsletter of ACM's new Special Interest Group (SIG) on Knowledge Discovery and Data Mining. Two issues of the first volume issue are already out and available online at http://www.acm.org/sigkdd/explorations/index.htm. SIGKDD Explorations newsletter is sent to the ACM SIGKDD membership and to a world-wide network of libraries. Submissions can be made in any one of the following categories. - survey/tutorial articles (short) on important topics not exceeding 20 pages - topical articles on problems and challenges - well-articulated position papers - technical articles not exceeding 15 pages. - news items on the order of 1-3 paragraphs - Brief announcements not exceeding 5 lines in length. - review articles of products and methodologies not exceeding 20 pages - reviews/summaries from conferences, panels and special meetings. - reports on relevant meetings and committees related to the field *NEW*: We also have added a for-pay advertisement section to allow vendors, companies, consultants, and others to reach the rapidly growing SIGKDD community. Advertising rates start at $250 for quarter page, $500 per half page, and $1000 for a full page. Submissions should be made to fayyad at acm.org or sunita at cs.berkeley.edu. All submissions must arrive by May 21, 2000 for inclusion in the next issue. Some words about the SIGKDD newsletter: -------------------------------------- SIGKDD Explorations is a bi-annual newsletter dedicated to serve the SIGKDD membership and community. Our goal is to make SIGKDD Newsletter a very informative, rapid publication, and interesting forum for communicating with SIGKDD community. Submissions will be reviewed by the editor and/or associate editors as apporpriate. The distribution will be very wide (on the web, to all members but probably not restricted access the first year, and to ACM's world-wide network of libraries. Members get e-mail notifications of new issues and get hardcopies if they desire). For more information on SIGKDD visit http://www.acm.org/sigkdd and for more information on the newsletter visit http://www.acm.org/sigkdd/explorations/index.htm Usama Fayyad, Editor-in-Chief fayyad at acm.org Sunita Sarawagi, Associate Editor sunita at cs.berkeley.edu From sally at svl.co.uk Tue Mar 21 04:12:48 2000 From: sally at svl.co.uk (Tickner, Sally) Date: Tue, 21 Mar 2000 10:12:48 +0100 Subject: FW: PERSPECTIVES IN NEURAL COMPUTING BOOK SERIES Message-ID: March 21st 2000 BOOK ANNOUNCEMENT Perspectives in Neural Computing Series Editor: John Taylor Perspectives in Neural Computing is a series of books on both the theoretical and applied aspects of neural computation. Designed to reflect the multidisciplinary nature of the subject, it provides research and tutorial texts for students and professional researchers. JUST PUBLISHED! Artificial Neural Networks in Biomedicine Paulo J.G. Lisboa, Emmanuel C. Ifeachor and Piotr S. Szczepaniak (Eds) If you are: a neural network practitioner involved with biomedical applications; a computer scientist working in medicine or looking for medical applications of computational intelligence methods; a developer and manufacturer of clinical computer systems; a medical researcher looking for new methods and computational tools; or a (post)graduate student on a relevant computer science or engineering course, then this book is for you. Among the contents are: * a set of tutorial papers covering established methods of best practice in neural network design * an extensive collection of case studies covering both commercially-available products, recently-granted patents, and a wide range of applications which this new methodology is opening-up for practical development in biomedicine =A345.00 February 2000 272 pages softcover ISBN 1-85233-005-8 To request more information about the series, or any other Springer-Verlag books and journals, please contact Sally Tickner, Senior Marketing Manager, Springer-Verlag London Ltd., Sweetapple House, Catteshall Road, Godalming, Surrey GU7 3DJ Tel: 01483 414113 Fax: 01483 415151 Email: sally at svl.co.uk www.springer.co.uk ** End Sally Tickner, Sr. Marketing Manager, Springer-Verlag London Ltd, Sweetapple House, Catteshall Road, Godalming, Surrey, GU7 3DJ, UK Tel: 01483 414113 Fax: 01483 415151 Email: sally at svl.co.uk www.springer.co.uk www.springer.de From dwang at cis.ohio-state.edu Tue Mar 21 16:35:52 2000 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Tue, 21 Mar 2000 16:35:52 -0500 Subject: Recent papers on perception and neurodynamics Message-ID: <38D7EAE7.3C498E5@cis.ohio-state.edu> The following papers, available at http://www.cis.ohio-state.edu/~dwang/announce.html, may be of interest to the list: 1. Liu X. and Wang D.L. (1999): "Perceptual organization based on temporal dynamics." Proceedings of NIPS-99, in press. A figure-ground segregation network is proposed based on a novel boundary pair representation. Nodes in the network are boundary segments obtained through local grouping. Each node is excitatorily coupled with the neighboring nodes that belong to the same region, and inhibitorily coupled with the corresponding paired node. Gestalt grouping rules are incorporated by modulating connections. The status of a node represents its probability being figural and is updated according to a differential equation. The system solves the figure-ground segregation problem through temporal evolution. Different perceptual phenomena, such as modal and amodal completion, virtual contours, grouping and shape decomposition are then explained through local diffusion. The system eliminates combinatorial optimization and accounts for many psychophysical results with a fixed set of parameters. 2. Wang D.L. (2000): "On connectedness: a solution based on oscillatory correlation." Neural Computation, vol. 12, pp. 131-139. A long-standing problem in neural computation has been the problem of connectedness, first identified by Minsky and Papert in 1969. This problem served as the cornerstone for them to analytically establish that perceptrons are fundamentally limited in computing geometrical (topological) properties. A solution to this problem is offered by a different class of neural networks - oscillator networks. To solve the problem, the representation of oscillatory correlation is employed whereby one pattern is represented as a synchronized block of oscillators, and different patterns are represented by distinct blocks that desynchronize from each other. Oscillatory correlation emerges from a LEGION network, whose architecture consists of local excitation and global inhibition among neural oscillators. It is further shown that these oscillator networks exhibit sensitivity to topological structure, which may lay a neurocomputational foundation for explaining the psychophysical phenomenon of topological perception. 3. Wang D.L. (1999): Relaxation oscillators and networks. In Webster J. (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering, Wiley & Sons, vol. 18, pp. 396-405. A tutorial article on oscillatory dynamics and its applications to auditory and visual scene analysis. -- ------------------------------------------------------------ Dr. DeLiang Wang Department of Computer and Information Science The Ohio State University 2015 Neil Ave. Columbus, OH 43210-1277, U.S.A. Email: dwang at cis.ohio-state.edu Phone: 614-292-6827 (OFFICE); 614-292-7402 (LAB) Fax: 614-292-2911 URL: http://www.cis.ohio-state.edu/~dwang From C.Campbell at bristol.ac.uk Wed Mar 22 08:23:21 2000 From: C.Campbell at bristol.ac.uk (Colin Campbell, Engineering Mathematics) Date: Wed, 22 Mar 2000 13:23:21 +0000 (GMT Standard Time) Subject: PhD studentship in kernel methods/SVMs Message-ID: ***PhD studentship: Kernel Methods for Bioinformatics*** Applications are invited for a PhD studentship in the field of kernel methods (for example, Support Vector Machines) and their application to biosequence data. This position is a project studentship funded by the EPSRC and will pay all fees and maintenance for an applicant who is a citizen of the European Union. The grant also includes ample funds for travel and conference attendence. The emphasis of the proposed research is on the development of new algorithms and theoretical work. Consequently applicants should have a good first degree with a substantial mathematical component. A background in computing would also be an advantage. The applications component will be the development and evaluation of kernel methods specifically designed for handling classification tasks arising in bioinformatics. Further details about the proposed research area may be obtained from our webpage http://lara.enm.bris.ac.uk/cig which has a downloadable review paper (An Introduction to Kernel Methods) describing the subject in more detail. Non-EU candidates may also apply for the studentship but because of the difference between EU and overseas fees only exceptionally able candidates can be considered. Further details can be obtained from: Dr. Colin Campbell, Dept. of Engineering Mathematics, Queen's Building, University of Bristol, Bristol BS8 1TR United Kingdom Email: C.Campbell at bris.ac.uk From nnsp00 at neuro.kuleuven.ac.be Thu Mar 23 08:26:06 2000 From: nnsp00 at neuro.kuleuven.ac.be (NNSP2000, Sydney) Date: Thu, 23 Mar 2000 14:26:06 +0100 Subject: IEEE workshop on Neural Networks for Signal Processing (NNSP), Sydney, Australia, December 2000. Message-ID: <38DA1B6E.B86F0D53@neuro.kuleuven.ac.be> In response to the many requests we received recently, regarding the extension of paper submission deadline for NNSP'2000, the organizing committee has decided to extend the due date for initial paper submission to 15 April, 2000. The new version of the Call for Paper which reflects the change is attached for your information and reference. In case you would like to be removed from our mailing list: reply to this mail with as subject "remove" and the e-mail address you received this message on. Marc M. Van Hulle Katholieke Universiteit Leuven Belgium *********************************************** **** CALL FOR PAPERS **** **** submission deadline: April 15, 2000 **** *********************************************** December 11-13, 2000, Sydney, Australia NNSP'2000 homepage: http://eivind.imm.dtu.dk/nnsp2000 Thanks to the sponsorship of IEEE Signal Processing Society and the IEEE Neural Networks Council, the tenth of a series of IEEE workshops on Neural Networks for Signal Processing will be held at the University of Sydney Campus, Sydney, Australia. The workshop will feature keynote lectures, technical presentations, and panel discussions. Papers are solicited for, but not limited to, the following areas: Algorithm and Architectures: Artificial neural networks (ANN), adaptive signal processing, Bayesian modeling, MCMC, generalization, design algorithms, optimization, parameter estimation, nonlinear signal processing, Markov models, fuzzy systems (FS), evolutionary computation (EC), synergistic models of ANN/FS/EC, and wavelets. Applications: Speech processing, image processing, sonar and radar, data fusion, intelligent multimedia and web processing, OCR, robotics, adaptive filtering, blind source separation, communications, sensors, system identification, and other general signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. PAPER SUBMISSION PROCEDURE Prospective authors are invited to submit a full paper of up to six pages using the electronic submission procedure described at the workshop homepage: http://eivind.imm.dtu.dk/nnsp2000 Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Extended versions of the best workshop papers will be selected and published in a Special Issue of an international journal published by Kluwer Academica Publishers. SCHEDULE Submission of full paper: April 15, 2000 Notification of acceptance: May 31, 2000 Submission of photo-ready accepted paper: July 15, 2000 Super Early registration, before: July 15, 2000 Advanced registration, before: September 15, 2000 ORGANIZATION Honorary Chair Bernard WIDROW Stanford University General Chairs Ling GUAN University of Sydney email: ling at ee.usyd.edu.au Kuldip PALIWA Griffith University email: kkp at shiva2.me.gu.edu.au Program Chairs T?lay ADALI University of Maryland, Baltimore County email: adali at umbc.edu Jan LARSEN Technical University of Denmark email: jl at imm.dtu.dk Finance Chair Raymond Hau-San WONG University of Sydney email: hswong at ee.usyd.edu.au Proceedings Chairs Elizabeth J. WILSON Raytheon Co. email: bwilson at ed.ray.com Scott C. DOUGLAS Southern Methodist University email: douglas at seas.smu.edu Publicity Chair Marc van HULLE Katholieke Universiteit, Leuven email: Marc.VanHulle at med.kuleuven.ac.be Registration and Local Arrangements Stuart PERRY Defence Science and Technology Organisation email: Stuart.Perry at dsto.defence.gov.au Europe Liaison Jean-Francois CARDOSO ENST email: cardoso at sig.enst.fr America Liaison Amir ASSADI University of Wisconsin at Madison email: ahassadi at facstaff.wisc.edu Asia Liaison Andrew BACK RIKEN email: andrew.back at usa.net Program Committee Amir Assadi Yianni Attikiouzel John Asenstorfer Andrew Back Geoff Barton Herv? Bourlard Andy Chalmers Zheru Chi Andrzej Cichocki Tharam Dillon Tom Downs Hsin Chia Fu Suresh Hangenahally Marwan Jabri Haosong Kong Shigeru Katagiri Anthony Kuh Yi Liu Fa-Long Luo David Miller Christophe Molina M Mohammadian Erkki Oja Soo-Chang Pei Jose Principe Ponnuthurai Suganthan Ah Chung Tsoi Marc Van Hulle A.N. Venetsanopoulos Yue Wang Wilson Wen From austin at minster.cs.york.ac.uk Thu Mar 23 10:52:19 2000 From: austin at minster.cs.york.ac.uk (Jim Austin) Date: Thu, 23 Mar 2000 15:52:19 +0000 Subject: PhD Studentship in Neural Networks for Image Analysis, York, UK. Message-ID: <10003231552.ZM15574@minster.cs.york.ac.uk> PhD Studentship in Neural Networks for Image Analysis Advanced Computer Architectures Group Dept. of Computer Science University of York York UK. Oct 2000 - Sep 2003 A University funded PhD studentship is available from Oct 2000 in the application of neural networks to image analysis problems. The work will build on research in our group on the use of neural network based associative memories for the high performance analysis of images. In particular the work by Simon O'Keefe and Christos Orovas. For details see 133 and 102 on our publications list at http://www.cs.york.ac.uk/arch/neural/publications/papers.html The project will be supervised by Prof. Jim Austin and Dr. Simon O'Keefe within an active group working in neural networks research. Details of the groups research can be found at http://www.cs.york.ac.uk/arch/neural For further details please contact Prof. Jim Austin (austin at cs.york.ac.uk) or write, for an application form, to Filomena Ottaway, Graduate Secretary, Dept. of Computer Science, University of York, York, YO10 5DD, UK, or email filo at cs.york.ac.uk. -- Jim Austin, Professor of Neural Computation Advanced Computer Architecture Group, Department of Computer Science, University of York, York, YO10 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch/ From stefan.wermter at sunderland.ac.uk Thu Mar 23 14:40:54 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Thu, 23 Mar 2000 19:40:54 +0000 Subject: Book on Hybrid Neural Systems Message-ID: <38DA7346.85D39EAC@sunderland.ac.uk> NEW BOOK ON HYBRID NEURAL SYSTEMS ==================================== Title: Hybrid Neural Systems Stefan Wermter, University of Sunderland, UK Ron Sun, University of Missouri, Columbia, MO, USA (Eds.) More details on this book Hybrid Neural Systems can be gained from its web page at http://www.his.sunderland.ac.uk/ -> New Book and http://www.his.sunderland.ac.uk/newbook/hybrid.html (all abstracts and first chapter) Overview --------- Keywords: Artificial Neural Networks, Hybrid Neural Systems, Connectionism, Hybrid Symbolic Neural Architectures, Cognitive Neuroscience, Machine Learning, Language Processing The aim of this book is to present a broad spectrum of current research in hybrid neural systems, and advance the state of the art in neural networks and artificial intelligence. Hybrid neural systems are computational systems which are based mainly on artificial neural networks but which also allow a symbolic interpretation or interaction with symbolic components. This book focuses on the following issues related to different types of representation: How does neural representation contribute to the success of hybrid systems? How does symbolic representation supplement neural representation? How can these types of representation be combined? How can we utilize their interaction and synergy? How can we develop neural and hybrid systems for new domains? What are the strengths and weaknesses of hybrid neural techniques? Are current principles and methodologies in hybrid neural systems useful? How can they be extended? What will be the impact of hybrid and neural techniques in the future? Table of Contents ------------------ An Overview of Hybrid Neural Systems Stefan Wermter and Ron Sun Structured Connectionism and Rule Representation --------------------------------- Layered Hybrid Connectionist Models for Cognitive Science Jerome Feldman and David Bailey Types and Quantifiers in SHRUTI --- A Connectionist Model of Rapid Reasoning and Relational Processing Lokendra Shastri A Recursive Neural Network for Reflexive Reasoning Steffen Hlldobler, Yvonne Kalinke and Jrg Wunderlich A Novel Modular Neural Architecture for Rule-based and Similarity- based Reasoning Rafal Bogacz and Christophe Giraud-Carrier Addressing Knowledge-Representation Issues in Connectionist Symbolic Rule Encoding for General Inference Nam Seog Park Towards a Hybrid Model of First-Order Theory Refinement Nelson A. Hallack, Gerson Zaverucha and Valmir C. Barbosa Distributed Neural Architectures and Language Processing -------------------------------------- Dynamical Recurrent Networks for Sequential Data Processing Stefan Kremer and John Kolen Fuzzy Knowledge and Recurrent Neural Networks: A Dynamical Systems Perspective Christian W. Omlin, Lee Giles and Karvel K. Thornber Combining Maps and Distributed Representations for Shift-Reduce Parsing Marshall R. Mayberry and Risto Miikkulainen Towards Hybrid Neural Learning Internet Agents Stefan Wermter, Garen Arevian and Christo Panchev A Connectionist Simulation of the Empirical Acquisition of Grammatical Relations ---------------------------------------------- William C. Morris, Garrison W. Cottrell and Jeffrey L. Elman Large Patterns Make Great Symbols: An Example of Learning from Example Pentti Kanerva Context Vectors: A Step Toward a Grand Unified Representation Stephen I. Gallant Integration of Graphical Rules with Adaptive Learning of Structured Information Paolo Frasconi, Marco Gori and Alessandro Sperduti Transformation and Explanation --------------------- Lessons from Past, Current Issues and Future Research Directions in Extracting the Knowledge Embedded in Artificial Neural Networks Alan B. Tickle, Frederic Maire, Guido Bologna, Robert Andrews and Joachim Diederich Symbolic Rule Extraction from the DIMLP Neural Network Guido Bologna Understanding State Space Organization in Recurrent Neural Networks with Iterative Function Systems Dynamics Peter Tino, Georg Dorffner and Christian Schittenkopf Direct Explanations and Knowledge Extraction from a Multilayer Perceptron Network that Performs Low Back Pain Classification Marilyn L. Vaughn, Steven J. Cavill, Stewart J. Taylor, Michael A. Foy and Anthony J.B. Fogg High Order Eigentensors as Symbolic Rules in Competitive Learning Hod Lipson and Hava T. Siegelmann Holistic Symbol Processing and the Sequential RAAM: An Evaluation James A. Hammerton and Barry L. Kalman Robotics, Vision and Cognitive Approaches -------------------------------------------- Life, Mind and Robots: The Ins and Outs of Embodied Cognition Noel Sharkey and Tom Ziemke Supplementing Neural Reinforcement Learning with Symbolic Methods Ron Sun Self-Organizing Maps in Symbol Processing Timo Honkela Evolution of Symbolisation: Signposts to a Bridge between Connectionist and Symbolic Systems Ronan Reilly A Cellular Neural Associative Array for Symbolic Vision Christos Orovas and James Austin Application of Neurosymbolic Integration for Environment Modelling in Mobile Robots Gerhard K. Kraetzschmar, Stefan Sablatng, Stefan Enderle, Gnther Palm ====================================================== Online order -------- http://www.springer.de/cgi-bin/search_book.pl?isbn=3-540-67305-9 Publisher: Springer Publication Date: 29 March 2000 Wermter, S., University of Sunderland, UK Sun, R., University of Missouri, Columbia, MO, USA (Eds.) Hybrid Neural Systems 2000. IX, 403 pp. 3-540-67305-9 DM 86,- Recommended List Price LNCS 1778 *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 2781 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From jf218 at hermes.cam.ac.uk Thu Mar 23 04:56:53 2000 From: jf218 at hermes.cam.ac.uk (Dr J. Feng) Date: Thu, 23 Mar 2000 09:56:53 +0000 (GMT) Subject: six papers available at my homepage Message-ID: Dear connectionists, You could find the following papers at address http://www.cus.cam.ac.uk/~jf218 [46] Feng J., and Brown D.(2000) Integrate-and-fire models with nonlinear leakage Bulletin of Mathematical Biology (in press) ABSTRACT Can we express biophysical neuronal models as integrate-and-fire models with leakage coefficients which are no longer constant, as in the conventional leaky integrate-and-fire (IF) model, but functions of membrane potential and other biophysical variables? We illustrate the answer to this question using the FitzHugh-Nagumo (FHN) as an example. Novel integrate-and-fire models, the IF-FHN model, which approximate to the FHN mode, is obtained. The leakage coefficients derived in the IF-FHN model have non-monotonic relationships with membrane potential, revealing at least in part the intrinsic mechanisms underlying the model. The model correspondingly exhibits more complex behaviour than the standard IF model. For example, in some parameter regions, the IF-FHN model has a coefficient of variation of output interspike interval which is independent of the number of inhibitory inputs, being close to unity over the whole range, comparable to the FHN model as we noted previously. [45] Davison A., Feng J., Brown D.(2000) A reduced compartmental model of the mitral cell for use in network models of the olfactory bulb Brain Research Bulletin vol. 51, 393-399. ABSTRACT We have developed two-, three and four-compartment models of a mammalian olfactory bulb mitral cell as a reduction of a complex 286-compartment model. A minimum of three compartments, representing soma, secondary dendrites and the glomerular tuft of the primary dendrite, is required to adequately reproduce the behaviour of the full model over abroad range of firing rates. Adding a fourth compartment to represent the shaft of the primary dendrite gives a substantial improvement. The reduced models exhibit behaviours in common with the full model which were not used in fitting the model parameters. The reduced modes run 75 or more times faster than the full model, making their use in large, realistic network models of the olfactory bulb practical. [44] Feng J., Brown D., and Li G. (2000) Synchronization due to common pulsed input in Stein's model Physics Review E vol. 61, 2987-2995. ABSTRACT It is known that stimulus-evoked oscillatory synchronisation among neurones occurs in widely separated cortical regions. In this paper we provide a possible mechanism to explain the phenomena. When a common, random input is presented, we find that a group of neurones-- of Stein's (integrate-and-fire) model type with or without reversal potentials--are capable of quickly synchronising their firing. Interestingly the optimal average synchronisation time occurs when the common input has a high CV (ISI) (greater than 0.5) for this model with or without reversal potentials. The model with reversal potentials more quickly synchronises than that without reversal potentials. [43] Feng, J., and Tirozzi B. (2000) Stochastic resonance tuned by correlations in neuronal models. Phys. Rev. E. (in press, April) ABSTRACT The idea that neurons might use stochastic resonance (SR) to take advantages of random signals has been extensively discussed in the literature. However, there are a few key issues which have not been clarified and thus it is difficult to assess that whether SR in neuronal models occurs inside plausible physiology parameter regions or not. We propose and show that neurons can adjust correlations between synaptic inputs, which can be measured in experiments and are dynamical variables, to exhibit SR. The benefit of such a mechanism over the conventional SR is also discussed. [42] Feng J., and Brown D.(2000). Impact of correlated input on the output of the integrate-and-fire models Neural Computation vol. 12, 711-732. ABSTRACT For the integrate-and-fire model with or without reversal potentials, we consider how correlated inputs affect the variability of cellular output. For both models the variability of efferent spike trains measured by coefficient of variation of the interspike interval (abbreviated to CV in the remainder of the paper) is a nondecreasing function of input correlation. When the correlation coefficient is greater than 0.09, the CV of the integrate-and-fire model without reversal potentials is always above 0.5, no matter how strong the inhibitory inputs. When the correlation coefficient is greater than 0.05, CV for the integrate-and-fire model with reversal potentials is always above 0.5, independent of the strength of the inhibitory inputs. Under a given condition on correlation coefficients we find that correlated Poisson processes can be decomposed into independent Poisson processes. We also develop a novel method to estimate the distribution density of the first passage time of the integrate-and-fire model. [41] Feng J., Georgii H.O., and Brown D. (2000) Convergence to global minima for a class of diffusion processes Physica A vol. 276, 465-476. ABSTRACT We prove that there exists a gain function $(\eta(t),\beta(t))_{t\ge 0}$ such that the solution of the SDE $dx_t=\eta(t)(-\mbox{ grad } U(x_t)dt +\beta(t)dB_t)$ 'settles' down on the set of global minima of $U$. In particular the existence of a gain function $(\eta(t))_{t\ge 0}$ so that $y_t$ satisfying $dy_t=\eta(t)(-\mbox{ grad } U(y_t)dt +dB_t)$ converges to the set of the global minima of $U$ is verified. Then we apply the results to the Robbins-Monro and the Kiefer-Wolfowitz procedures which are of particular interest in statistics and neural networks. with best regards Jianfeng Feng The Babraham Institute Cambridge CB2 4AT UK From cl at andrew.cmu.edu Mon Mar 27 10:27:00 2000 From: cl at andrew.cmu.edu (Christian Lebiere) Date: Mon, 27 Mar 2000 10:27:00 -0500 Subject: ACT-R Summer School and Workshop Message-ID: <50723163.3163141620@blubber.psy.cmu.edu> [** Final reminder: the summer school application deadline is April 1st. **] SEVENTH ANNUAL ACT-R SUMMER SCHOOL AND WORKSHOP =============================================== Carnegie Mellon University - July/August 2000 ============================================= ACT-R is a hybrid cognitive theory and simulation system for developing cognitive models for tasks that vary from simple reaction time to air traffic control. The most recent advances of the ACT-R theory were detailed in the recent book "The Atomic Components of Thought" by John R. Anderson and Christian Lebiere, published in 1998 by Lawrence Erlbaum Associates. Each year, a two-week summer school is held to train researchers in the use of the ACT-R system, followed by a three-day workshop to enable new and current users to exchange research results and ideas. The Seventh Annual ACT-R Summer School and Workshop will be held at Carnegie Mellon University in Pittsburgh in July/August 2000. SUMMER SCHOOL: The summer school will take place from Monday July 24 to Friday August 4, with the intervening Sunday free. This intensive 11-day course is designed to train researchers in the use of ACT-R for cognitive modeling. It is structured as a set of 8 units, with each unit lasting a day and involving a morning theory lecture, a web-based tutorial, an afternoon discussion session and a homework assignment which students are expected to complete during the day and evening. The final three days of the summer school will be devoted to individual research projects. Computing facilities for the tutorials, assignments and research projects will be provided. Successful student projects will be presented at the workshop, which all summer school students are expected to attend as part of their training. To provide an optimal learning environment, admission is limited to a dozen participants, who must submit by APRIL 1 an application consisting of a curriculum vitae, a statement of purpose and a one-page description of the data set that they intend to model as their research project. The data set can be the applicant's own or can be taken from the published literature. Applicants will be notified of admission by APRIL 15. Admission to the summer school is free. A stipend of up to $750 is available to graduate students for reimbursement of travel, housing and meal expenses. To qualify for the stipend, students must be US citizens and join to their application a letter of reference from a faculty member. WORKSHOP: The workshop will take place from the morning of Saturday August 5 to Monday August 7 at noon. Mornings will be devoted to research presentations, each lasting about 20 minutes plus questions. Participants are invited to present their ACT-R research by submitting a one-page abstract with their registration. Informal contributions of up to 8 pages can be submitted by August 1 for inclusion in the workshop proceedings. Afternoons will feature more research presentations as well as discussion sessions and instructional tutorials. Suggestions for the topics of the tutorials and discussion sessions are welcome. Evenings will be occupied by demonstration sessions, during which participants can gain a more detailed knowledge of the models presented and engage in unstructured discussions. Admission to the workshop is open to all. The early registration fee (before July 1) is $100 and the late registration fee (after July 1) is $125. A registration form is appended below. Additional information (detailed schedule, etc.) will appear on the ACT-R Web site (http://act.psy.cmu.edu/) when available or can be requested at: 2000 ACT-R Summer School and Workshop Psychology Department Attn: Helen Borek Baker Hall 345C Fax: +1 (412) 268-2844 Carnegie Mellon University Tel: +1 (412) 268-3438 Pittsburgh, PA 15213-3890 Email: helen+ at cmu.edu ________________________________________________________ Seventh Annual ACT-R Summer School and Workshop July 24 to August 7, 2000 at Carnegie Mellon University in Pittsburgh REGISTRATION ============ Name: .................................................................. Address: .................................................................. .................................................................. .................................................................. Tel/Fax: .................................................................. Email: .................................................................. Summer School (July 24 to August 4): ........ (check here to apply) ==================================== Applications are due APRIL 1. Acceptance will be notified by APRIL 15. Applicants MUST include a curriculum vitae, a short statement of purpose and a one-page description of the data set that they intend to model. A stipend of up to $750 is available for the reimbursement of travel, lodging and meal expenses (receipts needed). To qualify for the stipend, the applicant must be a graduate student with US citizenship and include with the application a letter of reference from a faculty member. Check here to apply for stipend: ........ Workshop (August 5 to 7): ........ (check here to register) ========================= Presentation topic (optional - include one-page abstract with registration): ......................................................................... Registration fee: Before July 1: $100 ... After July 1: $125 ... The fee is due upon registration. Please send checks or money orders only. We cannot accept credit cards. HOUSING ======= Housing is available in Resnick House, a CMU dormitory that offers suite-style accommodations. Rooms include air-conditioning, a semi-private bathroom and a common living room for suite-mates. Last year's rates were $180.75/week/person or $32.60/night/person for single rooms and $134.25/week/person or $24.25/night/person for double rooms. Housing reservations will be taken after acceptance to the summer school. Do not send money. See http://www.housing.cmu.edu/conferences/ for further housing information. To reserve a room in Resnick House, fill in the dates and select one of the three room options: I will stay from ................ to ................ 1. ... I want a single room 2. ... I want a double room and I will room with ................ 3. ... I want a double room. Please select a roommate of ....... gender ROOM PAYMENT IS DUE UPON CHECK-IN. DO NOT SEND MONEY. The recommended hotel is the Holiday Inn University Center, located on the campus of the University of Pittsburgh within easy walking distance of CMU. Contact the Holiday Inn directly at +1 (412) 682-6200. Send this form to: 2000 ACT-R Summer School and Workshop Psychology Department Attn: Helen Borek Baker Hall 345C Fax: +1 (412) 268-2844 Carnegie Mellon University Tel: +1 (412) 268-3438 Pittsburgh, PA 15213-3890 Email: helen+ at cmu.edu From vdavidsanchez at earthlink.net Sun Mar 26 12:18:54 2000 From: vdavidsanchez at earthlink.net (V. David Sanchez A.) Date: Sun, 26 Mar 2000 09:18:54 -0800 Subject: NEUROCOMPUTING - new address Message-ID: <38DE467E.D1B31C43@earthlink.net> Ladies and gentlemen, please send all submissions and inquiries to the following new address: Advanced Computational Intelligent Systems Attn.: V. David Sanchez A. NEUROCOMPUTING - Editor in Chief - P.O. Box 60130 Pasadena, CA 91116-6130 U.S.A. Fax: (626) 793-5120 Email: vdavidsanchez at earthlink.net URL: http://www.elsevier.nl/locate/neucom From john at dcs.rhbnc.ac.uk Tue Mar 28 02:43:45 2000 From: john at dcs.rhbnc.ac.uk (John Shawe-Taylor) Date: Tue, 28 Mar 2000 08:43:45 +0100 Subject: PhD Studentships available Message-ID: <200003280743.IAA28309@platon.cs.rhbnc.ac.uk> PhD Studentships available at Royal Holloway, University of London Royal Holloway, University of London Department of Computer Science currently has the following FUNDED PhD STUDENTSHIPS available: - THREE FULLY-FUNDED COLLEGE RESEARCH STUDENTSHIPS covering tuition fees at Home and EU rates and maintenance for 3 years - TWO FEES-ONLY COLLEGE RESEARCH STUDENTSHIPS covering tuition fees at Home and EU rates - EPSRC RESEARCH STUDENTSHIP covering tuition fees at Home and EU rates and maintenance for 3 years The above studentships are available in any area of the Department's research interests: Computational Learning, Kolmogorov Complexity, Bioinformatics, Formal methods, Languages and Architectures, and Constraint Satisfaction. The department currently has the following FUNDED MSc STUDENTSHIP available: - COLLEGE MASTERS STUDENTSHIP covering tuition fees at HEU rates, available to students taking MSc in Computer Science by Research Closing date for applications for studentships is 2 June 2000. Further information and application forms for these postgraduate degree programmes may be obtained from the Director of Graduate Studies Steve Schneider at S.Schneider at dcs.rhbnc.ac.uk From Marc.VanHulle at med.kuleuven.ac.be Wed Mar 29 02:16:16 2000 From: Marc.VanHulle at med.kuleuven.ac.be (Marc Van Hulle) Date: Wed, 29 Mar 2000 09:16:16 +0200 Subject: book announcement: Self-Organization and Topographic Maps Message-ID: <38E1ADBF.7E1765C@neuro.kuleuven.ac.be> NEW BOOK ON SELF-ORGANIZATION AND TOPOGRAPHIC MAPS ================================================== Title: Faithful Representations and Topographic Maps From Distortion- to Information-based Self-organization Author: Marc M. Van Hulle Publisher: J. Wiley & Sons, Inc. Publication Date: February 2000 with forewords by Teuvo Kohonen and Helge Ritter ------------------------------------------------------------------------------- A new perspective on topographic map formation and the advantages of information-based learning The study of topographic map formation provides us with important tools for both biological modeling and statistical data modeling. Faithful Representations and Topographic Maps offers a unified, systematic survey of this rapidly evolving field, focusing on current knowledge and available techniques for topographic map formation. The author presents a cutting-edge, information-based learning strategy for developing equiprobabilistic topographic maps -- that is, maps in which all neurons have an equal probability to be active --, clearly demonstrating how this approach yields faithful representations and how it can be successfully applied in such areas as density estimation, regression, clustering, and feature extraction. The book begins with the standard approach of distortion-based learning, discussing the commonly used Self-Organizing Map (SOM) algorithm and other algorithms, and pointing out their inadequacy for developing equiprobabilistic maps. It then examines the advantages of information-based learning techniques, and finally introduces a new algorithm for equiprobabilistic topographic map formation using neurons with kernel-based response characteristics. The complete learning algorithms and simulation details are given throughout, along with comparative performance analysis tables and extensive references. Faithful Representations and Topographic Maps is an excellent, eye-opening guide for neural network researchers, industrial scientists involved in data mining, and anyone interested in self-organization and topographic maps. ------------------------------------------------------------------------------- "I am convinced that this book marks an important contribution to the field of topographic map representations and that it will become a major reference for many years." (Ritter) "This book will provide a significant contribution to our theoretical understanding of the brain." (Kohonen) ------------------------------------------------------------------------------- http://www.amazon.com/exec/obidos/ASIN/0471345075/qid=948382599/sr=1-1/002-0713799-7248240 http://www.barnesandnoble.com/ search for (Keyword): Faithful representations From labbi at cui.unige.ch Thu Mar 30 07:09:57 2000 From: labbi at cui.unige.ch (A.R. Labbi) Date: Thu, 30 Mar 2000 14:09:57 +0200 Subject: Postdoc. in Machine Learning/ Statistical Modeling in Geneva, Switzerland References: <10003231552.ZM15574@minster.cs.york.ac.uk> Message-ID: <38E34415.EA3D8BD@cui.unige.ch> POSTDOC at the CSD of the University of Geneva - Switzerland ----------------------------------------------------- The Computer Science Department of the University of Geneva (Switzerland) is seeking a postdoctoral fellow to participate in two Swiss-NSF funded research projects about biomedical data analysis for visual object recognition, and textual data analysis for document categorization. The two projects have strong overlap on the computational side since they both address common machine learning and statistical modeling issues such as classification and clustering (for more details, please visit: http://cuiwww.unige.ch/~labbi ). Candidates for this position should have a strong background in machine learning or statistical modeling as well as a sound experience in statistical image processing and an interest in statistical text processing (or vice versa). The candidate who will fill the position will work in a multidisciplinary environment where computer scientists, mathematicians, biologists, and computational linguists collaborate in stimulating research projects. Therefore, the postdoc will have the opportunity to develop new interests and initiate new research directions in the department. The position is to be filled as soon as possible, and is initially for one year, extendable to another year or possibly more. If you have a Ph.D. in applied mathematics, computer science, or a related field, and have programming experience (e.g. matlab and/or C/C++ and/or Java), and if you are interested in joining an attractive muti-cultural work environment with modern computing and teaching facilities, please mail, e-mail, or fax your r?sum? with the names and addresses (including e-mails) of two references to: Dr. Abderrahim Labbi or Prof. Christian Pellegrini Dept. of Computer Science University of Geneva 24, rue du General Dufour 1204 Geneva - Switzerland Fax: +41 22 705 77 80 E-mail: {Abderrahim.Labbi or Christian.Pellegrini}@cui.unige.ch From cweber at cs.tu-berlin.de Thu Mar 30 18:27:57 2000 From: cweber at cs.tu-berlin.de (Cornelius Weber) Date: Fri, 31 Mar 2000 01:27:57 +0200 (MET DST) Subject: Paper available Message-ID: The following IJCNN'00 paper is accepted and available on-line. I will be happy about any feedback. Structured models from structured data: emergence of modular information processing within one sheet of neurons Abstract: In our contribution we investigate how structured information processing within a neural net can emerge as a result of unsupervised learning from data. Our model consists of input neurons and hidden neurons which are recurrently connected and which represent the thalamus and the cortex, respectively. On the basis of a maximum likelihood framework the task is to generate given input data using the code of the hidden units. Hidden neurons are fully connected allowing for different roles to play within the unfolding time-dynamics of this data generation process. One parameter which is related to the sparsity of neuronal activation varies across the hidden neurons. As a result of training the net captures the structure of the data generation process. Trained on data which are generated by different mechanisms acting in parallel, the more active neurons will code for the more frequent input features. Trained on hierarchically generated data, the more active neurons will code on the higher level where each feature integrates several lower level features. The results imply that the division of the cortex into laterally and hierarchically organized areas can evolve to a certain degree as an adaptation to the environment. retreive from: http://www.cs.tu-berlin.de/~cweber/publications/ (6 pages, 230 KB) From tewon at salk.edu Thu Mar 30 21:29:18 2000 From: tewon at salk.edu (Te-Won Lee) Date: Thu, 30 Mar 2000 18:29:18 -0800 Subject: Postdoctoral Positions Available Message-ID: <38E40D7E.7C8391C2@salk.edu> University of California, San Diego Institute for Neural Computation Applications are invited for postdoctoral fellowships in the field of signal & image processing, neural networks, pattern recognition and machine learning. Position: Post-doctoral research associate Organization: Institute for Neural Computation, University of California, San Diego Faculty: Te-Won Lee Funding Period: Minimum 2 years available Location: San Diego, CA Deadline: Open until filled Title: Intelligent Sound and Image Processing Systems The goal of this project is to develop software that can process sounds and images in a more humanlike fashion so that a recognition system can work robustly in real-world environments. Recent advances in data processing technologies have led to several human computer interface applications such as automatic speech recognition systems and visual object recognition systems. Although some commercial products are currently available, the performance of those systems usually degrades substantially under real-world conditions. For example, a speech recognition system in an automobile may process voice commands when spoken in a quiet situation, but the recognition performance may be unacceptable with the presence of interfering sounds such as the car engine noise, music, and other voices in the background. In contrast, humans are able to recognize speech under very noisy conditions. The goal of the project is to develop software that can enhance the sound or image signal using a newly developed technique called ICA (independent component analysis). ICA is able to separate sounds or images when the environment has mixed them. This software is an important step in advancing computer systems closer to humanlike performance and hence will make the human machine interaction more natural. Furthermore, the software will enhance the communication between the sending and receiving units in a noisy environment where the receiver can get clean audio-visual information. Qualifications: Background in signal processing, image processing, machine learning algorithms, neural networks and pattern recognition is desirable. Matlab and C programming skills are required. Our Lab: Successful candidates will join the Neuroengineering Laboratory at the Institute for Neural Computation. The Institute is an interdisciplinary organized research unit at UCSD directed by Terrence J. Sejnowski. The 45 faculty members in the Institute represent 14 research disciplines, including neuroscience, visual science, cognitive science, mathematics, economics and social science, and computer engineering, and addresses the twin scientific and engineering challenges of understanding how humans function at the neural and cognitive levels and solve major technological problems related to neural network implementations (see http://inc.ucsd.edu/). The Neuroengineering Laboratory currently consists of three research faculty who work with postdoctoral fellows, graduate students and visiting scientists, in collaboration with other researchers in electrical and computer engineering, cognitive science, neurosciences, as well as with industry. Contact Info: If interested, please send your curriculum vitae, list of publications and the names, addresses, and phone numbers of three references to: Te-Won Lee, Ph.D. Institute for Neural Computation University of California, San Diego La Jolla, CA 92093-0523 Tel: 858-453-4100 x1527 Fax: 858-587-0417 http://www.cnl.salk.edu/~tewon From Nello.Cristianini at bristol.ac.uk Fri Mar 31 11:04:02 2000 From: Nello.Cristianini at bristol.ac.uk (N Cristianini) Date: Fri, 31 Mar 2000 17:04:02 +0100 (BST) Subject: Available Now: Support Vector Book Message-ID: The Support Vector Book is now distributed and available (see http://www.support-vector.net for details). AN INTRODUCTION TO SUPPORT VECTOR MACHINES (and other kernel-based learning methods) N. Cristianini and J. Shawe-Taylor Cambridge University Press, 2000 ISBN: 0 521 78019 5 http://www.support-vector.net Contents - Overview 1 The Learning Methodology 2 Linear Learning Machines 3 Kernel-Induced Feature Spaces 4 Generalisation Theory 5 Optimisation Theory 6 Support Vector Machines 7 Implementation Techniques 8 Applications of Support Vector Machines Pseudocode for the SMO Algorithm Background Mathematics References Index Description This book is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory. The book also introduces Bayesian analysis of learning and relates SVMs to Gaussian Processes and other kernel based learning methods. SVMs deliver state-of-the-art performance in real-world applications such as text categorisation, hand-written character recognition, image classification, biosequences analysis, etc. Their first introduction in the early 1990s lead to a recent explosion of applications and deepening theoretical analysis, that has now established Support Vector Machines along with neural networks as one of the standard tools for machine learning and data mining. Students will find the book both stimulating and accessible, while practitioners will be guided smoothly through the material required for a good grasp of the theory and application of these techniques. The concepts are introduced gradually in accessible and self-contained stages, though in each stage the presentation is rigorous and thorough. Pointers to relevant literature and web sites containing software ensure that it forms an ideal starting point for further study. These are also available on-line through an associated web site www.support-vector.net, which will be kept updated with pointers to new literature, applications, and on-line software. From saadd at aston.ac.uk Fri Mar 31 10:48:37 2000 From: saadd at aston.ac.uk (David Saad) Date: Fri, 31 Mar 2000 15:48:37 +0000 Subject: Postdoctoral Research Fellowship Message-ID: <38E4C8D5.D4001F14@aston.ac.uk> Neural Computing Research Group ------------------------------- School of Engineering and Applied Sciences Aston University, Birmingham, UK POSTDOCTORAL RESEARCH FELLOWSHIP -------------------------------- Irregular Gallager-type error-correcting codes - a statistical mechanics perspective ----------------------------------------------- *** Full details at http://www.ncrg.aston.ac.uk/ *** The Neural Computing Research Group at Aston is looking for a highly motivated individual for a 2 year postdoctoral research position in the area of `Irregular Gallager-type error-correcting codes - a statistical mechanics perspective'. The emphasis of the research will be on applying theoretical and numerical methods to study the properties of Gallager-type error-correcting codes, with the aim of systematically identifying optimal constructions of this type. Potential candidates should have strong mathematical and computational skills, with a background in statistical physics and/or error-correcting codes. Conditions of Service --------------------- Salaries will be up to point 6 on the RA 1A scale, currently 18,185 UK pounds. The salary scale is subject to annual increments. How to Apply ------------ If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, to: Prof. David Saad Neural Computing Research Group School of Engineering and Applied Sciences Aston University Birmingham B4 7ET, U.K. Tel: 0121 333 4631 Fax: 0121 333 4586 e-mail: D.Saad at aston.ac.uk e-mail submission of postscript files is welcome. Closing date: 28.4.2000 ---------------------------------------------------------------------- From golden at utdallas.edu Wed Mar 1 07:35:31 2000 From: golden at utdallas.edu (Richard M Golden) Date: Wed, 1 Mar 2000 06:35:31 -0600 (CST) Subject: March 3 deadline for submitting paper to "Neural Net Technical Session" Message-ID: Please send email correspondence to "bein at malachite.cs.unlv.edu". ===================================== From y.wilks at dcs.shef.ac.uk Wed Mar 1 11:22:25 2000 From: y.wilks at dcs.shef.ac.uk (Yorick Wilks) Date: Wed, 1 Mar 2000 16:22:25 GMT Subject: RESEARCH SCHOLARSHIPS AVAILABLE IN COMPUTER SCIENCE/NLP Message-ID: <200003011622.QAA10033@burbage.dcs.shef.ac.uk.> University of Sheffield Department of Computer Science RESEARCH DEGREES IN COMPUTER SCIENCE This department intends to recruit a number of postgraduate research students to begin studies in October 2000 or before. Successful applicants will read for an M.Phil or Ph.D. The department has research groups in: Natural Language Processing Verification and Testing Communications and Distributed Systems Speech and Hearing Computer Graphics Machine Learning Neurocomputing and Robotics UK, EU and overseas candidates with research interests in any relevant area are encouraged to apply. Candidates for all awards should have a good honours degree in a relevant discipline (not necessarily Computer Science), or should attain such a degree by September 1999. Part-time registration is a possibility. A number of EPSRC awards are available, which are available for UK students' fees and support and, on a fees-only basis, for EU students. More details of our research, and application forms, are on our world-wide-web site: http://www.dcs.shef.ac.uk. Hard copy application forms and further particulars are available from the Research Admissions Secretary, Department of Computer Science, University of Sheffield, Regent Court, 211 Portobello St, Sheffield S1 4DP. All applicants should quote reference number ST051. Informal enquiries may be addressed to: Professor Yorick Wilks: +44 114 222 1804, yorick at dcs.shef.ac.uk From wolfskil at MIT.EDU Fri Mar 3 15:33:00 2000 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Fri, 3 Mar 2000 16:33:00 -0400 Subject: book announcement--Cloete Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 2189 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/09fb884c/attachment-0001.bin From terry at salk.edu Sat Mar 4 00:39:50 2000 From: terry at salk.edu (terry@salk.edu) Date: Fri, 3 Mar 2000 21:39:50 -0800 (PST) Subject: NEURAL COMPUTATION 12:3 Message-ID: <200003040539.VAA12984@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 3 - March 1, 2000 ARTICLE Dynamics of Encoding In Neuron Populations: Some General Mathematical Features Bruce W. Knight NOTE Local And Global Gating Of Synaptic Plasticity Manuel A. Sanchez-Montanis, Paul F. M. J. Verschure, and Peter Konig Nonlinear Autoassociation Is Not Equivalent To PCA Nathalie Japkowicz, Stephen Josi Hanson and Mark A. Gluck LETTERS No Free Lunch For Noise Prediction Malik Magdon-Ismail Self-Organization Of Symmetry Networks: Transformation Invariance From The Spontaneous Symmetry-Breaking Mechanism Chris J. S. Webber Geometric Analysis Of Population Rhythms In Synaptically Coupled Neuronal Networks J. Rubin and D. Terman An Accurate Measure of The Instantaneous Discharge Probability, With Application To Unitary Joint-Event Analysis Quentin Pauluis and Stuart N. Baker Impact Of Correlated Inputs On The Output Of The Integrate- And-Fire Model Jianfeng Feng and David Brown Weak, Stochastic Temporal Correlation of Large-Scale Synaptic Input Is A Major Determinant of Neuronal Bandwidth David M. Halliday Second-Order Learning Algorithm With Squared Penalty Term Kazumi Saito and Ryohei Nakano ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2000 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From sok at cs.york.ac.uk Mon Mar 6 07:23:49 2000 From: sok at cs.york.ac.uk (Simon E M O'Keefe) Date: Mon, 06 Mar 2000 12:23:49 +0000 Subject: Workshop on Optical, Neural and Computational Associative processing Message-ID: <38C3A355.264F139F@cs.york.ac.uk> WONCA '00 Workshop on Optical, Neural and Computational Associative processing University of York, York, United Kingdom 30-31 March 2000 Call for participation WONCA' 00 is an international workshop aimed at identifying key research issues and strategy for Associative Processing, and will focus on aspects of associative computation in the biological, optical and electronics domains. There are approximately 6 (six) funded workshop places available for scientists wishing to participate in this workshop (total number of participants approx 30). Places include one night's accomodation and meals for the duration of the workshop. In addition, standard travel within the UK will be refundable. Research Students and Research Associates are particularly encouraged to apply. Some background information and the provisional timetable are given below. More details of the arrangements may be found on the workshop website, at http://www.cs.york.ac.uk/WONCA. Interested parties are invited to complete and return the registration details below and submit a personal statement outlining their reasons for wishing to participate in the workshop. Submissions may be sent by email to WONCA at cs.york.ac.uk, or by post to: WONCA Department of Computer Science University of York Heslington YORK YO10 5DD United Kingdom Submissions should be sent to arrive no later than 17th March 2000. The workshop organisers decision regarding acceptance is final. No correspondence will be entered into. --------------------------------------------------------------------- Background EPSRC Networks The EPSRC has established a number of Emerging Computing Networks, to establish communities and stimulate research in long term, speculative, emerging areas of computing that have the potential for high rewards. The programme aims to encourage the transfer of ideas and insights, and to stimulate research to advance computing and IT beyond the foreseeable capabilities of existing technologies. Deliverables required of each Network are: i. A definition and description of the topic area. ii. A report on the state-of-the-art, both in the UK and overseas. iii. Identification of key research issues within the area. iv. Identification of enabling technologies and tools. v. Recommendations to move the area forward. Emergent Behaviour Computing Network The Emergent Behaviour Computing Network recognises the need for new computational paradigms, aimed at circumventing the limitations of conventional silicon and computing technologies, and at the comprehension and control of complex systems. Complex systems exhibit behaviours that are a product of the system as a whole. Such behaviours have long been recognised in natural systems, but man-made can also exhibit such emergent behaviours. These behaviours are always unintended and often detrimental. This Network will focus on extending current distributed computing by exploring the potential for harnessing Emergent Behaviour Computing. WONCA '00 WONCA' 00 will focus on all aspects of associative computation as related to biological, optical and electronics domains and will address the following questions: 1. What can we learn from nature about associative computation? 2. What can we build now and what we would like to construct in the future? 3. What would we like associative systems to do? The key speakers will present the state-of-the-art in neurobiology, neurophysiology, optics, and neural networks research. There will be an emphasis on active discussion, and the result of the workshop will be contributions to the deliverables required by the EPSRC. --------------------------------------------------------------------- Provisional Timetable Day 1 - 30th March 2000 12:00 Registration 13:00 Buffet Lunch 13:45 Welcome from Chairman 14:00 Speaker 1 - D Willshaw Neurobiological aspects 14:35 Speaker 2 - G Palm Theoretical aspects 15:10 Speaker 3 - I Aleksander Emergent aspects 15:45 Tea 16:00 Regroup 16:15 Break-out Session - Discussion of research questions and strategy 17:15 Regroup 17:30 Round-up discussion 18:00 Day Close 19:30 Leave for Social 20:00 Dinner Day 2 - 31st March 2000 08:00 Breakfast 09:00 Day 1 Results & Reminder of Purpose 09:10 Speaker 4 - U Rueckert Implementation aspects 09:45 Speaker 5 - A Krikelis Implementation aspects 10:20 Coffee 10:30 Break-out Session - Discussion of research questions and strategy 11:30 Reports from Group Speakers 12:15 General Discussion 13:00 Lunch 14:30 Summary of Recommendations 14:45 Close --------------------------------------------------------------------- REGISTRATION DETAILS Name: Affiliation: Address: Telephone: Fax: Email: Emergency contact: Date of arrival: Date of departure: Vehicle registration: Disability/access requirements: Dietary requirements: Personal statement: (This should indicate your background and your particular interest in participation, given the aims of the workshop) From bengioy at IRO.UMontreal.CA Mon Mar 6 09:35:59 2000 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Mon, 6 Mar 2000 09:35:59 -0500 Subject: program of workshop on Selecting and Combining Models with Machine Learning Algorithms (corrected) Message-ID: <20000306093559.53019@IRO.UMontreal.CA> Hello, The preliminary program of the Workshop on Selecting and Combining Models with Machine Learning Algorithms is ready, and posted on www.iro.umontreal.ca/~bengioy/crmworkshop2000 The workshop will be held in Montreal, April 11-14, 2000. Registration is free but mandatory (and early registration is recommanded because the number of seats may be limited). Organizers: Yoshua Bengio and Dale Schuurmans List of speakers: Grace Wahba Leo Breiman Yoav Freund Peter Bartlett Peter Sollich Tom Dietterich Michael Perrone Hugh Chipman Dale Schuurmans Christian Leger Robert Schapire Shai Ben-David Bill Armstrong Olivier Chapelle Ayan Demiriz Sorin Draghici Russ Greiner Jasvinder Kandola Ofer Melnik Ion Muslea In-Jae Myung Gunnar Raetsch -- Yoshua Bengio Professeur aggrege Departement d'Informatique et Recherche Operationnelle Universite de Montreal, addresse postale: C.P. 6128 Succ. Centre-Ville, Montreal, Quebec, Canada H3C 3J7 addresse civique: 2920 Chemin de la Tour, Montreal, Quebec, Canada H3T 1J8, #2194 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/~lisa From golden at utdallas.edu Sun Mar 5 22:04:48 2000 From: golden at utdallas.edu (Richard M Golden) Date: Sun, 5 Mar 2000 21:04:48 -0600 (CST) Subject: Request for Comments on "Mathematical Methods for Neural Network Analysis and Design" Message-ID: I'm in the process of preparing a second edition of my book "Mathematical Methods for Neural Network Analysis and Design" (MIT Press, 1996). If you have read the book and have either positive or negative comments regarding its contents or if you have suggestions regarding material which you feel should be in the book, please let me know. Thanks! Richard Golden Professor of Cognitive Science and Electrical Engineering ******************************************************************************* Cognition and Neuroscience Program, School of Human Development, GR41 The University of Texas at Dallas, Box 830688 Richardson, Texas 75083-0688, PHONE: (972) 883-2423 EMAIL: golden at utdallas.edu, WEB: http://www.utdallas.edu/~golden/index.html ******************************************************************************* From bengio at idiap.ch Tue Mar 7 03:35:10 2000 From: bengio at idiap.ch (Samy Bengio) Date: Tue, 7 Mar 2000 09:35:10 +0100 (MET) Subject: open positions in speech/vision Message-ID: IDIAP invites applications for the positions of Speech Processing Group Leader as well as Senior Research Positions in Speech and Computer Vision Qualified applicants are assumed to have a Ph. D. in Computer Science, Electrical Engineering or a related field. They must have an outstanding research record and a proven excellence in leadership, project management, and supervision of Ph. D. students. They should master state-of-the-art techniques and theories related to speech or vision processing, hidden Markov Models and neural networks, learning algorithms, and statistical learning theory. About IDIAP IDIAP is a semi-private non-profit research institute, affiliated with the Swiss Federal Institute of Technology at Lausanne (EPFL) and the University of Geneva. For the last 10 years, IDIAP has mainly been carrying research and development in the fields of speech and speaker recognition, computer vision, and machine learning. Location IDIAP is located in the town of Martigny in Valais, a scenic region in the south of Switzerland, surrounded by some of the highest mountains in Europe which offer some of the best skiing, hiking, and climbing. It is within close proximity to Lausanne , and lake Geneva, and centrally located for travel to other parts of Europe. Prospective candidates should send their detailed CV to the address given below. Prof. Herve Bourlard Director of IDIAP P.O. Box 592 CH-1920 Martigny, Switzerland Email : secretariat at idiap.ch Phone : +41 27 721 77 20 Fax : +41 27 721 77 12 ----- Samy Bengio Research Director. Machine Learning Group Leader. IDIAP, CP 592, rue du Simplon 4, 1920 Martigny, Suisse. tel: +41 27 721 77 39, fax: +41 27 721 77 12. mailto:bengio at idiap.ch, http://www.idiap.ch/~bengio From ijspeert at rana.usc.edu Tue Mar 7 21:26:34 2000 From: ijspeert at rana.usc.edu (Auke Ijspeert) Date: Tue, 7 Mar 2000 18:26:34 -0800 (PST) Subject: CFP: session on Biologically Inspired Robotics (Boston 3-8 Nov.) Message-ID: [Apologies for multiple postings. Please note that this call for papers is sent at a short notice: the deadline for the abstract is in 3 weeks already, but the abstract need only to be 250-words long.] CFP: session on BIOLOGICALLY INSPIRED ROBOTICS 3-8 November 2000, Boston, Massachusetts (the exact days of the session have still to be defined) Special session of Sensor Fusion and Decentralized Control in Robotic Systems III (RB06) http://www.spie.org/web/meetings/calls/pe00/confs/RB06.html Chairs: Auke Jan Ijspeert (USC, Los Angeles), Nicolas Franceshini (CNRS, France) This session addresses biological inspiration in robotics at several levels: from sensory systems to biomimetic structures and muscle-like actuation, via biologically inspired control mechanisms. It will look in particular at 1) what are the benefits of biological inspiration in current robotics, and 2) what insights can a robotic implementation give, in return, into the functioning of biological systems. Although this session is primarily interested in robotics, work involving realistic physically-based simulations will also be considered. Participants are invited to submit a 250-word abstract before the ---27th of MARCH 2000---. Please submit both through the conference web page at http://www.spie.org/web/meetings/calls/pe00/confs/RB06.html AND by sending an electronic copy to ijspeert at rana.usc.edu and enfranceschini at LNB.cnrs-mrs.fr . Please specify at the beginning of the abstract that it is submitted to "Biologically-Inspired Robotics session, Sensor Fusion and Decentralized Control in Robotic Systems III (RB06)". Participants whose abstract has been accepted after review will also be invited to contribute to the conference proceedings by submitting a manuscript by the 14th of August 2000 (details to be given). GENERAL CALL FOR PAPERS: SPIE Photonices East 3-8 November 2000, Boston, Massachusetts Sensor Fusion and Decentralized Control in Robotic Systems III (RB06) http://www.spie.org/web/meetings/calls/pe00/confs/RB06.html On-site Proceedings. Abstracts for this conference are due by 27 March 2000. Manuscripts are due by 14 August 2000. Conference Chairs: Gerard T. McKee, Univ. of Reading (UK); Paul S. Schenker, Jet Propulsion Lab. Program Committee: Mongi A. Abidi, Univ. of Tennessee/Knoxville; Dimi Apostolopoulos, Carnegie Mellon Univ.; Eric T. Baumgartner, Jet Propulsion Lab.; George A. Bekey, Univ. of Southern California; Henrik I. Christensen, Royal Institute of Technology (Sweden); Steven Dubowsky, Massachusetts Institute of Technology; John T. Feddema, Sandia National Labs.; Nicolas H. Franceschini, Ctr. National de la Recherche Scientifique (France); Terrance L. Huntsberger, Jet Propulsion Lab.; Pradeep K. Khosla, Carnegie Mellon Univ.; Maja J. Mataric, Univ. of Southern California; Robin R. Murphy, Univ. of South Florida; Francois G. Pin, Oak Ridge National Lab.; J=FCrgen Rossmann, Univ. of Dortmund (Germany); Arthur C. Sanderson, Rensselaer Polytechnic Institute; Tzyh-Jong Tarn, Washington Univ. This conference addresses multi-sensor fusion and distributed control in robotic systems. The theme is intelligent automation through enriched perception and action skills, at all levels of robot tasking, including cooperative interactions of multiple robots. We welcome multi-disciplinary submissions based in both technological and biological models. The primary focus of the meeting is algorithmic: submissions should clearly state a robotic sensing or control objective, outline a physical or behavioral model, reduce it to computational definition, and present experimental and/or analytical results. Describe what distinguishes your problem as a robotic sensor fusion or distributed control problem; contrast your approach with alternative methods. Papers that report working robotic systems and specific task applications are particularly desirable. In all submissions authors should relate their objectives to mainstream problems in robot navigation, manipulation, surveillance, planning, assembly, learning, etc. Topics of interest include, but are not limited to: *modeling, registration, and calibration of multiple sensors *3D object estimation from multiple features and views *visual integration of structural and motion information *robust fusion of active and passive sensors & databases *estimation, recognition and error models for data fusion *task driven planning and sequencing of robotic sensors *integration of vision and touch in dexterous robotic tasks *sensor based human-machine interaction (voice/gesture/etc.) *learning strategies for multi-sensor object recognition *decentralized control of multiple-armed and legged robots *task planning for reconfigurable & modular robotic systems *motion coordination and control in multiple robot tasks *cooperative, emergent behaviors in multiple agents *task based mapping and learning of sensor-based behaviors *biologically inspired sensing, controls, and behaviors --------------------------------------------------------------------------- Dr Auke Jan Ijspeert Brain Simulation Lab & Computational Learning and Motor Control Lab Dept. of Computer Science, Hedco Neurosciences bldg, 3614 Watt way U. of Southern California, Los Angeles, CA 90089-2520, USA Web: http://rana.usc.edu:8376/~ijspeert/ Tel: +1 213 7401922 or 7406995 (work) +1 310 8238087 (home) Fax: +1 213 7405687 Email: ijspeert at rana.usc.edu --------------------------------------------------------------------------- From margindr at CS.ORST.EDU Wed Mar 8 14:27:30 2000 From: margindr at CS.ORST.EDU (Dragos Margineantu) Date: Wed, 8 Mar 2000 11:27:30 -0800 (PST) Subject: ICML-2000 Workshop on Cost-Sensitive Learning Message-ID: <200003081927.LAA04805@ghost.CS.ORST.EDU> Invitation to Participate, Call for Contributions WORKSHOP ON COST-SENSITIVE LEARNING ----------------------------------- [In conjunction with the Seventeenth International Conference on Machine Learning - ICML-2000, Stanford University June 29-July 2, 2000] Workshop webpage is at: http://www.cs.orst.edu/~margindr/Workshops/Workshop-ICML2000.html Workshop Motivation and Description ----------------------------------- Recent years have seen supervised learning methods applied to a variety of challenging problems in industry, medicine, and science. In many of these problems, there are costs associated with measuring input features and there are costs associated with different possible outcomes. However, existing classification algorithms assume that the input features are already measured (at no cost) and that the goal is to minimize the number of misclassification errors (the 0/1 loss). For example, in medical diagnosis, different tests have different costs (and risks) and different outcomes (false positives and false negatives) have different costs. The cost of a false positive medical diagnosis is an unnecessary treatment, but the cost of a false negative diagnosis may be the death of the patient. Given a choice, a cost-sensitive learning algorithm should prefer to measure less costly features and to make less costly errors (in this example, false positives). Not surprisingly, when existing learning algorithms are applied to cost-sensitive problems, the results are often poor, because they have no way of making these tradeoffs. Another example concerns the timeliness of predictions in time-series applications. Consider a classifier that is applied to monitor a complex system (e.g., factor, power plant, medical device). It is supposed to signal an alarm if a problem is about to occur. The value of the alarm is not merely related to whether it is a false alarm or a missed alarm, but also to whether the alarm is raised soon enough to allow preventative measures to be taken. The goal of this workshop is to bring together researchers who are working on problems for which the standard 0/1-loss model with zero-cost input features is unsatisfactory. A good reference on different types of costs, and cost-sensitive learning can be found at: http://www.iit.nrc.ca/bibliographies/cost-sensitive.html. The workshop will be structured around three main topics: * ALGORITHMS FOR COST-SENSITIVE LEARNING Algorithms that take cost information as input (along with the training data) and produce a cost-sensitive classifier as output. Algorithms that construct robust classifiers that accept cost information at classification time. Algorithms designed for other types of costs. * COSTS AND LOSS FUNCTIONS THAT ARISE IN REAL-WORLD APPLICATIONS What types of costs are involved in practical applications? What are the loss functions in current and future applications of machine learning? What are the right ways of formulating various cost-sensitive learning problems? * METHODS AND PROMISING DIRECTIONS FOR FUTURE RESEARCH What methods should be applied to evaluate cost-sensitive learning algorithms? What are promising new directions to pursue? What should be our ultimate research goals? Approximately one third of the day will be devoted to each of these three topics. On each of these topics, one or two people will give overview presentations. These will be followed by a mix of discussion and short position papers presented by the participants. Submissions ----------- To participate in the workshop, please send a email message to Tom Dietterich (tgd at cs.orst.edu) giving your name, address, email address, and a brief description of your reasons for wanting to attend. In addition, if you wish to present one or more position papers on the topics listed above, please send a one-page abstract of each position paper to Tom Dietterich at the same email address. You may submit a position paper on each of the three main topics (algorithms, loss functions, future research). If you have an issue or contribution that is not covered by these three categories, please contact Tom Dietterich by email to discuss your idea prior to submitting a position paper. Submissions are especially solicited that describe the loss functions arising in industrial applications of machine learning. The organizers will review the submissions with the goal of assembling a stimulating and exciting workshop. Attendence will be limited to 40 people, with preference given to people who are presenting position papers. Important dates: - Submission deadline: April 24,2000 - Notification of acceptance: May 8, 2000 - Workshop will held between June 29 and July 2, 2000 (to be announced by mid-March) Organizers ---------- Tom Dietterich, Oregon State University (tgd at cs.orst.edu) Foster Provost, New York University, (provost at stern.nyu.edu) Peter Turney, Institute for Information Technology of the National Research Council of Canada (Peter.Turney at iit.nrc.ca) Dragos Margineantu, Oregon State University (margindr at cs.orst.edu) From ted.carnevale at yale.edu Wed Mar 8 23:33:42 2000 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Wed, 08 Mar 2000 23:33:42 -0500 Subject: NEURON 2000 Summer Course Message-ID: <38C729A6.D5DB08D1@yale.edu> COURSE ANNOUNCEMENT What: "The NEURON Simulation Environment" (NEURON 2000 Summer Course) When: Saturday, August 5, through Wednesday, August 9, 2000 Where: San Diego Supercomputer Center University of California at San Diego, CA Organizers: N.T. Carnevale and M.L. Hines Faculty includes: N.T. Carnevale, M.L. Hines, W.W. Lytton, and T.J. Sejnowski Description: This intensive hands-on course covers the design, construction, and use of models in the NEURON simulation environment. It is intended primarily for those who are concerned with models of biological neurons and neural networks that are closely linked to empirical observations, e.g. experimentalists who wish to incorporate modeling in their research plans, and theoreticians who are interested in the principles of biological computation. The course is designed to be useful and informative for registrants at all levels of experience, from those who are just beginning to those who are already quite familiar with NEURON or other simulation tools. Registration is limited to 20, and the deadline for receipt of applications is Friday, July 7, 2000. For more information see http://www.neuron.yale.edu/sdsc2000/sdsc2000.htm or contact Ted Carnevale Psychology Dept. Box 208205 Yale University New Haven, CT 06520-8205 USA phone 203-432-7363 fax 203-432-7172 email ted.carnevale at yale.edu Supported in part by: National Science Foundation National Institutes of Health National Partnership for Advanced Computational Infrastructure and the San Diego Supercomputer Center Contractual terms require inclusion of the following statement: This course is not sponsored by the University of California. But thanks anyway. --Ted From d.mareschal at bbk.ac.uk Thu Mar 9 05:06:02 2000 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Thu, 9 Mar 2000 11:06:02 +0100 Subject: Postdoc and Phd positions available Message-ID: BIRKBECK COLLEGE UNIVERSITY OF LONDON Faculty of Science School of Psychology The School of Psychology has gained an EU Fifth Framework Human Potential Initiative award and invites applications for two positions to be held jointly in the School and the Centre for Brain and Cognitive Development to work on a project entitled "The Basic Mechanisms of Learning in Natural and Artificial Systems". POSTDOCTORAL RESEARCH ASSOCIATE (Two-Year Fixed Term) The postholder will mainly use connectionist/neural network techniques to model memory dissociations in infancy and help implement and design models of explicit and implicit memory in infancy in collaboration with Dr. D. Mareschal. Applicants should have a PhD, preferably in cognitive psychology, developmental psychology, AI, or Neural Computation. The post is tenable from 1st July, 2000 or a date to be arranged thereafter. Salary range: ?18,420 to ?20,319 pa inc. For details send a large (A4) sae to the Personnel Department, Ref: APS343, Birkbeck, Malet Street, London WC1E 7HX. Closing date: 30th April 2000 DOCTORAL STUDENTSHIP (Three-Year Award) The award is available to work with Dr. D. Mareschal and Professor Mark H Johnson on a project exploring the interference effects in infant object recognition using behavioural and possibly ERP methods to test infants. Applicants should have a good first degree in experimental psychology. An interest in connectionist modelling and the neurosciences would be desirable. The award is available from 1st October 2000. For further information and application forms send a large SAE to the School of Psychology Birkbeck, Malet Street, London, WC1E 7HX or telephone (+44) 171 631 6207 Closing date: 31st March 2000 College web site: http://www.bbk.ac.uk Informal enquiries for both positions can be addressed to d.mareschal at bbk.ac.uk These positions are only available to citizens and long-term residents of the EU and Associate Member states who are not citizens or long-term residents of the UK. ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development Department of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 171 631-6582/6207 fax +44 171 631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From Uta.Schwalm at neuroinformatik.ruhr-uni-bochum.de Thu Mar 9 10:52:20 2000 From: Uta.Schwalm at neuroinformatik.ruhr-uni-bochum.de (Uta Schwalm) Date: Thu, 9 Mar 2000 16:52:20 +0100 (MET) Subject: Full professorship in Neural Computation Message-ID: <200003091552.QAA02300@luda.neuroinformatik.ruhr-uni-bochum.de> The deadline for application for the following professorship is already next week, and some people on this list may be interested in applying. The Ruhr-Universit?t Bochum has a vacancy for a full professorship (C4) in Neural Computation ("Neuroinformatik") (succession Prof. Dr. W. von Seelen) The Ruhr-Universit?t Bochum offers a wide-ranging spectrum of subjects in the natural sciences, humanities, engineering and medicine. The Institut f?r Neuroinformatik is interdisciplinary in scope and is a central research institute of the University. By developing models for functional aspects of the central nervous system and experimentally realizing artificial systems the institute contributes on the one hand to the elucidation of the function and development of the brain, and on the other to technical applications. Current research concentrates on computer vision and robot control in natural environments. In terms of methodology, emphasis is on computer experiments and the analytical and numerical treatment of learning dynamical systems. The appointee is expected to carry his or her share in administration, teaching and development of the Institute, to build and lead a large research team, and to attract research funding. There is the possibility to cooperate with ZN GmbH, a spin-off of the Institute. Applicants are expected to have a proven track record of scientific research and experience in leading a research team. It is expected that the appointee will join one of the departments of the University, depending on his or her specific field. The Ruhr-Universit?t Bochum aims at raising its percentage of females and encourages qualified women to apply. Applications by qualified disabled persons are encouraged. Applications with the usual materials are to be sent by 15 March 2000 to: Rektor der Ruhr-Universit?t Bochum Universit?tsstra?e 150 D-44780 Bochum. Homepage of the Institute: http://www.neuroinformatik.ruhr-uni-bochum.de/ From pli at richmond.edu Thu Mar 9 16:25:02 2000 From: pli at richmond.edu (Ping Li) Date: Thu, 9 Mar 2000 16:25:02 -0500 Subject: postdoc and faculty positions Message-ID: Dear Colleagues, Please distribute this information to your students or colleagues. Thank you. Sincerely Ping Li *********************************************************************** Ping Li, Ph.D. Email: ping at cogsci.richmond.edu Associate Professor of Psychology http://www.richmond.edu/~pli/ Department of Psychology Phone: (804) 289-8125 (office) University of Richmond (804) 287-6039 (lab) Richmond, VA 23173, U.S.A. Fax: (804) 287-1905 *********************************************************************** Postdoc Position: Qualified individuals are invited to apply for a postdoctoral fellowship in connectionist models of language learning. The fellowship is supported by the National Science Foundation, and provides an annual stipend of $32,000 for two years. A qualified candidate should hold a Ph.D. degree in an area of cognitive sciences and have experience in connectionist modeling and natural language processing. Technical experiences with C/C++ and Unix/Linux on SUN/Windows platforms are preferable. The successful candidate will join the PI's research team in collaboration with Brian MacWhinney of Carnegie Mellon University to develop a self-organizing neural network model of language acquisition (see the NSF homepage for a summary of the project: http://www.nsf.gov/cgi-bin/showaward?award=9975249). In addition, the fellow will have opportunities to collaborate on research and teaching with faculties at the Department of Psychology, Department of Modern Languages, and Department of Computer and Mathematical Sciences at the University of Richmond. U of R is a highly selective, small private school located on a beautiful campus 6 miles west of Richmond (capital of Virginia, 1 hour east of Charlottesville, 1 hour north of Williamsburg, and 2 hours south of Washington DC). With its nearly 1-billion endowment and progressive program enhancement efforts, the university offers a congenial research and teaching environment. The fellowship is expected to start some time between now and September 1. Consideration of applications will begin immediately until the position is filled. Applicants should send a curriculum vitae, a cover letter, and two letters of recommendation to Ping Li, Department of Psychology, University of Richmond, Richmond, VA 23173, or via email to pli at richmond.edu. The University of Richmond is an equal opportunity, affirmative action employer. Women and minority candidates are encouraged to apply. Faculty Position: Univeristy of Richmond. The Department of Psychology invites applications for a one-year replacement position at the Assistant Professor level. Preference will be given to candidates who would be able to teach undergraduate courses in statistics and in memory and cognition. Candidates should have completed the Ph.D. degree by August 2000 starting date. Scholars who show a promise of excellence in teaching and an active research program which stimulates student interest in research involvement are encouraged to apply. Send vita, statement of research and teaching interests, and three letters of recommendation to Andrew F. Newcomb, Department of Psychology, University of Richmond, Richmond, VA 23173. Consideration of applications will begin on April 15, 2000. The University of Richmond is a highly selective, small private university located on a beautiful campus six miles west of the heart of Richmond. We are an Equal Opportunity, Affirmative Action Employer and encourage applications from women and minority candidates. From harnad at coglit.ecs.soton.ac.uk Fri Mar 10 04:32:07 2000 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Fri, 10 Mar 2000 09:32:07 +0000 (GMT) Subject: Minds, Machines and Turing Message-ID: The following paper is available at: http://www.cogsci.soton.ac.uk/~harnad/Papers/Harnad/harnad00.turing.html Comments welcome. Harnad, S. (2001) Minds, Machines and Turing: The Indistinguishability of Indistinguishables. Journal of Logic, Language, and Information (special issue on "Alan Turing and Artificial Intelligence") http://www.cogsci.soton.ac.uk/~harnad/Papers/Harnad/harnad00.turing.html MINDS, MACHINES AND TURING: THE INDISTINGUISHABILITY OF INDISTINGUISHABLES Stevan Harnad Department of Electronics and Computer Science University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM harnad at cogsci.soton.ac.uk http://www.cogsci.soton.ac.uk/~harnad/ ABSTRACT: Turing's celebrated 1950 paper proposes a very general methodological criterion for modelling mental function: total functional equivalence and indistinguishability. His criterion gives rise to a hierarchy of Turing Tests, from subtotal ("toy") fragments of our functions (t1), to total symbolic (pen-pal) function (T2 -- the standard Turing Test), to total external sensorimotor (robotic) function (T3), to total internal microfunction (T4), to total indistinguishability in every empirically discernible respect (T5). This is a "reverse-engineering" hierarchy of (decreasing) empirical underdetermination of the theory by the data. Level t1 is clearly too underdetermined, T2 is vulnerable to a counterexample (Searle's Chinese Room Argument), and T4 and T5 are arbitrarily overdetermined. Hence T3 is the appropriate target level for cognitive science. When it is reached, however, there will still remain more unanswerable questions than when Physics reaches its Grand Unified Theory of Everything (GUTE), because of the mind/body problem and the other-minds problem, both of which are inherent in this empirical domain, even though Turing hardly mentions them. KEYWORDS: cognitive neuroscience, cognitive science, computation, computationalism, consciousness, dynamical systems, epiphenomenalism, intelligence, machines, mental models, mind/body problem, other minds problem, philosophy of science, qualia, reverse engineering, robotics, Searle, symbol grounding, theory of mind, thinking,Turing, underdetermination, Zombies. From gzy at doc.ic.ac.uk Fri Mar 10 13:17:22 2000 From: gzy at doc.ic.ac.uk (gzy) Date: Fri, 10 Mar 2000 18:17:22 +0000 Subject: PhD and PostDoctoral Positions at Imperial College, London, England Message-ID: <38C93C32.6AA5AC81@doc.ic.ac.uk> Imperial College of Science, Technology and Medicine Department of Computing Visual Information Processing (VIP) Group Research Assistantship and PhD Studentship Positions We have two vacancies, one for a Postdoctoral Research Assistant, and the other for a PhD studentship, available within the Visual Information Processing Group. Both positions are for three years and are funded by the EPSRC under a project entitled "ViTAL: visual tracking for active learning." The aim of the project is to develop a novel framework for active learning and knowledge gathering for decision support systems in medical imaging. It is a collaborative venture between the Visual Information Processing (VIP) Group of the Department of Computing, Imperial College, and the Lung Imaging Research Team at the Royal Brompton Hospital. The Visual Information Processing group has been involved in biomedical imaging research for the last 10 years and has produced more than 200 publications in the areas of Computational Vision, Image Processing, Perceptual Intelligence, and Biomedical Imaging Systems. The group currently has a team of 18 members, including four full-time academic staff. Detailed information about the group can be found at http://vip.doc.ic.ac.uk, or you can call Dr Guang-Zhong Yang (gzy at doc.ic.ac.uk, 020-7594 8441), head of the research group, for an informal discussion about the techincal details of the project. The appointment for the research assistant will be on the RA 1A scale, (18,420 - 26,613) inclusive of London Allowance, depending on qualifications and experience. Applicants should have a good degree in Computing. Applications should include a full CV plus names and addresses of three referees. They must be submitted by 15th March 2000 to: Dr. T. Sergot, Department of Computing, Imperial College, 180 Queen's Gate, London SW7 2BZ, UK. email: t.sergot at ic.ac.uk Fax: +44 20 7581 8024 Imperial College is striving towards equal opportunities. At the leading edge of research, innovation and learning -- Guang-Zhong Yang, PhD Visual Information Processing Department of Computing 180 Queen's Gate Imperial College London SW7 2BZ, UK Tel: 44-(0)20 7594 8441 Fax: 44-(0)20 7581 8024 Email: gzy at doc.ic.ac.uk http://vip.doc.ic.ac.uk From steve at cns.bu.edu Mon Mar 13 07:49:46 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Mon, 13 Mar 2000 07:49:46 -0500 Subject: Neural dynamics of 3-D surface perception: Figure-ground separation and lightness perception Message-ID: The following article is available at http://www.cns.bu.edu/Profiles/Grossberg/ in HTML, PDF, and Gzipped postscript: Kelly, F. and Grossberg, (2000). Neural dynamics of 3-D surface perception: Figure-ground separation and lightness perception. Perception & Psychophysics, in press. Also available in the Tech Report Version, as CAS/CNS TR-98-0226. Abstract: This article develops the FACADE theory of three-dimensional (3-D) vision to simulate data concerning how two-dimensional (2-D) pictures give rise to 3-D percepts of occluded and occluding surfaces. The theory suggests how geometrical and contrastive properties of an image can either cooperate or compete when forming the boundary and surface representations that subserve conscious visual percepts. Spatially long-range cooperation and short-range competition work together to separate boundaries of occluding figures from their occluded neighbors, thereby providing sensitivity to T-junctions without the need to assume that T-junction "detectors" exist. Both boundary and surface representations of occluded objects may be amodally completed, while the surface representations of unoccluded objects become visible through modal processes. Computer simulations include Bregman-Kanizsa figure-ground separation, Kanizsa stratification, and various lightness percepts, including the Munker-White, Benary cross, and checkerboard percepts. Key words: Amodal Completion, Depth Perception, Figure-Ground Perception, Lightness, Visual Cortex, Neural Network From patrick at neuro.kuleuven.ac.be Mon Mar 13 07:56:23 2000 From: patrick at neuro.kuleuven.ac.be (Patrick De Maziere) Date: Mon, 13 Mar 2000 13:56:23 +0100 (MET) Subject: NEW BOOK ON SELF-ORGANIZATION AND TOPOGRAPHIC MAPS Message-ID: NEW BOOK ON SELF-ORGANIZATION AND TOPOGRAPHIC MAPS ================================================== Title: Faithful Representations and Topographic Maps From Distortion- to Information-based Self-organization Author: Marc M. Van Hulle Publisher: J. Wiley & Sons, Inc. Publication Date: February 2000 with forewords by Teuvo Kohonen and Helge Ritter ------------------------------------------------------------------------------- A new perspective on topographic map formation and the advantages of information-based learning The study of topographic map formation provides us with important tools for both biological modeling and statistical data modeling. Faithful Representations and Topographic Maps offers a unified, systematic survey of this rapidly evolving field, focusing on current knowledge and available techniques for topographic map formation. The author presents a cutting-edge, information-based learning strategy for developing equiprobabilistic topographic maps -- that is, maps in which all neurons have an equal probability to be active --, clearly demonstrating how this approach yields faithful representations and how it can be successfully applied in such areas as density estimation, regression, clustering, and feature extraction. The book begins with the standard approach of distortion-based learning, discussing the commonly used Self-Organizing Map (SOM) algorithm and other algorithms, and pointing out their inadequacy for developing equiprobabilistic maps. It then examines the advantages of information-based learning techniques, and finally introduces a new algorithm for equiprobabilistic topographic map formation using neurons with kernel-based response characteristics. The complete learning algorithms and simulation details are given throughout, along with comparative performance analysis tables and extensive references. Faithful Representations and Topographic Maps is an excellent, eye-opening guide for neural network researchers, industrial scientists involved in data mining, and anyone interested in self-organization and topographic maps. ------------------------------------------------------------------------------- "I am convinced that this book marks an important contribution to the field of topographic map representations and that it will become a major reference for many years." (Ritter) "This book will provide a significant contribution to our theoretical understanding of the brain." (Kohonen) ------------------------------------------------------------------------------- http://www.amazon.com/exec/obidos/ASIN/0471345075/qid=948382599/sr=1-1/002-0713799-7248240 http://www.barnesandnoble.com/ search for (Keyword): Faithful representations From d.lowe at aston.ac.uk Mon Mar 13 06:23:03 2000 From: d.lowe at aston.ac.uk (David LOWE) Date: Mon, 13 Mar 2000 11:23:03 +0000 Subject: Faculty Position in Information Engineering Message-ID: <0003131145142U.03816@nn-bernoulli.aston.ac.uk> Lecturer in Information Engineering Neural Computing Research Group Aston University, UK Aston University is seeking to appoint an inspired, research-led individual to the Information Engineering Group within the School of Engineering and Applied Science. Information Engineering encompasses the Neural Computing Research Group, one of the premier research groups in this area in Europe. The research areas of the Group are now very diverse, covering both theoretical and practical aspects of information analysis. These include biomedical signal analysis, theory and algorithms of novel neural network structures, such as support vector machines and Gaussian processes, inference and graphical models, nonlinear time series modelling and a range of application domains. Another aspect of the group's activities focuses on the relation between statistical physics and a variety of methods in information analysis (for instance, support vector machines, Gaussian processes and Bayesian networks) and in communication (error correcting codes and cryptography). Current research contracts are valued at around ukp 1 million. The NCRG also runs a research-based MSc course on Pattern Analysis and Neural Networks. We are seeking a strong research-active person who is also capable of contributing to the current and future-planned taught activities. More details on the taught activities can be obtained from www.maths.aston.ac.uk, and further details of the research activities can be found on www.ncrg.aston.ac.uk. Members of the research team are involved in the organisation of most of the major conferences in the respective areas, and the Group has links with several Government and industrial organisations through its research. The group runs its own computing facilities based mainly on Silicon Graphics, Sun and Alpha computers (including a multiprocessor Silicon Graphics Challenge machine and a powerful multiprocessor Alpha DS20). Although the position is intended to be permanent, Aston University policy is to appoint all staff for a fixed term (usually 5 years for lecturers) in the first instance, with a transfer to a continuing appointment usually made during the fifth year. Note that this position is equivalent to an assistant professor in other world locations. Informal enquiries can be made via email to Prof Lowe (d.lowe at aston.ac.uk). Further details may be obtained from the Personnel Office at Aston University. Interested individuals should send a comprehensive resume and a list of referees to either the Personnel Office or to the Group's administrator (v.j.bond at aston.ac.uk). Electronic submissions are welcome. The closing date for submissions is 1st May 2000. From reder at cmu.edu Mon Mar 13 09:30:05 2000 From: reder at cmu.edu (Lynne M. Reder) Date: Mon, 13 Mar 2000 09:30:05 -0500 Subject: postdoctoral opportunity Message-ID: The training grant on Combined Training in Computational and Behavioral approaches to the Study of Cognition is seeking two new postdoctoral fellows, one starting in June of 2000 (i.e., the degree is already in hand before the end of this June). The second position is a bit more flexible in start time. The successful applicants need not already have experience in computational modeling but must be interested in learning. The training grant is only open to US citizens or nationals (permanent residents). Starting salary for the stipend is $26,256 and goes up with number of years of postdoctoral experience (the stipend is set by NIMH). There is also a travel allowance of $800 and a budget for ancillary expenses of $2500. The computational approaches that we offer are symbolic, subsymbolic and hybrid. Please see the following web page for more information: http://www.psy.cmu.edu/~reder/ Applications will be accepted until both positions are filled. In addition, I am looking for a postdoctoral fellow for a position in my lab (independent of the training grant). My computational approach is localist so this newsgroup may not be the appropriate venue for such an advertisement. ========================================================== Lynne M. Reder, Professor Department of Psychology Carnegie Mellon University Pittsburgh, PA 15213 phone: (412)268-3792 fax: (412) 268-2844 email: reder at cmu.edu URL: http://www.andrew.cmu.edu/~reder/reder.html From bvr at stanford.edu Wed Mar 15 03:10:18 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Wed, 15 Mar 2000 00:10:18 -0800 Subject: CALL FOR WORKSHOP PROPOSALS -- NIPS*2000 Message-ID: <4.2.0.58.20000315000959.00ca6100@bvr.pobox.stanford.edu> CALL FOR WORKSHOP PROPOSALS -- NIPS*2000 ===================================== Neural Information Processing Systems Natural and Synthetic NIPS*2000 Post-Conference Workshops December 1 and 2, 2000 Breckenridge, Colorado ===================================== Following the regular program of the Neural Information Processing Systems 2000 conference, workshops on various current topics in neural information processing will be held on December 1 and 2, 2000, in Breckenridge, Colorado. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Example topics include: Active Learning, Architectural Issues, Attention, Audition, Bayesian Analysis, Bayesian Networks, Benchmarking, Brain Imaging, Computational Complexity, Computational Molecular Biology, Control, Genetic Algorithms, Graphical Models, Hippocampus and Memory, Hybrid Supervised/Unsupervised Learning Methods, Hybrid HMM/ANN Systems, Implementations, Independent Component Analysis, Mean-Field Methods, Markov Chain Monte-Carlo Methods, Music, Network Dynamics, Neural Coding, Neural Plasticity, On-Line Learning, Optimization, Recurrent Nets, Robot Learning, Rule Extraction, Self-Organization, Sensory Biophysics, Signal Processing, Spike Timing, Support Vectors, Speech, Time Series, Topological Maps, and Vision. The goal of the workshops is to provide an informal forum for researchers to discuss important issues of current interest. There will be six hours of workshop meetings per day, split into morning and afternoon sessions, with free time in between for ongoing individual exchange or outdoor activities. Controversial issues, open problems, and comparison of competing approaches are encouraged and preferred as workshop topics. Representation of alternative viewpoints and panel-style discussions are particularly encouraged. Descriptions of previous workshops may be found at http://www.cs.cmu.edu/Groups/NIPS/NIPS99/Workshops/ Select workshops may be invited to submit their workshop proceedings for publication as part of a new series of monographs for the post-NIPS workshops. Workshop organizers will have responsibilities including: ++ coordinating workshop participation and content, which includes arranging short informal presentations by experts, arranging for expert commentators to sit on a discussion panel, formulating a set of discussion topics, etc. ++ moderating the discussion, and reporting its findings and conclusions to the group during evening plenary sessions ++ writing a brief summary and/or coordinating submitted material for post-conference electronic dissemination. ======================= Submission Instructions ======================= Interested parties should submit a short proposal for a workshop of interest via email by May 26, 2000. Proposals should include title, description of what the workshop is to address and accomplish, proposed workshop length (1 or 2 days), planned format (mini-conference, panel discussion, combinations of the above, etc), and proposed speakers. Names of potential invitees should be given where possible. Preference will be given to workshops that reserve a significant portion of time for open discussion or panel discussion, as opposed to pure "mini-conference" format. An example format is: ++ Tutorial lecture providing background and introducing terminology relevant to the topic. ++ Two short lectures introducing different approaches, alternating with discussions after each lecture. ++ Discussion or panel presentation. ++ Short talks or panels alternating with discussion and question/answer sessions. ++ General discussion and wrap-up. We suggest that organizers allocate at least 50% of the workshop schedule to questions, discussion, and breaks. Past experience suggests that workshops otherwise degrade into mini-conferences as talks begin to run over. The proposal should motivate why the topic is of interest or controversial, why it should be discussed, and who the targeted group of participants is. It also should include a brief resume of the prospective workshop chair with a list of publications to establish scholarship in the field. Submissions should include contact name, address, email address, phone and fax numbers. Proposals should be emailed to caruana at cs.cmu.edu. Proposals must be RECEIVED by May 26, 2000. If email is unavailable, mail to: NIPS Workshops, Rich Caruana, SCS CMU, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA. Questions may be addressed to either of the Workshop Co-Chairs: Rich Caruana (caruana at cs.cmu.edu) Virginia de Sa (desa at phy.ucsf.edu) PROPOSALS MUST BE RECEIVED BY MAY 26, 2000 From bvr at stanford.edu Wed Mar 15 03:09:56 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Wed, 15 Mar 2000 00:09:56 -0800 Subject: CALL FOR PAPERS -- NIPS*2000 Message-ID: <4.2.0.58.20000315000937.00c72500@bvr.pobox.stanford.edu> CALL FOR PAPERS -- NIPS*2000 ========================================== Neural Information Processing Systems Natural and Synthetic Monday, Nov. 27 -- Saturday, Dec. 2, 2000 Denver, Colorado ========================================== This is the fourteenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, statisticians, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks as well as oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Nov. 27), and following it there will be two days of focused workshops on topical issues at a nearby ski area (Dec. 1-2). Major categories for paper submission, with example subcategories (by no means exhaustive), are listed below. A special area of emphasis this year is innovative applications of neural computation. Algorithms and Architectures: supervised and unsupervised learning algorithms, feedforward and recurrent network architectures, localized basis functions, mixture models, committee models, belief networks, graphical models, support vector machines, Gaussian processes, topographic maps, decision trees, factor analysis, principal component analysis and extensions, independent component analysis, model selection algorithms, combinatorial optimization, hybrid symbolic-subsymbolic systems. Applications: innovative applications of neural computation including data mining, information retrieval, web and network applications, intrusion detection, fraud detection, bio-informatics, medical diagnosis, image processing and analysis, handwriting recognition, industrial monitoring and control, financial analysis, time-series prediction, consumer products, music, video and artistic applications, animation, virtual environments, learning dynamical systems. Cognitive Science/Artificial Intelligence: perception and psychophysics, neuropsychology, cognitive neuroscience, development, conditioning, human learning and memory, attention, language, natural language, reasoning, spatial cognition, emotional cognition, conceptual representation, neurophilosophy, problem solving and planning. Implementations: analog and digital VLSI, optical neurocomputing systems, novel neurodevices, computational sensors and actuators, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics, experimental data relevant to computational issues. Reinforcement Learning and Control: exploration, planning, navigation, Q-learning, TD-learning, state estimation, dynamic programming, robotic motor control, process control, Markov decision processes. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, speech signal enhancement, auditory scene analysis, source separation, applications of hidden Markov models to signal processing, models of human speech perception, auditory modeling and psychoacoustics. Theory: computational learning theory, statistical physics of learning, information theory, Bayesian methods, prediction and generalization, regularization, online learning (stochastic approximation), dynamics of learning, approximation and estimation theory, complexity theory, multi-agent learning. Visual Processing: image processing, image coding, object recognition, visual psychophysics, stereopsis, motion detection and tracking. ---------------------------------------------------------------------- Review Criteria: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors new to NIPS are strongly encouraged to submit their work, and will be given preference for oral presentations. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting a final camera-ready copy for the proceedings. Paper Format: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are required to use the NIPS LaTeX style files obtainable by anonymous FTP at the site given below. THE STYLE FILES HAVE BEEN UPDATED; please make sure that you use the current ones and not previous versions. Submission Instructions: NIPS has migrated to electronic submissions. Full submission instructions will be available at the web site given below. You will be asked to enter paper title, names of all authors, category, oral/poster preference, and contact author data (name, full address, telephone, fax, and email). You will upload your manuscript from the same page. We are only accepting postscript manuscripts. No pdf files will be accepted this year. The electronic submission page will be available on April 28, 2000. Submission Deadline: SUBMISSIONS MUST BE LOGGED BY MIDNIGHT MAY 19, 2000 PACIFIC DAYLIGHT TIME (08:00 GMT May 20). The LaTeX style files for NIPS, the Electronic Submission Page, and other conference information are available on the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS Copies of the style files are also available via anonymous ftp at ftp.cs.cmu.edu (128.2.242.152) in /afs/cs/Web/Groups/NIPS/formatting. For general inquiries or requests for registration material, send e-mail to nipsinfo at salk.edu or fax to (619)587-0417. NIPS*2000 Organizing Committee: General Chair, Todd K. Leen, Oregon Graduate Institute; Program Chair, Tom Dietterich, Oregon State University; Publications Chair, Volker Tresp, Siemens AG; Tutorial Chair, Mike Mozer, University of Colorado; Workshops Co-Chairs, Rich Caruana, Carnegie Mellon University, Virginia de Sa, Sloan Center for Theoretical Neurobiology; Publicity Chair, Benjamin Van Roy, Stanford University; Treasurer, Bartlett Mel, University of Southern California; Web Masters, Doug Baker and Alex Gray, Carnegie Mellon University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*2000 Program Committee: Leon Bottou, AT&T Labs - Research; Tom Dietterich, Oregon State University (chair); Bill Freeman, Mitsubishi Electric Research Lab; Zoubin Ghahramani, University College London; Dan Hammerstrom, Oregon Graduate Institute; Thomas Hofmann, Brown University; Tommi Jaakkola, MIT; Sridhar Mahadevan, Michigan State University; Klaus Obermeyer, TU Berlin; Manfred Opper, Aston University; Yoram Singer, Hebrew University of Jerusalem; Malcolm Slaney, Interval Research; Josh Tenenbaum, Stanford University; Sebastian Thrun, Carnegie Mellon University. PAPERS MUST BE SUBMITTED BY MAY 19, 2000 From yilin at stat.wisc.edu Wed Mar 15 13:31:17 2000 From: yilin at stat.wisc.edu (Yi Lin) Date: Wed, 15 Mar 2000 12:31:17 -0600 (CST) Subject: Paper announcement: SVM in nonstandard situations Message-ID: Dear Connectionists, A paper on the support vector machines for classification in nonstandard situations (with unequal misclassification cost, sampling bias present) is now available online: http://www.stat.wisc.edu/~yilin or http://www.stat.wisc.edu/~wahba Title and abstract are below: --------------------------------------------------------------------------- Support Vector Machines for Classification in Nonstandard Situations Yi Lin, Yoonkyung Lee, and Grace Wahba The majority of classification algorithms are developed for the standard situation in which it is assumed that the examples in the training set come from the same distribution as that of the target population, and that the cost of misclassification into different classes are the same. However, these assumptions are often violated in real world settings. For some classification methods, this can often be taken care of simply with a change of threshold; for others, additional effort is required. In this paper, we explain why the standard support vector machine is not suitable for the nonstandard situation, and introduce a simple procedure for adapting the support vector machine methodology to the nonstandard situation. Theoretical justification for the procedure is provided. Simulation study illustrates that the modified support vector machine significantly improves upon the standard support vector machine in the nonstandard situation. The computational load of the proposed procedure is the same as that of the standard support vector machine. The procedure reduces to the standard support vector machine in the standard situation. From pvdputten at smr.nl Fri Mar 17 09:54:11 2000 From: pvdputten at smr.nl (Peter van der Putten) Date: Fri, 17 Mar 2000 15:54:11 +0100 Subject: CoIL Competition Challenge 2000: Present your results in Greece Message-ID: Provide the best solution in the CoIL Challenge 2000 before May 4 and get free registration and travel support for the CoIL'2000 Symposium on June 19-23 in Chios, Greece! Direct mailings to a company's potential customers - "junk mail" to many - can be a very effective way for them to market a product or a service. However, as we all know, much of this junk mail is really of no interest to the majority of the people that receive it. Most of it ends up thrown away, not only wasting the money that the company spent on it, but also filling up landfill waste sites or needing to be recycled. If the company had a better understanding of who their potential customers were, they would know more accurately who to send it to, so some of this waste and expense could be reduced. Therefore, following a successful CoIL competition last year (See Synergy Issue 1, Winter 1999), CoIL has just announced a new competition challenge for 2000: "Can you predict who would be interested in buying a caravan insurance policy and give an explanation why?" This competition is organized by COiL, the Computational Intelligence and Learning Cluster. The goal of CoIL is to achieve scientific, technical and "social" integration of four european communities that perform research, development and application: Erudit (Fuzzy logic), EvoNet (Evolutionary computing), MLNet (Machine learning) and NEuroNet (Neural networks). The data was supplied by the Dutch datamining company Sentient Machine Research. We encourage any type of solutions to these problems, particularly those involving any of the CoIL technologies or any combinations of these. We are also interested in other solutions using other technologies, since CoIL is interested in being able to demonstrate how CoIL technologies relate to other techniques. The winners will be invited to present a short paper on their approach at the CoIL'2000 Symposium on Computational Intelligence and Learning (19-23 June 2000), in Chios, Greece. Your participation will be free of charge and we will pay you a travel support of 750 Euro. Important dates: 17 March 2000 Release of data 4 May 2000 Deadline for submissions 12 May 2000 Announcement of winners 22-23 June 2000 CoIL'2000 Symposium For more information, please visit CoILWeb: http://www.dcs.napier.ac.uk/coil/. Best wishes, Peter van der Putten Consultant Sentient Machine Research From fritsch at ira.uka.de Sun Mar 19 13:09:07 2000 From: fritsch at ira.uka.de (Juergen Fritsch) Date: Sun, 19 Mar 2000 19:09:07 +0100 Subject: PhD thesis available Message-ID: <38D517C3.E27D148E@ira.uka.de> Dear Connectionists, My PhD thesis on hierarchical connectionist acoustic modeling for large vocabulary speech recognition is now available on the WWW at http://isl.ira.uka.de/~fritsch For those interested, I have appended the abstract. Best regards, --Juergen Fritsch. ========================================================== Juergen Fritsch Research Scientist ---------------------------------------------------------- Interactive Systems Labs University of Karlsruhe & Carnegie Mellon University phone:++49-721-6086285 http://isl.ira.uka.de/~fritsch fax:++49-721-607721 email: fritsch at ira.uka.de ========================================================== Abstract: Hierarchical Connectionist Acoustic Modeling for Domain-Adaptive Large Vocabulary Speech Recognition Juergen Fritsch PhD Thesis, 238 pages Interactive Systems Labs Faculty of Computer Science University of Karlsruhe Germany ABSTRACT This thesis presents a new, hierarchical framework for connectionist acoustic modeling in large vocabulary statistical speech recognition systems. Based on the divide and conquer paradigm, the task of estimating HMM state posteriors is decomposed and distributed in the form of a tree-structured architecture consisting of thousands of small neural networks. In contrast to monolithic connectionist models, our approach scales to arbitrarily large state spaces. Phonetic context is represented simultaneously at multiple resolutions which allows for scalable acoustic modeling. We demonstrate that the hierarchical structure allows for (1) accelerated score computations through dynamic tree pruning, (2) effective speaker adaptation with limited amounts of adaptation data and (3) downsizing of the trained model for small memory footprints. The viability of the proposed hierarchical model is demonstrated in recognition experiments on the Switchboard large vocabulary conversational telephone speech corpus, currently considered the most difficult standardized speech recognition benchmark, where it achieves state-of-the-art performance with less parameters and faster recognition times compared to conventional mixture models. The second contribution of this thesis is an algorithm that allows for domain-adaptive speech recognition using the proposed hierarchical acoustic model. In contrast to humans, automatic speech recognition systems still suffer from a strong dependence on the application domain they have been trained on. Typically, a speech recognition system has to be tailored to a specific application domain to reduce semantic, syntactic and acoustic variability and thus increase recognition accuracy. Unfortunately, this approach results in a lack of portability as performance typically deteriorates unacceptably when moving to a new application domain. We present Structural Domain Adaptation (SDA), an algorithm for hierarchically organized acoustic models that exploits the scalable specificity of phonetic context modeling by modifying the tree structure for optimal performance on previously unseen application domains. We demonstrate the effectiveness of the SDA approach by adapting a large vocabulary conversational telephone speech recognition system to (1) a telephone dictation task and (2) spontaneous scheduling of meetings. SDA together with domain-specific dictionaries and language models allows to match the performance of domain-specific models with only 45-60 minutes of acoustic adaptation data. From king at harvard.edu Mon Mar 20 17:09:28 2000 From: king at harvard.edu (Gary King) Date: Mon, 20 Mar 2000 17:09:28 -0500 (EST) Subject: positions at HU Message-ID: We'd like to fill these positions with with lots of connectionists! --------------------------------------------------------------------- Visiting Faculty, Post-doctoral, and Pre-doctoral Positions at Harvard A major new initiative, "Military Conflict as a Public Health Problem," will commence at Harvard University beginning in the 2000-2001 academic year. This project will support political scientists, statistical methodologists, and public health scholars interested in pursing their own work or joint work related to this project. We are offering research positions for faculty, post-docs, and graduate students (including salary, office space, and computer access; there are no teaching or administrative duties) for those interested in: - forecasting and explaining international conflict and civil wars - describing or explaining the direct and indirect public health consequences of military conflict - utilizing variables that are usually used to explain or forecast conflict to study the more ultimate dependent variable of human misery (e.g., political scientists have found that democracies do not fight each other as often as other types of countries, but they have not studied in this way the ultimate consequences of democracy for human well-being.) - developing statistical methods for analyzing these data -- such as neural network models, CART, spatial statistics, data mining, hierarchical Bayesian models for numerous short time series or multiple cross-sections, models for complex dependence structures such as analyzing the presence of war or other variables in pairs of countries, forecasting models, statistical pattern recognition, visualization in large data sets, etc. - conducting research on human security, expanding the notion of military security to include other aspects of human well-being. - analyzing the best data in existence on global mortality and morbidity (by country, age, sex, and cause), military conflict (both international and civil), and thousands of explanatory variables corresponding to known or suspected predictors of each, at every level of aggregation available. Participants will have unrestricted access to these data. - other related topics. A key motivation for this project is to forge new alliances across disciplines, and so we do not expect applicants to be familiar with military conflict AND new statistical approaches AND public health research; expert knowledge in one of these fields and an interest in learning about or contributing to one of the others is sufficient. Please send a letter describing your research interests, a C.V., at least two letters of reference, and a sample of your scholarly work to Lara Birk, Center for Basic Research in the Social Sciences, 34 Kirkland Street, Harvard University, Cambridge, MA 02138; email: lbirk at latte.harvard.edu; fax 617-496-5149; phone 617-495-9271. Please get your application in by April 10th if possible. Please refer any questions you may have to Ms. Birk. The Principal Investigators for this project are Gary King (Professor of Government, Harvard; Director, Harvard-MIT Data Center; and Advisor to the World Health Organization (WHO)) and Christopher Murray (Director, Global Programme on Evidence for Health Policy, WHO; Professor of International Health Economics, Harvard School of Public Health). The project Advisory Committee includes James Alt (Professor of Government, Harvard, and Director of CBRSS), Bear Braumoeller (Assistant Professor of Government, Harvard), Paul E. Farmer, Jr. (Associate Professor, Department of Social Medicine, Harvard Medical School and Director, Institute for Health and Social Justice), Lisa Martin (Professor of Government, Harvard), Jasjeet Sekhon (Assistant Professor of Government, Harvard), Kenji Shibuya (Assistant Professor of Public Health, Teikyo University School of Medicine, Japan), and Langche Zeng (Associate Professor of Political Science, George Washington University and CBRSS fellow). The project is sponsored by the U.S. National Science Foundation, Harvard University's Weatherhead Center for International Affairs (WCFIA), Center for Basic Research in the Social Sciences (CBRSS), and Harvard-MIT Data Center, and it is in collaboration with the World Health Organization's Global Programme on Evidence for Health Policy. : Gary King, King at Harvard.Edu http://GKing.Harvard.Edu : : Center for Basic Research Internet Keyword: Gary King : : in the Social Sciences Direct (617) 495-2027 : : 34 Kirkland Street, Rm. 2 Assistant (617) 495-9271 : : Harvard U, Cambridge, MA 02138 eFax (520) 832-7022 : From bsc at microsoft.com Mon Mar 20 18:48:08 2000 From: bsc at microsoft.com (Bernhard Schoelkopf) Date: Mon, 20 Mar 2000 15:48:08 -0800 Subject: PhD studentship in SVM novelty detection, Univ. Oxford / Microsoft Cambridge Message-ID: Applications are invited for a PhD studentship supervised by Prof. L. Tarassenko (Neural Networks and Signal Processing Group, University of Oxford) and Dr. B. Schoelkopf (Microsoft Research, Cambridge, UK) in the field of Support Vector Machines (SVMs) for novelty detection. The detection of novelty is an important generic problem in health monitoring, and the SVM paradigm is an excellent theoretical framework for this. The student will be expected to develop state-of-the-art machine learning algorithms, with a view to applying them to real world problems such as epileptic seizure detection. Applicants with an excellent degree in Computer Science, Mathematics, Engineering, Physics are invited to contact Prof. Tarassenko (Lionel.Tarassenko at eng.ox.ac.uk) or Dr. Schoelkopf (bsc at microsoft.com) for further information. The student will be based at Oxford University, with the possibility of an internship at Microsoft Research during the summer vacations. The studentship, starting 1/10/2000, is generously funded by Microsoft. http://www.eng.ox.ac.uk/~wpcres/Summary/B-Neural.html http://www.research.microsoft.com/~bsc With kind regards Bernhard From steve at cns.bu.edu Mon Mar 20 21:05:17 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Mon, 20 Mar 2000 21:05:17 -0500 Subject: The Complementary Brain: A Unifying View of Brain Specialization and Modularity Message-ID: The following article is available at http://www.cns.bu.edu/Profiles/Grossberg/ in HTML, PDF, and Gzipped postscript: Grossberg, S. (2000). The complementary brain: A unifying view of brain specialization and modularity. Trends in Cognitive Sciences, in press. Preliminary version appears as Boston University Technical Report CAS/CNS-TR-98-003. Abstract: How are our brains functionally organized to achieve adaptive behavior in a changing world? This article presents one alternative to the computer metaphor suggesting that brains are organized into independent modules. Evidence is reviewed that brains are organized into parallel processing streams with complementary properties. Hierarchical interactions within each stream and parallel interactions between streams create coherent behavioral representations that overcome the complementary deficiencies of each stream and support unitary conscious experiences. This perspective suggests how brain design reflects the organization of the physical world with which brains interact. Examples from perception, learning, cognition, and action are described, and theoretical concepts and mechanisms by which complementarity is accomplished are presented. Keywords: modulatory, What and Where processing, visual cortex, motor cortex, reinforcement, recognition, attention, learning, expectation, volition, speech, neural network From sunita at it.iitb.ernet.in Tue Mar 21 01:13:26 2000 From: sunita at it.iitb.ernet.in (Dr. Sunita Sarawagi) Date: Tue, 21 Mar 2000 11:43:26 +0530 (IST) Subject: SIGKDD Explorations: call for papers: volume 2, issue 1 Message-ID: We invite submissions to the first issue of the second volume of SIGKDD Explorations to be published by the middle of the year. SIGKDD explorations is the official newsletter of ACM's new Special Interest Group (SIG) on Knowledge Discovery and Data Mining. Two issues of the first volume issue are already out and available online at http://www.acm.org/sigkdd/explorations/index.htm. SIGKDD Explorations newsletter is sent to the ACM SIGKDD membership and to a world-wide network of libraries. Submissions can be made in any one of the following categories. - survey/tutorial articles (short) on important topics not exceeding 20 pages - topical articles on problems and challenges - well-articulated position papers - technical articles not exceeding 15 pages. - news items on the order of 1-3 paragraphs - Brief announcements not exceeding 5 lines in length. - review articles of products and methodologies not exceeding 20 pages - reviews/summaries from conferences, panels and special meetings. - reports on relevant meetings and committees related to the field *NEW*: We also have added a for-pay advertisement section to allow vendors, companies, consultants, and others to reach the rapidly growing SIGKDD community. Advertising rates start at $250 for quarter page, $500 per half page, and $1000 for a full page. Submissions should be made to fayyad at acm.org or sunita at cs.berkeley.edu. All submissions must arrive by May 21, 2000 for inclusion in the next issue. Some words about the SIGKDD newsletter: -------------------------------------- SIGKDD Explorations is a bi-annual newsletter dedicated to serve the SIGKDD membership and community. Our goal is to make SIGKDD Newsletter a very informative, rapid publication, and interesting forum for communicating with SIGKDD community. Submissions will be reviewed by the editor and/or associate editors as apporpriate. The distribution will be very wide (on the web, to all members but probably not restricted access the first year, and to ACM's world-wide network of libraries. Members get e-mail notifications of new issues and get hardcopies if they desire). For more information on SIGKDD visit http://www.acm.org/sigkdd and for more information on the newsletter visit http://www.acm.org/sigkdd/explorations/index.htm Usama Fayyad, Editor-in-Chief fayyad at acm.org Sunita Sarawagi, Associate Editor sunita at cs.berkeley.edu From sally at svl.co.uk Tue Mar 21 04:12:48 2000 From: sally at svl.co.uk (Tickner, Sally) Date: Tue, 21 Mar 2000 10:12:48 +0100 Subject: FW: PERSPECTIVES IN NEURAL COMPUTING BOOK SERIES Message-ID: March 21st 2000 BOOK ANNOUNCEMENT Perspectives in Neural Computing Series Editor: John Taylor Perspectives in Neural Computing is a series of books on both the theoretical and applied aspects of neural computation. Designed to reflect the multidisciplinary nature of the subject, it provides research and tutorial texts for students and professional researchers. JUST PUBLISHED! Artificial Neural Networks in Biomedicine Paulo J.G. Lisboa, Emmanuel C. Ifeachor and Piotr S. Szczepaniak (Eds) If you are: a neural network practitioner involved with biomedical applications; a computer scientist working in medicine or looking for medical applications of computational intelligence methods; a developer and manufacturer of clinical computer systems; a medical researcher looking for new methods and computational tools; or a (post)graduate student on a relevant computer science or engineering course, then this book is for you. Among the contents are: * a set of tutorial papers covering established methods of best practice in neural network design * an extensive collection of case studies covering both commercially-available products, recently-granted patents, and a wide range of applications which this new methodology is opening-up for practical development in biomedicine =A345.00 February 2000 272 pages softcover ISBN 1-85233-005-8 To request more information about the series, or any other Springer-Verlag books and journals, please contact Sally Tickner, Senior Marketing Manager, Springer-Verlag London Ltd., Sweetapple House, Catteshall Road, Godalming, Surrey GU7 3DJ Tel: 01483 414113 Fax: 01483 415151 Email: sally at svl.co.uk www.springer.co.uk ** End Sally Tickner, Sr. Marketing Manager, Springer-Verlag London Ltd, Sweetapple House, Catteshall Road, Godalming, Surrey, GU7 3DJ, UK Tel: 01483 414113 Fax: 01483 415151 Email: sally at svl.co.uk www.springer.co.uk www.springer.de From dwang at cis.ohio-state.edu Tue Mar 21 16:35:52 2000 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Tue, 21 Mar 2000 16:35:52 -0500 Subject: Recent papers on perception and neurodynamics Message-ID: <38D7EAE7.3C498E5@cis.ohio-state.edu> The following papers, available at http://www.cis.ohio-state.edu/~dwang/announce.html, may be of interest to the list: 1. Liu X. and Wang D.L. (1999): "Perceptual organization based on temporal dynamics." Proceedings of NIPS-99, in press. A figure-ground segregation network is proposed based on a novel boundary pair representation. Nodes in the network are boundary segments obtained through local grouping. Each node is excitatorily coupled with the neighboring nodes that belong to the same region, and inhibitorily coupled with the corresponding paired node. Gestalt grouping rules are incorporated by modulating connections. The status of a node represents its probability being figural and is updated according to a differential equation. The system solves the figure-ground segregation problem through temporal evolution. Different perceptual phenomena, such as modal and amodal completion, virtual contours, grouping and shape decomposition are then explained through local diffusion. The system eliminates combinatorial optimization and accounts for many psychophysical results with a fixed set of parameters. 2. Wang D.L. (2000): "On connectedness: a solution based on oscillatory correlation." Neural Computation, vol. 12, pp. 131-139. A long-standing problem in neural computation has been the problem of connectedness, first identified by Minsky and Papert in 1969. This problem served as the cornerstone for them to analytically establish that perceptrons are fundamentally limited in computing geometrical (topological) properties. A solution to this problem is offered by a different class of neural networks - oscillator networks. To solve the problem, the representation of oscillatory correlation is employed whereby one pattern is represented as a synchronized block of oscillators, and different patterns are represented by distinct blocks that desynchronize from each other. Oscillatory correlation emerges from a LEGION network, whose architecture consists of local excitation and global inhibition among neural oscillators. It is further shown that these oscillator networks exhibit sensitivity to topological structure, which may lay a neurocomputational foundation for explaining the psychophysical phenomenon of topological perception. 3. Wang D.L. (1999): Relaxation oscillators and networks. In Webster J. (ed.), Wiley Encyclopedia of Electrical and Electronics Engineering, Wiley & Sons, vol. 18, pp. 396-405. A tutorial article on oscillatory dynamics and its applications to auditory and visual scene analysis. -- ------------------------------------------------------------ Dr. DeLiang Wang Department of Computer and Information Science The Ohio State University 2015 Neil Ave. Columbus, OH 43210-1277, U.S.A. Email: dwang at cis.ohio-state.edu Phone: 614-292-6827 (OFFICE); 614-292-7402 (LAB) Fax: 614-292-2911 URL: http://www.cis.ohio-state.edu/~dwang From C.Campbell at bristol.ac.uk Wed Mar 22 08:23:21 2000 From: C.Campbell at bristol.ac.uk (Colin Campbell, Engineering Mathematics) Date: Wed, 22 Mar 2000 13:23:21 +0000 (GMT Standard Time) Subject: PhD studentship in kernel methods/SVMs Message-ID: ***PhD studentship: Kernel Methods for Bioinformatics*** Applications are invited for a PhD studentship in the field of kernel methods (for example, Support Vector Machines) and their application to biosequence data. This position is a project studentship funded by the EPSRC and will pay all fees and maintenance for an applicant who is a citizen of the European Union. The grant also includes ample funds for travel and conference attendence. The emphasis of the proposed research is on the development of new algorithms and theoretical work. Consequently applicants should have a good first degree with a substantial mathematical component. A background in computing would also be an advantage. The applications component will be the development and evaluation of kernel methods specifically designed for handling classification tasks arising in bioinformatics. Further details about the proposed research area may be obtained from our webpage http://lara.enm.bris.ac.uk/cig which has a downloadable review paper (An Introduction to Kernel Methods) describing the subject in more detail. Non-EU candidates may also apply for the studentship but because of the difference between EU and overseas fees only exceptionally able candidates can be considered. Further details can be obtained from: Dr. Colin Campbell, Dept. of Engineering Mathematics, Queen's Building, University of Bristol, Bristol BS8 1TR United Kingdom Email: C.Campbell at bris.ac.uk From nnsp00 at neuro.kuleuven.ac.be Thu Mar 23 08:26:06 2000 From: nnsp00 at neuro.kuleuven.ac.be (NNSP2000, Sydney) Date: Thu, 23 Mar 2000 14:26:06 +0100 Subject: IEEE workshop on Neural Networks for Signal Processing (NNSP), Sydney, Australia, December 2000. Message-ID: <38DA1B6E.B86F0D53@neuro.kuleuven.ac.be> In response to the many requests we received recently, regarding the extension of paper submission deadline for NNSP'2000, the organizing committee has decided to extend the due date for initial paper submission to 15 April, 2000. The new version of the Call for Paper which reflects the change is attached for your information and reference. In case you would like to be removed from our mailing list: reply to this mail with as subject "remove" and the e-mail address you received this message on. Marc M. Van Hulle Katholieke Universiteit Leuven Belgium *********************************************** **** CALL FOR PAPERS **** **** submission deadline: April 15, 2000 **** *********************************************** December 11-13, 2000, Sydney, Australia NNSP'2000 homepage: http://eivind.imm.dtu.dk/nnsp2000 Thanks to the sponsorship of IEEE Signal Processing Society and the IEEE Neural Networks Council, the tenth of a series of IEEE workshops on Neural Networks for Signal Processing will be held at the University of Sydney Campus, Sydney, Australia. The workshop will feature keynote lectures, technical presentations, and panel discussions. Papers are solicited for, but not limited to, the following areas: Algorithm and Architectures: Artificial neural networks (ANN), adaptive signal processing, Bayesian modeling, MCMC, generalization, design algorithms, optimization, parameter estimation, nonlinear signal processing, Markov models, fuzzy systems (FS), evolutionary computation (EC), synergistic models of ANN/FS/EC, and wavelets. Applications: Speech processing, image processing, sonar and radar, data fusion, intelligent multimedia and web processing, OCR, robotics, adaptive filtering, blind source separation, communications, sensors, system identification, and other general signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. PAPER SUBMISSION PROCEDURE Prospective authors are invited to submit a full paper of up to six pages using the electronic submission procedure described at the workshop homepage: http://eivind.imm.dtu.dk/nnsp2000 Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Extended versions of the best workshop papers will be selected and published in a Special Issue of an international journal published by Kluwer Academica Publishers. SCHEDULE Submission of full paper: April 15, 2000 Notification of acceptance: May 31, 2000 Submission of photo-ready accepted paper: July 15, 2000 Super Early registration, before: July 15, 2000 Advanced registration, before: September 15, 2000 ORGANIZATION Honorary Chair Bernard WIDROW Stanford University General Chairs Ling GUAN University of Sydney email: ling at ee.usyd.edu.au Kuldip PALIWA Griffith University email: kkp at shiva2.me.gu.edu.au Program Chairs T?lay ADALI University of Maryland, Baltimore County email: adali at umbc.edu Jan LARSEN Technical University of Denmark email: jl at imm.dtu.dk Finance Chair Raymond Hau-San WONG University of Sydney email: hswong at ee.usyd.edu.au Proceedings Chairs Elizabeth J. WILSON Raytheon Co. email: bwilson at ed.ray.com Scott C. DOUGLAS Southern Methodist University email: douglas at seas.smu.edu Publicity Chair Marc van HULLE Katholieke Universiteit, Leuven email: Marc.VanHulle at med.kuleuven.ac.be Registration and Local Arrangements Stuart PERRY Defence Science and Technology Organisation email: Stuart.Perry at dsto.defence.gov.au Europe Liaison Jean-Francois CARDOSO ENST email: cardoso at sig.enst.fr America Liaison Amir ASSADI University of Wisconsin at Madison email: ahassadi at facstaff.wisc.edu Asia Liaison Andrew BACK RIKEN email: andrew.back at usa.net Program Committee Amir Assadi Yianni Attikiouzel John Asenstorfer Andrew Back Geoff Barton Herv? Bourlard Andy Chalmers Zheru Chi Andrzej Cichocki Tharam Dillon Tom Downs Hsin Chia Fu Suresh Hangenahally Marwan Jabri Haosong Kong Shigeru Katagiri Anthony Kuh Yi Liu Fa-Long Luo David Miller Christophe Molina M Mohammadian Erkki Oja Soo-Chang Pei Jose Principe Ponnuthurai Suganthan Ah Chung Tsoi Marc Van Hulle A.N. Venetsanopoulos Yue Wang Wilson Wen From austin at minster.cs.york.ac.uk Thu Mar 23 10:52:19 2000 From: austin at minster.cs.york.ac.uk (Jim Austin) Date: Thu, 23 Mar 2000 15:52:19 +0000 Subject: PhD Studentship in Neural Networks for Image Analysis, York, UK. Message-ID: <10003231552.ZM15574@minster.cs.york.ac.uk> PhD Studentship in Neural Networks for Image Analysis Advanced Computer Architectures Group Dept. of Computer Science University of York York UK. Oct 2000 - Sep 2003 A University funded PhD studentship is available from Oct 2000 in the application of neural networks to image analysis problems. The work will build on research in our group on the use of neural network based associative memories for the high performance analysis of images. In particular the work by Simon O'Keefe and Christos Orovas. For details see 133 and 102 on our publications list at http://www.cs.york.ac.uk/arch/neural/publications/papers.html The project will be supervised by Prof. Jim Austin and Dr. Simon O'Keefe within an active group working in neural networks research. Details of the groups research can be found at http://www.cs.york.ac.uk/arch/neural For further details please contact Prof. Jim Austin (austin at cs.york.ac.uk) or write, for an application form, to Filomena Ottaway, Graduate Secretary, Dept. of Computer Science, University of York, York, YO10 5DD, UK, or email filo at cs.york.ac.uk. -- Jim Austin, Professor of Neural Computation Advanced Computer Architecture Group, Department of Computer Science, University of York, York, YO10 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch/ From stefan.wermter at sunderland.ac.uk Thu Mar 23 14:40:54 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Thu, 23 Mar 2000 19:40:54 +0000 Subject: Book on Hybrid Neural Systems Message-ID: <38DA7346.85D39EAC@sunderland.ac.uk> NEW BOOK ON HYBRID NEURAL SYSTEMS ==================================== Title: Hybrid Neural Systems Stefan Wermter, University of Sunderland, UK Ron Sun, University of Missouri, Columbia, MO, USA (Eds.) More details on this book Hybrid Neural Systems can be gained from its web page at http://www.his.sunderland.ac.uk/ -> New Book and http://www.his.sunderland.ac.uk/newbook/hybrid.html (all abstracts and first chapter) Overview --------- Keywords: Artificial Neural Networks, Hybrid Neural Systems, Connectionism, Hybrid Symbolic Neural Architectures, Cognitive Neuroscience, Machine Learning, Language Processing The aim of this book is to present a broad spectrum of current research in hybrid neural systems, and advance the state of the art in neural networks and artificial intelligence. Hybrid neural systems are computational systems which are based mainly on artificial neural networks but which also allow a symbolic interpretation or interaction with symbolic components. This book focuses on the following issues related to different types of representation: How does neural representation contribute to the success of hybrid systems? How does symbolic representation supplement neural representation? How can these types of representation be combined? How can we utilize their interaction and synergy? How can we develop neural and hybrid systems for new domains? What are the strengths and weaknesses of hybrid neural techniques? Are current principles and methodologies in hybrid neural systems useful? How can they be extended? What will be the impact of hybrid and neural techniques in the future? Table of Contents ------------------ An Overview of Hybrid Neural Systems Stefan Wermter and Ron Sun Structured Connectionism and Rule Representation --------------------------------- Layered Hybrid Connectionist Models for Cognitive Science Jerome Feldman and David Bailey Types and Quantifiers in SHRUTI --- A Connectionist Model of Rapid Reasoning and Relational Processing Lokendra Shastri A Recursive Neural Network for Reflexive Reasoning Steffen Hlldobler, Yvonne Kalinke and Jrg Wunderlich A Novel Modular Neural Architecture for Rule-based and Similarity- based Reasoning Rafal Bogacz and Christophe Giraud-Carrier Addressing Knowledge-Representation Issues in Connectionist Symbolic Rule Encoding for General Inference Nam Seog Park Towards a Hybrid Model of First-Order Theory Refinement Nelson A. Hallack, Gerson Zaverucha and Valmir C. Barbosa Distributed Neural Architectures and Language Processing -------------------------------------- Dynamical Recurrent Networks for Sequential Data Processing Stefan Kremer and John Kolen Fuzzy Knowledge and Recurrent Neural Networks: A Dynamical Systems Perspective Christian W. Omlin, Lee Giles and Karvel K. Thornber Combining Maps and Distributed Representations for Shift-Reduce Parsing Marshall R. Mayberry and Risto Miikkulainen Towards Hybrid Neural Learning Internet Agents Stefan Wermter, Garen Arevian and Christo Panchev A Connectionist Simulation of the Empirical Acquisition of Grammatical Relations ---------------------------------------------- William C. Morris, Garrison W. Cottrell and Jeffrey L. Elman Large Patterns Make Great Symbols: An Example of Learning from Example Pentti Kanerva Context Vectors: A Step Toward a Grand Unified Representation Stephen I. Gallant Integration of Graphical Rules with Adaptive Learning of Structured Information Paolo Frasconi, Marco Gori and Alessandro Sperduti Transformation and Explanation --------------------- Lessons from Past, Current Issues and Future Research Directions in Extracting the Knowledge Embedded in Artificial Neural Networks Alan B. Tickle, Frederic Maire, Guido Bologna, Robert Andrews and Joachim Diederich Symbolic Rule Extraction from the DIMLP Neural Network Guido Bologna Understanding State Space Organization in Recurrent Neural Networks with Iterative Function Systems Dynamics Peter Tino, Georg Dorffner and Christian Schittenkopf Direct Explanations and Knowledge Extraction from a Multilayer Perceptron Network that Performs Low Back Pain Classification Marilyn L. Vaughn, Steven J. Cavill, Stewart J. Taylor, Michael A. Foy and Anthony J.B. Fogg High Order Eigentensors as Symbolic Rules in Competitive Learning Hod Lipson and Hava T. Siegelmann Holistic Symbol Processing and the Sequential RAAM: An Evaluation James A. Hammerton and Barry L. Kalman Robotics, Vision and Cognitive Approaches -------------------------------------------- Life, Mind and Robots: The Ins and Outs of Embodied Cognition Noel Sharkey and Tom Ziemke Supplementing Neural Reinforcement Learning with Symbolic Methods Ron Sun Self-Organizing Maps in Symbol Processing Timo Honkela Evolution of Symbolisation: Signposts to a Bridge between Connectionist and Symbolic Systems Ronan Reilly A Cellular Neural Associative Array for Symbolic Vision Christos Orovas and James Austin Application of Neurosymbolic Integration for Environment Modelling in Mobile Robots Gerhard K. Kraetzschmar, Stefan Sablatng, Stefan Enderle, Gnther Palm ====================================================== Online order -------- http://www.springer.de/cgi-bin/search_book.pl?isbn=3-540-67305-9 Publisher: Springer Publication Date: 29 March 2000 Wermter, S., University of Sunderland, UK Sun, R., University of Missouri, Columbia, MO, USA (Eds.) Hybrid Neural Systems 2000. IX, 403 pp. 3-540-67305-9 DM 86,- Recommended List Price LNCS 1778 *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 2781 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From jf218 at hermes.cam.ac.uk Thu Mar 23 04:56:53 2000 From: jf218 at hermes.cam.ac.uk (Dr J. Feng) Date: Thu, 23 Mar 2000 09:56:53 +0000 (GMT) Subject: six papers available at my homepage Message-ID: Dear connectionists, You could find the following papers at address http://www.cus.cam.ac.uk/~jf218 [46] Feng J., and Brown D.(2000) Integrate-and-fire models with nonlinear leakage Bulletin of Mathematical Biology (in press) ABSTRACT Can we express biophysical neuronal models as integrate-and-fire models with leakage coefficients which are no longer constant, as in the conventional leaky integrate-and-fire (IF) model, but functions of membrane potential and other biophysical variables? We illustrate the answer to this question using the FitzHugh-Nagumo (FHN) as an example. Novel integrate-and-fire models, the IF-FHN model, which approximate to the FHN mode, is obtained. The leakage coefficients derived in the IF-FHN model have non-monotonic relationships with membrane potential, revealing at least in part the intrinsic mechanisms underlying the model. The model correspondingly exhibits more complex behaviour than the standard IF model. For example, in some parameter regions, the IF-FHN model has a coefficient of variation of output interspike interval which is independent of the number of inhibitory inputs, being close to unity over the whole range, comparable to the FHN model as we noted previously. [45] Davison A., Feng J., Brown D.(2000) A reduced compartmental model of the mitral cell for use in network models of the olfactory bulb Brain Research Bulletin vol. 51, 393-399. ABSTRACT We have developed two-, three and four-compartment models of a mammalian olfactory bulb mitral cell as a reduction of a complex 286-compartment model. A minimum of three compartments, representing soma, secondary dendrites and the glomerular tuft of the primary dendrite, is required to adequately reproduce the behaviour of the full model over abroad range of firing rates. Adding a fourth compartment to represent the shaft of the primary dendrite gives a substantial improvement. The reduced models exhibit behaviours in common with the full model which were not used in fitting the model parameters. The reduced modes run 75 or more times faster than the full model, making their use in large, realistic network models of the olfactory bulb practical. [44] Feng J., Brown D., and Li G. (2000) Synchronization due to common pulsed input in Stein's model Physics Review E vol. 61, 2987-2995. ABSTRACT It is known that stimulus-evoked oscillatory synchronisation among neurones occurs in widely separated cortical regions. In this paper we provide a possible mechanism to explain the phenomena. When a common, random input is presented, we find that a group of neurones-- of Stein's (integrate-and-fire) model type with or without reversal potentials--are capable of quickly synchronising their firing. Interestingly the optimal average synchronisation time occurs when the common input has a high CV (ISI) (greater than 0.5) for this model with or without reversal potentials. The model with reversal potentials more quickly synchronises than that without reversal potentials. [43] Feng, J., and Tirozzi B. (2000) Stochastic resonance tuned by correlations in neuronal models. Phys. Rev. E. (in press, April) ABSTRACT The idea that neurons might use stochastic resonance (SR) to take advantages of random signals has been extensively discussed in the literature. However, there are a few key issues which have not been clarified and thus it is difficult to assess that whether SR in neuronal models occurs inside plausible physiology parameter regions or not. We propose and show that neurons can adjust correlations between synaptic inputs, which can be measured in experiments and are dynamical variables, to exhibit SR. The benefit of such a mechanism over the conventional SR is also discussed. [42] Feng J., and Brown D.(2000). Impact of correlated input on the output of the integrate-and-fire models Neural Computation vol. 12, 711-732. ABSTRACT For the integrate-and-fire model with or without reversal potentials, we consider how correlated inputs affect the variability of cellular output. For both models the variability of efferent spike trains measured by coefficient of variation of the interspike interval (abbreviated to CV in the remainder of the paper) is a nondecreasing function of input correlation. When the correlation coefficient is greater than 0.09, the CV of the integrate-and-fire model without reversal potentials is always above 0.5, no matter how strong the inhibitory inputs. When the correlation coefficient is greater than 0.05, CV for the integrate-and-fire model with reversal potentials is always above 0.5, independent of the strength of the inhibitory inputs. Under a given condition on correlation coefficients we find that correlated Poisson processes can be decomposed into independent Poisson processes. We also develop a novel method to estimate the distribution density of the first passage time of the integrate-and-fire model. [41] Feng J., Georgii H.O., and Brown D. (2000) Convergence to global minima for a class of diffusion processes Physica A vol. 276, 465-476. ABSTRACT We prove that there exists a gain function $(\eta(t),\beta(t))_{t\ge 0}$ such that the solution of the SDE $dx_t=\eta(t)(-\mbox{ grad } U(x_t)dt +\beta(t)dB_t)$ 'settles' down on the set of global minima of $U$. In particular the existence of a gain function $(\eta(t))_{t\ge 0}$ so that $y_t$ satisfying $dy_t=\eta(t)(-\mbox{ grad } U(y_t)dt +dB_t)$ converges to the set of the global minima of $U$ is verified. Then we apply the results to the Robbins-Monro and the Kiefer-Wolfowitz procedures which are of particular interest in statistics and neural networks. with best regards Jianfeng Feng The Babraham Institute Cambridge CB2 4AT UK From cl at andrew.cmu.edu Mon Mar 27 10:27:00 2000 From: cl at andrew.cmu.edu (Christian Lebiere) Date: Mon, 27 Mar 2000 10:27:00 -0500 Subject: ACT-R Summer School and Workshop Message-ID: <50723163.3163141620@blubber.psy.cmu.edu> [** Final reminder: the summer school application deadline is April 1st. **] SEVENTH ANNUAL ACT-R SUMMER SCHOOL AND WORKSHOP =============================================== Carnegie Mellon University - July/August 2000 ============================================= ACT-R is a hybrid cognitive theory and simulation system for developing cognitive models for tasks that vary from simple reaction time to air traffic control. The most recent advances of the ACT-R theory were detailed in the recent book "The Atomic Components of Thought" by John R. Anderson and Christian Lebiere, published in 1998 by Lawrence Erlbaum Associates. Each year, a two-week summer school is held to train researchers in the use of the ACT-R system, followed by a three-day workshop to enable new and current users to exchange research results and ideas. The Seventh Annual ACT-R Summer School and Workshop will be held at Carnegie Mellon University in Pittsburgh in July/August 2000. SUMMER SCHOOL: The summer school will take place from Monday July 24 to Friday August 4, with the intervening Sunday free. This intensive 11-day course is designed to train researchers in the use of ACT-R for cognitive modeling. It is structured as a set of 8 units, with each unit lasting a day and involving a morning theory lecture, a web-based tutorial, an afternoon discussion session and a homework assignment which students are expected to complete during the day and evening. The final three days of the summer school will be devoted to individual research projects. Computing facilities for the tutorials, assignments and research projects will be provided. Successful student projects will be presented at the workshop, which all summer school students are expected to attend as part of their training. To provide an optimal learning environment, admission is limited to a dozen participants, who must submit by APRIL 1 an application consisting of a curriculum vitae, a statement of purpose and a one-page description of the data set that they intend to model as their research project. The data set can be the applicant's own or can be taken from the published literature. Applicants will be notified of admission by APRIL 15. Admission to the summer school is free. A stipend of up to $750 is available to graduate students for reimbursement of travel, housing and meal expenses. To qualify for the stipend, students must be US citizens and join to their application a letter of reference from a faculty member. WORKSHOP: The workshop will take place from the morning of Saturday August 5 to Monday August 7 at noon. Mornings will be devoted to research presentations, each lasting about 20 minutes plus questions. Participants are invited to present their ACT-R research by submitting a one-page abstract with their registration. Informal contributions of up to 8 pages can be submitted by August 1 for inclusion in the workshop proceedings. Afternoons will feature more research presentations as well as discussion sessions and instructional tutorials. Suggestions for the topics of the tutorials and discussion sessions are welcome. Evenings will be occupied by demonstration sessions, during which participants can gain a more detailed knowledge of the models presented and engage in unstructured discussions. Admission to the workshop is open to all. The early registration fee (before July 1) is $100 and the late registration fee (after July 1) is $125. A registration form is appended below. Additional information (detailed schedule, etc.) will appear on the ACT-R Web site (http://act.psy.cmu.edu/) when available or can be requested at: 2000 ACT-R Summer School and Workshop Psychology Department Attn: Helen Borek Baker Hall 345C Fax: +1 (412) 268-2844 Carnegie Mellon University Tel: +1 (412) 268-3438 Pittsburgh, PA 15213-3890 Email: helen+ at cmu.edu ________________________________________________________ Seventh Annual ACT-R Summer School and Workshop July 24 to August 7, 2000 at Carnegie Mellon University in Pittsburgh REGISTRATION ============ Name: .................................................................. Address: .................................................................. .................................................................. .................................................................. Tel/Fax: .................................................................. Email: .................................................................. Summer School (July 24 to August 4): ........ (check here to apply) ==================================== Applications are due APRIL 1. Acceptance will be notified by APRIL 15. Applicants MUST include a curriculum vitae, a short statement of purpose and a one-page description of the data set that they intend to model. A stipend of up to $750 is available for the reimbursement of travel, lodging and meal expenses (receipts needed). To qualify for the stipend, the applicant must be a graduate student with US citizenship and include with the application a letter of reference from a faculty member. Check here to apply for stipend: ........ Workshop (August 5 to 7): ........ (check here to register) ========================= Presentation topic (optional - include one-page abstract with registration): ......................................................................... Registration fee: Before July 1: $100 ... After July 1: $125 ... The fee is due upon registration. Please send checks or money orders only. We cannot accept credit cards. HOUSING ======= Housing is available in Resnick House, a CMU dormitory that offers suite-style accommodations. Rooms include air-conditioning, a semi-private bathroom and a common living room for suite-mates. Last year's rates were $180.75/week/person or $32.60/night/person for single rooms and $134.25/week/person or $24.25/night/person for double rooms. Housing reservations will be taken after acceptance to the summer school. Do not send money. See http://www.housing.cmu.edu/conferences/ for further housing information. To reserve a room in Resnick House, fill in the dates and select one of the three room options: I will stay from ................ to ................ 1. ... I want a single room 2. ... I want a double room and I will room with ................ 3. ... I want a double room. Please select a roommate of ....... gender ROOM PAYMENT IS DUE UPON CHECK-IN. DO NOT SEND MONEY. The recommended hotel is the Holiday Inn University Center, located on the campus of the University of Pittsburgh within easy walking distance of CMU. Contact the Holiday Inn directly at +1 (412) 682-6200. Send this form to: 2000 ACT-R Summer School and Workshop Psychology Department Attn: Helen Borek Baker Hall 345C Fax: +1 (412) 268-2844 Carnegie Mellon University Tel: +1 (412) 268-3438 Pittsburgh, PA 15213-3890 Email: helen+ at cmu.edu From vdavidsanchez at earthlink.net Sun Mar 26 12:18:54 2000 From: vdavidsanchez at earthlink.net (V. David Sanchez A.) Date: Sun, 26 Mar 2000 09:18:54 -0800 Subject: NEUROCOMPUTING - new address Message-ID: <38DE467E.D1B31C43@earthlink.net> Ladies and gentlemen, please send all submissions and inquiries to the following new address: Advanced Computational Intelligent Systems Attn.: V. David Sanchez A. NEUROCOMPUTING - Editor in Chief - P.O. Box 60130 Pasadena, CA 91116-6130 U.S.A. Fax: (626) 793-5120 Email: vdavidsanchez at earthlink.net URL: http://www.elsevier.nl/locate/neucom From john at dcs.rhbnc.ac.uk Tue Mar 28 02:43:45 2000 From: john at dcs.rhbnc.ac.uk (John Shawe-Taylor) Date: Tue, 28 Mar 2000 08:43:45 +0100 Subject: PhD Studentships available Message-ID: <200003280743.IAA28309@platon.cs.rhbnc.ac.uk> PhD Studentships available at Royal Holloway, University of London Royal Holloway, University of London Department of Computer Science currently has the following FUNDED PhD STUDENTSHIPS available: - THREE FULLY-FUNDED COLLEGE RESEARCH STUDENTSHIPS covering tuition fees at Home and EU rates and maintenance for 3 years - TWO FEES-ONLY COLLEGE RESEARCH STUDENTSHIPS covering tuition fees at Home and EU rates - EPSRC RESEARCH STUDENTSHIP covering tuition fees at Home and EU rates and maintenance for 3 years The above studentships are available in any area of the Department's research interests: Computational Learning, Kolmogorov Complexity, Bioinformatics, Formal methods, Languages and Architectures, and Constraint Satisfaction. The department currently has the following FUNDED MSc STUDENTSHIP available: - COLLEGE MASTERS STUDENTSHIP covering tuition fees at HEU rates, available to students taking MSc in Computer Science by Research Closing date for applications for studentships is 2 June 2000. Further information and application forms for these postgraduate degree programmes may be obtained from the Director of Graduate Studies Steve Schneider at S.Schneider at dcs.rhbnc.ac.uk From Marc.VanHulle at med.kuleuven.ac.be Wed Mar 29 02:16:16 2000 From: Marc.VanHulle at med.kuleuven.ac.be (Marc Van Hulle) Date: Wed, 29 Mar 2000 09:16:16 +0200 Subject: book announcement: Self-Organization and Topographic Maps Message-ID: <38E1ADBF.7E1765C@neuro.kuleuven.ac.be> NEW BOOK ON SELF-ORGANIZATION AND TOPOGRAPHIC MAPS ================================================== Title: Faithful Representations and Topographic Maps From Distortion- to Information-based Self-organization Author: Marc M. Van Hulle Publisher: J. Wiley & Sons, Inc. Publication Date: February 2000 with forewords by Teuvo Kohonen and Helge Ritter ------------------------------------------------------------------------------- A new perspective on topographic map formation and the advantages of information-based learning The study of topographic map formation provides us with important tools for both biological modeling and statistical data modeling. Faithful Representations and Topographic Maps offers a unified, systematic survey of this rapidly evolving field, focusing on current knowledge and available techniques for topographic map formation. The author presents a cutting-edge, information-based learning strategy for developing equiprobabilistic topographic maps -- that is, maps in which all neurons have an equal probability to be active --, clearly demonstrating how this approach yields faithful representations and how it can be successfully applied in such areas as density estimation, regression, clustering, and feature extraction. The book begins with the standard approach of distortion-based learning, discussing the commonly used Self-Organizing Map (SOM) algorithm and other algorithms, and pointing out their inadequacy for developing equiprobabilistic maps. It then examines the advantages of information-based learning techniques, and finally introduces a new algorithm for equiprobabilistic topographic map formation using neurons with kernel-based response characteristics. The complete learning algorithms and simulation details are given throughout, along with comparative performance analysis tables and extensive references. Faithful Representations and Topographic Maps is an excellent, eye-opening guide for neural network researchers, industrial scientists involved in data mining, and anyone interested in self-organization and topographic maps. ------------------------------------------------------------------------------- "I am convinced that this book marks an important contribution to the field of topographic map representations and that it will become a major reference for many years." (Ritter) "This book will provide a significant contribution to our theoretical understanding of the brain." (Kohonen) ------------------------------------------------------------------------------- http://www.amazon.com/exec/obidos/ASIN/0471345075/qid=948382599/sr=1-1/002-0713799-7248240 http://www.barnesandnoble.com/ search for (Keyword): Faithful representations From labbi at cui.unige.ch Thu Mar 30 07:09:57 2000 From: labbi at cui.unige.ch (A.R. Labbi) Date: Thu, 30 Mar 2000 14:09:57 +0200 Subject: Postdoc. in Machine Learning/ Statistical Modeling in Geneva, Switzerland References: <10003231552.ZM15574@minster.cs.york.ac.uk> Message-ID: <38E34415.EA3D8BD@cui.unige.ch> POSTDOC at the CSD of the University of Geneva - Switzerland ----------------------------------------------------- The Computer Science Department of the University of Geneva (Switzerland) is seeking a postdoctoral fellow to participate in two Swiss-NSF funded research projects about biomedical data analysis for visual object recognition, and textual data analysis for document categorization. The two projects have strong overlap on the computational side since they both address common machine learning and statistical modeling issues such as classification and clustering (for more details, please visit: http://cuiwww.unige.ch/~labbi ). Candidates for this position should have a strong background in machine learning or statistical modeling as well as a sound experience in statistical image processing and an interest in statistical text processing (or vice versa). The candidate who will fill the position will work in a multidisciplinary environment where computer scientists, mathematicians, biologists, and computational linguists collaborate in stimulating research projects. Therefore, the postdoc will have the opportunity to develop new interests and initiate new research directions in the department. The position is to be filled as soon as possible, and is initially for one year, extendable to another year or possibly more. If you have a Ph.D. in applied mathematics, computer science, or a related field, and have programming experience (e.g. matlab and/or C/C++ and/or Java), and if you are interested in joining an attractive muti-cultural work environment with modern computing and teaching facilities, please mail, e-mail, or fax your r?sum? with the names and addresses (including e-mails) of two references to: Dr. Abderrahim Labbi or Prof. Christian Pellegrini Dept. of Computer Science University of Geneva 24, rue du General Dufour 1204 Geneva - Switzerland Fax: +41 22 705 77 80 E-mail: {Abderrahim.Labbi or Christian.Pellegrini}@cui.unige.ch From cweber at cs.tu-berlin.de Thu Mar 30 18:27:57 2000 From: cweber at cs.tu-berlin.de (Cornelius Weber) Date: Fri, 31 Mar 2000 01:27:57 +0200 (MET DST) Subject: Paper available Message-ID: The following IJCNN'00 paper is accepted and available on-line. I will be happy about any feedback. Structured models from structured data: emergence of modular information processing within one sheet of neurons Abstract: In our contribution we investigate how structured information processing within a neural net can emerge as a result of unsupervised learning from data. Our model consists of input neurons and hidden neurons which are recurrently connected and which represent the thalamus and the cortex, respectively. On the basis of a maximum likelihood framework the task is to generate given input data using the code of the hidden units. Hidden neurons are fully connected allowing for different roles to play within the unfolding time-dynamics of this data generation process. One parameter which is related to the sparsity of neuronal activation varies across the hidden neurons. As a result of training the net captures the structure of the data generation process. Trained on data which are generated by different mechanisms acting in parallel, the more active neurons will code for the more frequent input features. Trained on hierarchically generated data, the more active neurons will code on the higher level where each feature integrates several lower level features. The results imply that the division of the cortex into laterally and hierarchically organized areas can evolve to a certain degree as an adaptation to the environment. retreive from: http://www.cs.tu-berlin.de/~cweber/publications/ (6 pages, 230 KB) From tewon at salk.edu Thu Mar 30 21:29:18 2000 From: tewon at salk.edu (Te-Won Lee) Date: Thu, 30 Mar 2000 18:29:18 -0800 Subject: Postdoctoral Positions Available Message-ID: <38E40D7E.7C8391C2@salk.edu> University of California, San Diego Institute for Neural Computation Applications are invited for postdoctoral fellowships in the field of signal & image processing, neural networks, pattern recognition and machine learning. Position: Post-doctoral research associate Organization: Institute for Neural Computation, University of California, San Diego Faculty: Te-Won Lee Funding Period: Minimum 2 years available Location: San Diego, CA Deadline: Open until filled Title: Intelligent Sound and Image Processing Systems The goal of this project is to develop software that can process sounds and images in a more humanlike fashion so that a recognition system can work robustly in real-world environments. Recent advances in data processing technologies have led to several human computer interface applications such as automatic speech recognition systems and visual object recognition systems. Although some commercial products are currently available, the performance of those systems usually degrades substantially under real-world conditions. For example, a speech recognition system in an automobile may process voice commands when spoken in a quiet situation, but the recognition performance may be unacceptable with the presence of interfering sounds such as the car engine noise, music, and other voices in the background. In contrast, humans are able to recognize speech under very noisy conditions. The goal of the project is to develop software that can enhance the sound or image signal using a newly developed technique called ICA (independent component analysis). ICA is able to separate sounds or images when the environment has mixed them. This software is an important step in advancing computer systems closer to humanlike performance and hence will make the human machine interaction more natural. Furthermore, the software will enhance the communication between the sending and receiving units in a noisy environment where the receiver can get clean audio-visual information. Qualifications: Background in signal processing, image processing, machine learning algorithms, neural networks and pattern recognition is desirable. Matlab and C programming skills are required. Our Lab: Successful candidates will join the Neuroengineering Laboratory at the Institute for Neural Computation. The Institute is an interdisciplinary organized research unit at UCSD directed by Terrence J. Sejnowski. The 45 faculty members in the Institute represent 14 research disciplines, including neuroscience, visual science, cognitive science, mathematics, economics and social science, and computer engineering, and addresses the twin scientific and engineering challenges of understanding how humans function at the neural and cognitive levels and solve major technological problems related to neural network implementations (see http://inc.ucsd.edu/). The Neuroengineering Laboratory currently consists of three research faculty who work with postdoctoral fellows, graduate students and visiting scientists, in collaboration with other researchers in electrical and computer engineering, cognitive science, neurosciences, as well as with industry. Contact Info: If interested, please send your curriculum vitae, list of publications and the names, addresses, and phone numbers of three references to: Te-Won Lee, Ph.D. Institute for Neural Computation University of California, San Diego La Jolla, CA 92093-0523 Tel: 858-453-4100 x1527 Fax: 858-587-0417 http://www.cnl.salk.edu/~tewon From Nello.Cristianini at bristol.ac.uk Fri Mar 31 11:04:02 2000 From: Nello.Cristianini at bristol.ac.uk (N Cristianini) Date: Fri, 31 Mar 2000 17:04:02 +0100 (BST) Subject: Available Now: Support Vector Book Message-ID: The Support Vector Book is now distributed and available (see http://www.support-vector.net for details). AN INTRODUCTION TO SUPPORT VECTOR MACHINES (and other kernel-based learning methods) N. Cristianini and J. Shawe-Taylor Cambridge University Press, 2000 ISBN: 0 521 78019 5 http://www.support-vector.net Contents - Overview 1 The Learning Methodology 2 Linear Learning Machines 3 Kernel-Induced Feature Spaces 4 Generalisation Theory 5 Optimisation Theory 6 Support Vector Machines 7 Implementation Techniques 8 Applications of Support Vector Machines Pseudocode for the SMO Algorithm Background Mathematics References Index Description This book is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory. The book also introduces Bayesian analysis of learning and relates SVMs to Gaussian Processes and other kernel based learning methods. SVMs deliver state-of-the-art performance in real-world applications such as text categorisation, hand-written character recognition, image classification, biosequences analysis, etc. Their first introduction in the early 1990s lead to a recent explosion of applications and deepening theoretical analysis, that has now established Support Vector Machines along with neural networks as one of the standard tools for machine learning and data mining. Students will find the book both stimulating and accessible, while practitioners will be guided smoothly through the material required for a good grasp of the theory and application of these techniques. The concepts are introduced gradually in accessible and self-contained stages, though in each stage the presentation is rigorous and thorough. Pointers to relevant literature and web sites containing software ensure that it forms an ideal starting point for further study. These are also available on-line through an associated web site www.support-vector.net, which will be kept updated with pointers to new literature, applications, and on-line software. From saadd at aston.ac.uk Fri Mar 31 10:48:37 2000 From: saadd at aston.ac.uk (David Saad) Date: Fri, 31 Mar 2000 15:48:37 +0000 Subject: Postdoctoral Research Fellowship Message-ID: <38E4C8D5.D4001F14@aston.ac.uk> Neural Computing Research Group ------------------------------- School of Engineering and Applied Sciences Aston University, Birmingham, UK POSTDOCTORAL RESEARCH FELLOWSHIP -------------------------------- Irregular Gallager-type error-correcting codes - a statistical mechanics perspective ----------------------------------------------- *** Full details at http://www.ncrg.aston.ac.uk/ *** The Neural Computing Research Group at Aston is looking for a highly motivated individual for a 2 year postdoctoral research position in the area of `Irregular Gallager-type error-correcting codes - a statistical mechanics perspective'. The emphasis of the research will be on applying theoretical and numerical methods to study the properties of Gallager-type error-correcting codes, with the aim of systematically identifying optimal constructions of this type. Potential candidates should have strong mathematical and computational skills, with a background in statistical physics and/or error-correcting codes. Conditions of Service --------------------- Salaries will be up to point 6 on the RA 1A scale, currently 18,185 UK pounds. The salary scale is subject to annual increments. How to Apply ------------ If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, to: Prof. David Saad Neural Computing Research Group School of Engineering and Applied Sciences Aston University Birmingham B4 7ET, U.K. Tel: 0121 333 4631 Fax: 0121 333 4586 e-mail: D.Saad at aston.ac.uk e-mail submission of postscript files is welcome. Closing date: 28.4.2000 ----------------------------------------------------------------------