From steve at cns.bu.edu Sun Mar 3 13:14:04 2002 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sun, 3 Mar 2002 13:14:04 -0500 Subject: postdoc in neural modeling of vision and recognition Message-ID: POSTDOCTORAL FELLOW DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS BOSTON UNIVERSITY A postdoctoral fellow is sought to join the Boston University Department of Cognitive and Neural Systems (CNS) and Center for Adaptive Systems (CAS), which model how the brain controls behavior, and how technology can emulate biological intelligence. The fellowship is available immediately for a minimum term of two years. The postdoc would help to develop models of biological vision and object recognition, as well as to adapt these discoveries to image processing applications. Recent research in CNS proposes how the laminar architecture of cerebral cortex leads to biological intelligence. These models have clarified how bottom-up, top-down, and horizontal interactions work together in the visual cortex to control development, learning, attention, perceptual grouping, stereopsis, and 3-D surface perception. New projects will further develop these topics with applications to problems in figure-ground perception, recognition learning and categorization, scene understanding, and cognitive information processing. The postdoc will collaborate with Professors Gail Carpenter and Stephen Grossberg on basic research and technology transfer efforts. Candidates should have substantial training in developing neural network models of biomimetic vision/image processing and/or adaptive pattern recognition. CNS and CAS offer excellent opportunities for broadening knowledge of biological and technological neural modeling and applications, as summarized at http://www.cns.bu.edu. Boston University is an Equal Opportunity/Affirmative Action Employer. Please send a curriculum vitae, 3 letters of recommendation, and illustrative research articles to: Postdoctoral Search, Room 203, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA 02215. From sok at cs.york.ac.uk Mon Mar 4 08:21:56 2002 From: sok at cs.york.ac.uk (Simon O'Keefe) Date: Mon, 04 Mar 2002 13:21:56 +0000 Subject: PhD position - binary neural networks Message-ID: <3C8374F4.BD7F047A@cs.york.ac.uk> PhD research position available - Satellite Image Classification by Binary Neural Networks Dr Simon O'Keefe Advanced Computer Architectures Group Department of Computer Science University of York http://www.cs.york.ac.uk/arch The Advanced Computer Architectures Group invites applications for an EPSRC-funded studentship to study the application of the AURA neural network (http://www.cs.york.ac.uk/arch/nn/aura.html) to the analysis of satellite images. The studentship is available immediately. The principal area of research is into pixel classification in satellite images, concentrating on the methodological issues surrounding representation of data to the network, and the characterisation of the network's generalisation behaviour. Demonstration of the classification abilities of the network is as important as the development of the theoretical framework, and test data will be available from a number of sources, including satellite images but also other types of data. Applicants should have a good first degree in computer science, mathematics/statistics or a related discipline. Informal queries should be addressed to Dr Simon O'Keefe (email: sok at cs.york.ac.uk), but note that formal applications must be received by 18th March 2002. Applicants should contact the Graduate Secretary at the address below for an application form. Graduate Secretary Department of Computer Science University of York Heslington YORK YO10 5DD United Kingdom Tel: +44 1904 432721 E-mail: grad-secretary at cs.york.ac.uk This studentship covers maintenance, and tuition at the UK/EU rate. Applicants from outside the EU will need to pay the balance of the fees. Funding rules dictate that the start date for the studentship must be NO LATER THAN 1st MAY 2002. Because of the short timescales, applications should be returned DIRECTLY TO THE GRADUATE SECRETARY AND NOT VIA THE UNIVERSITY ADMISSIONS OFFICE. Applications MUST be received by 18th MARCH 2002. It is anticipated that interviews will be held in the middle of April 2002. -- _________________________________________________________________________ Dr Simon O'Keefe PHONE: +44 1904 432762 EMAIL: sok at cs.york.ac.uk Department of Computer Science, University of York, York, YO10 5DD (U.K.) From dr at tedlab.mit.edu Mon Mar 4 14:41:50 2002 From: dr at tedlab.mit.edu (Douglas Rohde) Date: Mon, 04 Mar 2002 14:41:50 -0500 Subject: Ph.D. Thesis Announcement Message-ID: <3C83CDFE.4000703@tedlab.mit.edu> You may be interested in the availability of my recently completed Ph.D. thesis, entitled, "A Connectionist Model of Sentence Comprehension and Production." I have included an abstract and a summary of the table of contents below. The thesis can be downloaded in .ps or .pdf format at this site: http://tedlab.mit.edu/~dr/Thesis/ Although the thesis is a bit long, you may wish to take a look at the Introduction and the Discussion to see if you might find anything of interest in between. Cheers, Doug Rohde ------------------------------------------------------ A Connectionist Model of Sentence Comprehension and Production Douglas L. T. Rohde Carnegie Mellon University, Dept. of Computer Science and the Center for the Neural Basis of Cognition Thesis Committee: Dr. David C. Plaut, Chair Dr. James L. McClelland Dr. David S. Touretzky Dr. Maryellen C. MacDonald Abstract: The most predominant language processing theories have, for some time, been based largely on structured knowledge and relatively simple rules. These symbolic models intentionally segregate syntactic information processing from statistical information as well as semantic, pragmatic, and discourse influences, thereby minimizing the importance of these potential constraints in learning and processing language. While such models have the advantage of being relatively simple and explicit, they are inadequate to account for learning and validated ambiguity resolution phenomena. In recent years, interactive constraint-based theories of sentence processing have gained increasing support, as a growing body of empirical evidence demonstrates early influences of various factors on comprehension performance. Connectionist networks are one form of model that naturally reflect many properties of constraint-based theories, and thus provide a form in which those theories may be instantiated. Unfortunately, most of the connectionist language models implemented until now have involved severe limitations, restricting the phenomena they could address. Comprehension and production models have, by and large, been limited to simple sentences with small vocabularies (St. John & McClelland, 1990). Most models that have addressed the problem of complex, multi-clausal sentence processing have been prediction networks (Elman, 1991; Christiansen & Chater, 1999). Although a useful component of a language processing system, prediction does not get at the heart of language: the interface between syntax and semantics. The current thesis focuses on the design and testing of the Connectionist Sentence Comprehension and Production (CSCP) model, a recurrent neural network that has been trained to both comprehend and produce a relatively complex subset of English. This language includes such features as tense and number, adjectives and adverbs, prepositional phrases, relative clauses, subordinate clauses, and sentential complements, with a vocabulary of about 300 total words. It is broad enough that it permits the model to address a wide range of sentence processing phenomena. The experiments reported here involve such issues as the relative comprehensibility of various sentence types, the resolution of lexical ambiguities, generalization to novel sentences, the comprehension of main verb/reduced relative, sentential complement, subordinate clause, and prepositional phrase attachment ambiguities, agreement attraction and other production errors, and structural priming. The model is able to replicate many key aspects of human sentence processing across these domains, including sensitivity to lexical and structural frequencies, semantic plausibility, inflectional morphology, and locality effects. A critical feature of the model is its suggestion of a tight coupling between comprehension and production and the idea that language production is primarily learned through the formulation and testing of covert predictions during comprehension. I believe this work represents a major advance in the attested ability of connectionist networks to process natural language and a significant step towards a more complete understanding of the human language faculty. ------------------------------------------------------ Contents 1 Introduction 1.1 Why implement models? 1.2 Properties of human language processing 1.3 Properties of symbolic models 1.4 Properties of connectionist models 1.5 The CSCP model 1.6 Chapter overview 2 An Overview of Connectionist Sentence Processing 2.1 Parsing 2.2 Comprehension 2.3 Word prediction 2.4 Production 2.5 Other language processing models 3 Empirical Studies of Sentence Processing 3.1 Introduction 3.2 Relative clauses 3.3 Main verb/reduced-relative ambiguities 3.4 Sentential complements 3.6 Prepositional phrase attachment 3.7 Effects of discourse context 3.8 Production 3.9 Summary of empirical findings 4 Analysis of Syntax Statistics in Parsed Corpora 4.1 Extracting syntax statistics isn't easy 4.2 Verb phrases 4.3 Relative clauses 4.4 Sentential noun phrases 4.5 Determiners and adjectives 4.6 Prepositional phrases 4.7 Coordination and subordination 4.8 Conclusion 5 The Penglish Language 5.1 Language features 5.2 Penglish grammar 5.3 The lexicon 5.4 Phonology 5.5 Semantics 5.6 Statistics 6 The CSCP Model 6.1 Basic architecture 6.2 The semantic system 6.3 The comprehension, prediction, and production system 6.4 Training 6.5 Testing 6.6 Claims and limitations of the model 7 General Comprehension Results 7.1 Overall performance 7.2 Representation 7.3 Experiment 2: Comparison of sentence types 7.4 Lexical ambiguity 7.5 Experiment 4: Adverbial attachment 7.6 Experiment 5: Prepositional phrase attachment 7.7 Reading time 7.8 Individual differences 8 The Main Verb/Reduced Relative Ambiguity 8.1 Empirical results 8.2 Experiment 6 8.3 Verb frequency effects 8.4 Summary 9 The Sentential Complement Ambiguity 9.1 Empirical results 9.2 Experiment 7 9.3 Summary 10 The Subordinate Clause Ambiguity 10.1 Empirical results 10.2 Experiment 8 10.3 Experiment 9 10.4 Experiment 10: Incomplete reanalysis 10.5 Summary and discussion 11 Relative Clauses 11.1 Empirical results 11.2 Experiment 11 11.3 Discussion 12 Production 12.1 Word-by-word production 12.2 Free production 12.3 Agreement attraction 12.4 Structural priming 12.5 Summary 13 Discussion 13.1 Summary of results 13.2 Accomplishments of the model 13.3 Problems with the model 13.4 Model versus theory 13.5 Properties, principles, processes 13.6 Conclusion Appendices A Lens: The Light, Efficient Network Simulator A.1 Performance benchmarks A.2 Optimizations A.3 Parallel training A.4 Customization A.5 Interface A.6 Conclusion B SLG: The Simple Language Generator B.1 The grammar B.2 Resolving the grammar B.3 Minimizing the grammar B.4 Parsing B.5 Word prediction B.6 Conclusion C TGrep2: A Tool for Searching Parsed Corpora C.1 Preparing corpora C.2 Command-line arguments C.3 Specifying patterns C.4 Controlling the output C.5 Differences from TGrep D Details of the Penglish Language D.1 The Penglish SLG grammar D.2 The Penglish lexicon ------------------------------------------------------ From Johan.Suykens at esat.kuleuven.ac.be Tue Mar 5 04:32:45 2002 From: Johan.Suykens at esat.kuleuven.ac.be (Johan Suykens) Date: Tue, 5 Mar 2002 10:32:45 +0100 (MET) Subject: NATO-ASI on Learning Theory and Practice - Last call for participation Message-ID: <200203050932.KAA29925@euler.esat.kuleuven.ac.be> Last call for participation ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ NATO Advanced Study Institute on Learning Theory and Practice (NATO-ASI LTP 2002) July 8-19 2002 - K.U. Leuven Belgium http://www.esat.kuleuven.ac.be/sista/natoasi/ltp2002.html ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -General Objective- This NATO Advanced Study Institute on Learning Theory and Practice aims at creating a fascinating interplay between advanced fundamental theory and several application areas such as bioinformatics, multimedia/computer vision, e-commerce finance, internet search, textmining and others. It offers an interdisciplinary forum for presenting recent progress and breakthroughs in learning theory with respect to several areas as neural networks, machine learning, mathematics and statistics. -Invited Lecturers- Peter Bartlett (Australian National University Canberra, AUS) Sankar Basu (IBM T.J. Watson, USA) Kristin Bennett (Rensselaer Polytechnic Institute New York, USA) Chris Bishop (Microsoft Research Cambridge, UK) Nello Cristianini (Royal Holloway London, UK) Luc Devroye (McGill University Montreal, CAN) Lazlo Gyorfi (T.U. Budapest, HUN) Gabor Horvath (T.U. Budapest, HUN) Rudolf Kulhavy (Honeywell Technology Center Prague, CZ) Vera Kurkova (Academy of Sciences of the Czech Republic, CZ) Joerg Lemm (University of Muenster, GER) Charles Micchelli (State U. of NY, USA) Tomaso Poggio (MIT, USA) Massimiliano Pontil (University of Siena, IT) Bernhard Schoelkopf (Max-Planck-Institute Tuebingen, GER) Yoram Singer (Hebrew University Jerusalem, IS) Steve Smale (U.C. Berkeley, USA) Johan Suykens (K.U. Leuven, BEL) Vladimir Vapnik (AT&T Labs Research, USA) Mathukumalli Vidyasagar (Tata Consultancy Services, IND) -Organizing committee- Sankar Basu (IBM T.J. Watson, USA) Gabor Horvath (T.U. Budapest, HUN), Co-director partner country Charles Micchelli (State U. of NY, USA) Johan Suykens (K.U. Leuven, BEL), Director Joos Vandewalle (K.U. Leuven, BEL) -Program and participation- According to the NATO rules http://www.nato.int/science the number of ASI students will be limited to 80. All participants will obtain a *free* registration (including welcome reception, lunches, banquets, refreshments and a NATO-ASI Science Series book to be published with IOS Press). Limited additional funding will be available to cover attendance costs. All interested participants should fill out an application form, taking into account the NATO restrictions. Application form and preliminary program are available from http://www.esat.kuleuven.ac.be/sista/natoasi/ltp2002.html -Venue- The Advanced Study Institute will take place in the Arenberg Castle of the K.U. Leuven Heverlee. The place is surrounded by several restaurants/cafes and parks where one may have a relaxing walk. The historical town of Leuven is within walking distance from the meeting site. Leuven is also well-known for its pleasant atmosphere, pubs and restaurants. -Housing- In addition to hotels rooms, blocks of low cost student rooms are reserved for the ASI students. The student rooms and hotels are located within walking distance from the Arenberg castle meeting site. In order to stimulate the interaction among the participants the option of student rooms will be recommended to all ASI students. -Important Dates- Deadline submission of application form: March 18, 2002 Notification of acceptance: April 30, 2002 NATO-ASI LTP 2002 meeting: July 8-19, 2002 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ From marina at cs.rhul.ac.uk Wed Mar 6 04:33:48 2002 From: marina at cs.rhul.ac.uk (marina d'Engelbronner) Date: Wed, 6 Mar 2002 09:33:48 -0000 Subject: NeuroCOLT Workshop 2002 Message-ID: <007301c1c4f2$055809b0$7ebcdb86@VENICE> Workshop on Generalisation Bounds Less than 0.5 The NeuroCOLT project coordinated by Royal Holloway, University of London is organising a workshop (which will be the final meeting of the project) from Monday April 29th until Thursday May 2nd 2002 at Cumberland Lodge, Windsor. The workshop draws together leading researchers in the drive to give non-trivial bounds on the generalisation of classifiers trained on real-world data. There will also be space for contributed presentations on related topics in the analysis of learning systems. Further information concerning the workshop (including the registration form) can be obtained from the workshop webpage: http://www.neurocolt.org/bounds2002.html If you are interested in attending the workshop, please fill out the registration form. Please note that there is very limited space at the workshop, so book early to avoid disappointment. The costs for the workshop (including accommodation, lunch, dinner, tea and coffee) will be =A3328. We will require the full amount of =A3328 to be paid on or before Wednesday April 15th 2002, either by cheque (in name of RHBNC; sent to me, see address below) or by making a bank transfer (to RHBNC, NatWestBank, Egham; sort-code: 600733; bank account number: 02325330; bank international code: NWB KGB2L; reference-code: 1090.3966). If you have any questions, please do not hesitate to contact me. Many thanks Marina d'Engelbronner Department of Computer Science Royal Holloway University of London Egham Hill Surrey TW20 0EX United Kingdom tel: **.(0)1784.443912 From wolfskil at MIT.EDU Wed Mar 6 14:42:35 2002 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Wed, 06 Mar 2002 14:42:35 -0500 Subject: CD-ROM announcement--collected NIPS proceedings (Jordan) Message-ID: <2002030614423530164@outgoing.mit.edu> I thought readers of the Connectionists List might be interested in this book. For more information, please visit http://mitpress.mit.edu/026256145X/ Thank you! Best, Jud Advances in Neural Information Processing Systems Proceedings of the First 12 Conferences edited by Michael I. Jordan, Yann LeCun, and Sara A. Solla The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This CD-ROM contains the entire proceedings of the twelve Neural Information Processing Systems conferences from 1988 to 1999. The files are available in the DjVu image format developed by Yann LeCun and his group at AT&T Labs. The CD-ROM includes free browsers for all major platforms. Michael I. Jordan is Professor of Computer Science and of Statistics at the University of California, Berkeley. Yann LeCun is Head of the Image Processing Research Department at AT&T Labs-Research. Sara A. Solla is Professor of Physics and of Physiology at Northwestern University. CD-ROM, ISBN 0-262-56145-X Neural Information Processing series ______________________ Jud Wolfskill Associate Publicist The MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617 253 2079 617 253 1709 fax http://mitpress.mit.edu From becker at meitner.psychology.mcmaster.ca Wed Mar 6 21:11:26 2002 From: becker at meitner.psychology.mcmaster.ca (S. Becker) Date: Wed, 6 Mar 2002 21:11:26 -0500 (EST) Subject: neural computation-related textbooks Message-ID: Dear connectionists, This message is directed to those of you who teach a course in neural networks & related topics, or computational neuroscience or cognitive science, and would like to exchange opinions on textbooks they are using. I'd like to know the name of your course, who takes it (undergrad vs grad, comp sci/eng vs psych/bio), what textbook you are using and what you consider to be the pros and cons of this book. I teach a course in neural computation to 3rd year undergrads, mostly CS majors and some psych/biopsych, and have used James Anderson's book An Introduction to Neural Networks a number of times. I like this book a lot -- it is the only one I know of that is truly interdisciplinary and suitable for undergraduates -- but it is in need of updating. I will post a summary of the replies I receive to connectionists. cheers, Sue -- Sue Becker, Associate Professor Department of Psychology, McMaster University becker at mcmaster.ca 1280 Main Street West, Hamilton, Ont. L8S 4K1 Fax: (905)529-6225 www.science.mcmaster.ca/Psychology/sb.html Tel: 525-9140 ext. 23020 From giacomo at ini.phys.ethz.ch Thu Mar 7 08:43:18 2002 From: giacomo at ini.phys.ethz.ch (Giacomo Indiveri) Date: Thu, 07 Mar 2002 14:43:18 +0100 Subject: Neuromorphic Engineering Workshop - SECOND CALL Message-ID: <3C876E76.8060900@ini.phys.ethz.ch> SECOND CALL FOR APPLICATIONS for the 2002 Workshop on Neuromorphic Engineering http://www.ini.unizh.ch/telluride Please distribute this announcement: http://www.ini.unizh.ch/telluride/announcement.html ------------------------------------------------------------- Neuromorphic Engineering Workshop Sunday, JUNE 30 - Saturday, JULY 20, 2002 TELLURIDE, COLORADO ------------------------------------------------------------------ Avis COHEN (University of Maryland) Rodney DOUGLAS (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Timmer HORIUCHI (Johns Hopkins University) Giacomo INDIVERI (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) ----------------------------------------------------------------- We invite applications for a three week summer workshop that will be held in Telluride, Colorado from Sunday, June 30 to Sunday, July 21, 2002. The application deadline is Friday, March 15, and application instructions are described at the bottom of this document. The 2001 summer workshop on "Neuromorphic Engineering", sponsored by the National Science Foundation, the Gatsby Foundation, Whitaker Foundation, the Office of Naval Research, and by the Center for Neuromorphic Systems Engineering at the California Institute of Technology, was an exciting event and a great success. A detailed report on the workshop is available at the workshop's web-site. We strongly encourage interested parties to browse through the previous workshop web pages. GOALS: Carver Mead introduced the term "Neuromorphic Engineering" for a new field based on the design and fabrication of artificial neural systems, such as vision systems, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems. The goal of this workshop is to bring together young investigators and more established researchers from academia with their counterparts in industry and national laboratories, working on both neurobiological as well as engineering aspects of sensory systems and sensory-motor integration. The focus of the workshop will be on active participation, with demonstration systems and hands on experience for all participants. Neuromorphic engineering has a wide range of applications from nonlinear adaptive control of complex systems to the design of smart sensors. Many of the fundamental principles in this field, such as the use of learning methods and the design of parallel hardware (with an emphasis on analog and asynchronous digital VLSI), are inspired by biological systems. However, existing applications are modest and the challenge of scaling up from small artificial neural networks and designing completely autonomous systems at the levels achieved by biological systems lies ahead. The assumption underlying this three week workshop is that the next generation of neuromorphic systems would benefit from closer attention to the principles found through experimental and theoretical studies of real biological nervous systems as whole systems. FORMAT: The three week summer workshop will include background lectures on systems neuroscience (in particular learning, oculo-motor and other motor systems and attention), practical tutorials on analog VLSI design, small mobile robots (Koalas, Kheperas and LEGO), hands-on projects, and special interest groups. Participants are required to take part and possibly complete at least one of the projects proposed. They are furthermore encouraged to become involved in as many of the other activities proposed as interest and time allow. There will be two lectures in the morning that cover issues that are important to the community in general. Because of the diverse range of backgrounds among the participants, the majority of these lectures will be tutorials, rather than detailed reports of current research. These lectures will be given by invited speakers. Participants will be free to explore and play with whatever they choose in the afternoon. Projects and interest groups meet in the late afternoons, and after dinner. In the early afternoon there will be tutorial on a wide spectrum of topics, including analog VLSI, mobile robotics, auditory systems, central-pattern-generators, selective attention mechanisms, etc. Projects that are carried out during the workshop will be centered in a number of working groups, including: * active vision * audition * olfaction * motor control * central pattern generator * robotics * multichip communication * analog VLSI * learning The active perception project group will emphasize vision and human sensory-motor coordination. Issues to be covered will include spatial localization and constancy, attention, motor planning, eye movements, and the use of visual motion information for motor control. Demonstrations will include an active vision system consisting of a three degree-of-freedom pan-tilt unit, and a silicon retina chip. The central pattern generator group will focus on small walking and undulating robots. It will look at characteristics and sources of parts for building robots, play with working examples of legged and segmented robots, and discuss CPG's and theories of nonlinear oscillators for locomotion. It will also explore the use of simple analog VLSI sensors for autonomous robots. The robotics group will use rovers and working digital vision boards as well as other possible sensors to investigate issues of sensorimotor integration, navigation and learning. The audition group aims to develop biologically plausible algorithms and aVLSI implementations of specific auditory tasks such as source localization and tracking, and sound pattern recognition. Projects will be integrated with visual and motor tasks in the context of a robot platform. The multichip communication project group will use existing interchip communication interfaces to program small networks of artificial neurons to exhibit particular behaviors such as amplification, oscillation, and associative memory. Issues in multichip communicationwill be discussed. LOCATION AND ARRANGEMENTS: The workshop will take place in the small town of Telluride, 9000 feet high in Southwest Colorado, about 6 hours drive away from Denver (350miles). United Airlines provide daily flights directly into Telluride. All facilities within the beautifully renovated public school building are fully accessible to participants with disabilities. Participants will be housed in ski condominiums, within walking distance of the school. Participants are expected to share condominiums. The workshop is intended to be very informal and hands-on. Participants are not required to have had previous experience in analog VLSI circuit design, computational or machine vision, systems level neurophysiology or modeling the brain at the systems level. However, we strongly encourage active researchers with relevant backgrounds from academia, industry and national laboratories to apply, in particular if they are prepared to work on specific projects, talk about their own work or bring demonstrations to Telluride (e.g. robots, chips, software). Internet access will be provided. Technical staff present throughout the workshops will assist with software and hardware issues. We will have a network of PCs running LINUX and Microsoft Windows. No cars are required. Bring hiking boots, warm clothes and a backpack, since Telluride is surrounded by beautiful mountains. Unless otherwise arranged with one of the organizers, we expect participants to stay for the entire duration of this three week workshop. FINANCIAL ARRANGEMENT: Notification of acceptances will be mailed out around March 21, 2001. Participants are expected to pay a $275.00 workshop fee at that time in order to reserve a place in the workshop. The cost of a shared condominium will be covered for all academic participants but upgrades to a private room will cost extra. Participants from National Laboratories and Industry are expected to pay for these condominiums. Travel reimbursement of up to $500 for US domestic travel and up to $800 for overseas travel will be possible if financial help is needed (please specify on the application). HOW TO APPLY: Applicants should be at the level of graduate students or above (i.e.postdoctoral fellows, faculty, research and engineering staff and the equivalent positions in industry and national laboratories). We actively encourage qualified women and minority candidates to apply. Application should include: * First name, Last name, valid email address. * Curriculum Vitae. * One page summary of background and interests relevant to the workshop. * Description of special equipment needed for demonstrations that could be brought to the workshop. * Two letters of recommendation. Complete applications should be sent to: Terrence Sejnowski The Salk Institute 10010 North Torrey Pines Road San Diego, CA 92037 e-mail: telluride at salk.edu FAX: (858) 587 0417 THE APPLICATION DEADLINE IS March 15, 2002. Applicants will be notified by e-mail around the end of March. From ingber at ingber.com Thu Mar 7 16:15:12 2002 From: ingber at ingber.com (Lester Ingber) Date: Thu, 7 Mar 2002 16:15:12 -0500 Subject: Financial Engineer Message-ID: <20020307161512.A9459@ingber.com> If you have very strong credentials for the position described below, please email your resume to: Lester Ingber Director R&D Financial Engineer A disciplined, quantitative, analytic individual proficient in prototyping and coding (such as C/C++, Maple/Mathematica, or Visual Basic, etc.) is sought for financial engineering/risk:reward optimization research position with established Florida hedge fund (over two decades in the business and $1 billion in assets under management). A PhD in a mathematical science, such as physics, statistics, math, or computer-science, is preferred. Hands-on experience in the financial industry is required. Emphasis is on applying state-of-the-art methods to financial time-series of various frequencies. Ability to work with a team to transform ideas/models into robust, intelligible code is key. Salary: commensurate with experience, with bonuses tied to the individual's and the firm's performance. Start date for this position may range anywhere from immediately to six months hence, depending on both the candidate's and the firm's needs. See http://www.ingber.com/open_positions.html for more information. -- Prof. Lester Ingber ingber at ingber.com ingber at alumni.caltech.edu www.ingber.com www.alumni.caltech.edu/~ingber From costa at dsi.unifi.it Fri Mar 8 06:39:19 2002 From: costa at dsi.unifi.it (Fabrizio Costa) Date: Fri, 8 Mar 2002 12:39:19 +0100 (MET) Subject: Ph.D. Thesis Announcement Message-ID: Dear Connectionists, I am pleased to announce that my PhD thesis entitled ------------------------------------------------------------------------ A Connectionist Approach to First-pass Attachment Prediction in Natural Language Parsing Fabrizio Costa Dept. of Computer Science University of Florence - Italy ------------------------------------------------------------------------ is now available at http://www.dsi.unifi.it/~costa/online/thesis.ps (and .pdf) Please find the summary of my thesis below. Fabrizio Costa Summary ======= The apparently effortless capability that humans show in understanding natural language is one of the main problems for modern cognitive science. Natural language is extremely ambiguous and many theories of human parsing claim that ambiguity is resolved resorting to linguistic experience, i.e. ambiguities are preferentially resolved in a way that has been successful most frequently in the past. Current research is focused on finding what kind of features are relevant for conditioning choice preferences. In this thesis we employ a connectionist paradigm (Recursive Neural Networks) capable of processing acyclic graphs to perform supervised learning on syntactic trees extracted from a large corpus of parsed sentences. The architecture is capable to extract the relevant features directly from inspection of syntactic structure and adaptively encode this information for the disambiguation task. Following a widely accepted hypothesis in psycholinguistics, we assume an incremental parsing process (one word at a time) that keeps a connected partial parse tree at all times. In this thesis we show how the model can be trained from a large parsed corpus (treebank), leading to a good disambiguation performance both on unrestricted text, and on a collection of examples from the psycholinguistic experimental literature. The goals of the thesis are twofold: (1) to demonstrate that a strongly incremental approach is computationally feasible and (2) to present an application of a connectionist architecture that directly processes complex informations such as syntactic trees. In Chapter 1 we set the background to understand the current work. We highlight a trend in the field of statistical parsers that shows how recently proposed models condition the statistics collected on an growing number of linguistic features. We will review how some recent incremental approaches to parsing show the viability of incrementality from a computational and psycholinguistic viewpoint. A review of the connectionists models applied to NLP follows. In Chapter 2 we formally introduce the recursive neural network architecture and we go through the issues of training and generalization capabilities of the model. In Chapter 3 we introduce the incremental grammar formalism used in the current work. We formalize the task of discriminating the correct successor tree, and we then describe how to use recursive neural networks to incrementally predict syntactic trees. In Chapter 4 we try to gain an insight in the relations between the parameters of the proposed system and the learned preferences. In Chapter 5 we study the relationship between the generalization error and the features of the domain. Some transformations are then applied on the input and on the architecture to enhance the generalization capabilities of the model. The effectiveness of the introduced modifications is then validated through a range of experiments. In Chapter 6 we investigate how the modeling properties of the proposed architecture relate to the the ambiguity resolution mechanisms employed by human beings. In Chapter 7 we employ the first-pass attachment prediction mechanism to build an initial version of a syntactic incremental probabilistic parser. ------------------------------------------------------ Contents Introduction 1 Parsing, Incrementality and Connectionism 1.1 Introduction 1.1.1 The ambiguity issue 1.2 Adding information to Context-Free Grammars 1.3 Incremental Parsing 1.3.1 Incremental parsers 1.3.2 Top-down left-corner parsing 1.3.3 Incremental Cascaded Markov Model 1.4 Connectionist Models for Natural Language Processing 1.4.1 Recurrent Networks 1.4.2 Recursive Auto Associative Memory 1.4.3 Simple Syncrony Networks 1.5 Conclusions 2 Recursive Neural Networks 2.1 Introduction 2.2 Definitions 2.3 Recursive representation 2.4 Modeling assumptions 2.5 Recursive network 2.6 Encoding network 2.7 Recursive processing with neural networks 2.8 Generalization Capabilities 3 Learning first-pass attachment preferences 3.1 Introduction 3.2 Incremental trees and connection paths 3.2.1 Definitions 3.2.2 Connection paths extraction 3.2.3 Is incrementality plausible? 3.2.4 POS tag level 3.2.5 Left recursion treatment 3.3 Formal definition of the learning problem 3.3.1 Universe of connection paths 3.3.2 The forest of candidates 3.3.3 The learning task 3.4 Recursive networks for ranking alternatives 3.5 Learning procedure 3.5.1 Dataset compilation 3.5.2 Network unfolding 3.5.3 Parameters estimation 4 Network performance 4.1 Experimental setting 4.1.1 The data set 4.2 Learning and generalizing capabilities of the system 4.2.1 Target function variation 4.2.2 Parameters variation 4.2.3 Learning rate variation 4.2.4 Training set variation 4.2.5 Network generalization 5 What is the Network really choosing? 5.1 Structural analyses 5.1.1 Statistical information on incremental tree features 5.1.2 Correlation between features and error 5.1.3 Correct and incorrect predictions characteristics 5.1.4 Influence of connection paths' frequency 5.1.5 Filtering out the connection paths' frequency effect 5.1.6 Filtering out the anchor attachment ambiguity 5.2 Comparison with linguistic heuristics 5.3 Enhancing the network performance 5.3.1 Connection path set reduction 5.3.2 Tree reduction 5.3.3 Network specialization 5.3.4 Anchor shortcut 6 Modeling human ambiguity resolution 6.1 Introduction 6.2 Experienced based models 6.3 Experimental setting 6.4 Psycholinguistic examples 6.4.1 NP/S ambiguities 6.4.2 Relative clause attachment 6.4.3 Prepositional phrase attachment to noun phrases 6.4.4 Adverb attachment 6.4.5 Closure ambiguity 6.4.6 PP attachment ambiguities 6.5 Conclusion 7 Toward a connectionist incremental parser 7.1 Probabilistic framework 7.1.1 Defining the model structure 7.1.2 Maximum likelihood estimation 7.2 An example: PCFG 7.3 Probabilistic Incremental Grammar 7.4 Proposal for an incremental parser and modeling issues 7.5 How to evaluate the parser output 7.5.1 Definitions 7.5.2 Evaluation metrics 7.6 Parser evaluation 7.6.1 Influence of tree reduction procedure 7.6.2 Parsing performance issues 7.6.3 Influence of the connection path coverage 7.6.4 Influence of the beam size 7.7 Final remarks Conclusion ----------------------------------------------------------------- Fabrizio Costa Dept. of Systems and Computer Science, University of Florence Via di Santa Marta 3, I-50139 Firenze - Italy Phone: +39 055 4796 361 Fax: +39 055 4796 363 From steve at cns.bu.edu Sat Mar 9 06:20:34 2002 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sat, 9 Mar 2002 06:20:34 -0500 Subject: Adaptive boundary-surface alignment and the McCollough effect Message-ID: The following article is now available at http://www.cns.bu.edu/Profiles/Grossberg in PDF and Gzipped Postscript. Grossberg, S., Hwang, S., and Mingolla, E. (2002). Thalamocortical dynamics of the McCollough effect: Boundary-surface alignment through perceptual learning. Vision Research, in press. Abstract: This article further develops the FACADE neural model of 3-D vision and figure-ground perception to quantitatively explain properties of the McCollough effect. The model proposes that many McCollough effect data result from visual system mechanisms whose primary function is to adaptively align, through learning, boundary and surface representations that are positionally shifted, due to the process of binocular fusion. For example, binocular boundary representations are shifted by binocular fusion relative to monocular surface representations, yet the boundaries must become positionally aligned with the surfaces to control binocular surface capture and filling-in. The model also includes perceptual reset mechanisms that use habituative transmitters in opponent processing circuits. Thus the model shows how McCollough effect data may arise from a combination of mechanisms that have a clear functional role in biological vision. Simulation results with a single set of parameters quantitatively fit data from thirteen experiments that probe the nature of achromatic/chromatic and monocular/binocular interactions during induction of the McCollough effect. The model proposes how perceptual learning, opponent processing, and habituation at both monocular and binocular surface representations are involved, including early thalamocortical sites. In particular, it explains the anomalous McCollough effect utilizing these multiple processing sites. Alternative models of the McCollough effect are also summarized and compared with the present model. From jaz at cs.rhul.ac.uk Mon Mar 11 05:59:20 2002 From: jaz at cs.rhul.ac.uk (J.S. Kandola) Date: Mon, 11 Mar 2002 10:59:20 +0000 Subject: REMINDER: Call for Papers: JMLR Special Issue on Machine Learning Methods for Text and Images Message-ID: <02031110592000.01774@trinity> *********** Apologies for Multiple Postings*************** ******************************************************* Call for Papers: JMLR Special Issue on Machine Learning Methods for Text and Images Guest Editors: Jaz Kandola (Royal Holloway College, University of London, UK) Thomas Hofmann (Brown University, USA) Tomaso Poggio (M.I.T, USA) John Shawe-Taylor (Royal Holloway College, University of London, UK) ***Submission Deadline: 29th March 2002*** ================================= Papers are invited reporting original research on Machine Learning Methods for Text and Images. This special issue follows the NIPS 2001 workshop on the same topic, but is open also to contribution that were not presented in it. A special volume will be published for this issue. There has been much interest in information extraction from structured and semi-structured data in the machine learning community. This has in part been driven by the large amount of unstructured and semi-structured data available in the form of text documents, images, audio, and video files. In order to optimally utilize this data, one has to devise efficient methods and tools that extract relevant information. We invite original contributions that focus on exploring innovative and potentially groundbreaking machine learning technologies as well as on identifying key challenges in information access, such as multi-class classification, partially labeled examples and the combination of evidence from separate multimedia domains. The special issue seeks contributions applied to text and/or images. For a list of possible topics and information about the associated NIPS workshop please see http://www.cs.rhul.ac.uk/colt/JMLR.html Important Dates: Submission Deadline: 29th March 2002 Decision: 24th June 2002 Final Papers: 24th July 2002 Many thanks Jaz Kandola, Thomas Hofmann, Tommy Poggio and John Shawe-Taylor From laubach at kafka.med.yale.edu Mon Mar 11 13:51:41 2002 From: laubach at kafka.med.yale.edu (Mark Laubach) Date: Mon, 11 Mar 2002 13:51:41 -0500 Subject: positions available Message-ID: POSTDOCTORAL POSITIONS IN SYSTEMS NEUROPHYSIOLOGY AND NEUROENGINEERING Four postdoctoral positions are available immediately as part of a collaborative project involving the laboratories of Drs. Jon Kaas and Troy Hackett at Vanderbilt University, Dr. Mandayam Srinivasan at MIT, and Dr. Mark Laubach at Yale University. Two positions in system neurophysiology are available. These positions, primarily at Vanderbilt, will involve using multi-electrode recording methods to study auditory-related areas of the macaque cerebral cortex. We are seeking one person with experience in the auditory system and another with experience in multi-electrode recording methods. Two positions that bridge the gap between physiology and engineering are also available, primarily at Yale. One of these positions requires expertise in neurophysiology, electronics, and computer-based instrumentation. This person will develop, test, and implement new procedures for multi-electrode microstimulation. The second position requires expertise in modern methods for data analysis (multispectral techniques and machine learning/pattern recognition) and/or real-time data acquisition and analysis. A major requirement for all these positions is strong skills in teamwork and communication (in English). We request that only qualified and serious candidates apply. If interested, please send a vita, a list of publications, a summary of research experience, and the names of three individuals who can be contacted as references to: Mark Laubach, Ph.D. The John B. Pierce Laboratory Yale University School of Medicine 290 Congress Ave New Haven CT 06519 E-mail: laubach at kafka.med.yale.edu http://www.jbpierce.org/staff/laubach.html From bbs at bbsonline.org Mon Mar 11 14:39:35 2002 From: bbs at bbsonline.org (Behavioral & Brain Sciences) Date: Mon, 11 Mar 2002 14:39:35 -0500 Subject: Northoff: CATATONIA: BBS Call for Commentators Message-ID: Below is a link to the forthcoming BBS target article WHAT CATATONIA CAN TELL US ABOUT "TOP-DOWN MODULATION": A NEUROPSYCHIATRIC HYPOTHESIS by George Northoff http://www.bbsonline.org/Preprints/Northoff This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within two (2) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _____________________________________________________________ WHAT CATATONIA CAN TELL US ABOUT "TOP-DOWN MODULATION": A NEUROPSYCHIATRIC HYPOTHESIS George Northoff, MD PhD, PhD Harvard University Beth Israel Hospital Boston, Massachusetts KEYWORDS: Catatonia; Parkinson's disease; Top-down modulation; Bottom-up modulation; Horizontal modulation; Vertical modulation ABSTRACT: Differentialdiagnosis of motor symptoms, as for example akinesia, may be difficult in clinical neuropsychiatry. They may be either of neurologic origin, as for example Parkinson's disease, or psychiatric origin, as for example catatonia, leading to a so-called "conflict of paradigms". Despite their different origin symptoms may appear clinically more or less similar. Possibility of dissociation between origin and clinical appearance may reflect functional brain organisation in general and cortical-cortical/subcortical relations in particular. It is therefore hypothesized that similarities and differences between Parkinson's disease and catatonia may be accounted for by distinct kinds of modulation between cortico-cortical and cortico-subcortical relations. Comparison between Parkinson's disease and catatonia reveals distinction between two kinds of modulation "vertical and horizontal modulation". "Vertical modulation" concerns cortical-subcortical relations and allows apparently for bidirectional modulation. This is reflected in possibility of both "top-down and bottom-up modulation" and appearance of motor symptoms in both Parkinson's disease and catatonia. "Horizontal modulation" concerns cortical-cortical relations and allows apparently only for unidirectional modulation. This is reflected in one-way connections from prefrontal cortex to motor cortex and absence of major affective and behavioural symptoms in Parkinson's disease. It is concluded that comparison between Parkinson's disease and catatonia may reveal the nature of modulation of cortico-cortical and cortico-subcortical relations in further detail. http://www.bbsonline.org/Preprints/Northoff ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password above: http://www.bbsonline.org/ *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Ralph BBS ------------------------------------------------------------------- Ralph DeMarco Associate Editorial Coordinator Behavioral and Brain Sciences Journals Department Cambridge University Press 40 West 20th Street New York, NY 10011-4211 UNITED STATES bbs at bbsonline.org http://bbsonline.org Tel: +001 212 924 3900 ext.374 Fax: +001 212 645 5960 ------------------------------------------------------------------- From escorchado at ubu.es Tue Mar 12 13:35:48 2002 From: escorchado at ubu.es (escorchado@ubu.es) Date: Tue, 12 Mar 2002 18:35:48 GMT Subject: 2 POSTDOC. COMPUTING VISITING POSITIONS AVAILABLE Message-ID: <200203121835.g2CIZkC01459@estafeta.cid.ubu.es> 2 POSTDOC. COMPUTING VISITING POSITIONS Computer and Information Systems Department University of Burgos, BURGOS, SPAIN. We are looking for personnel to play a full role in the research, teaching and administrative activities of the Department. The Department has established main research areas in: Artificial Neural Networks Data Mining Web Technologies Software Engineering, Object Oriented Methods Object Oriented Languages and Databases Also we are interested in GIS and Computer Supported Cooperative Work. Salary will be within the Associate Professor scale (28250 euros). The positions are limited to five years maximum. You may informally contact to Emilio Corchado escorchado at ubu.es. We aplogies if you receive this message several times. ---------------------------------- The city of Burgos has a population of 165.000 inhabitants. "Burgos" is the capital of the province which takes the same name, and it's one of the nine provinces that form The Autonomous Community of Castile and Leon. The city is outstanding for its exceptional historic and artistic heritage and for its geographical location. As a result, Burgos is one of the most visited cities in Spain. Burgos is a very well known city for its important food and car Industry. Further information about the city of Burgos in http://www.ubu.es/relacinter/eng_guiaextranjero.htm. Burgoss university is a young and dinamic institution. Firstly, it existed as a campus of University of Valladolid since 1972 but it was in 1994 when it became an independent university. A three-year degree in Information Systems has been taugh since 1995, and this course the studies have been extended to a five-year degree. From bbs at bbsonline.org Thu Mar 14 17:15:49 2002 From: bbs at bbsonline.org (Behavioral & Brain Sciences) Date: Thu, 14 Mar 2002 17:15:49 -0500 Subject: BBS Call for Commentators: Perruchet: THE SELF-ORGANIZING CONSCIOUSNESS Message-ID: Below is a link to the forthcoming BBS target article THE SELF-ORGANIZING CONSCIOUSNESS by Pierre Perruchet and Annie Vinter http://www.bbsonline.org/Preprints/Perruchet This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. ____________________________________________________________ THE SELF-ORGANIZING CONSCIOUSNESS Pierre Perruchet and Annie Vinter Universit de Bourgogne LEAD/CNRS Dijon, France KEYWORDS: Associative learning; automatism, consciousness; development; implicit learning; incubation; language; mental representation; perception; phenomenal experience ABSTRACT: We propose that the isomorphism generally observed between the representations composing our momentary phenomenal experience and the structure of the world is the end-product of a progressive organization that emerges thanks to elementary associative processes that take our conscious representations themselves as the stuff on which they operate, a thesis that we summarize in the concept of Self-Organizing Consciousness (SOC). We show that the SOC framework accounts for the discovery of words and objects, and for word-object mapping. We then argue that isomorphic representations may underlie seemingly rule-governed behavior, as is observed in the areas of implicit learning of arbitrary structures, language, problem solving, and automatisms. This analysis provides support for the so-called "mentalistic" framework (e.g. Dulany, 1997), which avoids postulating the existence of unconscious representations and computations. http://www.bbsonline.org/Preprints/Perruchet ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password above: http://www.bbsonline.org/ *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Ralph BBS ------------------------------------------------------------------- Ralph DeMarco Associate Editorial Coordinator Behavioral and Brain Sciences Journals Department Cambridge University Press 40 West 20th Street New York, NY 10011-4211 UNITED STATES bbs at bbsonline.org http://bbsonline.org Tel: +001 212 924 3900 ext.374 Fax: +001 212 645 5960 ------------------------------------------------------------------- From butz at illigal.ge.uiuc.edu Thu Mar 14 21:03:38 2002 From: butz at illigal.ge.uiuc.edu (Martin Butz) Date: Thu, 14 Mar 2002 20:03:38 -0600 (CST) Subject: 2nd CFP - ABiALS 2002 - Adaptive Behavior in Anticipatory Learning Systems Message-ID: (We apologize if you receive multiple copies) ######################################################################## 2nd C A L L F O R P A P E R S ABiALS Workshop 2002 Adaptive Behavior in Anticipatory Learning Systems ######################################################################## August 11., 2002 Edinburgh, Scotland http://www-illigal.ge.uiuc.edu/ABiALS to be held during the seventh international conference on Simulation of Adaptive Behavior (SAB'02) http://www.isab.org.uk/sab02/ This workshops aims for an interdisciplinary gathering of people interested in how anticipations can guide behavior as well as how an anticipatory influence can be implemented in an adaptive behavior system. Particularly, we are looking for adaptive behavior systems that incorporate online anticipation mechanisms. NEWS: - Submission deadline approaching - please submit your contribution by the 31st of March. - Review form online: We put the review form up on the web-page to give an idea of the most important acceptance criteria. ___________________________________________________________________________ Aim and Objectives: Most of the research over the last years in artificial adaptive behavior with respect to model learning and anticipatory behavior has focused on the model learning side. Research is particularly engaged in online generalized model learning. Up to now, though, exploitation of the model has been done mainly to show that exploitation is possible or that an appropriate model exists in the first place. Only very few applications exist that show the utility of the model for the simulation of anticipatory processes and a consequent adaptive behavior. The aim of this workshop is to bring together researchers that are interested in anticipatory processes and essentially anticipatory adaptive behavior. It is aimed for an interdisciplinary gathering that brings together researchers from distinct areas so as to discuss the different guises that takes anticipation in these different perspectives. But the workshop intends to focus on anticipations in the form of low-level computational processes rather than high-level processes such as explicit planning. Essential questions: * How can anticipations influence the adaptive behavior of an artificial learning system? * How can anticipatory adaptive behavior be implemented in an artificial learning system? * How does an incomplete model influence anticipatory behavior? * How do anticipations guide further model learning? * How do anticipations control attention? * Can anticipations be used for the detection of special environmental properties? * What are the benefits of anticipations for adaptive behavior? * What is the trade-off between simple bottom-up stimulus-response driven behavior and more top-down anticipatory driven behavior? * In what respect does anticipation mediate between low-level environmental processing and more complex cognitive simulation? * What role do anticipations play for the implementation of motivations and emotions? ___________________________________________________________________________ Submission: Submissions for the workshop should address or at least be related to one of the questions listed above. However, other approaches to anticipatory adaptive behavior are encouraged as well. The workshop is not limited to one particular type of anticipatory learning system or a particular representation of anticipations. However, the learning system should learn its anticipatory representation online rather than being provided by a model of the world beforehand. Nonetheless, background knowledge of a typical environment can be incorporated (and is probably inevitably embodied in the provided sensors, actions, and the coding in any adaptive system). Since this is a full day workshop, we hope to be able to provide more time for presentations and discussions. In that way, the advantages and disadvantages of the different learning systems should become clearer. It is also aimed for several discussion sessions in which anticipatory influences will be discussed in a broader sense. Papers will be reviewed for acceptance by the program committee and the organizers. Papers should be submitted electronically to one of the organizers via email in pdf or ps format. Electronic submission is strongly encouraged. If you cannot submit your contribution electronically, please contact one of the organizers. Submitted papers should be between 10 and 20 pages in 10pt, one-column format. The LNCS Springer-Verlag style is preferred (see http://www.springer.de/comp/lncs/authors.html). Submission deadline is the 31st of March 2002. Dependent on the quality and number of contributions we hope to be able to publish Post-Workshop proceedings as either a Springer LNAI volume or a special issue of a journal. For more information please refer to www-illigal.ge.uiuc.edu/ABiALS/ ___________________________________________________________________________ Important Dates: 31.March 2002: Deadline for Submissions 15.May 2002: Notification of Acceptance 15.June 2002: Camera Ready Version for SAB Workshop Proceedings 11.August 2002: Workshop ABiALS ___________________________________________________________________________ Program Committee: Emmanuel Dauc Facult des sciences du sport Universit de la Mditerranne Marseille, France Ralf Moeller Cognitive Robotics Max Planck Institute for Psychological Research Munich, Germany Wolfgang Stolzmann DaimlerChrysler AG Berlin, Germany Jun Tani Lab. for Behavior and Dynamic Cognition Brain Science Institute, RIKEN 2-1 Hirosawa, Wako-shi, Saitama, 351-0198 Japan Stewart W. Wilson President Prediction Dynamics USA ___________________________________________________________________________ Organizers: Martin V. Butz, Illinois Genetic Algorithms Laboratory (IlliGAL), Universtiy of Illinois at Urbana-Champaign, Illinois, USA also: Department of Cognitive Psychology University of Wuerzburg, Germany butz at illigal.ge.uiuc.edu http://www-illigal.ge.uiuc.edu/~butz Pierre Grard, AnimatLab, University Paris VI, Paris, France pierre.gerard at lip6.fr http://animatlab.lip6.fr/Gerard Olivier Sigaud, AnimatLab, University Paris VI, Paris, France olivier.sigaud at lip6.fr http://animatlab.lip6.fr/Sigaud From ckiw at dai.ed.ac.uk Fri Mar 15 14:21:58 2002 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Fri, 15 Mar 2002 19:21:58 +0000 (GMT) Subject: MSc study in Learning from Data at Edinburgh Message-ID: Dear colleagues, I would be grateful if you could forward mail this to any strong final year undergraduates (or equivalent) who are looking to do MSc study in the machine learning/probabilistic modelling areas. Chris Williams Chris Williams ckiw at dai.ed.ac.uk Institute for Adaptive and Neural Computation Division of Informatics, University of Edinburgh 5 Forrest Hill, Edinburgh EH1 2QL, Scotland, UK fax: +44 131 650 6899 tel: (direct) +44 131 651 1212 (department switchboard) +44 131 650 3090 http://www.dai.ed.ac.uk/homes/ckiw/ --------------------------------------------------------------------------- MSc In Informatics: Learning from Data -------------------------------------- Division of Informatics, University of Edinburgh Edinburgh, UK The Division of Informatics, University of Edinburgh runs a wide-ranging taught Master of Science (MSc) programme. Within this, the Learning from Data specialism teaches students about methods from machine learning, probabilistic graphical modelling, pattern recognition and neural networks. These methods underpin the analysis of learning in natural and artificial systems, and are also of vital importance in the analysis, interpretation and exploitation of the increasing amounts of electronic data that are being collected in scientific, engineering and commercial environments. The Learning from Data specialism comprises eight taught modules, plus a five month long research project. Of the eight modules, there are three compulsory inner core modules (Learning from Data 1, Probabilistic Modelling and Reasoning, Data Mining and Exploration), and a further two should be chosen from five outer core modules (Learning from Data 2, Genetic Algorithms and Genetic Programming, Statistical Natural Language Processing, Database Systems, Visualisation). The remaining three modules can be chosen from the whole range of around 50 available MSc modules. The course is funded by a Masters Training Package from the Engineering and Physical Sciences Research Council (EPSRC). We have a number of grants available to support UK and EC students who have a very good honours degree (or equivalent qualification) in a numerate area such as Computer Science, Physics etc. The Division of Informatics at Edinburgh enjoys an international reputation for its teaching and research. It was awarded the top 5*A rating in Computer Science in the 2001 UK Research Assessment Exercise. We also received a top excellent rating in the most recent SHEFC Teaching Quality Assessment. For further details and an application form contact MSc Admissions, Division of Informatics, University of Edinburgh, 5 Forrest Hill, Edinburgh, EH1 2QL, Scotland, UK or visit our web site at http://www.informatics.ed.ac.uk/prospectus/graduate/msc.html. Informal enquiries may be made to msc-admissions at informatics.ed.ac.uk. Preferential treatment for admission in October 2002 will be given to applications received by 31 March 2002. From hamilton at may.ie Fri Mar 15 06:42:35 2002 From: hamilton at may.ie (Hamilton Institute) Date: Fri, 15 Mar 2002 11:42:35 -0000 Subject: Research Positions (Statistical Machine Learning), Hamilton Institute, Ireland Message-ID: <00cc01c1cc16$8086a8a0$128a9f82@pwhm6> RESEARCH POSITIONS The Hamilton Institute is a new multi-disciplinary research centrecurrently undergoing substantial expansion. Funded by ScienceFoundation Ireland, we are committed to research excellence. Applications are invited from well qualified candidates for a number ofresearch positions. The successful candidates will be outstanding researchers who can demonstrate an exceptional researchtrack record or significant research potential at international level. We currently have an active programme of research in statistical machinelearning and probabilistic reasoning methods in the context of nonlinearand switched/hybrid dynamics (particuarly computer controlled systemsand human-computer interfaces). Experience in topics includingnonparametric regression and classification, Bayesian statistics, MCMCand time series analysis would be beneficial. These posts offer a unique opportunity for tackling fundamental problemsin a leading edge multi-disciplinary research group with state of theart facilities which is currently undergoing substantial expansion. Allappointments are initially for 3 years, extendable to 5. We arecommitted to research excellence and appointments will reflect this. Remuneration will reflect qualifications and experience, but will beinternationally competitive. Further information: Please visit our web site at: http://www.hamilton.may.ie Enquiries to Prof. Douglas Leith, doug.leith at may.ie From K.Branney at elsevier.nl Fri Mar 15 11:01:28 2002 From: K.Branney at elsevier.nl (Branney, Kate (ELS)) Date: Fri, 15 Mar 2002 17:01:28 +0100 Subject: Call for Submissions - Neurocomputing Letters Message-ID: <46414F09B351C64BAA875CE0B37BE07101446499@elsamsvexch02.elsevier.nl> Neurocomputing Letters NEUROCOMPUTING An International Journal published by Elsevier Science B.V., vol. 49-55, 28 issues, in 2003 ISNN 0925-2312, URL: http://www.elsevier.com/locate/neucom Neurocomputing is pleased to invite authors to submit letters, concise papers and short communications (jointly called "Letters") aimed at rapid publication. The review of this type of submission will be made subject to a special fast procedure and will only take place if the following conditions are met: - Size: no longer than five (5) A4 pages typed in 12point double space, - Scope: the content of the submitted manuscript fits within the general Aims and Scope of the journal, - Content: about a new development which is served by rapid publication, relevant comments on articles published in the journal, or about outstanding preliminary results of current research. Please send your submission in one of two electronic forms - pdf or word doc - by email directly to our Letters Editor with a copy (cc) to the Editor-in-Chief. The Letters Editor will determine whether the submitted manuscript meets all requirements for a letter submission and is in charge of the complete review procedure. If all requirements are met, the Letters Editor will make the submission subject to a special, rapid procedure: - the author can expect a review report on his/her letter within eight (8) weeks after submission, - the letter shall be either accepted or rejected. By acceptance, the only possibility is one with minor corrections. - upon acceptance of the letter, publication will take place in the first available issue of the journal, within approximately the next 3 to 4 months. - revised versions of rejected manuscripts may only be resubmitted as a regular paper. The resubmission must include copies of the original referees' comments and separate pages with the authors' response to all these comments. Author's instructions for regular papers and further information on the Letters section can be found on the back inside cover or at http://www.elsevier.com/locate/neucom From helge at TechFak.Uni-Bielefeld.DE Sat Mar 16 14:10:29 2002 From: helge at TechFak.Uni-Bielefeld.DE (Helge Ritter) Date: Sat, 16 Mar 2002 20:10:29 +0100 Subject: open position Message-ID: <3C9398A5.1A7EA5EF@techfak.uni-bielefeld.de> Dear Colleagues: The research group Neuroinformatics (Prof. Helge Ritter) at the University of Bielefeld is offering a research project position for a Research Assistant with salary according to BAT-IIa. The position will be affiliated with the Special Collaborative Research Unit (Sonderforschungsbereich) SFB 360: Situated Artificial Communicators ( http://www.sfb360.uni-bielefeld.de/sfbengl.html ). Research topic of the project is visual instruction of robot actions based on teaching by showing. The project provides the opportunity for a dissertation. Applicants should have a university degree (Masters or german diploma) in computer science, electrical engineering or physics and should have a good knowledge in Unix/C/C++ programming. A good background in the fields of robotics/neural networks/computer vision is desireable. The University of Bielefeld follows a policy to increase the proportion of female employees in fields where women are underrepresented and is therefore particularly encouraging women for an application. Applications of accordingly qualified disabled persons are welcome. Further information can be obtained from our group homepage: http://www.TechFak.Uni-Bielefeld.DE/techfak/ags/ni/ the university homepage: http://www.Uni-Bielefeld.DE Applications should be sent to: Prof. Dr. Helge Ritter Neural Computation Group Faculty of Technology Bielefeld University 33 501 Bielefeld mailto:helge at techfak.uni-bielefeld.de From n.lawrence at dcs.shef.ac.uk Mon Mar 18 04:49:12 2002 From: n.lawrence at dcs.shef.ac.uk (Neil Lawrence) Date: Mon, 18 Mar 2002 09:49:12 -0000 Subject: PhD Studentship available Message-ID: <000101c1ce62$28a84dc0$4008a78f@leonardo> Applications are invited to fill a PhD studentship at the University of Sheffield, U.K. Funding is available under EPSRC grant number GR/R84801/01 - `Learning Classifiers from Sloppily Labelled Data' for a three year EPSRC studentship. This project is a collaborative effort between Neil Lawrence of the Machine Learning group at Sheffield (http://www.thelawrences.net/neil/) and Bernhard Schoelkopf (http://www.kyb.tuebingen.mpg.de/bu/people/bs) of the Max Planck Institute for Biological Cybernetics in Tuebingen, Germany. The studentship is scheduled to start in September 2002. The principal requirement for applicants is that they have a strong mathematical background developed through a first degree in Physics, Mathematics, Engineering or Computer Science. Additional knowledge of the fundamentals of machine learning would also be an advantage. Informal queries may be made via e-mail to neil at dcs.shef.ac.uk From bernhard.schoelkopf at tuebingen.mpg.de Mon Mar 18 13:07:20 2002 From: bernhard.schoelkopf at tuebingen.mpg.de (Bernhard Schoelkopf) Date: Mon, 18 Mar 2002 19:07:20 +0100 Subject: Openings at the Max Planck Institute in Tuebingen Message-ID: <00c501c1cea7$bef21140$9865fea9@VAIO> Dear Colleagues: Several positions at various levels (MSc, PhD, Postdoc) are available in a new department at the Max Planck Institute in Tuebingen, Germany, studying learning theory and algorithms with applications in various domains (computer vision, bioinformatics, psychophysics modeling - for further information, cf. www.kyb.tuebingen.mpg.de/bs/index.html). We invite applications of candidates with an outstanding academic record including a strong mathematical background. Please forward this message to suitable candidates. Max Planck Institutes are publicly funded research labs with an emphasis on excellence in basic research. Tuebingen is a small university town in southern Germany, see http://www.tuebingen.de/kultur/english/index.html for some pictures. Please submit applications (hardcopy or e-mail, including names of potential referees) by the end of April to: Bernhard Schoelkopf Max-Planck-Institut fuer biologische Kybernetik Spemannstr.38 72076 Tuebingen Germany Tel. +49 7071 601 551 Fax +49 7071 601 577 http://www.kyb.tuebingen.mpg.de/~bs From David.Cohn at acm.org Mon Mar 18 15:04:47 2002 From: David.Cohn at acm.org (David 'Pablo' Cohn) Date: Mon, 18 Mar 2002 15:04:47 -0500 Subject: jmlr-announce: JMLR special issue on shallow parsing is now available Message-ID: The Journal of Machine Learning Research is pleased to announce the Special Issue on Machine Learning Approaches to Shallow Parsing, available online at http://www.jmlr.org. ---------------------------------------- JMLR Special Issue on Shallow Parsing - contents: Introduction to Special Issue on Machine Learning Approaches to Shallow Parsing - James Hammerton, Miles Osborne, Susan Armstrong, Walter Daelemans pp. 551-558 Memory-Based Shallow Parsing - Erik F. Tjong Kim Sang pp. 559-594 Shallow Parsing using Specialized HMMs - Antonio Molina, Ferran Pla pp. 595-613 Text Chunking based on a Generalization of Winnow - Tong Zhang, Fred Damerau, David Johnson pp. 615-637 Shallow Parsing with PoS Taggers and Linguistic Features - Beata Megyesi pp. 639-668 Learning Rules and Their Exceptions - Herve Dejean pp. 669-693 Shallow Parsing using Noisy and Non-Stationary Training Material - Miles Osborne pp. 695-719 ---------------------------------------- All papers in the special issue, as well as all previous JMLR papers, are available electronically at http://www.jmlr.org/ in PostScript and PDF formats. Many are also available in HTML. The papers of Volume 1 are also available in hardcopy from the MIT Press; please see http://mitpress.mit.edu/JMLR for details. -David Cohn, Managing Editor, Journal of Machine Learning Research From bbs at bbsonline.org Mon Mar 18 16:28:55 2002 From: bbs at bbsonline.org (Behavioral & Brain Sciences) Date: Mon, 18 Mar 2002 16:28:55 -0500 Subject: BBS Call for Commentators: Williams: FACIAL EXPRESSION OF PAIN: AN EVOLUTIONARY ACCOUNT Message-ID: Below is a link to the forthcoming BBS target article FACIAL EXPRESSION OF PAIN: AN EVOLUTIONARY ACCOUNT by Amanda Williams http://www.bbsonline.org/Preprints/Williams This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _______________________________________________________ FACIAL EXPRESSION OF PAIN: AN EVOLUTIONARY ACCOUNT Amanda C de C Williams University of London St Thomas' Hospital London, UK KEYWORDS: pain; facial expression; adaptation; evolutionary psychology ABSTRACT: This paper proposes that human expression of pain in the presence or absence of caregivers, and the detection of pain by observers, arise from evolved propensities. The function of pain is to demand attention and prioritise escape, recovery and healing; where others can help achieve these goals, effective communication of pain is required. Evidence is reviewed of a distinct and specific facial expression of pain from infancy to old age, consistent across stimuli, and recognizable as pain by observers. Voluntary control over amplitude is incomplete, and observers better detect pain which the individual attempts to suppress than to amplify or to simulate it. In many clinical and experimental settings, facial expression of pain is incorporated with verbal and nonverbal-vocal activity, posture and movement in an overall category of pain behaviour. This is assumed by clinicians to be under operant control of social contingencies such as sympathy, caregiving, and practical help; thus strong facial expression is presumed to constitute an attempt to manipulate these contingencies by amplification of the normal expression. Operant formulations support skepticism about the presence or extent of pain, judgements of malingering, and sometimes the withholding of caregiving and help. However, to the extent that pain expression is influenced by environmental contingencies, "amplification" could equally plausibly constitute release of suppression according to evolved contingent propensities which guide behaviour. Pain has been largely neglected in the evolutionary literature and that on pain expression, but an evolutionary account can generate improved assessment of pain and reactions to it. http://www.bbsonline.org/Preprints/Williams ====================================================================== IMPORTANT Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. ======================================================================= _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password above: http://www.bbsonline.org/ *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Ralph BBS ------------------------------------------------------------------- Ralph DeMarco Associate Editorial Coordinator Behavioral and Brain Sciences Journals Department Cambridge University Press 40 West 20th Street New York, NY 10011-4211 UNITED STATES bbs at bbsonline.org http://bbsonline.org Tel: +001 212 924 3900 ext.374 Fax: +001 212 645 5960 ------------------------------------------------------------------- From nnsp02 at neuro.kuleuven.ac.be Mon Mar 18 04:06:48 2002 From: nnsp02 at neuro.kuleuven.ac.be (Neural Networks for Signal Processing 2002) Date: Mon, 18 Mar 2002 10:06:48 +0100 Subject: NNSP'02 IEEE Workshop, Neural Networks for Signal Processing, 2nd Call for Papers Message-ID: <3C95AE28.A638CDFA@neuro.kuleuven.ac.be> 2002 IEEE Workshop Neural Networks for Signal Processing September 4-6, 2002 Martigny, Valais, Switzerland http://eivind.imm.dtu.dk/nnsp2002 Sponsored by the IEEE Signal Processing Society In cooperation with the IEEE Neural Networks Council Call for Papers Thanks to the sponsorship of IEEE Signal Processing Society and IEEE Neural Network Council, the twelfth of a series of IEEE workshops on Neural Networks for Signal Processing will be held in Martigny (http://www.martigny.ch ), Switzerland, at the ``Centre du Parc'' (http://www.hotelduparc.ch ). The workshop will feature keynote addresses, technical presentations and panel discussions. Papers are solicited for, but not limited to, the following areas: Algorithms and Architectures: Artificial neural networks, kernel methods, committee models, independent component analysis, adaptive and/or nonlinear signal processing, (hidden) Markov models, Bayesian modeling, parameter estimation, generalization, optimization, design algorithms. Applications: Speech processing, image processing (computer vision, OCR), multimodal interactions, multi-channel processing, intelligent multimedia and web processing, robotics, sonar and radar, bio-medical engineering, bio-informatics, financial analysis, time series prediction, blind source separation, data fusion, data mining, adaptive filtering, communications, sensors, system identification, and other signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. Further Information NNSP'2002 webpage: http://eivind.imm.dtu.dk/nnsp2002 Paper Submission Procedure Prospective authors are invited to submit a full paper of up to ten pages using the electronic submission procedure described at the workshop homepage. Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Schedule Submission of full paper: April 15, 2002 Notification of acceptance: May 1, 2002 Camera-ready paper and author registration: June 1, 2002 Advance registration, before: July 15, 2002 Preliminary Programme September 4, AM: Plenary talk by Mahesan Niranjan, Sheffield University "From Kalman to Particle Filtering for solving Signal Processing Problems with Neural Networks" Regular session, including oral and poster presentations September 4, PM: Special session on "Machine Learning and statistical approaches for Bio-Informatic Applications", Plenary talk by Anders Krogh, University of Denmark "Hidden Markov models of proteins and DNA", followed by special session, including oral and poster presentations September 5, AM: Plenary Talk: Yoshua Bengio, University of Montreal "Faking unlabeled data for geometric regularization" Regular session, including oral and poster presentations September 5, PM: Special session on "Multimodal/multi-channel processing" Plenary talk by Josef Kittler, University of Surrey "Fusion of multiple experts in multimodal biometric personal identity verification systems" September 6, AM: Plenary talk by Zoubin Ghahramani, Gatzby Institute, UK "Occam's Razor and Infinite Models" Regular oral and poster sessions. Invited Speakers Yoshua Bengio, University of Montreal Title of the talk: Faking unlabeled data for geometric regularization Zoubin Ghahramani, Gatzby Institute, Title of the talk: Occam's Razor and Infinite Models Josef Kittler, University of Surrey Invited for a Special Session on Multimodal and multi-channel applications Title of the talk: Fusion of multiple experts in multimodal biometric personal identity verification systems Anders Krogh, University of Denmark Invited for a Special Session on Bio-Informatics Title of the talk: Hidden Markov models of proteins and DNA Mahesan Niranjan, Sheffield University Title of the talk: From Kalman to Particle Filtering for solving Signal Processing Problems with Neural Networks General Chairs Herve BOURLARD IDIAP and EPF, Lausanne Tulay ADALI University of Maryland Baltimore County Program Chairs Samy BENGIO IDIAP Jan LARSEN Technical University of Denmark Technical Committee Chair Jose PRINCIPE University of Florida at Gainsville Finance Chair Jean-Philippe THIRAN EPF, Lausanne Proceedings Chairs Jean-Cdric CHAPPELIER EPF, Lausanne Scott C. DOUGLAS Southern Methodist University Publicity Chair Marc VAN HULLE Katholieke Universiteit, Leuven American Liaison Jose PRINCIPE University of Florida at Gainsville Asia Liaison Shigeru KATAGIRI NTT Communication Science Laboratories Program Committee Amir Assadi Andrew Back Samy Bengio Yoshua Bengio M. Carreira-Perpinan Jean-Cdric Chappelier Andrzej Cichocki Jesus Cid-Sueiro Bob Dony Scott Douglas Craig Fancourt Ling Guan Tzyy-Ping Jung Shigeru Katagiri Jens Kohlmorgen Shoji Makino Danilo Mandic Elias Manolakos Takashi Matsumoto David Miller Li Min Christophe Molina Mahesan Niranjan Kostas Plataniotis Tommy Poggio Jose Principe Phillip Regalia Steve Renals Joo-Marcos Romano Jonas Sjberg Robert Snapp Kemal Snmez Sren Riis Jean-Philippe Thiran Naonori Ueda Marc Van Hulle Fernando Von Zuben Christian Wellekens Lizhong Wu Lian Yan From ezequiel at cogs.susx.ac.uk Wed Mar 20 09:02:04 2002 From: ezequiel at cogs.susx.ac.uk (Ezequiel Di Paolo) Date: Wed, 20 Mar 2002 14:02:04 +0000 Subject: CFP Adaptive Behavior Special Issue Message-ID: [Apologies for multiple copies. Please, do not forward to any other lists] ********** Call For Papers: ********** Adaptive Behavior Special Issue No 10 ------------------------------------- Plastic mechanisms, multiple timescales and lifetime adaptation --------------------------------------------------------------- Submission Deadline: 15 July 2002 http://www.cogs.susx.ac.uk/users/ezequiel/ab-cfp.html Guest Editor Editor-in-Chief ------------ --------------- Ezequiel A. Di Paolo Peter M. Todd Evolutionary and Adaptive Center for Adaptive Behavior Systems, and Cognition, School of Cognitive Max Planck Institute And Computing Sciences, for Human Development, University of Sussex, Lentzealle 94, Brighton, BN1 9QH, UK D-14195 Berlin, Germany ezequiel at cogs.susx.ac.uk editor at adaptive-behavior.org The last few years have seen an increased interest in the design of plastic robot controllers, or controllers with inherent dynamical properties such as the interplay of multiple timescales, for the generation highly adaptive and robust behaviour. This research area in robotics draws important inspiration from neuroscience and may be applied to the testing and generation of hypotheses on the role of plasticity in brain function. Synthetic methods, such as evolutionary robotics, have provided a glimpse of how plastic neural mechanisms, like activity-dependent neuromodulation, that are often studied locally in reduced systems, can give rise to integrated and coordinated performance in a whole situated robot. Recent studies have included the role of modulatory processes affecting neural activation, diffusing localized neuromodulation, the evolution of rules of synaptic change, the design of neural controllers acting on fast and slow timescales, and the evolution of stabilizing mechanisms of cellular activity. These studies have successfully revealed that such mechanisms are able to introduce highly desirable properties such as robustness, adaptation to bodily perturbations, and improved evolvability. But many questions remain open, such as what is the relation between plasticity and stability, how adequate is a given mechanism for the required task, how do alternative methods of obtaining plastic behaviour relate, and to what extent is environmental regularity responsible for successful tuning of neural controllers. Adaptive Behavior solicits high quality contributions on these topics for its 2002 special issue (vol 10:3/4). Papers should describe work integrating mechanisms and adaptation at the behavioural level. They may present work using simulations or real platforms. Appropriate contributions addressing other levels of plasticity (such as sensory morphology or bodily structure) will also be considered. Papers drawing inspiration from, and contributing back to, neuroscience will be particularly appropriate. Topics ------ Multi-timescale controllers Activity-dependent plastic neural controllers Change and stability in robot performance Adaptation to radical perturbations Neuromodulation Re-configurable neural controllers Plastic controllers and simulation/robot transfer Submissions ----------- Authors intending to submit are also encouraged to contact the Guest Editor as soon as possible to discuss paper ideas and suitability for this issue. Submission of manuscripts should be made to the Guest Editor at the address below. ***** Submissions due: 15 July 2002 ***** Submissions should be in English, with American spelling preferred, in the style described in the Fourth Edition of the Publication Manual of the American Psychological Association, be double-spaced throughout and not normally exceed 25 journal pages (40 manuscript pages including figures, tables and references). Electronic submission in PDF format is strongly preferred. Each submission should have a title page including: the submission's title; names, postal, and email addresses of the authors; the phone and FAX number of the corresponding author; and a short running title. The second page should contain an abstract of about 150 words and up to six suggested key words. The main text should start on page 3, with acknowledgements at the end. Detailed guidelines for submission layout can be ffound on the ISAB web site at http://www.isab.org.uk/journal/ by following the link there labelled "Instructions to Contributors". Submit manuscripts to the Special Issue Guest Editor in PDF format by email with "Special Issue 10" in the subject line. Dr Ezequiel A. Di Paolo School of Cognitive and Computing Sciences, University of Sussex, Brighton, BN1 9QH, UK ezequiel at cogs.susx.ac.uk Tel.: +44 1273 877763 Fax.: +44 1273 671320 Adaptive Behavior ----------------- Adaptive Behavior is the premier international journal for research on adaptive behaviour in animals and autonomous artificial systems. For over 10 years it has offered ethologists, psychologists, behavioural ecologists, computer scientists, and robotics researchers a forum for discussing new findings and comparing insights and approaches across disciplines. Adaptive Behavior explores mechanisms, organizational principles, and architectures for generating action in environments, as expressed in computational, physical, or mathematical models. Adaptive Behavior is published by Sage Publications, a leading independent publisher of behavioural journals, spanning human psychology to robotics to simulation modelling. A new editorial board, headed by Peter M. Todd of the Center for Adaptive Behavior and Cognition, is shaping the journal in novel directions. New technological infrastructure will allow faster publication turnaround, better feedback from reviewers to authors (and back again), and greater access to research results. The journal publishes articles, reviews, and short communications addressing topics including perception and motor control, learning and evolution, action selection and behavioural sequences, motivation and emotion, characterization of environments, collective and social behaviour, navigation, foraging, mate choice, and communication and signalling. From bpg at cs.stir.ac.uk Thu Mar 21 06:13:50 2002 From: bpg at cs.stir.ac.uk (B.P. Graham) Date: Thu, 21 Mar 2002 11:13:50 +0000 Subject: PhD studentship available Message-ID: <3C99C06E.4A641D9C@cs.stir.ac.uk> Dear all, A 3-year EPSRC-funded PhD studentship is available in the Neural Computing group, Department of Computing Science and Mathematics, University of Stirling, Scotland (http://www.cs.stir.ac.uk). Applications are required by May 1st, 2002. Start date is September 1st, 2002. Title: Compartmental modelling of developing neurons Supervisor: Dr Bruce Graham (http://www.cs.stir.ac.uk/~bpg/) Project summary: A major tool in the study of biological nervous systems is computer simulation of neuronal function using the compartmental modelling framework. Neurons and the networks they form are the outcome of a developmental process. The aim of this project is to extend the compartmental modelling framework to enable the modelling of developing neurons that are undergoing changes in shape and membrane characteristics over time. Novel numerical techniques are required to handle the accurate simulation of dynamic intracellular and extracellular environments. Simulations must handle multiple time scales, ranging from the submillisecond for electrical activity to hours for morphological change. The techniques will be incorporated into user-friendly simulation software. The resultant computational tools will allow the investigation of many aspects of nervous system development. They will be applied here in a study of the growth of a neuron's dendritic tree. The dendritic tree is the major site of input to the neuron and its morphology plays a determining role in the signal integration and processing capabilities of the neuron. The aim is to produce models that elucidate the biophysical mechanisms underlying the formation of the characteristic tree morphologies of different neuronal types. Further details of the research can be found on the project's web page (http://www.cs.stir.ac.uk/~bpg/research/neurite.html). The student will work under the supervision of Dr Bruce Graham (http://www.cs.stir.ac.uk/~bpg/), with additional supervision from Dr Arjen van Ooyen (http://www.anc.ed.ac.uk/~arjen/) at the Netherlands Institute for Brain Research in Amsterdam. Travel money is available for yearly visits to Amsterdam. The project ideally suits candidates with strong numerical and computing skills, with an interest in modelling biological systems, or alternatively with a biological background with significant, demonstrated numerical and computing skills. Closing date for applications is 1st May, 2002. Starting date is 1st September, 2002 (or as soon as possible thereafter). For further information, in the first instance please contact Dr Bruce Graham (b.graham at cs.stir.ac.uk). For application details, please contact:- Mrs Heather Brennan Department of Computing Science and Mathematics University of Stirling Stirling FK9 4LA, Scotland, UK Tel: 01786 467460 Fax: 01786 464551 Email: heather at cs.stir.ac.uk Applications should clearly indicate that you are applying for this studentship. The required summary of your proposed research project should emphasise your reasons for applying for this particular project and the skills and methodology that you will employ. -- Dr Bruce Graham, Lecturer (b.graham at cs.stir.ac.uk) Dept. of Computing Science and Mathematics, University of Stirling, Stirling FK9 4LA phone: +44 1786 467 432 fax: +44 1786 464 551 From becker at meitner.psychology.mcmaster.ca Mon Mar 25 15:14:12 2002 From: becker at meitner.psychology.mcmaster.ca (S. Becker) Date: Mon, 25 Mar 2002 15:14:12 -0500 (EST) Subject: responses to textbook survey Message-ID: Dear connectionists, thanks to all who replied to my request to exchange info on textbooks for neural network-related courses. If you are teaching such a course, I think you will find the comments quite useful. As well there are several pointers to web notes etc. So rather than attempt to summarize I've included the comments verbatim, except where the sender requested that they remain confidential. cheers, Sue _____________________________________________________________________________ Date: Wed, 6 Mar 2002 21:11:26 -0500 (EST) From: "S. Becker" Dear connectionists, This message is directed to those of you who teach a course in neural networks & related topics, or computational neuroscience or cognitive science, and would like to exchange opinions on textbooks they are using. I'd like to know the name of your course, who takes it (undergrad vs grad, comp sci/eng vs psych/bio), what textbook you are using and what you consider to be the pros and cons of this book. I teach a course in neural computation to 3rd year undergrads, mostly CS majors and some psych/biopsych, and have used James Anderson's book An Introduction to Neural Networks a number of times. I like this book a lot -- it is the only one I know of that is truly interdisciplinary and suitable for undergraduates -- but it is in need of updating. I will post a summary of the replies I receive to connectionists. _____________________________________________________________________________ From: "Olly Downs" The Hopfield Group (formerly at CalTech, and for the past 4 years at Princeton) has taught the course 'Computational Neurobiology and Computing Networks' for many years. The principal text for the course has always been 'Introduction to the Theory of Neural Computation' by Hertz, Krough and Palmer. It is very much oriented towards the Physics/Applied Math people in the audience, and thus more recently we have also used Dana Ballard's "An Introduction to Natural Computation", which we found to be more paedagogical in it's approach, and somewhat less mathematical - which has helped span the broad audience our course has enjoyed - including 3rd-year undergrads through faculty in Biology, Psychology, Applied Math, CS and Physics. _____________________________________________________________________________ Sender: Dave_Touretzky at ammon.boltz.cs.cmu.edu Hi Sue. I teach my neural nets course using two books: Hertz, Krogh and Palmer, 1991, Introduction to the Theory of Neural Computation. Bishop, 1995, Neural Networks for Pattern Recognition. This started out as a graduate course (in fact, it was originally taught by Geoff Hinton) but is now dual-listed as an undergrad/grad class. Those who take the grad version have to do a project in addition to the homeworks and two exams. The syllabus for my course, along with a bunch of MATLAB demos, can be found here: http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15782-s02/ _____________________________________________________________________________ I teach a course Principles of Neural Computation to 3rd year undergrads and I am using Simon Haykin's book Neural Networks, a Comprehensive foundation. In my course only chapters 1-5 and 9 are used. The contents of those are: Introduction, Learning processes, Single layer perceptrons, Multilayer perceptrons, Radial basis function networks and Self-organizing maps. In general the book is quite good, but the first chapter and maybe something in the second are too loose or too general to start with undergrads. It would be better to start with more concrete examples. But I am using the book, because the previous lecturer of the course used the same book and he gave me his slides and because he and I have not found a better book. We have also an Advanced course on Neural computing for graduated students. In that course the material is selected among the chapters of the same book which have not been used in the course for undergraduates. * Kimmo Raivio, Dr. of Science in Technology | email: Kimmo.Raivio at hut.fi * Lab. of Computer and Information Science | http://www.cis.hut.fi/kimmo/ * Helsinki University of Technology | phone +358 9 4515295 * P.O.BOX 5400, (Konemiehentie 2, Espoo) | gsm +358 50 3058427 * FIN-02015 HUT, FINLAND | fax +358 9 4513277 _____________________________________________________________________________ From: Gregor Wenning I am a teaching assistant in the lab of Klaus Obermayer. We offer a one year course in artificial neural networks, supervised and unsupervised methods, mainly for computer scientists. The main books we use are: 1) Bishop, "Neural Networks for Pattern Recognition" a good book for basic methods, unfortunately the students do not like it 2) Haykin,"Neural Networks", good overview, many topics and aspects, inludes problems and one can get a solutions manual as well! _____________________________________________________________________________ From: Nicol Schraudolph I've just taught Machine Learning to upper-level, mostly CS undergrads (corresponds to U.S. master's students), using Cherkassky/Mulier's "Learning from Data". That book gives a very nice presentation of the basic concepts and problems in machine learning, but has two drawbacks: 1) it is too expensive for students, and 2) it is full of errors in the details, especially in the equations. Their two-page description of EM contains no less than 5 errors! I can recommend using this book to design a machine learning course, as long as you don't try to teach directly from it. Dr. Nicol N. Schraudolph http://www.inf.ethz.ch/~schraudo/ Chair of Computational Science mobile: +41-76-585-3877 ETH Zentrum, WET D3 office: -1-632-7942 CH-8092 Zuerich, Switzerland fax: -1374 _____________________________________________________________________________ From: Roseli Aparecida Francelin Romero I have used to teach a course in neural networks for graduate students, Haykin's book "Neural Networks - A Comprehensive Foundation". This book presents many exercises in each chapther and I think is a good book for getting the basis necessary in nn & related topics. _____________________________________________________________________________ We used Bechtel and Abrahamsen's 'Connectionism and the mind' to teach 3-4th year psych students in psychology and cognitive science (Univ. of Amsterdam and earlier also Univ. of Utrecht). The book needs updating and a 2nd edition was promised several years ago but has never come out. So, now we are back to using a reader and I have started to write a book myself. Prof.dr. Jaap Murre Department of Psychology University of Amsterdam Roetersstraat 15 1018 WB Amsterdam _____________________________________________________________________________ I've been mostly using Bishop's book on NNs for PR (1995), for graduate students in CS, some engineers and mathematicians (and few or no psych/bio people). It is very well written, and most students appreciate it very much, despite the arduous introductory (learning theory and non-parametric statistics) chapters. One of my colleagues (Kegl) has used for the same course this year the new edition of the Duda & Hart book, which is more up-to-date, however with many minor but annoying errors. Yoshua Bengio Associate professor / Professeur agrégé Canada Research Chair in Statistical Learning Algorithms / titulaire de la chaire de recherche du Canada en algorithmes d'apprentissage statistique Département d'Informatique et Recherche Opérationnelle Université de Montréal, _____________________________________________________________________________ From: "Tascillo, Anya (A.L.)" I'm not teaching large classes, just all my coworkers, and I still hand them Maureen Caudill's "Understanding Neural Networks" spiral bound books, volumes 1 and 2. People do the exercises and and one friend turned around and taught the neural network section of his summer class after I lent him these books. _____________________________________________________________________________ From: Peter Dayan > I'd like to know the name of your course, theoretical neuroscience > who takes it (undergrad vs grad, grad students > comp sci/eng vs psych/bio), a mix of those two, more the former > what textbook you are using you guess :-) > consider to be the pros and cons of this book. The only con is the prose.... More seriously, our students take this course, plus one on machine learning, which doesn't really have an ideal book yet. I expect that a mixture of David MacKay and Mike Jordan's forthcoming books will ultimately be used. _____________________________________________________________________________ Your message was forwarded to me... I use Principles of Neurocomputing = for Science and Engineering by Ham and Kostanic. The course taught in = the ECE dept. of UNH is attended by mostly first year grad students but = I also have two ugs. The book is well written and comes with a large = set of solved Matlab assignments which I use extensively. Andrew L. Kun Assistant Professor University of New Hampshire ECE Department, Kingsbury Hall Durham, NH 03824 voice: 603 862 4175 fax: 603 862 1832 www.ece.unh.edu/bios/andrewkun.html _____________________________________________________________________________ From: "Randall C. O'Reilly" We have compiled a list of courses using our textbook: http://psych.colorado.edu/~oreilly/cecn_teaching.html We and several others use our book for undergrad and grad level courses, and it works pretty well. Hopefully some of these other people will email you with a more objective 3rd-party perspective on the pros and cons of the book. Here's our general sense: Pros: - covers neuroscience, computation & cognition in a truly integrated fashion: units are point neurons w/ ion channel & membrane potential eq's, implement powerful learning algorithms, and are applied to a wide range of cognitive phenomena. - includes many "research grade" cognitive models -- not just a bunch of toy models. - integrated software & use of one framework throughout makes it easy on the students Cons: - not encyclopedic: generally presents one coherent view instead of a bunch of alternatives (alternatives are only briefly discussed). - not as computationally focused as other books: emphasizes biology and cognition, but does not cover many of the more recent machine learning advances. _____________________________________________________________________________ From: Andy Barto I teach a course entitled "Computing with Artificial Neural Networks" to advanced undergraduates and a few graduate students. Students come mostly from CS, but some for engineering, psychology, and neuroscience. I have used almost all the books out there. I have used Anderson's book, finding it unique, but a bit too low-level of technical detail for my students. Currently I am using Haykin, second edition. I have used Bishop's book for independent studies, but it is too difficult for my undergrads. I refer a lot to Reed and Marks "Neural Smithing" but it is too narrow for the whole course. I have not found the perfect book. Still waiting for Geoffrey to write one. I would be most interested to hear what you are able to find out. On another matter, as we revise our overall curriculum, I am not sure that the NN course will survive in present form. It may happen that we will cover major NN algorithms as part of a broader machine learning course sequence. We like the Duda, Hard, and Stork book, and I really like the book "The Elements of Statistical learning" by Hastie, Tisbhirani, and Friedman, although it is not completely satisfactory for a course either, being too difficult and also lacking the algorithmic point of view that I think is important for CS students. ______________________________________________________________________ I teach a graduate course out of my own book (now in the 2nd edition, D. S. Levine, Intro. to Neural and Cognitive Modeling, Erlbaum, 2000). My current class is predominantly psychology students, but I have also taught a class out of the first edition that was predominantly computer science students with a substantial minority from EE. (The second edition is not too changed from the first in Chs. 1-5 except for updating, but Chapters 6 and 7 have several new or expanded sections). In my biased author's view, I see some pros of my book as its discussions of general modeling principles, of the history of the field, of the relations of network architectures to psychological functions, and of the approaches of Grossberg and his colleagues. (I remember you noted some of these when reviewing the first edition for an AI journal.) But perhaps more important than any of these, it is the only NN book I know of that proceeds systematically from building blocks (such as lateral inhibition and associative learning) to modeling of complex cognitive/behavioral functions such as categorization, conditioning, and decision making. This is reflected in its chapters which are organized not primarily by "schools" or "paradigms" as are many other NN introductions, but primarily by functions and secondarily by paradigms. The flow chart which illustrates this organization has been moved to the front of the book to be more apparent. The book is multidisciplinary friendly because the equations are mostly clustered at the ends of chapters, and there are neuroscience and math appendices for those who lack background in those areas. Also it is relatively accessible monetarily (the last I checked, $36 in paperback). More description of the book's distinctive features can be obtained from my web site (www.uta.edu/psychology/faculty/levine), under Books, or the Erlbaum web site (www.erlbaum.com). When I teach out of the book I supplement it with copies of the original papers being discussed (by Grossberg, Sutton-Barto, Klopf, Anderson, Rumelhart et al., et cetera). That is probably a good idea for any textbook in the field, particularly for grad courses. The cons of my book are that it is probably not the best one for those interested in detailed engineering or computational implementations or mathematical algorithms. My experience is that many CS or EE students found a course based on the book a useful complement to another course based on a book, or professor's notes, that have those other strengths -- although I know one EE professor doing research on vision and robotics, Haluk Ogmen of the U of Houston, who has taught from my book extensively with his students. Also, while the book does not demand specific technical prerequisites in any one area, it may or may not be the best book for undergraduates. Some undergrads have found it stylistically difficult because the discussion cuts across traditional boundaries, though many undergrads already involved in beginning research have taken my course or a roughly equivalent independent study course and felt comfortable with the book. My book has a variety of exercises, some computational and some open-ended. I have planned to write a web-based instructor's manual but that has remained on the back burner: the idea is still alive and audience interest might help push it forward. But the following exercises have been particularly valuable for the students in my classes: Ch. 2, Exercises 2 (McCulloch-Pitts), 3 (Rosenblatt perceptron: very detailed!), 8 (ADALINE); Ch. 3, 2 (outstar), 4 (Grossberg gated dipole), 6 (BAM); Ch. 4, 1 and 2 (shunting lateral inhibition); Ch. 5, 2 (Klopf), 3 (Sutton-Barto); Ch. 6, 4 (backprop T vs. C), 6 (BSB). Best, Dan Levine levine at uta.edu ______________________________________________________________________ From: Alex Smola you're welcome to have a look at my slides (linked on my homepage) for the past two courses i gave. and there'll be slides on bayesian kernel methods online really soon, too (they're, in fact, at http://mlg.anu.edu.au/~smola/summer2002). most of that is taken from bernhard's and my book http://www.learning-with-kernels.org for a free sample of the first 1/3, just go to contents and download it. and, of course, i like that book a lot ;). but you won't find a word on neural networks in it (except for the perceptron). ''~`` ( o o ) +------------------------------.oooO--(_)--Oooo.----------------------------+ | Alexander J. Smola http://mlg.anu.edu.au/~smola | | Australian National University Alex.Smola at anu.edu.au | | Research School for Information Tel: (+61) 2 6125-8652 | | Sciences and Engineering .oooO Fax: (+61) 2 6125-8651 | | Canberra, ACT 0200 (#00120C) ( ) Oooo. Cel: (+61) 410 457 686 | +---------------------------------\ (----( )------------------------------+ \_) ) / (_/ ______________________________________________________________________ I use Simon Haykin's 2nd edition of Neural Networks (It doesn't cover ART). For density estimation and the Bayesian framework I use Chris Bishop's Neural Networks for Pattern Recognition (which I see as an updated version of Duda and Hart). My students are final year CS or Math and stats undergraduates. Wael El-Deredy, PhD Cognitive Neuroscience Group Visiting Lecturer Unilever Research Port Sunlight School of Comp. and Math Sci. Bebington CH63 3JW - UK Liverpool John Moores Univ. ______________________________________________________________________ I am using in my 3rd and 4rth year courses: "Foundations of neural networks, fuzzy systems and knowledge engineering", N.Kasabov, MIT Press, 1966 with a WWW page being updated annually: http://divcom.otago.ac.nz/infosci/kel/courses/ Nik Kasabov Professor of Information Science University of Otago New Zealand ______________________________________________________________________ This is a brief response to your broadcast query for information about neural network classes. I guess I have three experiences to contribute: At Carnegie Mellon University I taught a class in the Department of Psychology on Parallel Distributed Processing. The focus of the course was on cognitive modeling. The students were a mix of undergraduates, graduate students, and even a few visiting postdocs and faculty members. The undergraduates included computer science students, psychology majors, and participants in an interdisciplinary cognitive science program. I used the original PDP volumes, including the handbook of exercises, but assignments were completed using the PDP++ software. Selected papers were also used as assigned readings. In terms of course credit, this class counted as a class and a half, so the students were worked pretty hard. I found the PDP volumes to still be an excellent source of readings, but I think that most students depended on lectures heavily to clarify concepts. Materials from this course are collecting dust in a corner of the web at: "http://www.cnbc.cmu.edu/~noelle/classes/PDP/". I am currently teaching two relevant classes at Vanderbilt University. The first is a graduate level seminar on Computational Modeling Methods for Cognitive Neuroscience, offered through the Department of Electrical Engineering and Computer Science. Between a third and a half of the participants are engineering students, most of which are interested in robotics. The remainder are psychology students, with a couple of neuroscientists among them. I am following the O'Reilly and Munakata text, _Computational Explorations in Cognitive Neuroscience_, fairly closely. I am really enjoying teaching from this book, though I sense that the seminar participants feel a bit like they're trying to drink from a fire hose -- the pages are densely packed with material presented at a fairly rapid clip. Most lecture time has been spent providing guidance through this wealth of material, but some class time has been used to provide balance and counterpoint to the biases expressed by the authors. The book does not provide the usual taxonomic review of techniques, opting instead to focus on a specific set of methods which may be fruitfully integrated into a unified modeling framework. This means that I've had to put in a bit of extra work to make sure that the students are at least briefly exposed to the range of techniques that they will encounter in the literature. Despite these issues, the book has been a very useful tool. The text's integrated computer exercises using PDP++, packaged so as to require only minimal computer skills, have proven to be especially helpful. The materials that I have prepared for this seminar are at: "http://www.vuse.vanderbilt.edu/~noelledc/classes/S02/cs396/". I am also teaching an undergraduate computer science class entitled "Project in Artificial Intelligence". My version of this course is subtitled "Fabricating Intelligent Systems With Artificial Neural Networks". The class is intended to provide seniors in computer science with an opportunity to develop and evaluate a substantial software project. These students are being introduced to neural network technology with the help of _Elements of Artificial Neural Networks_ by Mehrotra, Mohan, and Ranka. I selected this text because it appeared to present the basics of neural networks in a concise manner, it focused on engineering issues in favor of cognitive modeling issues, and it included some discussions of applications. Unfortunately, my students have, for the most part, found the text to be too concise, and they have had difficulty with some of the mathematical notation. I have had to supplement this material extensively in order to communicate central concepts to the students. I have fabricated a web site for this course, which may be found at: "http://www.vuse.vanderbilt.edu/~noelledc/classes/S02/cs269/". -- David Noelle ---- Vanderbilt University, Computer Science ------ -- noelle at acm.org -- http://people.vanderbilt.edu/~david.noelle/ -- ________________________________________________________________________ From: Chris Williams I no longer teach a NN course. For machine learning I use Tom Mitchell's book. This includes some NN stuff (I also add stuff on unsupervised learning) For probabilistic modelling (belief nets) I use the upcoming Jordan & Bishop text. _____________________________________________________________________________ I am teaching at UQAM (Montreal) and have the same problem. Please let me know if you get some useful answers. Alex Friedmann Neural Networks Group LEIBNIZ-IMAG Tel: (33) 4 76 57 46 58 46, avenue Felix Viallet Fax: (33) 4 76 57 46 02 38000 Grenoble (France) Email: Alex.Friedmann at imag.fr _____________________________________________________________________________ I have taught a comp neuro course several times. I have used the Koch & Segev book supplemented with lots of guest lectures. Recently I used Dayan & Abbott and will probably do so again since it is a pretty comprehensive book. My syllabus can be found in various formats on my web page http://www.pitt.edu/~phase Bard Ermentrout _____________________________________________________________________________ With two collegues I teach a course called "Connectist modeling in Psychology" at the University of Amsterdam. It is a course at undergraduate level, but Dutch degrees do not translate neatly to American ones. When in three years we switch to an Anglosaxon bachelor-master system, it will probably be part of the master's curriculum. As the name suggests, it is a course within the field of psychology. Next to psychology students, some biology and AI majors also take our course. We have used a book of Bechtel and Abrahamson, called "Connectionism and the mind". This book, from the early nineties, is mainly targeted at cognitive science: cognitive psychology, philosophy. It is not a good book. Students complain that it is unorganized, and it misses out on many important topics while rambling on about symbolism vs. connectionism, important in the eighties but now not a very lively discussion anymore. Moreover, it has run out of print (the second edition promised years ago still isn't there), so we had to give students the opportunity to photocopy the book. For all these reasons, we decided to this year not use the book, but instead make a reader. We are now collecting articles to include in it (the introductory chapters will be written by one of us, Jaap Murre). To give our students hands-on experience, we have also made excercises. These are available at www.neuromod.org/connectionism2002/opdrachten/, but alas most of the explanations are in Dutch. I hope that you get a lot of reactions. Would it perhaps be possible for you to, if you do get many reactions, somehow share the information? Perhaps by putting reactions on the web or sending a mail to Connectionists? I would be very interested in what books others have used. Thank you in advance. Sincerely, Martijn Meeter _____________________________________________________________________________ I give a course simply called "Neural Networks" for undergraduate computer science students (3rd year). The book is S. Haykin, Neural Networks: A Comprehensive Foundation, Second edition, Prentice Hall, 1999. ISBN 0-13-273350-1. The book is indeed comprehensive, but in my opinion not an ideal book for computer science students. More so for engineering students, I think. The signal-theoretic viewpoint on neural networks is lost on computer science students, who would prefer a more computational view. But I still use the book, since it is so comprehensive. The book is both deeper and wider than the course content, which makes it possible for the students to use the same book as a reference book when they want to dig deeper or broaden their view. This is useful since the course ends with a project-like assignment, defined by the students themselves. Different students dig in difference places, but most of them can use the same source. ----- If God is real, why did he create discontinuous functions? ----- Olle Gällmo, Dept. of Computer Systems, Uppsala University Snail Mail: Box 325, SE-751 05 Uppsala, Sweden. Tel: +46 18 471 10 09 URL: http://www.docs.uu.se/~crwth Fax: +46 18 55 02 25 Email: crwth at DoCS.UU.SE _____________________________________________________________________________ Germán Mato and myself are giving a course on biologically oriented neural networks, for advanced undergraduate students, in physics and engineering. This is the material, and the books we include: 1. An introduction to the nervous system, neurons and synapses: a snapshot from the books of "Essentials of Neural Science and Behavior" Kandel, Schwartz and Jessell 2 classes. 2. Single cell dynamics: 3 classes. We take the theory of dynamical systems and excitable media in "Nonlinear Dynamics and Chaos : With Applications to Physics, Biology, Chemistry, and Engineering" S. Strogatz, and then adapt it to several neuronal models (Hodgkin Huxley, integrate and fire, FitzHugh-Nagumo). We study the stable states, their stability, phase transitions, and the emergence of cycles. 3. Dynamic behavior of a large ensemble of neurons: We use chapters 3 and 5 of Kuramoto's book "Chemical Oscillations, Waves and Turbulence", to introduce the phase models, and study the conditions for synchronization. We exemplify with the integrate and fire and the Hodgkin Huxley models. 3 classes 4. Associative memories: "Introduction to the theory of neural computation", Hertz, Krogh and Palmer 8 classes 5. Information theoretical analysis of neural coding and decoding Cover and Thomas, to give the basic definitions. These two books: "Neural Networks and Brain Function", Rolls and Treves, and "Spikes", Rieke et al. provide us with examples in the nervous system. We study the problem of decoding the neural signal, how to overcome limited sampling, rate coding, the role of correlations and time dependent codes, and how to calculate the information transmitted in very short time windows. This unit is not actually one where we analyze how to build or model a neural network, but rather, how to extract information from it. 10 classes 6. Modeling cortical maps The book of Hertz "Introduction to the theory of neural computation" gives us the basics for unsupervised learning. In Haykin's book "Neural Networks" we find an information theoretical description of the Linsker effect, and of self organized feature maps. In the book of Neural Networks and Brain Function, Rolls and Treves give examples of this, in the brain. 6 classes Ines Samengo samengo at cab.cnea.gov.ar http://www.cab.cnea.gov.ar/users/samengo/samengo.html tel: +54 2944 445100 - fax: +54 2944 445299 Centro Atomico Bariloche (8.400) San Carlos de Bariloche Rio Negro, Argentina _______________________________________________________________________ From: Geoff Goodhill What a great question! I have twice so far at Georgetown taught a "computational neuroscience" course, an an elective course for graduate students in the Interdisciplinary Program in Neuroscience (backgrouns: bio/psych). The class has been around 4-5 each time, including one physics undergrad in each case. I used mostly papers, plus a few chapters from Churchland and Sejnowski. Overall I found the experience very unsatisfying, since none of the students could deal with any math (not even what I would regard as high school math). This year I am instead organizing an informal (no credit) reading group to work through the new Dayan and Abbott book. I think this is excellent: broad, sophisticated, though certainly not for anyone without a very solid math background. I think next year I will offer a formal course again (fortunately it's entirely up to me if and when I teach it) based around this book, and basically tell prospective students that if they can't understand the math then don't bother coming. _______________________________________________________________________ From: Stevo Bozinovski I am teaching Artificial Intelligence, at the Computer Science department, South Carolina state University. It is an undergraduate level (CS 323) course. I use the Pfeifer's book "Understanding Intelligence", in which a whole chapter is dedicated to Neural Networks. The book covers sufficiently the myltilayer percepton type networks. The adaptive neural arrays are not covered. I use my book "Consequence Driven Systems" in order to cover the adaptive neural arrays. _______________________________________________________________________ From: Lalit Gupta At SIU-C, we offer a graduate level Neural Networks course in the Dept of Electrical & Computer Engineering. The text that is currently being used is: Neural Networks, Second Edition, Simon Haykin, Prentice Hall, 1999. It covers many aspects of neural networks. It is a relatively difficult text for students to read directly without instruction. It is suitable for both engineers and scientists. Information about the text is available at: http://www.amazon.com/exec/obidos/tg/stores/detail/-/books/0132733501/glance /ref=pm_dp_ln_b_1/103-6340812-8312600 _______________________________________________________________________ From: Todd Troyer I have been teaching a one semester graduate level survey course entitled Computational Neuroscience for the last two years. The course is part of an optional core sequence of required courses in Maryland's Neuroscience and Cognitive Science interdisciplinary Ph.D. program. The course is designed to be taken by students spanning the range of neuroscience and cognitive science - everyone from engineers, and computer scientists to biologists and psychologists who have only a rudimentary math background. I too was frustrated with the lack of available textbooks and so have spent a lot of time writing extensive "class notes" in semi-book form. At this point things are obviously incomplete. I'm also spending some time developing computer simulations and I hope to have a "lab" component of the course next fall. Both things take huge amounts of time. The topic areas I focus on have extensive overlap with the topics presented in the new Dayan and Abbott book, but I spend more time on the linear algebra/network stuff. Overall the course is at a more basic level than Dayan and Abbott. I feel that their book ramps up the math too fast for many of my students. Next fall I will also require Churchland and Sejnowski's Computational Brain. I found that in focusing on the presentation of the technical material, I was having a hard time organizing the presentation of the biological examples. The Computational Brain, while getting a bit dated, does a wonderful job at this (with the usual minor qualms). I also recommend the Rieke et al Spikes book for my more advanced students. I've looked at some other books as well. I like the Anderson book, but it doesn't focus on the material that I'd like. I think Randy O'Reilly's new book presents an interesting attempt to present a connectionist perspective on cognitive neuroscience, although I haven't had a chance to look through it in detail. Overall, I've found that the key hinge point in thinking about such a class is the computational skills. It is much easier either to assume a reasonably sophisticated math background or not to delve into the math much at all. In the first case you can use formal language to talk about how the equations relate to the underlying ideas, but the course excludes students with noncomputational backgrounds. Without assuming such a background you can talk about some of the underlying concepts, but such a presentation tends to be vague and often either overstates what computation adds or seems obvious and not interesting. While I've had mixed success so far, I beleive that you can have a unified class where you teach enough math to bring the students that need it up to speed, at least to the degree that all students can begin to explore the *relationship* between the math and the neuroscience. I've been trying to set things up so that the students with a stronger background can work on self-directed outside reading and/or projects during some of the basic math sessions. It's a stretch. Finally, I'm thinking about developing the lab component into a joint lab/lecture course that I could teach to advanced undergraduates, again from a range of backgrounds. Todd W. Troyer Ph: 301-405-9971 Dept. of Psychology FAX: 301-314-9566 Program in Neuroscience ttroyer at glue.umd.edu and Cognitive Science www.glue.umd.edu/~ttroyer University of Maryland College Park, MD 20742 ____________________________________________________________________________ I teach a course Computational Neurobiology, to graduate students in the school of computational sciences. Most students are computational neuroscience students, though some engineers, math majors, and psychologists have taken it. I use Johnston and Wu, Foundations of Cellular Neurophysiology. It is fabulous, because is has all the equations describing cell properties. I.e. cable equation, Hodgkin-Huxley equations, etc. It is probably a bit advanced for undergrads, and even the psych majors have trouble if they don't have a strong math background (which they don't at GMU). The only thing against the book is the lack of good figures, such as found in From Neuron to Brain, by Nicholls et al. But I supplement my lectures from that book, since the students have not had a basic non-computational neuroscience course. I also teach a course Comptutational Neuroscience Systems. However, I haven't found a good textbook. So I use Kandel and Schwartz, or Fundamental Neuroscience for the basic neuroscience systems, and supplement each lecture with one or two modeling papers on the system we discuss. I'm still hoping to find a book that presents different methods of modeling systems of neurons, e.g. A chapter or two on (1) using genesis and neuron to develop large scaled detailed models, and (2) a linear systems approach, (3) a neural network approach, (4) integrate and fire neuron networks (5) spike train analysis, (6) higher level abstract models, such as the Rescorla-Wagner model. If you know of any like this, please let me know. Avrama Blackwell ____________________________________________________________________________ I teach an undergrad course on "Information processing models." Its mainly psych and neuro students, and a handful of graduate students. Most have little or no math or programming background. I have not found any one book that's good for this level. I have written up some of my own lecture notes and mini-tutorials which you can get at http://redwood.ucdavis.edu/bruno/psc128 I am also teaching a graduate course this spring on computational neuroscience in which I will use Peter and Larry's book for the first time. I offer this course only once every two years though. Bruno A. Olshausen Phone: (530) 757-8749 Center for Neuroscience Fax: (530) 757-8827 UC Davis Email: baolshausen at ucdavis.edu 1544 Newton Ct. WWW: http://redwood.ucdavis.edu/bruno Davis, CA 95616 _____________________________________________________________________________ From stefan.wermter at sunderland.ac.uk Mon Mar 25 12:36:44 2002 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Mon, 25 Mar 2002 17:36:44 +0000 Subject: Positions Cognitive Neuro Robotics for Language Learning Message-ID: <3C9F602C.8C67F20C@sunderland.ac.uk> Research Scientist Cognitive Neuro-Robotics (MirrorBot Project) 16,905 - 25,793 Ref No: CETR45/01 Research Associate Cognitive Neuro-Robotics (MirrorBot Project) 11,562 - 25,793 Ref No: CETR46/01 Biomimetic Multimodal Learning in a Mirror Neuron-based Robot EU Fundamental Emerging Technology project planned project start 1.6.2002, duration 3 years The Hybrid Intelligent Systems group ( http://www.his.sunderland.ac.uk ) within the area of Computing of the University of Sunderland is looking for two researchers in the new area of neuroscience- inspired computing on cognitive robots. This project will be in collaboration with research groups in cognitive neuroscience and neuroinformatics in Sunderland, Cambridge, Parma, Nancy and Ulm. The computational experiments are planned to be performed on a parallel neural high performance computer which should act as the "computational brain" for the robots. The researchers will play a key role in the design, development, programming and testing of a neural system on a mirror-neuron-based robot integrating vision and language for actions based on novel neural networks. It is expected that both researchers also contribute to the coordination of the project at different levels. The Research Scientist position will be for an experienced researcher with broad skill and expertise in computing, neural networks, computational neuroscience, and intelligent systems, a higher degree in Computing, PhD desirable. Knowledge in areas like vision, language, robotics, neuroscience and project management are an advantage. The Research Associate position will be for a researcher with expertise in computing, neural networks, a higher degree in Computing, at least Masters. Experience in supporting research projects (website, project reports etc) will be an advantage for the research associate. Closing date: 30th April 2002 For more information and discussion regarding either of these posts, please contact Professor Stefan Wermter, email stefan.wermter at sunderland.ac.uk. To apply, please submit your CV along with a letter of application and details of current salary, quoting vacancy title and reference number, to the Personnel Department, University of Sunderland, Langham Tower, Ryhope Road, Sunderland, SR2 7EE or e-mail employee.recruitment at sunderland.ac.uk *************************************** Professor Stefan Wermter Chair for Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From marina at cs.rhul.ac.uk Tue Mar 26 03:21:23 2002 From: marina at cs.rhul.ac.uk (marina d'Engelbronner) Date: Tue, 26 Mar 2002 08:21:23 -0000 Subject: vacancies for research assistants Message-ID: <003901c1d49f$37bdd1c0$7ebcdb86@VENICE> ROYAL HOLLOWAY University of London THREE RESEARCH ASSISTANT IN KERNEL BASED METHODS Department of Computer Science Royal Holloway, University of London invites applications for three research assistant positions in Computer Science. The first post is funded for two years duration, while the other two are funded for three years. The starting date for the first project is flexible, but the other vacancies are from April 2002. The first vacancy is for an EPSRC funded project entitled, 'Delimiting Kernel-based Learning Methods'. It spans work on the learning-theoretic analysis of kernel-based methods, kernel-theoretic work, and algorithmic design. A background working in the theoretical analysis of learning methods is very desirable for this position. Salary is up to =A324,656 inclusive of London Allowance Ref: KB/2053. The 2 remaining vacancies are for a project that involves developing kernel-based methods for the analysis of images, Learning for Adaptable Visual Assistants. The project is financed by the EU and also involves partners in Cambridge (Xerox), France (INRIA, Grenoble and University of Rennes), Austria (Technical University of Graz), Switzerland (IDIAP) and Sweden (University of Lund). We are seeking researchers with experience in applications of machine learning with a preference for knowledge of image processing, and a strong programming background. Experience with kernel methods is desirable but not required. Salary is in the range =A320,865 to =A328,625 per annum inclusive of London Allowance Ref: KB/2052. Please contact John Shawe-Taylor by email at jst at cs.rhul.ac.uk for more information. Information on the Department may be found at www.cs.rhbnc.ac.uk/. Further details and an application form can be obtained from The Personnel Office, Royal Holloway, University of London, Egham, Surrey TW20 0EX; fax: 01784 473527; tel: 01784 414241; email Sue.Clarke at rhul.ac.uk Please quote the appropriate reference The closing date for receipt of applications is 17th April 2002. We positively welcome applications from all sections of the community ROYAL HOLLOWAY University of London Department of Computer Science Research Assistant - Kernal Based MethodsInformation for Candidates =20 Royal Holloway, University of London invites applications for three research assistant positions in Computer Science. The first post is funded for two years duration, while the other two are funded for three years. The starting date for the first project is flexible, but the other vacancies are from April 2002. The College Royal Holloway is dedicated to achieving the highest international standards in teaching and research in the Sciences, Social Sciences, Humanities and Creative Arts. The College exists to promote education and scholarship for the public benefit. By constantly reinterpreting the ideals and principles of a university through undergraduate teaching, advanced teaching and research we carry forward the vision of our founders in the changing context of the society we serve. Royal Holloway is one of the eight large Colleges of the University of London. It has over 5,400 students in the Faculties of Arts, History and Social Sciences and Science. There are 20 academic departments and almost 1,200 staff, including 370 academic teaching staff. As well as strong links within the University of London, the College has forged many successful national and international collaborations. There is a strong commitment to excellence in both teaching and research and in the 1996 Research assessment Exercise, 8 departments were rated 5* or 5 and none below 3. The College occupies a large attractive campus at Egham, Surrey, situated in the green belt near Runnymede and Windsor Great Park, with good communications to and from London. Egham is 35 minutes by train from Waterloo, and the College is one mile from the M25 and 15 minutes' drive from Heathrow Airport. For further information about the College see website http://www.rhbnc.ac.uk. The Department The Department of Computer Science and Computer Learning Research Centre occupy the modern purpose-designed McCrea Building, which accommodates staff offices, computing laboratories and terminal rooms, providing the latest facilities for research and undergraduate teaching. The Department has 15 established academic posts in Computer Science, expanding to 18 this year, and also employs 10 Research Assistants, 3 Technical support staff and 4 Administrative/secretarial staff. The Department was given a rating of 5 in the last (2001) Research Assessment Exercise. The Department's teaching load at present is around 170 Full Time Equivalent (FTE) undergraduate students taking part in both Single and Joint Honours Degree programmes, and around 35 FTEs, approximately half taught and half research. The main taught postgraduate programmes are the MSc in Computer Science by Research, which offers intensive training in any one of the Department's research specialties, and the MSc in Business Information Systems, taught jointly with the School of Management.The Department is well equipped for teaching and research. It provides networked computer facilities accessed from desktop X-terminals. The servers include powerful DEC Alpha computers running Digital Unix, a Sun Sparc running Solaris, and an NT server running various office applications such as Microsoft Office. There is fast Internet access (multi megabit per second), and a wide range of specialist research software as well as laser printers and photocopiers. All staff and research postgraduates who do not have equipment provided under personal research grants are provided with an X terminal. This gives them access to both Unix services and an NT server. The Department holds weekly research seminars and invited lectures throughout the academic year as well as a Distinguished lecture series - recent speakers include Professors Donald Davies, Roger Penrose, Frank Sumner, Maurice Wilkes and Tony Hoare. The department celebrated its 30th anniversary in 1998. The department's website is http:\\www.dcs.rhbnc.ac.uk COMPUTER SCIENCE STAFF - RESEARCH INTERESTS A. Gammerman, BSc, PhD St Petersburg (Head of Department) Algorithmic randomness, Kolmogorov complexity, induction and transduction, applications. A. Chervonenkis, BSc, PhD Moscow Mathematical statistics, pattern recognition, learning theory, ore deposit modelling. D.A. Cohen, BA, DPhil Oxon Mathematics within Computer Science, theory of neural networks, constraint satisfaction problems. A.R. Davies, BSc, MSc Lond (Deputy Head of Department) Computer modelling of large scale scientific problems, laser optics, wave guides. A. Fukshansky, Dipl Math, Dr rer nat Freiburg Mathematics in computer science, bioinformatics/biomathematics. Z.G. Gutin, BSc Gomel, PhD Tel Aviv Graph theory and algorithms, combinatorial optimisation, linear and integer programming, bioinformatics. J.M. Hancock, BSc London, PhD Edinburgh Bioinformatics, molecular evolution, genome analysis, repetitive sequences. E.I. Hogger, BSc, MSc Lond, BA Open, ARCS, DIC Machine learning and its relationship to non-monotonic logic, intelligent knowledge-based systems, expert systems. A.I.C. Johnstone, BSc, PhD Lond, CEng, MBCS, MIEE Multiprocessor systems for real-time machine vision, language design for multiprocessor and array processor systems, VLSI implementation of image processing algorithms, advanced processors. C. Saunders, BSc, PhD Lond Machine learning algorithms, kernel methods, transductive inference, fault diagnosis, text analysis.S.A. Schneider, BA, DPhil Oxon Process algebra, concurrency theory, real-time systems, formal methods, computer security. E.A. Scott, BSc, DPhil Oxon Theoretical computer science, compiler theory, language analysis and design, termination theory, automated theorem proving, machine learning. J.S. Shawe-Taylor, PhD, MSc Lond, DIC, CMath, FIMA Computational learning theory, the mathematical analysis of kernel-based learning methods. H.E. Treharne, BSc, MSc, PhD Lond Safety critical software, formal methods, software metrics. V.N. Vapnik, BSc, PhD, DSc Moscow Pattern recognition, statistical analysis, support vector machine. V. Vovk, BSc, PhD Moscow Limits of machine learning: predictive complexity, randomness and information; inductive and transductive inference. C. Watkins, MA, PhD Cantab Reinforcement learning, computational learning theory, mathematical finance. RESEARCH COMPUTATIONAL LEARNING RESEARCH CENTRE The Centre was established in January 1998 to provide a focus for fundamental research and commercial and industrial applications in the fields of computer learning, including inductive/transductive inference and universal prediction. The current research topics are Universal Prediction, Support Vector method, Probabilistic Reasoning, the theory of Kolmogorov and predictive complexity, on-line prediction with expert advice, transductive inference and computational finance. Members of the Centre are Alex Gammerman (Director), Alexey Chervonenkis, Craig Saunders, Vladimir Vapnik, Volodya Vovk and Chris Watkins, with their Research Assistants and PhD students, and Visiting Professors are Jorma Rissanen, Chris Wallace, Glenn Shafer, Leonid Levin, Ray Solomonoff and Vladimir V'yugin. THEORETICAL COMPUTING GROUP The group is concerned with the modelling of computing systems and application areas in order to derive principled and practically effective solution strategies. Such modelling ensures that the development of solutions is guided by a scientific and well-founded analysis, which can provide practical guidance into the scaling and extent of their applicability. The research strands include: NEURAL AND COMPUTATIONAL LEARNING Members of this area work closely with the CLRC and their research is centred around analysis of neural networks including work on a novel digital neural chip, large margin algorithms and analysis, and relations between the Bayesian and probably approximately correct (pac) models. The research group has collaborated in ESPRIT funded research in the mobile telecoms area and currently co-ordinates an ESPRIT funded Working Group Neurocolt2. More recently they have received funding for an EU project 'Kernel Methods for Images and Text' (KerMIT) involving Reuters, Xerox and three other university sites.CONSTRAINTS Constraint satisfaction problems have been studied in the Department since 1989, supported by the EPSRC, the DTI, the Royal Society, the British Council, and the Nuffield Foundation. The group is currently collaborating with the government Radiocommunications Agency, and Vodafone Ltd, who support a PhD student. Theoretical research is currently focused on developing the mathematical methods that are needed to classify different types of constraints. A new approach to computational complexity theory, which makes use of tools from algebra and logic, is being investigated. On the applications side, radio frequency planning particularly in regard to the growth of mobile telecommunications, and large scale scheduling problems, such as the planning of manufacturing processes and the design of airline timetables, are of particular interest. FORMAL METHODS The main interests of the Formal Methods group concern the theory and application of formal methods to security- and safety-critical systems, with particular focus in the areas of concurrency and combining methods. The group has an international reputation in this field and is substantially supported by external funding. The group maintains active links with industry, including Motorola, the Defence and Evaluation Research Agency, and SRI. LANGUAGES AND ARCHITECTURES Research in the area of language translation and compiler theory aims to provide sound theory linked to approachable toolsets for work in software/hardware co-design, language implementation and reverse engineering. We have made contributions in the areas of general parsing; semi-automatic construction of intermediate forms and control flow graphs; control flow analysis for environments with poorly defined notions of procedure call; reverse compilation from assembler source to high level languages; mixed implementation of conventional processors, Digital Signal processors and Field Programmable Gate Arrays; and the development of toolsets for non-specialist users. BIOINFORMATICS The volume and variety of data coming out of current research in molecular biology contain the answers to many scientific questions and the keys to many medical advances. To answer the new types of question that are being asked, new computational techniques are needed, and machine learning methods are now becoming standard in computational biology. Techniques developed in the Department are now being applied to the analysis of biological sequences. In particular, support vector machine kernels suitable for the comparison and classification of protein and DNA sequences are being developed with a view to providing predictions of the structure and function of proteins from their sequences. The Post Delimiting Kernal-based learning Assisting Professor John Shawe-Taylor with the above research project, the aim of which is to analyse the limitations of kernal-based learning methods. Learning for Adaptable Visual Assistants=20 Assisting Professor John Shawe-Taylor with the above research project, the aim of which is developing kernal based methods for the analysis if images in collaboration with the other partners on the project. General Any other duties or responsibilities as the department may reasonably require. Opportunity to assist with undergraduate lectures and tutorials, assistance with postgraduate supervision. The PersonCandidates for all three positions will preferably have a doctorate in Computer Science or Mathematics. They will be expected to be able to demonstrate significant research accomplishment in a relevant field, including published work in high quality journals. Experience with kernal methods is desirable. Enquiries Please contact John Shawe-Taylor by e-mail at jst at cs.rhul.ac.uk for more information. The Appointment The first vacancy is for an EPSRC funded project entitled, 'Delimiting Kernel-based Learning Methods'. It spans work on the learning-theoretic analysis of kernel-based methods, kernel-theoretic work, and algorithmic design. A background working in the theoretical analysis of learning methods is very desirable for this position. Salary is up to =A324,656 inclusive of London Allowance Ref: KB/2053. The 2 remaining vacancies are for a project that involves developing kernel-based methods for the analysis of images, Learning for Adaptable Visual Assistants. The project is financed by the EU and also involves partners in Cambridge (Xerox), France (INRIA, Grenoble and University of Rennes), Austria (Technical University of Graz), Switzerland (IDIAP) and Sweden (University of Lund). We are seeking researchers with experience in applications of machine learning with a preference for knowledge of image processing, and a strong programming background. Experience with kernel methods is desirable but not required. Salary is in the range =A320,865 to =A328,625 per annum inclusive of London Allowance Ref: KB/2052. The appointments will be made on 12 months probation. There is also an annual appraisal scheme. Payment for salaries will be monthly, in arrears, by credit transfer into your bank account. Payment will usually be made on the 27th of the month or before if the 27th falls on a weekend or Public or College holiday. The normal date for the review of salaries is 1 April. Where a salary increment is due it will be payable on 1st August in each calendar year. Where a member of staff is appointed between 1st February and 31 July inclusive the first increment will be payable on 1 August in the following calendar year. This post is superannuable under the USS (Universities Superannuation Scheme). The College currently contributes 14% of salary and individuals contribute 6.35%. 27 days annual leave are attached to the post. In addition you will receive the 6 discretionary days which are granted and shared between Easter and Christmas when the College is closed and public holidays. ApplicationsApplications should be made using the standard application form obtainable from the Personnel Department (address below), to which should be appended: (I) a full curriculum vitae; (ii) the names and addresses of two or more academic referees; (iii) a statement of current research activities and areas of interest; (iv) a list of publications. These should be sent by 17th April 2002 to Personnel, Royal Holloway, University of London, Egham, Surrey TW20 0EX, UK (Tel: 01784 414241; Fax: 01784 473527; E-mail: s.watson at rhbnc.ac.uk).Extract from equal opportunities policy The only consideration in recruitment of employees will be how the genuine requirements of the post are met or likely to be met by the individual under consideration. These requirements, including retirement at the appropriate age, being met, no regard will be taken (except where legally required) of that person's race, sex, age, marital status, number of children, physical disability or beliefs or lawful preferences, privately held, on any matter including religion, politics and sex. =20 Information for Applicants with Special Needs Royal Holloway encourages and welcomes applications from people with disabilities and special needs. However it is important to note that the Campus is built on a hill side and that many buildings are of older construction and are not fully accessible. If you have a disability or a special need and wish to discuss any practical aspects of your application, please contact the Personnel Office in confidence on +44 (0) 1784 414058. General In an effort to provide a healthy and comfortable working environment, smoking is prohibited in public areas and in shared occupancy rooms. Full details of the Smoking Policy are available from the College Safety Officer. Staff have full use of the dining facilities on campus; there is also a College Shop which offers a wide range of goods. A National Westminster bank is on site together with a Waterstones bookshop. There is an independently run Nursery (for which a charge is made) located on the College campus for children over the age of two years and up to the age of five years. Designated areas of car parking are allocated to staff. Permits are required to allow parking on campus. A bus service is in operation from Egham Railway Station and may be used by staff in addition to students. The service operates in conjunction with trains arriving from Waterloo and Reading. Tickets for this bus service must be bought in advance and are available from the College Shop, Athlone Hall and Kingswood Hall. This bus service operates during term-time only. Internal promotion is encouraged by the college and all vacancies are advertised on internal noticeboards and circulated to departments. From wahba at stat.wisc.edu Tue Mar 26 13:35:07 2002 From: wahba at stat.wisc.edu (Grace Wahba) Date: Tue, 26 Mar 2002 12:35:07 -0600 (CST) Subject: responses to textbook survey Message-ID: <200203261835.MAA11615@hera.stat.wisc.edu> I teach a graduate level course in the Statistics Department on Statistical Modeling based on solving optimization problems in a Reproducing Kernel Hilbert Space (now called `kernel methods'). It attracts occasional graduate students from CS and Engineering. I use my book `Spline Models for Observational Data' (1990). This fall I will probably include parts of Smola and Schoelkopf's new book on SVM's, Hastie, Tibshirani and Friedman on Statistical Learning, and Chong Gu's new book on Smoothing Spline ANOVA. From cindy at bu.edu Tue Mar 26 14:22:56 2002 From: cindy at bu.edu (Cynthia Bradford) Date: Tue, 26 Mar 2002 14:22:56 -0500 Subject: Neural Networks 15(2) Message-ID: <200203261922.g2QJMui01212@cns-pc75.bu.edu> NEURAL NETWORKS 15(2) Contents - Volume 15, Number 2 - 2002 ------------------------------------------------------------------ INVITED ARTICLE: Synapses as dynamic memory buffers Wolfgang Maass and Henry Markram CONTRIBUTED ARTICLES: ***** Psychology and Cognitive Science ***** Learning the parts of objects by auto-association Xijin Ge and Shuichi Iwata ***** Neuroscience and Neuropsychology ***** Prospective control of manual interceptive actions: Comparative simulations of extant and new model constructs Joost C. Dessing, Daniel Bullock, C. (Lieke) E. Peper, and Peter J. Beek Temporal dynamics of binocular disparity processing with corticogeniculate interactions Stephen Grossberg and Alexander Grunewald A mathematical analysis of the development of oriented receptive fields in Linsker's model Tadashi Yamazaki ***** Mathematical and Computational Analysis ***** A general framework for neural network models on censored survival data Elia Biganzoli, Patrizia Boracchi, and Ettore Marubini MCMAC-CVT: A novel on-line associative memory based CVT transmission control system K.K. Ang, C. Quek, and A. Wahab A methodology to explain neural network classification Raphael Feraud and Fabrice Clerot ***** Engineering and Design ***** A neuro-fuzzy framework for inferencing Sukumar Chakraborty, Kuhu Pal, and Nikhil R. Pal Nonlinear Fisher discriminant analysis using a minimum squared error cost function and the orthogonal least squares algorithm Steve A. Billings and Kian L. Lee ***** Technology and Applications ***** Solving large scale traveling salesman problems by chaotic neurodynamics Mikio Hasegawa, Tohru Ikeguchi, and Kazuyuki Aihara LETTERS TO THE EDITOR Comments for Leung, Wong, Sum, and Chan (2001) Pierre van de Laar Response to comments for Leung, Wong, Sum, and Chan (2001) C.-S. Leung CURRENT EVENTS ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 (regular) Y 13,000 (regular) Neural Networks (plus 2,000 enrollment fee) $20 (student) SEK 460 (student) Y 11,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- membership without $30 SEK 200 not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Takashi Nagano Faculty of Engineering Hosei University 3-7-2, Kajinocho, Koganei-shi Tokyo 184-8584 Japan 81 42 387 6350 (phone and fax) jnns at k.hosei.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From T.J.Prescott at sheffield.ac.uk Wed Mar 27 08:04:29 2002 From: T.J.Prescott at sheffield.ac.uk (Tony Prescott) Date: Wed, 27 Mar 2002 13:04:29 +0000 Subject: Robotics as Theoretical Biology Workshop Message-ID: (apologies for repeat postings) FIRST CALL FOR PARTICIPATION WORKSHOP ON ROBOTICS AS THEORETICAL BIOLOGY AUGUST 10TH, 2002, EDINBURGH, SCOTLAND. http://www.shef.ac.uk/~abrg/sab02/index.shtml PART OF SAB '02: THE 7TH MEETING OF THE INTERNATIONAL SOCIETY FOR SIMULATION OF ADAPTIVE BEHAVIOR AUGUST 4TH-11TH, 2002, EDINBURGH, SCOTLAND. http://www.isab.org.uk/sab02/ WORKSHOP AIMS This workshop seeks to bring together researchers in robotics and biology who are interested in developing robot models of the biological systems underlying animal behavior. The scope of the workshop will include: * Evaluating current progress in applying robot models to biological questions * Identifying areas of biology where robots could make a future contribution * Exploring methodological issues * Considering how better ties can be forged between the robotics and biological research communities * Providing a forum for exhibiting current research and work in progress FORMAT OF THE WORKSHOP This will be a one-day workshop consisting of a mixture of invited talks and discussion sessions (see PROGRAM below). The workshop will include three sessions of invited talks: one on invertebrate behavior, one on vertebrate behavior, and one on collective behavior. For each session there will be two invited speakers, one a distinguished researcher from a (primarily) biological background, the other from a robotics background. Each invited speaker will be asked to address the question "How can robotics contribute to theoretical biology?" from the perspective of their own field and drawing on the experience of their own research. Each session will also conclude with a discussion involving the two speakers and a panel of other prominent researchers in the field. The workshop will end with a discussion about building stronger ties between the biological and robotics research communities. Representatives from funding bodies and from journal editorial boards will be asked to join the panel for this discussion. All attendees at the workshop are invited to BRING POSTERS DESCRIBING THEIR OWN RESEARCH for people to browse during scheduled coffee and lunch breaks (see CONTRIBUTING TO THE WORKSHOP below). WORKSHOP ORGANISERS The workshop is co-organised by: Tony Prescott, Adaptive Behaviour Research Group, University of Sheffield, UK, Barbara Webb, Department of Psychology, University of Stirling, UK. WORKSHOP PROGRAM Session I: Invertebrate behavior Chair: Barbara Webb Talk 1: Roy Ritzmann (Professor of Biology, Case Western Reserve University, US) Talk 2: Roger Quinn (Professor of Mechanical Engineering, Case Western Reserve University, US) Discussion (Panel to be confirmed) Session II: Vertebrate behavior Chair: Tony Prescott Talk 3: Peter Redgrave (Professor of Neuroscience, University of Sheffield, UK) Talk 4: Angelo Arleo (Research Fellow, College de France, Paris, France) Discussion (Panel to be confirmed) Session III: Collective behavior Chair: To be confirmed. Talk 5: Nigel Franks (Professor of Biological Sciences, University of Bristol). Talk 6: Chris Melhuish (Director of the Intelligent Autonomous Systems Laboratory, University of the West of England, UK) Discussion (Panel to be confirmed) Session IV: Building ties between robotics and biology Chair: To be confirmed Discussion (Panel to be confirmed but to include representatives from research funding bodies, research societies, and journal editorial boards) CONTRIBUTING TO THE WORKSHOP All attendees are invited (but not required) to bring posters, and robot demos, of relevant research to the workshop. Posters may have been previously exhibited at other recent workshops/conferences (including SAB '02), or may describe new research that has not been exhibited elsewhere. If you wish to show your work you are requested to SUBMIT AN ABSTRACT OF 200-400 WORDS to the workshop organisers before the 1ST OF MAY 2002. Submissions will not be reviewed however they will be vetted for relevance, and work that falls outside the scope of the workshop may be refused. Potential contributors should note that the focus of the workshop is on the possible contribution of robotics to biology, rather than the converse (the contribution of biology to robotics). Posters describing biologically-inspired robots, computer simulation work, or straight biological research are acceptable where they can be shown to have relevance to the understanding or development of future robot models in biology. Please make the biological relevance clear in your abstract. Attendees who wish to exhibit robots should also submit an abstract and should contact the organizers with full details of the demonstration and any technical requirements. There may be a limit on the number of available poster spaces, and there may also be space restrictions on robot demonstrations. If these limits become an issue then contributions will be prioritized according to the order in which they were received. Please note, we are NOT inviting submissions for oral presentations, however, the program is organized to include discussion sessions which will certainly be open to input from all attendees. For enquiries and for ELECTRONIC SUBMISSION OF ABSTRACTS please email: t.j.prescott at shef.ac.uk including "SAB '02" in the subject line. PROCEEDINGS There will be a proceedings booklet for attendees that will include the abstracts of invited talks and accepted posters/demos, contact details for all attendees, and other pertinent information. Attendees with accepted abstracts will be asked to contribute 1 page of 'camera-ready' material to the proceedings (formatting instructions to be published here at a later date). BOOKING INFORMATION Please book through the SAB 'O2 web-site (http://www.isab.org.uk/sab02/). Booking information should be available there soon. Note, it should be possible to register for the workshop without registering for the full conference. FOR MORE INFORMATION Please visit the workshop home-page: http://www.shef.ac.uk/~abrg/sab02/index.shtml. From vato at dibe.unige.it Wed Mar 27 06:26:50 2002 From: vato at dibe.unige.it (Alessandro Vato) Date: Wed, 27 Mar 2002 12:26:50 +0100 Subject: NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL Message-ID: <008101c1d582$4a034e80$6859fb82@Nathannever> NE.W.S. : NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOLJune 10 - 13, 2002 University of Genova, Italy http://www.bio.dibe.unige.it Organized by: Prof. Sergio Martinoia Neuroengineering and Bio-nanoTechnologies Group, Department of Biophysical and Electronic Engineering (DIBE), University of Genova Prof. Pietro Morasso Department of Communications, Computer and System Sciences (DIST), University of Genova Funded by the University of Genova MAIN GOAL: to understand , modify and use brain plasticity in order to advance Neuroscience at the network level and to inspire new computer architectures. Scientific background: Bioengineering, Electronics, Informatics, Neuroscience. How to reach the main goal: * By interfacing in vitro neurons to standard and microelectronic transducers capable to monitor and modify the neuron electrophysiological activity * By creating hybrid neuro-electronic systems * By developing neuro-prostheses * By computer simulating plasticity at the network level * By developing neuromorphic silicon neurons INVITED SPEAKERS: Fabio Babiloni, Department of Human Physiology and Pharmacology, University of Rome "La Sapienza", Rome, Italy Paolo Dario, Sant'Anna School of University Studies and Doctoral Research, Pisa, Italy Alain Destexhe, Unite de Neuroscience Integratives et Computationelles, CNRS,Gif-sur-Yvette, France Stefano Fusi, Institute of Physiology, University of Bern, Bern, Switzerland Michele Giugliano, Institute of Physiology, University of Bern, Bern, Switzerland Giacomo Indiveri, Institute for Neuroinformatics, ETH / University of Zurich, Switzerland Leandro Lorenzelli, Istituto Trentino di Cultura, Centre for Scientific and Technological Research Microsystems Division,Trento,Italy Giorgio Metta, Humanoid Robotics Group, MIT - Artificial Intelligence Lab, Cambridge (MA), USA Sandro Mussa-Ivaldi, Department of Physiology Northwestern University, Medical School, Chicago (IL), USA Miguel Nicolelis, Department of Neurobiology, Duke University, Durham (NC), USA Rolf Pfeifer, Department of Information Technology, University of Zurich, Switzerland Thomas Sinkjaer, Centre for Sensory-Motor Interaction, Aalborg University, Denmark Gielen Stan, Department of Medical Physics and Biophysics, University of Nijmegen, The Netherlands Stefano Vassanelli, Department of Anatomy and Human Physiology, University of Padova, Italy REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below and send it by email to news2002 at bio_nt.dibe.unige.it . REGISTRATION FORM (Please send it by email to: news2002 at bio_nt.dibe.unige.it ) NE.W.S. : NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL June 10 - 13, 2002 University of Genova, Department of Biophysical and Electronic Engineering V. Opera Pia 11a, 16145 Genova Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ Registration fee: CHECK ONE: ( ) Euro 50 Registration Fee (Student) ( ) Euro 100 Registration Fee (Regular) PREFERRED METHOD OF PAYMENT : [ ] Bank transfer: Bank CARIGE, Agency 41, ABI:6175, CAB: 1472 c/c DIBE - University of Genova 5341/90. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (VISA), check or cash at the very beginning of the workshop. -------------------------------------------------------------------- Alessandro Vato Neuroengineering and Bio-nanoTechnology - NBT Department of Biophysical and Electronic Engineering - DIBE Via Opera Pia 11A, 16145, GENOVA, ITALY Tel. +39-10-3532765 Fax. +39-10-3532133 URL: http://www.bio.dibe.unige.it/ From levine at uta.edu Wed Mar 27 17:54:06 2002 From: levine at uta.edu (Daniel S Levine) Date: Wed, 27 Mar 2002 16:54:06 -0600 Subject: Textbook examination copies Message-ID: <6A467F62A0D2D51194FE0004AC4CA556E7EA52@exchange.uta.edu> Dear fellow Connectionists, A free examination copy of my textbook, Introduction to Neural and Cognitive Modeling (2nd ed., 2000), is available to anyone requesting it for teaching purposes from Bill Webber at Lawrence Erlbaum Associates. His e-mail is bwebber at erlbaum.com. Best wishes, Dan Levine levine at uta.edu From bower at uthscsa.edu Wed Mar 27 20:27:45 2002 From: bower at uthscsa.edu (James Bower) Date: Wed, 27 Mar 2002 19:27:45 -0600 Subject: GUM*2002 Message-ID: CALL FOR PAPERS First Annual GENESIS Users Meeting GUM*2002 November 8,9,10 San Antonio Texas The first annual GENESIS Users Meeting will take place this fall in beautiful San Antonio, Texas. Unique in its structure, this meeting will combine introductory, intermediate, and advanced tutorials in the use of GENESIS with a full agenda of scientific presentations focused on the study of biological systems using realistic modeling techniques. The meeting's overall objective will be to promote communication and collaboration between GENESIS users and others involved in realistic biological modeling and to provide an introduction to other interested scientists. GENESIS Tutorials. The schedule on Friday, November 8th, will be devoted to tutorials organized at several different levels. The introductory tutorial will be lead by Dr. David Beeman, who has been providing introductory instruction in GENESIS use since the first course in Computational Neuroscience at the Marine Biological Laboratory in the summer of 1989, and more recently with the European Course in Computational Neuroscience. Students, Postdoctoral Fellows or Faculty interested in realistic modeling techniques are encouraged to attend. Concurrent tutorials will also be offered on Friday for intermediate and advanced GENESIS users. Scientific Program Saturday and Sunday, November 9th and 10th, will be devoted to scientific and technical presentations related to realistic simulations of biological systems. Appropriate presentations include scientific results from realistic modeling efforts, presentations on technical aspects of simulator use and development, and presentations describing biological systems felt to be ripe for simulation and modeling. While research using GENESIS is especially encouraged, presentations based on other simulation systems are also welcomed. The organization of the scientific program will also be somewhat unique. The morning sessions will be devoted to 15 minute oral presentations from each of the contributing authors. This will provide all attendees exposure to the full range of work presented at the meeting. The afternoon will then be devoted to more detailed discussion of results in a poster/demonstration format. Community Building In addition to the educational and scientific aspects of meeting, opportunities will also be provided for more relaxed interactions between participants. San Antonio Texas is famous for its River Walk (http://www.thesanantonioriverwalk.com/index.asp) where the meeting banquet will take place on Saturday evening. The San Antonio/Austin area of South Texas is also famous for its indigenous music scene - recently added to by Ramon and the K-Halls, who, it is rumored, are already putting pressure on meeting organizers for an exclusive contract. In other words, a good time will be had by all! Society for Neuroscience Attendees Likely attendees of this year's Society for Neuroscience meeting in Orlando Florida should note that GUM*02 is scheduled the weekend after the neuroscience meeting, and Texas is not very far from Florida and much more interesting and real than Orlando. Meeting Registration Costs Registration for GUM*02 will be $99 for students and $159 for non-students. Extra banquet tickets will be available for $35. Pre-Registration Because this is the first GUM, we would ask that interested participants pre-register by sending email to the GENESIS users group at babel at genesis-sim.org. There is no obligation, but pre-registration will allow the organizers to gauge the likely number of participants. With that information, the organizers will be able to arrange special rates for housing. Registration and Request for Presentations Meeting and tutorial registration will open on August 1st. At that time authors will be asked to provide an abstract for their presentations. We hope to see you in 'ol San Antone this Fall -- James M. Bower Ph.D. Research Imaging Center University of Texas Health Science Center at San Antonio 7703 Floyd Curl Drive San Antonio, TX 78284-6240 Cajal Neuroscience Center University of Texas San Antonio Phone: 210 567 8080 Fax: 210 567 8152 From hong at deakin.edu.au Thu Mar 28 00:42:05 2002 From: hong at deakin.edu.au (hong) Date: Thu, 28 Mar 2002 16:42:05 +1100 Subject: ICONIP'02-SEAL'02-FSKD'02 Call for Papers Message-ID: <3CA2AD2D.39566521@deakin.edu.au> [Apologies if you receive this announcement more than once.] ---------------------------------------------------------------------- 9th International Conference on Neural Information Processing (ICONIP'02) 4th Asia-Pacific Conference on Simulated Evolution And Learning (SEAL'02) International Conference on Fuzzy Systems and Knowledge Discovery (FSKD'02) ---------------------------------------------------------------------- November 18 - 22, 2002, Orchid Country Club, Singapore http://www.ntu.edu.sg/home/nef Organized by: School of Electrical and Electronic Engineering Nanyang Technological University, Singapore Sponsored by: Asia-Pacific Neural Network Assembly SEAL & FSKD Steering Committees Singapore Neuroscience Association In Co-Operation with: IEEE Neural Network Society International Neural Network Society European Neural Network Society SPIE Supported by: Lee Foundation Singapore Exhibition & Convention Bureau ~~~~~~~~ CALL FOR PAPERS, SPONSORSHIPS, AND SPECIAL SESSION PROPOSALS ~~~~~~~~ ICONIP'02, SEAL'02, and FSKD'02 will be jointly held in Orchid Country Club, Singapore from November 18 to 22, 2002. The conferences will not only feature the most up-to-date research results in natural and arti- ficial neural systems, evolutionary computation, fuzzy systems, and knowledge discovery, but also promote cross-fertilization over these exciting and yet closely-related areas. Registration to any one of the conferences will entitle a participant to the technical sessions and the proceedings of all three conferences, as well as the conference banquet, buffet lunches, and tours to two of the major attractions in Singapore, i.e., Night Safari and Sentosa Resort Island. Many well- known researchers will present keynote speeches, panel discussions, invited lectures, and tutorials. About Singapore --------------- Located at one of the most important crossroads of the world, Singapore is truly a place where East and West come together. Here you will find Chinese, Indian, and Malay communities living together, their long established cultures forming a unique backdrop to a clean and modern garden city. English is spoken everywhere and is the common business language of all. Few places on earth promise such a delight for the palate, with gourmet cuisine from over 30 countries. Exotic resorts in neighboring countries are only a short bus/ferry ride away. Orchid Country Club (OCC) ------------------------- The venue for this year's conferences is at one of Singapore's premier country clubs, a 25-minute bus ride from the city. Away from the hustle and bustle of downtown Singapore, the tranquil setting of the resort is ideal for serious technical discussions with an accommodating space and ambience for relaxation. Not to miss out on the splendor of downtown Singapore, the organizer has also secured good quality and affordable accommodation in the heart of the city with pre-arranged transport to/from the OCC. For golf enthusiasts, OCC is equipped with the largest computerized driving range in South East Asia and boasts of a 27-hole golf course with facilities for night golfing, ideal for relaxation after each day of technical discussions. Visit the OCC website at http://www.orchidclub.com Night Safari and Sentosa Resort Island -------------------------------------- It is said that a visit to Singapore is not complete without making a trip to two of the Republic's famous attractions. The only one of its kind in the world, the Night Safari provides a setting for visitors to experience what it is like to observe animals in their nocturnal habitat. The island of Sentosa offers some unique attractions and a visit there will also provide a glimpse and imagery of Singapore's past and present. Visits to these two attractions will be included as recreation for the joint conference. (Websites: http://www.zoo.com.sg/safari/, http://www.sentosa.com.sg) Topics of Interest ------------------ The joint conferences welcomes paper submissions from researchers, practitioners, and students worldwide in but not limited to the following areas. ICONIP'02: ~~ ARTIFICIAL NEURAL MODELS - Learning algorithms, Neural modeling and architectures, Neurodynamics NATURAL NEURAL SYSTEMS - Neuroscience, Neurobiology, Neuro- physiology, Brain imaging, Learning and memory COGNITIVE SCIENCE - Perception, emotion, and cognition, Selective attention, Vision and auditory models HARDWARD IMPLEMENTATION - Artificial retina & cochlear chips HYBRID SYSTEMS - Neuro-fuzzy systems, Evolutionary neural nets, etc APPLICATIONS - Bioinformatics, Finance, Manufacturing, etc. SEAL'02: ~~ THEORY - Co-evolution, Coding methods, Collective behavior METHODOLOGY - Evolution strategies, Genetic algorithms, Genetic programming, Molecular and quantum computing, Evolvable hardware, Multi-objective optimization, Ant colony, Artificial ecology EVOLUTIONARY LEARNING - Artificial life, Bayesian evolutionary algorithms HYBRID SYSTEMS - Evolutionary neuro-fuzzy systems, Soft computing APPLICATIONS - Scheduling, Operations research, Design, etc FSKD'02: ~~ THEORY AND FOUNDATIONS - Fuzzy theory and models, Uncertainty management, Statistical & probabilistic data mining, Computing with words, Rough sets, Intelligent agents METHODS AND ALGORITHMS - Classification, Clustering, Information retrieval & fusion, Data warehousing & OLAP, Fuzzy hardware, Visualization, Decision trees, Data preprocessing HYBRID SYSTEMS - Evolutionary neuro-fuzzy systems, Soft computing APPLICATIONS - Control, Optimization, Natural language processing, Forecasting, Human-computer interaction, etc. Special Sessions ---------------- The conferences will feature special sessions on specialized topics to encourage in-depth discussions. To propose a special session, email the session title, name of the conference under which the special session will be organized, contact information of the organizer(s), and a short description on the theme and topics covered by the session to Xin Yao, Special Sessions Chair (x.yao at cs.bham.ac.uk), with a copy to Lipo Wang, General Chair (Cc: elpwang at ntu.edu.sg). Sponsorship ----------- The conferences will offer product vendors a sponsorship package and/or an opportunity to interact with conference participants. Product demonstration and exhibition can also be arranged. For more information, please visit the conference website or contact Tong Seng Quah, Sponsorship/Exhibition Chair (itsquah at ntu.edu.sg), with a copy to Lipo Wang, General Chair (Cc: elpwang at ntu.edu.sg). Keynote Speakers ---------------- Shun-ichi Amari, RIKEN Brain Science Institute, Japan David Fogel, Natural Selection, Inc., USA Mitsuo Kawato, ATR, Japan Xin Yao, The University of Birmingham, UK Lotfi A. Zadeh, University of California, USA Registration Fee ---------------- The registration fee for regular participants before August 15, 2002 is S$680 (approximately US$370 as at February 6, 2002), which includes the proceedings, lunches, banquet, and tours. Submission of Papers -------------------- Authors are invited to submit electronic files (postscript, pdf or Word format) through the conference home page. Papers should be double-column and use 10 pt Times Roman or similar fonts. The final version of a paper should not exceed 5 pages in length. A selected number of accepted papers will be expanded and revised for possible inclusion in edited books and peer-reviewed journals, such as "Soft Computing" and "Knowledge and Information Systems: An International Journal" by Springer-Verlag. Important Dates --------------- Paper/Summary Deadline : April 30, 2002 Notification of Acceptance : July 15, 2002 Final Paper/ Registration : August 15, 2002 Honorary Conference Chairs -------------------------- Shun-ichi Amari, Japan Hans-Paul Schwefel, Germany Lotfi A. Zadeh, USA International Advisory Board ---------------------------- Sung-Yang Bang, Korea Meng Hwa Er, Singapore David B. Fogel, USA Toshio Fukuda, Japan A. Galushkin, Russia Tom Gedeon, Australia Zhenya He, China Mo Jamshidi, USA Nikola Kasabov, New Zealand Sun-Yuan Kung, USA Tong Heng Lee, Singapore Erkki Oja, Finland Nikhil R. Pal, India Enrique H. Ruspini,USA Harcharan Singh, Singapore Ah Chung Tsoi, Australia Shiro Usui, Toyohashi, Japan Lei Xu, China Benjamin W. Wah, USA Donald C. Wunsch II, USA Xindong Wu, USA Youshou Wu, China Yixin Zhong, China Jacek M. Zurada, USA Advisor ------- Alex C. Kot, Singapore General Chair ------------- Lipo Wang, Singapore Program Co-Chairs ----------------- ICONIP'02: Kunihiko Fukushima, Japan Soo-Young Lee, Korea Jagath C. Rajapakse, Singapore SEAL'02: Takeshi Furuhashi, Japan Jong-Hwan Kim, Korea Kay Chen Tan, Singapore FSKD'02: Saman Halgamuge, Australia Special Sessions: Xin Yao, UK Finance Chair ------------- Charoensak Charayaphan, Singapore Local Arrangement Chair ----------------------- Meng Hiot Lim, Singapore Proceedings Chair ----------------- Farook Sattar, Singapore Publicity Co-Chairs ------------------- Hepu Deng, Australia Chunru Wan, Singapore Li Weigang, Brazil Zili Zhang, Australia Sponsorship/Exhibition Chair ---------------------------- Tong Seng Quah, Singapore Tutorial Chair -------------- P. N. Suganthan, Singapore For More Information -------------------- Please visit the conference home page or contact: Lipo Wang, ICONIP'02-SEAL'02-FSKD'02 General Chair School of Electrical and Electronic Engineering Nanyang Technological University Block S2, 50 Nanyang Avenue, Singapore 639798 Email: elpwang at ntu.edu.sg Phone: +65 6790 6372 Fax: +65 6792 0415 Conference Secretariat ---------------------- ICONIP'02-SEAL'02-FSKD'02 Secretariat Conference Management Center/CCE, NTU Administration Annex Building #04-06 42 Nanyang Avenue, Singapore 639815 Email: nef at ntu.edu.sg Fax: +65 6793 0997 From steve at cns.bu.edu Sun Mar 3 13:14:04 2002 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sun, 3 Mar 2002 13:14:04 -0500 Subject: postdoc in neural modeling of vision and recognition Message-ID: POSTDOCTORAL FELLOW DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS BOSTON UNIVERSITY A postdoctoral fellow is sought to join the Boston University Department of Cognitive and Neural Systems (CNS) and Center for Adaptive Systems (CAS), which model how the brain controls behavior, and how technology can emulate biological intelligence. The fellowship is available immediately for a minimum term of two years. The postdoc would help to develop models of biological vision and object recognition, as well as to adapt these discoveries to image processing applications. Recent research in CNS proposes how the laminar architecture of cerebral cortex leads to biological intelligence. These models have clarified how bottom-up, top-down, and horizontal interactions work together in the visual cortex to control development, learning, attention, perceptual grouping, stereopsis, and 3-D surface perception. New projects will further develop these topics with applications to problems in figure-ground perception, recognition learning and categorization, scene understanding, and cognitive information processing. The postdoc will collaborate with Professors Gail Carpenter and Stephen Grossberg on basic research and technology transfer efforts. Candidates should have substantial training in developing neural network models of biomimetic vision/image processing and/or adaptive pattern recognition. CNS and CAS offer excellent opportunities for broadening knowledge of biological and technological neural modeling and applications, as summarized at http://www.cns.bu.edu. Boston University is an Equal Opportunity/Affirmative Action Employer. Please send a curriculum vitae, 3 letters of recommendation, and illustrative research articles to: Postdoctoral Search, Room 203, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA 02215. From sok at cs.york.ac.uk Mon Mar 4 08:21:56 2002 From: sok at cs.york.ac.uk (Simon O'Keefe) Date: Mon, 04 Mar 2002 13:21:56 +0000 Subject: PhD position - binary neural networks Message-ID: <3C8374F4.BD7F047A@cs.york.ac.uk> PhD research position available - Satellite Image Classification by Binary Neural Networks Dr Simon O'Keefe Advanced Computer Architectures Group Department of Computer Science University of York http://www.cs.york.ac.uk/arch The Advanced Computer Architectures Group invites applications for an EPSRC-funded studentship to study the application of the AURA neural network (http://www.cs.york.ac.uk/arch/nn/aura.html) to the analysis of satellite images. The studentship is available immediately. The principal area of research is into pixel classification in satellite images, concentrating on the methodological issues surrounding representation of data to the network, and the characterisation of the network's generalisation behaviour. Demonstration of the classification abilities of the network is as important as the development of the theoretical framework, and test data will be available from a number of sources, including satellite images but also other types of data. Applicants should have a good first degree in computer science, mathematics/statistics or a related discipline. Informal queries should be addressed to Dr Simon O'Keefe (email: sok at cs.york.ac.uk), but note that formal applications must be received by 18th March 2002. Applicants should contact the Graduate Secretary at the address below for an application form. Graduate Secretary Department of Computer Science University of York Heslington YORK YO10 5DD United Kingdom Tel: +44 1904 432721 E-mail: grad-secretary at cs.york.ac.uk This studentship covers maintenance, and tuition at the UK/EU rate. Applicants from outside the EU will need to pay the balance of the fees. Funding rules dictate that the start date for the studentship must be NO LATER THAN 1st MAY 2002. Because of the short timescales, applications should be returned DIRECTLY TO THE GRADUATE SECRETARY AND NOT VIA THE UNIVERSITY ADMISSIONS OFFICE. Applications MUST be received by 18th MARCH 2002. It is anticipated that interviews will be held in the middle of April 2002. -- _________________________________________________________________________ Dr Simon O'Keefe PHONE: +44 1904 432762 EMAIL: sok at cs.york.ac.uk Department of Computer Science, University of York, York, YO10 5DD (U.K.) From dr at tedlab.mit.edu Mon Mar 4 14:41:50 2002 From: dr at tedlab.mit.edu (Douglas Rohde) Date: Mon, 04 Mar 2002 14:41:50 -0500 Subject: Ph.D. Thesis Announcement Message-ID: <3C83CDFE.4000703@tedlab.mit.edu> You may be interested in the availability of my recently completed Ph.D. thesis, entitled, "A Connectionist Model of Sentence Comprehension and Production." I have included an abstract and a summary of the table of contents below. The thesis can be downloaded in .ps or .pdf format at this site: http://tedlab.mit.edu/~dr/Thesis/ Although the thesis is a bit long, you may wish to take a look at the Introduction and the Discussion to see if you might find anything of interest in between. Cheers, Doug Rohde ------------------------------------------------------ A Connectionist Model of Sentence Comprehension and Production Douglas L. T. Rohde Carnegie Mellon University, Dept. of Computer Science and the Center for the Neural Basis of Cognition Thesis Committee: Dr. David C. Plaut, Chair Dr. James L. McClelland Dr. David S. Touretzky Dr. Maryellen C. MacDonald Abstract: The most predominant language processing theories have, for some time, been based largely on structured knowledge and relatively simple rules. These symbolic models intentionally segregate syntactic information processing from statistical information as well as semantic, pragmatic, and discourse influences, thereby minimizing the importance of these potential constraints in learning and processing language. While such models have the advantage of being relatively simple and explicit, they are inadequate to account for learning and validated ambiguity resolution phenomena. In recent years, interactive constraint-based theories of sentence processing have gained increasing support, as a growing body of empirical evidence demonstrates early influences of various factors on comprehension performance. Connectionist networks are one form of model that naturally reflect many properties of constraint-based theories, and thus provide a form in which those theories may be instantiated. Unfortunately, most of the connectionist language models implemented until now have involved severe limitations, restricting the phenomena they could address. Comprehension and production models have, by and large, been limited to simple sentences with small vocabularies (St. John & McClelland, 1990). Most models that have addressed the problem of complex, multi-clausal sentence processing have been prediction networks (Elman, 1991; Christiansen & Chater, 1999). Although a useful component of a language processing system, prediction does not get at the heart of language: the interface between syntax and semantics. The current thesis focuses on the design and testing of the Connectionist Sentence Comprehension and Production (CSCP) model, a recurrent neural network that has been trained to both comprehend and produce a relatively complex subset of English. This language includes such features as tense and number, adjectives and adverbs, prepositional phrases, relative clauses, subordinate clauses, and sentential complements, with a vocabulary of about 300 total words. It is broad enough that it permits the model to address a wide range of sentence processing phenomena. The experiments reported here involve such issues as the relative comprehensibility of various sentence types, the resolution of lexical ambiguities, generalization to novel sentences, the comprehension of main verb/reduced relative, sentential complement, subordinate clause, and prepositional phrase attachment ambiguities, agreement attraction and other production errors, and structural priming. The model is able to replicate many key aspects of human sentence processing across these domains, including sensitivity to lexical and structural frequencies, semantic plausibility, inflectional morphology, and locality effects. A critical feature of the model is its suggestion of a tight coupling between comprehension and production and the idea that language production is primarily learned through the formulation and testing of covert predictions during comprehension. I believe this work represents a major advance in the attested ability of connectionist networks to process natural language and a significant step towards a more complete understanding of the human language faculty. ------------------------------------------------------ Contents 1 Introduction 1.1 Why implement models? 1.2 Properties of human language processing 1.3 Properties of symbolic models 1.4 Properties of connectionist models 1.5 The CSCP model 1.6 Chapter overview 2 An Overview of Connectionist Sentence Processing 2.1 Parsing 2.2 Comprehension 2.3 Word prediction 2.4 Production 2.5 Other language processing models 3 Empirical Studies of Sentence Processing 3.1 Introduction 3.2 Relative clauses 3.3 Main verb/reduced-relative ambiguities 3.4 Sentential complements 3.6 Prepositional phrase attachment 3.7 Effects of discourse context 3.8 Production 3.9 Summary of empirical findings 4 Analysis of Syntax Statistics in Parsed Corpora 4.1 Extracting syntax statistics isn't easy 4.2 Verb phrases 4.3 Relative clauses 4.4 Sentential noun phrases 4.5 Determiners and adjectives 4.6 Prepositional phrases 4.7 Coordination and subordination 4.8 Conclusion 5 The Penglish Language 5.1 Language features 5.2 Penglish grammar 5.3 The lexicon 5.4 Phonology 5.5 Semantics 5.6 Statistics 6 The CSCP Model 6.1 Basic architecture 6.2 The semantic system 6.3 The comprehension, prediction, and production system 6.4 Training 6.5 Testing 6.6 Claims and limitations of the model 7 General Comprehension Results 7.1 Overall performance 7.2 Representation 7.3 Experiment 2: Comparison of sentence types 7.4 Lexical ambiguity 7.5 Experiment 4: Adverbial attachment 7.6 Experiment 5: Prepositional phrase attachment 7.7 Reading time 7.8 Individual differences 8 The Main Verb/Reduced Relative Ambiguity 8.1 Empirical results 8.2 Experiment 6 8.3 Verb frequency effects 8.4 Summary 9 The Sentential Complement Ambiguity 9.1 Empirical results 9.2 Experiment 7 9.3 Summary 10 The Subordinate Clause Ambiguity 10.1 Empirical results 10.2 Experiment 8 10.3 Experiment 9 10.4 Experiment 10: Incomplete reanalysis 10.5 Summary and discussion 11 Relative Clauses 11.1 Empirical results 11.2 Experiment 11 11.3 Discussion 12 Production 12.1 Word-by-word production 12.2 Free production 12.3 Agreement attraction 12.4 Structural priming 12.5 Summary 13 Discussion 13.1 Summary of results 13.2 Accomplishments of the model 13.3 Problems with the model 13.4 Model versus theory 13.5 Properties, principles, processes 13.6 Conclusion Appendices A Lens: The Light, Efficient Network Simulator A.1 Performance benchmarks A.2 Optimizations A.3 Parallel training A.4 Customization A.5 Interface A.6 Conclusion B SLG: The Simple Language Generator B.1 The grammar B.2 Resolving the grammar B.3 Minimizing the grammar B.4 Parsing B.5 Word prediction B.6 Conclusion C TGrep2: A Tool for Searching Parsed Corpora C.1 Preparing corpora C.2 Command-line arguments C.3 Specifying patterns C.4 Controlling the output C.5 Differences from TGrep D Details of the Penglish Language D.1 The Penglish SLG grammar D.2 The Penglish lexicon ------------------------------------------------------ From Johan.Suykens at esat.kuleuven.ac.be Tue Mar 5 04:32:45 2002 From: Johan.Suykens at esat.kuleuven.ac.be (Johan Suykens) Date: Tue, 5 Mar 2002 10:32:45 +0100 (MET) Subject: NATO-ASI on Learning Theory and Practice - Last call for participation Message-ID: <200203050932.KAA29925@euler.esat.kuleuven.ac.be> Last call for participation ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ NATO Advanced Study Institute on Learning Theory and Practice (NATO-ASI LTP 2002) July 8-19 2002 - K.U. Leuven Belgium http://www.esat.kuleuven.ac.be/sista/natoasi/ltp2002.html ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ -General Objective- This NATO Advanced Study Institute on Learning Theory and Practice aims at creating a fascinating interplay between advanced fundamental theory and several application areas such as bioinformatics, multimedia/computer vision, e-commerce finance, internet search, textmining and others. It offers an interdisciplinary forum for presenting recent progress and breakthroughs in learning theory with respect to several areas as neural networks, machine learning, mathematics and statistics. -Invited Lecturers- Peter Bartlett (Australian National University Canberra, AUS) Sankar Basu (IBM T.J. Watson, USA) Kristin Bennett (Rensselaer Polytechnic Institute New York, USA) Chris Bishop (Microsoft Research Cambridge, UK) Nello Cristianini (Royal Holloway London, UK) Luc Devroye (McGill University Montreal, CAN) Lazlo Gyorfi (T.U. Budapest, HUN) Gabor Horvath (T.U. Budapest, HUN) Rudolf Kulhavy (Honeywell Technology Center Prague, CZ) Vera Kurkova (Academy of Sciences of the Czech Republic, CZ) Joerg Lemm (University of Muenster, GER) Charles Micchelli (State U. of NY, USA) Tomaso Poggio (MIT, USA) Massimiliano Pontil (University of Siena, IT) Bernhard Schoelkopf (Max-Planck-Institute Tuebingen, GER) Yoram Singer (Hebrew University Jerusalem, IS) Steve Smale (U.C. Berkeley, USA) Johan Suykens (K.U. Leuven, BEL) Vladimir Vapnik (AT&T Labs Research, USA) Mathukumalli Vidyasagar (Tata Consultancy Services, IND) -Organizing committee- Sankar Basu (IBM T.J. Watson, USA) Gabor Horvath (T.U. Budapest, HUN), Co-director partner country Charles Micchelli (State U. of NY, USA) Johan Suykens (K.U. Leuven, BEL), Director Joos Vandewalle (K.U. Leuven, BEL) -Program and participation- According to the NATO rules http://www.nato.int/science the number of ASI students will be limited to 80. All participants will obtain a *free* registration (including welcome reception, lunches, banquets, refreshments and a NATO-ASI Science Series book to be published with IOS Press). Limited additional funding will be available to cover attendance costs. All interested participants should fill out an application form, taking into account the NATO restrictions. Application form and preliminary program are available from http://www.esat.kuleuven.ac.be/sista/natoasi/ltp2002.html -Venue- The Advanced Study Institute will take place in the Arenberg Castle of the K.U. Leuven Heverlee. The place is surrounded by several restaurants/cafes and parks where one may have a relaxing walk. The historical town of Leuven is within walking distance from the meeting site. Leuven is also well-known for its pleasant atmosphere, pubs and restaurants. -Housing- In addition to hotels rooms, blocks of low cost student rooms are reserved for the ASI students. The student rooms and hotels are located within walking distance from the Arenberg castle meeting site. In order to stimulate the interaction among the participants the option of student rooms will be recommended to all ASI students. -Important Dates- Deadline submission of application form: March 18, 2002 Notification of acceptance: April 30, 2002 NATO-ASI LTP 2002 meeting: July 8-19, 2002 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ From marina at cs.rhul.ac.uk Wed Mar 6 04:33:48 2002 From: marina at cs.rhul.ac.uk (marina d'Engelbronner) Date: Wed, 6 Mar 2002 09:33:48 -0000 Subject: NeuroCOLT Workshop 2002 Message-ID: <007301c1c4f2$055809b0$7ebcdb86@VENICE> Workshop on Generalisation Bounds Less than 0.5 The NeuroCOLT project coordinated by Royal Holloway, University of London is organising a workshop (which will be the final meeting of the project) from Monday April 29th until Thursday May 2nd 2002 at Cumberland Lodge, Windsor. The workshop draws together leading researchers in the drive to give non-trivial bounds on the generalisation of classifiers trained on real-world data. There will also be space for contributed presentations on related topics in the analysis of learning systems. Further information concerning the workshop (including the registration form) can be obtained from the workshop webpage: http://www.neurocolt.org/bounds2002.html If you are interested in attending the workshop, please fill out the registration form. Please note that there is very limited space at the workshop, so book early to avoid disappointment. The costs for the workshop (including accommodation, lunch, dinner, tea and coffee) will be =A3328. We will require the full amount of =A3328 to be paid on or before Wednesday April 15th 2002, either by cheque (in name of RHBNC; sent to me, see address below) or by making a bank transfer (to RHBNC, NatWestBank, Egham; sort-code: 600733; bank account number: 02325330; bank international code: NWB KGB2L; reference-code: 1090.3966). If you have any questions, please do not hesitate to contact me. Many thanks Marina d'Engelbronner Department of Computer Science Royal Holloway University of London Egham Hill Surrey TW20 0EX United Kingdom tel: **.(0)1784.443912 From wolfskil at MIT.EDU Wed Mar 6 14:42:35 2002 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Wed, 06 Mar 2002 14:42:35 -0500 Subject: CD-ROM announcement--collected NIPS proceedings (Jordan) Message-ID: <2002030614423530164@outgoing.mit.edu> I thought readers of the Connectionists List might be interested in this book. For more information, please visit http://mitpress.mit.edu/026256145X/ Thank you! Best, Jud Advances in Neural Information Processing Systems Proceedings of the First 12 Conferences edited by Michael I. Jordan, Yann LeCun, and Sara A. Solla The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This CD-ROM contains the entire proceedings of the twelve Neural Information Processing Systems conferences from 1988 to 1999. The files are available in the DjVu image format developed by Yann LeCun and his group at AT&T Labs. The CD-ROM includes free browsers for all major platforms. Michael I. Jordan is Professor of Computer Science and of Statistics at the University of California, Berkeley. Yann LeCun is Head of the Image Processing Research Department at AT&T Labs-Research. Sara A. Solla is Professor of Physics and of Physiology at Northwestern University. CD-ROM, ISBN 0-262-56145-X Neural Information Processing series ______________________ Jud Wolfskill Associate Publicist The MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617 253 2079 617 253 1709 fax http://mitpress.mit.edu From becker at meitner.psychology.mcmaster.ca Wed Mar 6 21:11:26 2002 From: becker at meitner.psychology.mcmaster.ca (S. Becker) Date: Wed, 6 Mar 2002 21:11:26 -0500 (EST) Subject: neural computation-related textbooks Message-ID: Dear connectionists, This message is directed to those of you who teach a course in neural networks & related topics, or computational neuroscience or cognitive science, and would like to exchange opinions on textbooks they are using. I'd like to know the name of your course, who takes it (undergrad vs grad, comp sci/eng vs psych/bio), what textbook you are using and what you consider to be the pros and cons of this book. I teach a course in neural computation to 3rd year undergrads, mostly CS majors and some psych/biopsych, and have used James Anderson's book An Introduction to Neural Networks a number of times. I like this book a lot -- it is the only one I know of that is truly interdisciplinary and suitable for undergraduates -- but it is in need of updating. I will post a summary of the replies I receive to connectionists. cheers, Sue -- Sue Becker, Associate Professor Department of Psychology, McMaster University becker at mcmaster.ca 1280 Main Street West, Hamilton, Ont. L8S 4K1 Fax: (905)529-6225 www.science.mcmaster.ca/Psychology/sb.html Tel: 525-9140 ext. 23020 From giacomo at ini.phys.ethz.ch Thu Mar 7 08:43:18 2002 From: giacomo at ini.phys.ethz.ch (Giacomo Indiveri) Date: Thu, 07 Mar 2002 14:43:18 +0100 Subject: Neuromorphic Engineering Workshop - SECOND CALL Message-ID: <3C876E76.8060900@ini.phys.ethz.ch> SECOND CALL FOR APPLICATIONS for the 2002 Workshop on Neuromorphic Engineering http://www.ini.unizh.ch/telluride Please distribute this announcement: http://www.ini.unizh.ch/telluride/announcement.html ------------------------------------------------------------- Neuromorphic Engineering Workshop Sunday, JUNE 30 - Saturday, JULY 20, 2002 TELLURIDE, COLORADO ------------------------------------------------------------------ Avis COHEN (University of Maryland) Rodney DOUGLAS (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Timmer HORIUCHI (Johns Hopkins University) Giacomo INDIVERI (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) ----------------------------------------------------------------- We invite applications for a three week summer workshop that will be held in Telluride, Colorado from Sunday, June 30 to Sunday, July 21, 2002. The application deadline is Friday, March 15, and application instructions are described at the bottom of this document. The 2001 summer workshop on "Neuromorphic Engineering", sponsored by the National Science Foundation, the Gatsby Foundation, Whitaker Foundation, the Office of Naval Research, and by the Center for Neuromorphic Systems Engineering at the California Institute of Technology, was an exciting event and a great success. A detailed report on the workshop is available at the workshop's web-site. We strongly encourage interested parties to browse through the previous workshop web pages. GOALS: Carver Mead introduced the term "Neuromorphic Engineering" for a new field based on the design and fabrication of artificial neural systems, such as vision systems, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems. The goal of this workshop is to bring together young investigators and more established researchers from academia with their counterparts in industry and national laboratories, working on both neurobiological as well as engineering aspects of sensory systems and sensory-motor integration. The focus of the workshop will be on active participation, with demonstration systems and hands on experience for all participants. Neuromorphic engineering has a wide range of applications from nonlinear adaptive control of complex systems to the design of smart sensors. Many of the fundamental principles in this field, such as the use of learning methods and the design of parallel hardware (with an emphasis on analog and asynchronous digital VLSI), are inspired by biological systems. However, existing applications are modest and the challenge of scaling up from small artificial neural networks and designing completely autonomous systems at the levels achieved by biological systems lies ahead. The assumption underlying this three week workshop is that the next generation of neuromorphic systems would benefit from closer attention to the principles found through experimental and theoretical studies of real biological nervous systems as whole systems. FORMAT: The three week summer workshop will include background lectures on systems neuroscience (in particular learning, oculo-motor and other motor systems and attention), practical tutorials on analog VLSI design, small mobile robots (Koalas, Kheperas and LEGO), hands-on projects, and special interest groups. Participants are required to take part and possibly complete at least one of the projects proposed. They are furthermore encouraged to become involved in as many of the other activities proposed as interest and time allow. There will be two lectures in the morning that cover issues that are important to the community in general. Because of the diverse range of backgrounds among the participants, the majority of these lectures will be tutorials, rather than detailed reports of current research. These lectures will be given by invited speakers. Participants will be free to explore and play with whatever they choose in the afternoon. Projects and interest groups meet in the late afternoons, and after dinner. In the early afternoon there will be tutorial on a wide spectrum of topics, including analog VLSI, mobile robotics, auditory systems, central-pattern-generators, selective attention mechanisms, etc. Projects that are carried out during the workshop will be centered in a number of working groups, including: * active vision * audition * olfaction * motor control * central pattern generator * robotics * multichip communication * analog VLSI * learning The active perception project group will emphasize vision and human sensory-motor coordination. Issues to be covered will include spatial localization and constancy, attention, motor planning, eye movements, and the use of visual motion information for motor control. Demonstrations will include an active vision system consisting of a three degree-of-freedom pan-tilt unit, and a silicon retina chip. The central pattern generator group will focus on small walking and undulating robots. It will look at characteristics and sources of parts for building robots, play with working examples of legged and segmented robots, and discuss CPG's and theories of nonlinear oscillators for locomotion. It will also explore the use of simple analog VLSI sensors for autonomous robots. The robotics group will use rovers and working digital vision boards as well as other possible sensors to investigate issues of sensorimotor integration, navigation and learning. The audition group aims to develop biologically plausible algorithms and aVLSI implementations of specific auditory tasks such as source localization and tracking, and sound pattern recognition. Projects will be integrated with visual and motor tasks in the context of a robot platform. The multichip communication project group will use existing interchip communication interfaces to program small networks of artificial neurons to exhibit particular behaviors such as amplification, oscillation, and associative memory. Issues in multichip communicationwill be discussed. LOCATION AND ARRANGEMENTS: The workshop will take place in the small town of Telluride, 9000 feet high in Southwest Colorado, about 6 hours drive away from Denver (350miles). United Airlines provide daily flights directly into Telluride. All facilities within the beautifully renovated public school building are fully accessible to participants with disabilities. Participants will be housed in ski condominiums, within walking distance of the school. Participants are expected to share condominiums. The workshop is intended to be very informal and hands-on. Participants are not required to have had previous experience in analog VLSI circuit design, computational or machine vision, systems level neurophysiology or modeling the brain at the systems level. However, we strongly encourage active researchers with relevant backgrounds from academia, industry and national laboratories to apply, in particular if they are prepared to work on specific projects, talk about their own work or bring demonstrations to Telluride (e.g. robots, chips, software). Internet access will be provided. Technical staff present throughout the workshops will assist with software and hardware issues. We will have a network of PCs running LINUX and Microsoft Windows. No cars are required. Bring hiking boots, warm clothes and a backpack, since Telluride is surrounded by beautiful mountains. Unless otherwise arranged with one of the organizers, we expect participants to stay for the entire duration of this three week workshop. FINANCIAL ARRANGEMENT: Notification of acceptances will be mailed out around March 21, 2001. Participants are expected to pay a $275.00 workshop fee at that time in order to reserve a place in the workshop. The cost of a shared condominium will be covered for all academic participants but upgrades to a private room will cost extra. Participants from National Laboratories and Industry are expected to pay for these condominiums. Travel reimbursement of up to $500 for US domestic travel and up to $800 for overseas travel will be possible if financial help is needed (please specify on the application). HOW TO APPLY: Applicants should be at the level of graduate students or above (i.e.postdoctoral fellows, faculty, research and engineering staff and the equivalent positions in industry and national laboratories). We actively encourage qualified women and minority candidates to apply. Application should include: * First name, Last name, valid email address. * Curriculum Vitae. * One page summary of background and interests relevant to the workshop. * Description of special equipment needed for demonstrations that could be brought to the workshop. * Two letters of recommendation. Complete applications should be sent to: Terrence Sejnowski The Salk Institute 10010 North Torrey Pines Road San Diego, CA 92037 e-mail: telluride at salk.edu FAX: (858) 587 0417 THE APPLICATION DEADLINE IS March 15, 2002. Applicants will be notified by e-mail around the end of March. From ingber at ingber.com Thu Mar 7 16:15:12 2002 From: ingber at ingber.com (Lester Ingber) Date: Thu, 7 Mar 2002 16:15:12 -0500 Subject: Financial Engineer Message-ID: <20020307161512.A9459@ingber.com> If you have very strong credentials for the position described below, please email your resume to: Lester Ingber Director R&D Financial Engineer A disciplined, quantitative, analytic individual proficient in prototyping and coding (such as C/C++, Maple/Mathematica, or Visual Basic, etc.) is sought for financial engineering/risk:reward optimization research position with established Florida hedge fund (over two decades in the business and $1 billion in assets under management). A PhD in a mathematical science, such as physics, statistics, math, or computer-science, is preferred. Hands-on experience in the financial industry is required. Emphasis is on applying state-of-the-art methods to financial time-series of various frequencies. Ability to work with a team to transform ideas/models into robust, intelligible code is key. Salary: commensurate with experience, with bonuses tied to the individual's and the firm's performance. Start date for this position may range anywhere from immediately to six months hence, depending on both the candidate's and the firm's needs. See http://www.ingber.com/open_positions.html for more information. -- Prof. Lester Ingber ingber at ingber.com ingber at alumni.caltech.edu www.ingber.com www.alumni.caltech.edu/~ingber From costa at dsi.unifi.it Fri Mar 8 06:39:19 2002 From: costa at dsi.unifi.it (Fabrizio Costa) Date: Fri, 8 Mar 2002 12:39:19 +0100 (MET) Subject: Ph.D. Thesis Announcement Message-ID: Dear Connectionists, I am pleased to announce that my PhD thesis entitled ------------------------------------------------------------------------ A Connectionist Approach to First-pass Attachment Prediction in Natural Language Parsing Fabrizio Costa Dept. of Computer Science University of Florence - Italy ------------------------------------------------------------------------ is now available at http://www.dsi.unifi.it/~costa/online/thesis.ps (and .pdf) Please find the summary of my thesis below. Fabrizio Costa Summary ======= The apparently effortless capability that humans show in understanding natural language is one of the main problems for modern cognitive science. Natural language is extremely ambiguous and many theories of human parsing claim that ambiguity is resolved resorting to linguistic experience, i.e. ambiguities are preferentially resolved in a way that has been successful most frequently in the past. Current research is focused on finding what kind of features are relevant for conditioning choice preferences. In this thesis we employ a connectionist paradigm (Recursive Neural Networks) capable of processing acyclic graphs to perform supervised learning on syntactic trees extracted from a large corpus of parsed sentences. The architecture is capable to extract the relevant features directly from inspection of syntactic structure and adaptively encode this information for the disambiguation task. Following a widely accepted hypothesis in psycholinguistics, we assume an incremental parsing process (one word at a time) that keeps a connected partial parse tree at all times. In this thesis we show how the model can be trained from a large parsed corpus (treebank), leading to a good disambiguation performance both on unrestricted text, and on a collection of examples from the psycholinguistic experimental literature. The goals of the thesis are twofold: (1) to demonstrate that a strongly incremental approach is computationally feasible and (2) to present an application of a connectionist architecture that directly processes complex informations such as syntactic trees. In Chapter 1 we set the background to understand the current work. We highlight a trend in the field of statistical parsers that shows how recently proposed models condition the statistics collected on an growing number of linguistic features. We will review how some recent incremental approaches to parsing show the viability of incrementality from a computational and psycholinguistic viewpoint. A review of the connectionists models applied to NLP follows. In Chapter 2 we formally introduce the recursive neural network architecture and we go through the issues of training and generalization capabilities of the model. In Chapter 3 we introduce the incremental grammar formalism used in the current work. We formalize the task of discriminating the correct successor tree, and we then describe how to use recursive neural networks to incrementally predict syntactic trees. In Chapter 4 we try to gain an insight in the relations between the parameters of the proposed system and the learned preferences. In Chapter 5 we study the relationship between the generalization error and the features of the domain. Some transformations are then applied on the input and on the architecture to enhance the generalization capabilities of the model. The effectiveness of the introduced modifications is then validated through a range of experiments. In Chapter 6 we investigate how the modeling properties of the proposed architecture relate to the the ambiguity resolution mechanisms employed by human beings. In Chapter 7 we employ the first-pass attachment prediction mechanism to build an initial version of a syntactic incremental probabilistic parser. ------------------------------------------------------ Contents Introduction 1 Parsing, Incrementality and Connectionism 1.1 Introduction 1.1.1 The ambiguity issue 1.2 Adding information to Context-Free Grammars 1.3 Incremental Parsing 1.3.1 Incremental parsers 1.3.2 Top-down left-corner parsing 1.3.3 Incremental Cascaded Markov Model 1.4 Connectionist Models for Natural Language Processing 1.4.1 Recurrent Networks 1.4.2 Recursive Auto Associative Memory 1.4.3 Simple Syncrony Networks 1.5 Conclusions 2 Recursive Neural Networks 2.1 Introduction 2.2 Definitions 2.3 Recursive representation 2.4 Modeling assumptions 2.5 Recursive network 2.6 Encoding network 2.7 Recursive processing with neural networks 2.8 Generalization Capabilities 3 Learning first-pass attachment preferences 3.1 Introduction 3.2 Incremental trees and connection paths 3.2.1 Definitions 3.2.2 Connection paths extraction 3.2.3 Is incrementality plausible? 3.2.4 POS tag level 3.2.5 Left recursion treatment 3.3 Formal definition of the learning problem 3.3.1 Universe of connection paths 3.3.2 The forest of candidates 3.3.3 The learning task 3.4 Recursive networks for ranking alternatives 3.5 Learning procedure 3.5.1 Dataset compilation 3.5.2 Network unfolding 3.5.3 Parameters estimation 4 Network performance 4.1 Experimental setting 4.1.1 The data set 4.2 Learning and generalizing capabilities of the system 4.2.1 Target function variation 4.2.2 Parameters variation 4.2.3 Learning rate variation 4.2.4 Training set variation 4.2.5 Network generalization 5 What is the Network really choosing? 5.1 Structural analyses 5.1.1 Statistical information on incremental tree features 5.1.2 Correlation between features and error 5.1.3 Correct and incorrect predictions characteristics 5.1.4 Influence of connection paths' frequency 5.1.5 Filtering out the connection paths' frequency effect 5.1.6 Filtering out the anchor attachment ambiguity 5.2 Comparison with linguistic heuristics 5.3 Enhancing the network performance 5.3.1 Connection path set reduction 5.3.2 Tree reduction 5.3.3 Network specialization 5.3.4 Anchor shortcut 6 Modeling human ambiguity resolution 6.1 Introduction 6.2 Experienced based models 6.3 Experimental setting 6.4 Psycholinguistic examples 6.4.1 NP/S ambiguities 6.4.2 Relative clause attachment 6.4.3 Prepositional phrase attachment to noun phrases 6.4.4 Adverb attachment 6.4.5 Closure ambiguity 6.4.6 PP attachment ambiguities 6.5 Conclusion 7 Toward a connectionist incremental parser 7.1 Probabilistic framework 7.1.1 Defining the model structure 7.1.2 Maximum likelihood estimation 7.2 An example: PCFG 7.3 Probabilistic Incremental Grammar 7.4 Proposal for an incremental parser and modeling issues 7.5 How to evaluate the parser output 7.5.1 Definitions 7.5.2 Evaluation metrics 7.6 Parser evaluation 7.6.1 Influence of tree reduction procedure 7.6.2 Parsing performance issues 7.6.3 Influence of the connection path coverage 7.6.4 Influence of the beam size 7.7 Final remarks Conclusion ----------------------------------------------------------------- Fabrizio Costa Dept. of Systems and Computer Science, University of Florence Via di Santa Marta 3, I-50139 Firenze - Italy Phone: +39 055 4796 361 Fax: +39 055 4796 363 From steve at cns.bu.edu Sat Mar 9 06:20:34 2002 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sat, 9 Mar 2002 06:20:34 -0500 Subject: Adaptive boundary-surface alignment and the McCollough effect Message-ID: The following article is now available at http://www.cns.bu.edu/Profiles/Grossberg in PDF and Gzipped Postscript. Grossberg, S., Hwang, S., and Mingolla, E. (2002). Thalamocortical dynamics of the McCollough effect: Boundary-surface alignment through perceptual learning. Vision Research, in press. Abstract: This article further develops the FACADE neural model of 3-D vision and figure-ground perception to quantitatively explain properties of the McCollough effect. The model proposes that many McCollough effect data result from visual system mechanisms whose primary function is to adaptively align, through learning, boundary and surface representations that are positionally shifted, due to the process of binocular fusion. For example, binocular boundary representations are shifted by binocular fusion relative to monocular surface representations, yet the boundaries must become positionally aligned with the surfaces to control binocular surface capture and filling-in. The model also includes perceptual reset mechanisms that use habituative transmitters in opponent processing circuits. Thus the model shows how McCollough effect data may arise from a combination of mechanisms that have a clear functional role in biological vision. Simulation results with a single set of parameters quantitatively fit data from thirteen experiments that probe the nature of achromatic/chromatic and monocular/binocular interactions during induction of the McCollough effect. The model proposes how perceptual learning, opponent processing, and habituation at both monocular and binocular surface representations are involved, including early thalamocortical sites. In particular, it explains the anomalous McCollough effect utilizing these multiple processing sites. Alternative models of the McCollough effect are also summarized and compared with the present model. From jaz at cs.rhul.ac.uk Mon Mar 11 05:59:20 2002 From: jaz at cs.rhul.ac.uk (J.S. Kandola) Date: Mon, 11 Mar 2002 10:59:20 +0000 Subject: REMINDER: Call for Papers: JMLR Special Issue on Machine Learning Methods for Text and Images Message-ID: <02031110592000.01774@trinity> *********** Apologies for Multiple Postings*************** ******************************************************* Call for Papers: JMLR Special Issue on Machine Learning Methods for Text and Images Guest Editors: Jaz Kandola (Royal Holloway College, University of London, UK) Thomas Hofmann (Brown University, USA) Tomaso Poggio (M.I.T, USA) John Shawe-Taylor (Royal Holloway College, University of London, UK) ***Submission Deadline: 29th March 2002*** ================================= Papers are invited reporting original research on Machine Learning Methods for Text and Images. This special issue follows the NIPS 2001 workshop on the same topic, but is open also to contribution that were not presented in it. A special volume will be published for this issue. There has been much interest in information extraction from structured and semi-structured data in the machine learning community. This has in part been driven by the large amount of unstructured and semi-structured data available in the form of text documents, images, audio, and video files. In order to optimally utilize this data, one has to devise efficient methods and tools that extract relevant information. We invite original contributions that focus on exploring innovative and potentially groundbreaking machine learning technologies as well as on identifying key challenges in information access, such as multi-class classification, partially labeled examples and the combination of evidence from separate multimedia domains. The special issue seeks contributions applied to text and/or images. For a list of possible topics and information about the associated NIPS workshop please see http://www.cs.rhul.ac.uk/colt/JMLR.html Important Dates: Submission Deadline: 29th March 2002 Decision: 24th June 2002 Final Papers: 24th July 2002 Many thanks Jaz Kandola, Thomas Hofmann, Tommy Poggio and John Shawe-Taylor From laubach at kafka.med.yale.edu Mon Mar 11 13:51:41 2002 From: laubach at kafka.med.yale.edu (Mark Laubach) Date: Mon, 11 Mar 2002 13:51:41 -0500 Subject: positions available Message-ID: POSTDOCTORAL POSITIONS IN SYSTEMS NEUROPHYSIOLOGY AND NEUROENGINEERING Four postdoctoral positions are available immediately as part of a collaborative project involving the laboratories of Drs. Jon Kaas and Troy Hackett at Vanderbilt University, Dr. Mandayam Srinivasan at MIT, and Dr. Mark Laubach at Yale University. Two positions in system neurophysiology are available. These positions, primarily at Vanderbilt, will involve using multi-electrode recording methods to study auditory-related areas of the macaque cerebral cortex. We are seeking one person with experience in the auditory system and another with experience in multi-electrode recording methods. Two positions that bridge the gap between physiology and engineering are also available, primarily at Yale. One of these positions requires expertise in neurophysiology, electronics, and computer-based instrumentation. This person will develop, test, and implement new procedures for multi-electrode microstimulation. The second position requires expertise in modern methods for data analysis (multispectral techniques and machine learning/pattern recognition) and/or real-time data acquisition and analysis. A major requirement for all these positions is strong skills in teamwork and communication (in English). We request that only qualified and serious candidates apply. If interested, please send a vita, a list of publications, a summary of research experience, and the names of three individuals who can be contacted as references to: Mark Laubach, Ph.D. The John B. Pierce Laboratory Yale University School of Medicine 290 Congress Ave New Haven CT 06519 E-mail: laubach at kafka.med.yale.edu http://www.jbpierce.org/staff/laubach.html From bbs at bbsonline.org Mon Mar 11 14:39:35 2002 From: bbs at bbsonline.org (Behavioral & Brain Sciences) Date: Mon, 11 Mar 2002 14:39:35 -0500 Subject: Northoff: CATATONIA: BBS Call for Commentators Message-ID: Below is a link to the forthcoming BBS target article WHAT CATATONIA CAN TELL US ABOUT "TOP-DOWN MODULATION": A NEUROPSYCHIATRIC HYPOTHESIS by George Northoff http://www.bbsonline.org/Preprints/Northoff This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within two (2) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _____________________________________________________________ WHAT CATATONIA CAN TELL US ABOUT "TOP-DOWN MODULATION": A NEUROPSYCHIATRIC HYPOTHESIS George Northoff, MD PhD, PhD Harvard University Beth Israel Hospital Boston, Massachusetts KEYWORDS: Catatonia; Parkinson's disease; Top-down modulation; Bottom-up modulation; Horizontal modulation; Vertical modulation ABSTRACT: Differentialdiagnosis of motor symptoms, as for example akinesia, may be difficult in clinical neuropsychiatry. They may be either of neurologic origin, as for example Parkinson's disease, or psychiatric origin, as for example catatonia, leading to a so-called "conflict of paradigms". Despite their different origin symptoms may appear clinically more or less similar. Possibility of dissociation between origin and clinical appearance may reflect functional brain organisation in general and cortical-cortical/subcortical relations in particular. It is therefore hypothesized that similarities and differences between Parkinson's disease and catatonia may be accounted for by distinct kinds of modulation between cortico-cortical and cortico-subcortical relations. Comparison between Parkinson's disease and catatonia reveals distinction between two kinds of modulation "vertical and horizontal modulation". "Vertical modulation" concerns cortical-subcortical relations and allows apparently for bidirectional modulation. This is reflected in possibility of both "top-down and bottom-up modulation" and appearance of motor symptoms in both Parkinson's disease and catatonia. "Horizontal modulation" concerns cortical-cortical relations and allows apparently only for unidirectional modulation. This is reflected in one-way connections from prefrontal cortex to motor cortex and absence of major affective and behavioural symptoms in Parkinson's disease. It is concluded that comparison between Parkinson's disease and catatonia may reveal the nature of modulation of cortico-cortical and cortico-subcortical relations in further detail. http://www.bbsonline.org/Preprints/Northoff ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password above: http://www.bbsonline.org/ *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Ralph BBS ------------------------------------------------------------------- Ralph DeMarco Associate Editorial Coordinator Behavioral and Brain Sciences Journals Department Cambridge University Press 40 West 20th Street New York, NY 10011-4211 UNITED STATES bbs at bbsonline.org http://bbsonline.org Tel: +001 212 924 3900 ext.374 Fax: +001 212 645 5960 ------------------------------------------------------------------- From escorchado at ubu.es Tue Mar 12 13:35:48 2002 From: escorchado at ubu.es (escorchado@ubu.es) Date: Tue, 12 Mar 2002 18:35:48 GMT Subject: 2 POSTDOC. COMPUTING VISITING POSITIONS AVAILABLE Message-ID: <200203121835.g2CIZkC01459@estafeta.cid.ubu.es> 2 POSTDOC. COMPUTING VISITING POSITIONS Computer and Information Systems Department University of Burgos, BURGOS, SPAIN. We are looking for personnel to play a full role in the research, teaching and administrative activities of the Department. The Department has established main research areas in: Artificial Neural Networks Data Mining Web Technologies Software Engineering, Object Oriented Methods Object Oriented Languages and Databases Also we are interested in GIS and Computer Supported Cooperative Work. Salary will be within the Associate Professor scale (28250 euros). The positions are limited to five years maximum. You may informally contact to Emilio Corchado escorchado at ubu.es. We aplogies if you receive this message several times. ---------------------------------- The city of Burgos has a population of 165.000 inhabitants. "Burgos" is the capital of the province which takes the same name, and it's one of the nine provinces that form The Autonomous Community of Castile and Leon. The city is outstanding for its exceptional historic and artistic heritage and for its geographical location. As a result, Burgos is one of the most visited cities in Spain. Burgos is a very well known city for its important food and car Industry. Further information about the city of Burgos in http://www.ubu.es/relacinter/eng_guiaextranjero.htm. Burgoss university is a young and dinamic institution. Firstly, it existed as a campus of University of Valladolid since 1972 but it was in 1994 when it became an independent university. A three-year degree in Information Systems has been taugh since 1995, and this course the studies have been extended to a five-year degree. From bbs at bbsonline.org Thu Mar 14 17:15:49 2002 From: bbs at bbsonline.org (Behavioral & Brain Sciences) Date: Thu, 14 Mar 2002 17:15:49 -0500 Subject: BBS Call for Commentators: Perruchet: THE SELF-ORGANIZING CONSCIOUSNESS Message-ID: Below is a link to the forthcoming BBS target article THE SELF-ORGANIZING CONSCIOUSNESS by Pierre Perruchet and Annie Vinter http://www.bbsonline.org/Preprints/Perruchet This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. ____________________________________________________________ THE SELF-ORGANIZING CONSCIOUSNESS Pierre Perruchet and Annie Vinter Universit de Bourgogne LEAD/CNRS Dijon, France KEYWORDS: Associative learning; automatism, consciousness; development; implicit learning; incubation; language; mental representation; perception; phenomenal experience ABSTRACT: We propose that the isomorphism generally observed between the representations composing our momentary phenomenal experience and the structure of the world is the end-product of a progressive organization that emerges thanks to elementary associative processes that take our conscious representations themselves as the stuff on which they operate, a thesis that we summarize in the concept of Self-Organizing Consciousness (SOC). We show that the SOC framework accounts for the discovery of words and objects, and for word-object mapping. We then argue that isomorphic representations may underlie seemingly rule-governed behavior, as is observed in the areas of implicit learning of arbitrary structures, language, problem solving, and automatisms. This analysis provides support for the so-called "mentalistic" framework (e.g. Dulany, 1997), which avoids postulating the existence of unconscious representations and computations. http://www.bbsonline.org/Preprints/Perruchet ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password above: http://www.bbsonline.org/ *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Ralph BBS ------------------------------------------------------------------- Ralph DeMarco Associate Editorial Coordinator Behavioral and Brain Sciences Journals Department Cambridge University Press 40 West 20th Street New York, NY 10011-4211 UNITED STATES bbs at bbsonline.org http://bbsonline.org Tel: +001 212 924 3900 ext.374 Fax: +001 212 645 5960 ------------------------------------------------------------------- From butz at illigal.ge.uiuc.edu Thu Mar 14 21:03:38 2002 From: butz at illigal.ge.uiuc.edu (Martin Butz) Date: Thu, 14 Mar 2002 20:03:38 -0600 (CST) Subject: 2nd CFP - ABiALS 2002 - Adaptive Behavior in Anticipatory Learning Systems Message-ID: (We apologize if you receive multiple copies) ######################################################################## 2nd C A L L F O R P A P E R S ABiALS Workshop 2002 Adaptive Behavior in Anticipatory Learning Systems ######################################################################## August 11., 2002 Edinburgh, Scotland http://www-illigal.ge.uiuc.edu/ABiALS to be held during the seventh international conference on Simulation of Adaptive Behavior (SAB'02) http://www.isab.org.uk/sab02/ This workshops aims for an interdisciplinary gathering of people interested in how anticipations can guide behavior as well as how an anticipatory influence can be implemented in an adaptive behavior system. Particularly, we are looking for adaptive behavior systems that incorporate online anticipation mechanisms. NEWS: - Submission deadline approaching - please submit your contribution by the 31st of March. - Review form online: We put the review form up on the web-page to give an idea of the most important acceptance criteria. ___________________________________________________________________________ Aim and Objectives: Most of the research over the last years in artificial adaptive behavior with respect to model learning and anticipatory behavior has focused on the model learning side. Research is particularly engaged in online generalized model learning. Up to now, though, exploitation of the model has been done mainly to show that exploitation is possible or that an appropriate model exists in the first place. Only very few applications exist that show the utility of the model for the simulation of anticipatory processes and a consequent adaptive behavior. The aim of this workshop is to bring together researchers that are interested in anticipatory processes and essentially anticipatory adaptive behavior. It is aimed for an interdisciplinary gathering that brings together researchers from distinct areas so as to discuss the different guises that takes anticipation in these different perspectives. But the workshop intends to focus on anticipations in the form of low-level computational processes rather than high-level processes such as explicit planning. Essential questions: * How can anticipations influence the adaptive behavior of an artificial learning system? * How can anticipatory adaptive behavior be implemented in an artificial learning system? * How does an incomplete model influence anticipatory behavior? * How do anticipations guide further model learning? * How do anticipations control attention? * Can anticipations be used for the detection of special environmental properties? * What are the benefits of anticipations for adaptive behavior? * What is the trade-off between simple bottom-up stimulus-response driven behavior and more top-down anticipatory driven behavior? * In what respect does anticipation mediate between low-level environmental processing and more complex cognitive simulation? * What role do anticipations play for the implementation of motivations and emotions? ___________________________________________________________________________ Submission: Submissions for the workshop should address or at least be related to one of the questions listed above. However, other approaches to anticipatory adaptive behavior are encouraged as well. The workshop is not limited to one particular type of anticipatory learning system or a particular representation of anticipations. However, the learning system should learn its anticipatory representation online rather than being provided by a model of the world beforehand. Nonetheless, background knowledge of a typical environment can be incorporated (and is probably inevitably embodied in the provided sensors, actions, and the coding in any adaptive system). Since this is a full day workshop, we hope to be able to provide more time for presentations and discussions. In that way, the advantages and disadvantages of the different learning systems should become clearer. It is also aimed for several discussion sessions in which anticipatory influences will be discussed in a broader sense. Papers will be reviewed for acceptance by the program committee and the organizers. Papers should be submitted electronically to one of the organizers via email in pdf or ps format. Electronic submission is strongly encouraged. If you cannot submit your contribution electronically, please contact one of the organizers. Submitted papers should be between 10 and 20 pages in 10pt, one-column format. The LNCS Springer-Verlag style is preferred (see http://www.springer.de/comp/lncs/authors.html). Submission deadline is the 31st of March 2002. Dependent on the quality and number of contributions we hope to be able to publish Post-Workshop proceedings as either a Springer LNAI volume or a special issue of a journal. For more information please refer to www-illigal.ge.uiuc.edu/ABiALS/ ___________________________________________________________________________ Important Dates: 31.March 2002: Deadline for Submissions 15.May 2002: Notification of Acceptance 15.June 2002: Camera Ready Version for SAB Workshop Proceedings 11.August 2002: Workshop ABiALS ___________________________________________________________________________ Program Committee: Emmanuel Dauc Facult des sciences du sport Universit de la Mditerranne Marseille, France Ralf Moeller Cognitive Robotics Max Planck Institute for Psychological Research Munich, Germany Wolfgang Stolzmann DaimlerChrysler AG Berlin, Germany Jun Tani Lab. for Behavior and Dynamic Cognition Brain Science Institute, RIKEN 2-1 Hirosawa, Wako-shi, Saitama, 351-0198 Japan Stewart W. Wilson President Prediction Dynamics USA ___________________________________________________________________________ Organizers: Martin V. Butz, Illinois Genetic Algorithms Laboratory (IlliGAL), Universtiy of Illinois at Urbana-Champaign, Illinois, USA also: Department of Cognitive Psychology University of Wuerzburg, Germany butz at illigal.ge.uiuc.edu http://www-illigal.ge.uiuc.edu/~butz Pierre Grard, AnimatLab, University Paris VI, Paris, France pierre.gerard at lip6.fr http://animatlab.lip6.fr/Gerard Olivier Sigaud, AnimatLab, University Paris VI, Paris, France olivier.sigaud at lip6.fr http://animatlab.lip6.fr/Sigaud From ckiw at dai.ed.ac.uk Fri Mar 15 14:21:58 2002 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Fri, 15 Mar 2002 19:21:58 +0000 (GMT) Subject: MSc study in Learning from Data at Edinburgh Message-ID: Dear colleagues, I would be grateful if you could forward mail this to any strong final year undergraduates (or equivalent) who are looking to do MSc study in the machine learning/probabilistic modelling areas. Chris Williams Chris Williams ckiw at dai.ed.ac.uk Institute for Adaptive and Neural Computation Division of Informatics, University of Edinburgh 5 Forrest Hill, Edinburgh EH1 2QL, Scotland, UK fax: +44 131 650 6899 tel: (direct) +44 131 651 1212 (department switchboard) +44 131 650 3090 http://www.dai.ed.ac.uk/homes/ckiw/ --------------------------------------------------------------------------- MSc In Informatics: Learning from Data -------------------------------------- Division of Informatics, University of Edinburgh Edinburgh, UK The Division of Informatics, University of Edinburgh runs a wide-ranging taught Master of Science (MSc) programme. Within this, the Learning from Data specialism teaches students about methods from machine learning, probabilistic graphical modelling, pattern recognition and neural networks. These methods underpin the analysis of learning in natural and artificial systems, and are also of vital importance in the analysis, interpretation and exploitation of the increasing amounts of electronic data that are being collected in scientific, engineering and commercial environments. The Learning from Data specialism comprises eight taught modules, plus a five month long research project. Of the eight modules, there are three compulsory inner core modules (Learning from Data 1, Probabilistic Modelling and Reasoning, Data Mining and Exploration), and a further two should be chosen from five outer core modules (Learning from Data 2, Genetic Algorithms and Genetic Programming, Statistical Natural Language Processing, Database Systems, Visualisation). The remaining three modules can be chosen from the whole range of around 50 available MSc modules. The course is funded by a Masters Training Package from the Engineering and Physical Sciences Research Council (EPSRC). We have a number of grants available to support UK and EC students who have a very good honours degree (or equivalent qualification) in a numerate area such as Computer Science, Physics etc. The Division of Informatics at Edinburgh enjoys an international reputation for its teaching and research. It was awarded the top 5*A rating in Computer Science in the 2001 UK Research Assessment Exercise. We also received a top excellent rating in the most recent SHEFC Teaching Quality Assessment. For further details and an application form contact MSc Admissions, Division of Informatics, University of Edinburgh, 5 Forrest Hill, Edinburgh, EH1 2QL, Scotland, UK or visit our web site at http://www.informatics.ed.ac.uk/prospectus/graduate/msc.html. Informal enquiries may be made to msc-admissions at informatics.ed.ac.uk. Preferential treatment for admission in October 2002 will be given to applications received by 31 March 2002. From hamilton at may.ie Fri Mar 15 06:42:35 2002 From: hamilton at may.ie (Hamilton Institute) Date: Fri, 15 Mar 2002 11:42:35 -0000 Subject: Research Positions (Statistical Machine Learning), Hamilton Institute, Ireland Message-ID: <00cc01c1cc16$8086a8a0$128a9f82@pwhm6> RESEARCH POSITIONS The Hamilton Institute is a new multi-disciplinary research centrecurrently undergoing substantial expansion. Funded by ScienceFoundation Ireland, we are committed to research excellence. Applications are invited from well qualified candidates for a number ofresearch positions. The successful candidates will be outstanding researchers who can demonstrate an exceptional researchtrack record or significant research potential at international level. We currently have an active programme of research in statistical machinelearning and probabilistic reasoning methods in the context of nonlinearand switched/hybrid dynamics (particuarly computer controlled systemsand human-computer interfaces). Experience in topics includingnonparametric regression and classification, Bayesian statistics, MCMCand time series analysis would be beneficial. These posts offer a unique opportunity for tackling fundamental problemsin a leading edge multi-disciplinary research group with state of theart facilities which is currently undergoing substantial expansion. Allappointments are initially for 3 years, extendable to 5. We arecommitted to research excellence and appointments will reflect this. Remuneration will reflect qualifications and experience, but will beinternationally competitive. Further information: Please visit our web site at: http://www.hamilton.may.ie Enquiries to Prof. Douglas Leith, doug.leith at may.ie From K.Branney at elsevier.nl Fri Mar 15 11:01:28 2002 From: K.Branney at elsevier.nl (Branney, Kate (ELS)) Date: Fri, 15 Mar 2002 17:01:28 +0100 Subject: Call for Submissions - Neurocomputing Letters Message-ID: <46414F09B351C64BAA875CE0B37BE07101446499@elsamsvexch02.elsevier.nl> Neurocomputing Letters NEUROCOMPUTING An International Journal published by Elsevier Science B.V., vol. 49-55, 28 issues, in 2003 ISNN 0925-2312, URL: http://www.elsevier.com/locate/neucom Neurocomputing is pleased to invite authors to submit letters, concise papers and short communications (jointly called "Letters") aimed at rapid publication. The review of this type of submission will be made subject to a special fast procedure and will only take place if the following conditions are met: - Size: no longer than five (5) A4 pages typed in 12point double space, - Scope: the content of the submitted manuscript fits within the general Aims and Scope of the journal, - Content: about a new development which is served by rapid publication, relevant comments on articles published in the journal, or about outstanding preliminary results of current research. Please send your submission in one of two electronic forms - pdf or word doc - by email directly to our Letters Editor with a copy (cc) to the Editor-in-Chief. The Letters Editor will determine whether the submitted manuscript meets all requirements for a letter submission and is in charge of the complete review procedure. If all requirements are met, the Letters Editor will make the submission subject to a special, rapid procedure: - the author can expect a review report on his/her letter within eight (8) weeks after submission, - the letter shall be either accepted or rejected. By acceptance, the only possibility is one with minor corrections. - upon acceptance of the letter, publication will take place in the first available issue of the journal, within approximately the next 3 to 4 months. - revised versions of rejected manuscripts may only be resubmitted as a regular paper. The resubmission must include copies of the original referees' comments and separate pages with the authors' response to all these comments. Author's instructions for regular papers and further information on the Letters section can be found on the back inside cover or at http://www.elsevier.com/locate/neucom From helge at TechFak.Uni-Bielefeld.DE Sat Mar 16 14:10:29 2002 From: helge at TechFak.Uni-Bielefeld.DE (Helge Ritter) Date: Sat, 16 Mar 2002 20:10:29 +0100 Subject: open position Message-ID: <3C9398A5.1A7EA5EF@techfak.uni-bielefeld.de> Dear Colleagues: The research group Neuroinformatics (Prof. Helge Ritter) at the University of Bielefeld is offering a research project position for a Research Assistant with salary according to BAT-IIa. The position will be affiliated with the Special Collaborative Research Unit (Sonderforschungsbereich) SFB 360: Situated Artificial Communicators ( http://www.sfb360.uni-bielefeld.de/sfbengl.html ). Research topic of the project is visual instruction of robot actions based on teaching by showing. The project provides the opportunity for a dissertation. Applicants should have a university degree (Masters or german diploma) in computer science, electrical engineering or physics and should have a good knowledge in Unix/C/C++ programming. A good background in the fields of robotics/neural networks/computer vision is desireable. The University of Bielefeld follows a policy to increase the proportion of female employees in fields where women are underrepresented and is therefore particularly encouraging women for an application. Applications of accordingly qualified disabled persons are welcome. Further information can be obtained from our group homepage: http://www.TechFak.Uni-Bielefeld.DE/techfak/ags/ni/ the university homepage: http://www.Uni-Bielefeld.DE Applications should be sent to: Prof. Dr. Helge Ritter Neural Computation Group Faculty of Technology Bielefeld University 33 501 Bielefeld mailto:helge at techfak.uni-bielefeld.de From n.lawrence at dcs.shef.ac.uk Mon Mar 18 04:49:12 2002 From: n.lawrence at dcs.shef.ac.uk (Neil Lawrence) Date: Mon, 18 Mar 2002 09:49:12 -0000 Subject: PhD Studentship available Message-ID: <000101c1ce62$28a84dc0$4008a78f@leonardo> Applications are invited to fill a PhD studentship at the University of Sheffield, U.K. Funding is available under EPSRC grant number GR/R84801/01 - `Learning Classifiers from Sloppily Labelled Data' for a three year EPSRC studentship. This project is a collaborative effort between Neil Lawrence of the Machine Learning group at Sheffield (http://www.thelawrences.net/neil/) and Bernhard Schoelkopf (http://www.kyb.tuebingen.mpg.de/bu/people/bs) of the Max Planck Institute for Biological Cybernetics in Tuebingen, Germany. The studentship is scheduled to start in September 2002. The principal requirement for applicants is that they have a strong mathematical background developed through a first degree in Physics, Mathematics, Engineering or Computer Science. Additional knowledge of the fundamentals of machine learning would also be an advantage. Informal queries may be made via e-mail to neil at dcs.shef.ac.uk From bernhard.schoelkopf at tuebingen.mpg.de Mon Mar 18 13:07:20 2002 From: bernhard.schoelkopf at tuebingen.mpg.de (Bernhard Schoelkopf) Date: Mon, 18 Mar 2002 19:07:20 +0100 Subject: Openings at the Max Planck Institute in Tuebingen Message-ID: <00c501c1cea7$bef21140$9865fea9@VAIO> Dear Colleagues: Several positions at various levels (MSc, PhD, Postdoc) are available in a new department at the Max Planck Institute in Tuebingen, Germany, studying learning theory and algorithms with applications in various domains (computer vision, bioinformatics, psychophysics modeling - for further information, cf. www.kyb.tuebingen.mpg.de/bs/index.html). We invite applications of candidates with an outstanding academic record including a strong mathematical background. Please forward this message to suitable candidates. Max Planck Institutes are publicly funded research labs with an emphasis on excellence in basic research. Tuebingen is a small university town in southern Germany, see http://www.tuebingen.de/kultur/english/index.html for some pictures. Please submit applications (hardcopy or e-mail, including names of potential referees) by the end of April to: Bernhard Schoelkopf Max-Planck-Institut fuer biologische Kybernetik Spemannstr.38 72076 Tuebingen Germany Tel. +49 7071 601 551 Fax +49 7071 601 577 http://www.kyb.tuebingen.mpg.de/~bs From David.Cohn at acm.org Mon Mar 18 15:04:47 2002 From: David.Cohn at acm.org (David 'Pablo' Cohn) Date: Mon, 18 Mar 2002 15:04:47 -0500 Subject: jmlr-announce: JMLR special issue on shallow parsing is now available Message-ID: The Journal of Machine Learning Research is pleased to announce the Special Issue on Machine Learning Approaches to Shallow Parsing, available online at http://www.jmlr.org. ---------------------------------------- JMLR Special Issue on Shallow Parsing - contents: Introduction to Special Issue on Machine Learning Approaches to Shallow Parsing - James Hammerton, Miles Osborne, Susan Armstrong, Walter Daelemans pp. 551-558 Memory-Based Shallow Parsing - Erik F. Tjong Kim Sang pp. 559-594 Shallow Parsing using Specialized HMMs - Antonio Molina, Ferran Pla pp. 595-613 Text Chunking based on a Generalization of Winnow - Tong Zhang, Fred Damerau, David Johnson pp. 615-637 Shallow Parsing with PoS Taggers and Linguistic Features - Beata Megyesi pp. 639-668 Learning Rules and Their Exceptions - Herve Dejean pp. 669-693 Shallow Parsing using Noisy and Non-Stationary Training Material - Miles Osborne pp. 695-719 ---------------------------------------- All papers in the special issue, as well as all previous JMLR papers, are available electronically at http://www.jmlr.org/ in PostScript and PDF formats. Many are also available in HTML. The papers of Volume 1 are also available in hardcopy from the MIT Press; please see http://mitpress.mit.edu/JMLR for details. -David Cohn, Managing Editor, Journal of Machine Learning Research From bbs at bbsonline.org Mon Mar 18 16:28:55 2002 From: bbs at bbsonline.org (Behavioral & Brain Sciences) Date: Mon, 18 Mar 2002 16:28:55 -0500 Subject: BBS Call for Commentators: Williams: FACIAL EXPRESSION OF PAIN: AN EVOLUTIONARY ACCOUNT Message-ID: Below is a link to the forthcoming BBS target article FACIAL EXPRESSION OF PAIN: AN EVOLUTIONARY ACCOUNT by Amanda Williams http://www.bbsonline.org/Preprints/Williams This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _______________________________________________________ FACIAL EXPRESSION OF PAIN: AN EVOLUTIONARY ACCOUNT Amanda C de C Williams University of London St Thomas' Hospital London, UK KEYWORDS: pain; facial expression; adaptation; evolutionary psychology ABSTRACT: This paper proposes that human expression of pain in the presence or absence of caregivers, and the detection of pain by observers, arise from evolved propensities. The function of pain is to demand attention and prioritise escape, recovery and healing; where others can help achieve these goals, effective communication of pain is required. Evidence is reviewed of a distinct and specific facial expression of pain from infancy to old age, consistent across stimuli, and recognizable as pain by observers. Voluntary control over amplitude is incomplete, and observers better detect pain which the individual attempts to suppress than to amplify or to simulate it. In many clinical and experimental settings, facial expression of pain is incorporated with verbal and nonverbal-vocal activity, posture and movement in an overall category of pain behaviour. This is assumed by clinicians to be under operant control of social contingencies such as sympathy, caregiving, and practical help; thus strong facial expression is presumed to constitute an attempt to manipulate these contingencies by amplification of the normal expression. Operant formulations support skepticism about the presence or extent of pain, judgements of malingering, and sometimes the withholding of caregiving and help. However, to the extent that pain expression is influenced by environmental contingencies, "amplification" could equally plausibly constitute release of suppression according to evolved contingent propensities which guide behaviour. Pain has been largely neglected in the evolutionary literature and that on pain expression, but an evolutionary account can generate improved assessment of pain and reactions to it. http://www.bbsonline.org/Preprints/Williams ====================================================================== IMPORTANT Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. ======================================================================= _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password above: http://www.bbsonline.org/ *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Ralph BBS ------------------------------------------------------------------- Ralph DeMarco Associate Editorial Coordinator Behavioral and Brain Sciences Journals Department Cambridge University Press 40 West 20th Street New York, NY 10011-4211 UNITED STATES bbs at bbsonline.org http://bbsonline.org Tel: +001 212 924 3900 ext.374 Fax: +001 212 645 5960 ------------------------------------------------------------------- From nnsp02 at neuro.kuleuven.ac.be Mon Mar 18 04:06:48 2002 From: nnsp02 at neuro.kuleuven.ac.be (Neural Networks for Signal Processing 2002) Date: Mon, 18 Mar 2002 10:06:48 +0100 Subject: NNSP'02 IEEE Workshop, Neural Networks for Signal Processing, 2nd Call for Papers Message-ID: <3C95AE28.A638CDFA@neuro.kuleuven.ac.be> 2002 IEEE Workshop Neural Networks for Signal Processing September 4-6, 2002 Martigny, Valais, Switzerland http://eivind.imm.dtu.dk/nnsp2002 Sponsored by the IEEE Signal Processing Society In cooperation with the IEEE Neural Networks Council Call for Papers Thanks to the sponsorship of IEEE Signal Processing Society and IEEE Neural Network Council, the twelfth of a series of IEEE workshops on Neural Networks for Signal Processing will be held in Martigny (http://www.martigny.ch ), Switzerland, at the ``Centre du Parc'' (http://www.hotelduparc.ch ). The workshop will feature keynote addresses, technical presentations and panel discussions. Papers are solicited for, but not limited to, the following areas: Algorithms and Architectures: Artificial neural networks, kernel methods, committee models, independent component analysis, adaptive and/or nonlinear signal processing, (hidden) Markov models, Bayesian modeling, parameter estimation, generalization, optimization, design algorithms. Applications: Speech processing, image processing (computer vision, OCR), multimodal interactions, multi-channel processing, intelligent multimedia and web processing, robotics, sonar and radar, bio-medical engineering, bio-informatics, financial analysis, time series prediction, blind source separation, data fusion, data mining, adaptive filtering, communications, sensors, system identification, and other signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. Further Information NNSP'2002 webpage: http://eivind.imm.dtu.dk/nnsp2002 Paper Submission Procedure Prospective authors are invited to submit a full paper of up to ten pages using the electronic submission procedure described at the workshop homepage. Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Schedule Submission of full paper: April 15, 2002 Notification of acceptance: May 1, 2002 Camera-ready paper and author registration: June 1, 2002 Advance registration, before: July 15, 2002 Preliminary Programme September 4, AM: Plenary talk by Mahesan Niranjan, Sheffield University "From Kalman to Particle Filtering for solving Signal Processing Problems with Neural Networks" Regular session, including oral and poster presentations September 4, PM: Special session on "Machine Learning and statistical approaches for Bio-Informatic Applications", Plenary talk by Anders Krogh, University of Denmark "Hidden Markov models of proteins and DNA", followed by special session, including oral and poster presentations September 5, AM: Plenary Talk: Yoshua Bengio, University of Montreal "Faking unlabeled data for geometric regularization" Regular session, including oral and poster presentations September 5, PM: Special session on "Multimodal/multi-channel processing" Plenary talk by Josef Kittler, University of Surrey "Fusion of multiple experts in multimodal biometric personal identity verification systems" September 6, AM: Plenary talk by Zoubin Ghahramani, Gatzby Institute, UK "Occam's Razor and Infinite Models" Regular oral and poster sessions. Invited Speakers Yoshua Bengio, University of Montreal Title of the talk: Faking unlabeled data for geometric regularization Zoubin Ghahramani, Gatzby Institute, Title of the talk: Occam's Razor and Infinite Models Josef Kittler, University of Surrey Invited for a Special Session on Multimodal and multi-channel applications Title of the talk: Fusion of multiple experts in multimodal biometric personal identity verification systems Anders Krogh, University of Denmark Invited for a Special Session on Bio-Informatics Title of the talk: Hidden Markov models of proteins and DNA Mahesan Niranjan, Sheffield University Title of the talk: From Kalman to Particle Filtering for solving Signal Processing Problems with Neural Networks General Chairs Herve BOURLARD IDIAP and EPF, Lausanne Tulay ADALI University of Maryland Baltimore County Program Chairs Samy BENGIO IDIAP Jan LARSEN Technical University of Denmark Technical Committee Chair Jose PRINCIPE University of Florida at Gainsville Finance Chair Jean-Philippe THIRAN EPF, Lausanne Proceedings Chairs Jean-Cdric CHAPPELIER EPF, Lausanne Scott C. DOUGLAS Southern Methodist University Publicity Chair Marc VAN HULLE Katholieke Universiteit, Leuven American Liaison Jose PRINCIPE University of Florida at Gainsville Asia Liaison Shigeru KATAGIRI NTT Communication Science Laboratories Program Committee Amir Assadi Andrew Back Samy Bengio Yoshua Bengio M. Carreira-Perpinan Jean-Cdric Chappelier Andrzej Cichocki Jesus Cid-Sueiro Bob Dony Scott Douglas Craig Fancourt Ling Guan Tzyy-Ping Jung Shigeru Katagiri Jens Kohlmorgen Shoji Makino Danilo Mandic Elias Manolakos Takashi Matsumoto David Miller Li Min Christophe Molina Mahesan Niranjan Kostas Plataniotis Tommy Poggio Jose Principe Phillip Regalia Steve Renals Joo-Marcos Romano Jonas Sjberg Robert Snapp Kemal Snmez Sren Riis Jean-Philippe Thiran Naonori Ueda Marc Van Hulle Fernando Von Zuben Christian Wellekens Lizhong Wu Lian Yan From ezequiel at cogs.susx.ac.uk Wed Mar 20 09:02:04 2002 From: ezequiel at cogs.susx.ac.uk (Ezequiel Di Paolo) Date: Wed, 20 Mar 2002 14:02:04 +0000 Subject: CFP Adaptive Behavior Special Issue Message-ID: [Apologies for multiple copies. Please, do not forward to any other lists] ********** Call For Papers: ********** Adaptive Behavior Special Issue No 10 ------------------------------------- Plastic mechanisms, multiple timescales and lifetime adaptation --------------------------------------------------------------- Submission Deadline: 15 July 2002 http://www.cogs.susx.ac.uk/users/ezequiel/ab-cfp.html Guest Editor Editor-in-Chief ------------ --------------- Ezequiel A. Di Paolo Peter M. Todd Evolutionary and Adaptive Center for Adaptive Behavior Systems, and Cognition, School of Cognitive Max Planck Institute And Computing Sciences, for Human Development, University of Sussex, Lentzealle 94, Brighton, BN1 9QH, UK D-14195 Berlin, Germany ezequiel at cogs.susx.ac.uk editor at adaptive-behavior.org The last few years have seen an increased interest in the design of plastic robot controllers, or controllers with inherent dynamical properties such as the interplay of multiple timescales, for the generation highly adaptive and robust behaviour. This research area in robotics draws important inspiration from neuroscience and may be applied to the testing and generation of hypotheses on the role of plasticity in brain function. Synthetic methods, such as evolutionary robotics, have provided a glimpse of how plastic neural mechanisms, like activity-dependent neuromodulation, that are often studied locally in reduced systems, can give rise to integrated and coordinated performance in a whole situated robot. Recent studies have included the role of modulatory processes affecting neural activation, diffusing localized neuromodulation, the evolution of rules of synaptic change, the design of neural controllers acting on fast and slow timescales, and the evolution of stabilizing mechanisms of cellular activity. These studies have successfully revealed that such mechanisms are able to introduce highly desirable properties such as robustness, adaptation to bodily perturbations, and improved evolvability. But many questions remain open, such as what is the relation between plasticity and stability, how adequate is a given mechanism for the required task, how do alternative methods of obtaining plastic behaviour relate, and to what extent is environmental regularity responsible for successful tuning of neural controllers. Adaptive Behavior solicits high quality contributions on these topics for its 2002 special issue (vol 10:3/4). Papers should describe work integrating mechanisms and adaptation at the behavioural level. They may present work using simulations or real platforms. Appropriate contributions addressing other levels of plasticity (such as sensory morphology or bodily structure) will also be considered. Papers drawing inspiration from, and contributing back to, neuroscience will be particularly appropriate. Topics ------ Multi-timescale controllers Activity-dependent plastic neural controllers Change and stability in robot performance Adaptation to radical perturbations Neuromodulation Re-configurable neural controllers Plastic controllers and simulation/robot transfer Submissions ----------- Authors intending to submit are also encouraged to contact the Guest Editor as soon as possible to discuss paper ideas and suitability for this issue. Submission of manuscripts should be made to the Guest Editor at the address below. ***** Submissions due: 15 July 2002 ***** Submissions should be in English, with American spelling preferred, in the style described in the Fourth Edition of the Publication Manual of the American Psychological Association, be double-spaced throughout and not normally exceed 25 journal pages (40 manuscript pages including figures, tables and references). Electronic submission in PDF format is strongly preferred. Each submission should have a title page including: the submission's title; names, postal, and email addresses of the authors; the phone and FAX number of the corresponding author; and a short running title. The second page should contain an abstract of about 150 words and up to six suggested key words. The main text should start on page 3, with acknowledgements at the end. Detailed guidelines for submission layout can be ffound on the ISAB web site at http://www.isab.org.uk/journal/ by following the link there labelled "Instructions to Contributors". Submit manuscripts to the Special Issue Guest Editor in PDF format by email with "Special Issue 10" in the subject line. Dr Ezequiel A. Di Paolo School of Cognitive and Computing Sciences, University of Sussex, Brighton, BN1 9QH, UK ezequiel at cogs.susx.ac.uk Tel.: +44 1273 877763 Fax.: +44 1273 671320 Adaptive Behavior ----------------- Adaptive Behavior is the premier international journal for research on adaptive behaviour in animals and autonomous artificial systems. For over 10 years it has offered ethologists, psychologists, behavioural ecologists, computer scientists, and robotics researchers a forum for discussing new findings and comparing insights and approaches across disciplines. Adaptive Behavior explores mechanisms, organizational principles, and architectures for generating action in environments, as expressed in computational, physical, or mathematical models. Adaptive Behavior is published by Sage Publications, a leading independent publisher of behavioural journals, spanning human psychology to robotics to simulation modelling. A new editorial board, headed by Peter M. Todd of the Center for Adaptive Behavior and Cognition, is shaping the journal in novel directions. New technological infrastructure will allow faster publication turnaround, better feedback from reviewers to authors (and back again), and greater access to research results. The journal publishes articles, reviews, and short communications addressing topics including perception and motor control, learning and evolution, action selection and behavioural sequences, motivation and emotion, characterization of environments, collective and social behaviour, navigation, foraging, mate choice, and communication and signalling. From bpg at cs.stir.ac.uk Thu Mar 21 06:13:50 2002 From: bpg at cs.stir.ac.uk (B.P. Graham) Date: Thu, 21 Mar 2002 11:13:50 +0000 Subject: PhD studentship available Message-ID: <3C99C06E.4A641D9C@cs.stir.ac.uk> Dear all, A 3-year EPSRC-funded PhD studentship is available in the Neural Computing group, Department of Computing Science and Mathematics, University of Stirling, Scotland (http://www.cs.stir.ac.uk). Applications are required by May 1st, 2002. Start date is September 1st, 2002. Title: Compartmental modelling of developing neurons Supervisor: Dr Bruce Graham (http://www.cs.stir.ac.uk/~bpg/) Project summary: A major tool in the study of biological nervous systems is computer simulation of neuronal function using the compartmental modelling framework. Neurons and the networks they form are the outcome of a developmental process. The aim of this project is to extend the compartmental modelling framework to enable the modelling of developing neurons that are undergoing changes in shape and membrane characteristics over time. Novel numerical techniques are required to handle the accurate simulation of dynamic intracellular and extracellular environments. Simulations must handle multiple time scales, ranging from the submillisecond for electrical activity to hours for morphological change. The techniques will be incorporated into user-friendly simulation software. The resultant computational tools will allow the investigation of many aspects of nervous system development. They will be applied here in a study of the growth of a neuron's dendritic tree. The dendritic tree is the major site of input to the neuron and its morphology plays a determining role in the signal integration and processing capabilities of the neuron. The aim is to produce models that elucidate the biophysical mechanisms underlying the formation of the characteristic tree morphologies of different neuronal types. Further details of the research can be found on the project's web page (http://www.cs.stir.ac.uk/~bpg/research/neurite.html). The student will work under the supervision of Dr Bruce Graham (http://www.cs.stir.ac.uk/~bpg/), with additional supervision from Dr Arjen van Ooyen (http://www.anc.ed.ac.uk/~arjen/) at the Netherlands Institute for Brain Research in Amsterdam. Travel money is available for yearly visits to Amsterdam. The project ideally suits candidates with strong numerical and computing skills, with an interest in modelling biological systems, or alternatively with a biological background with significant, demonstrated numerical and computing skills. Closing date for applications is 1st May, 2002. Starting date is 1st September, 2002 (or as soon as possible thereafter). For further information, in the first instance please contact Dr Bruce Graham (b.graham at cs.stir.ac.uk). For application details, please contact:- Mrs Heather Brennan Department of Computing Science and Mathematics University of Stirling Stirling FK9 4LA, Scotland, UK Tel: 01786 467460 Fax: 01786 464551 Email: heather at cs.stir.ac.uk Applications should clearly indicate that you are applying for this studentship. The required summary of your proposed research project should emphasise your reasons for applying for this particular project and the skills and methodology that you will employ. -- Dr Bruce Graham, Lecturer (b.graham at cs.stir.ac.uk) Dept. of Computing Science and Mathematics, University of Stirling, Stirling FK9 4LA phone: +44 1786 467 432 fax: +44 1786 464 551 From becker at meitner.psychology.mcmaster.ca Mon Mar 25 15:14:12 2002 From: becker at meitner.psychology.mcmaster.ca (S. Becker) Date: Mon, 25 Mar 2002 15:14:12 -0500 (EST) Subject: responses to textbook survey Message-ID: Dear connectionists, thanks to all who replied to my request to exchange info on textbooks for neural network-related courses. If you are teaching such a course, I think you will find the comments quite useful. As well there are several pointers to web notes etc. So rather than attempt to summarize I've included the comments verbatim, except where the sender requested that they remain confidential. cheers, Sue _____________________________________________________________________________ Date: Wed, 6 Mar 2002 21:11:26 -0500 (EST) From: "S. Becker" Dear connectionists, This message is directed to those of you who teach a course in neural networks & related topics, or computational neuroscience or cognitive science, and would like to exchange opinions on textbooks they are using. I'd like to know the name of your course, who takes it (undergrad vs grad, comp sci/eng vs psych/bio), what textbook you are using and what you consider to be the pros and cons of this book. I teach a course in neural computation to 3rd year undergrads, mostly CS majors and some psych/biopsych, and have used James Anderson's book An Introduction to Neural Networks a number of times. I like this book a lot -- it is the only one I know of that is truly interdisciplinary and suitable for undergraduates -- but it is in need of updating. I will post a summary of the replies I receive to connectionists. _____________________________________________________________________________ From: "Olly Downs" The Hopfield Group (formerly at CalTech, and for the past 4 years at Princeton) has taught the course 'Computational Neurobiology and Computing Networks' for many years. The principal text for the course has always been 'Introduction to the Theory of Neural Computation' by Hertz, Krough and Palmer. It is very much oriented towards the Physics/Applied Math people in the audience, and thus more recently we have also used Dana Ballard's "An Introduction to Natural Computation", which we found to be more paedagogical in it's approach, and somewhat less mathematical - which has helped span the broad audience our course has enjoyed - including 3rd-year undergrads through faculty in Biology, Psychology, Applied Math, CS and Physics. _____________________________________________________________________________ Sender: Dave_Touretzky at ammon.boltz.cs.cmu.edu Hi Sue. I teach my neural nets course using two books: Hertz, Krogh and Palmer, 1991, Introduction to the Theory of Neural Computation. Bishop, 1995, Neural Networks for Pattern Recognition. This started out as a graduate course (in fact, it was originally taught by Geoff Hinton) but is now dual-listed as an undergrad/grad class. Those who take the grad version have to do a project in addition to the homeworks and two exams. The syllabus for my course, along with a bunch of MATLAB demos, can be found here: http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15782-s02/ _____________________________________________________________________________ I teach a course Principles of Neural Computation to 3rd year undergrads and I am using Simon Haykin's book Neural Networks, a Comprehensive foundation. In my course only chapters 1-5 and 9 are used. The contents of those are: Introduction, Learning processes, Single layer perceptrons, Multilayer perceptrons, Radial basis function networks and Self-organizing maps. In general the book is quite good, but the first chapter and maybe something in the second are too loose or too general to start with undergrads. It would be better to start with more concrete examples. But I am using the book, because the previous lecturer of the course used the same book and he gave me his slides and because he and I have not found a better book. We have also an Advanced course on Neural computing for graduated students. In that course the material is selected among the chapters of the same book which have not been used in the course for undergraduates. * Kimmo Raivio, Dr. of Science in Technology | email: Kimmo.Raivio at hut.fi * Lab. of Computer and Information Science | http://www.cis.hut.fi/kimmo/ * Helsinki University of Technology | phone +358 9 4515295 * P.O.BOX 5400, (Konemiehentie 2, Espoo) | gsm +358 50 3058427 * FIN-02015 HUT, FINLAND | fax +358 9 4513277 _____________________________________________________________________________ From: Gregor Wenning I am a teaching assistant in the lab of Klaus Obermayer. We offer a one year course in artificial neural networks, supervised and unsupervised methods, mainly for computer scientists. The main books we use are: 1) Bishop, "Neural Networks for Pattern Recognition" a good book for basic methods, unfortunately the students do not like it 2) Haykin,"Neural Networks", good overview, many topics and aspects, inludes problems and one can get a solutions manual as well! _____________________________________________________________________________ From: Nicol Schraudolph I've just taught Machine Learning to upper-level, mostly CS undergrads (corresponds to U.S. master's students), using Cherkassky/Mulier's "Learning from Data". That book gives a very nice presentation of the basic concepts and problems in machine learning, but has two drawbacks: 1) it is too expensive for students, and 2) it is full of errors in the details, especially in the equations. Their two-page description of EM contains no less than 5 errors! I can recommend using this book to design a machine learning course, as long as you don't try to teach directly from it. Dr. Nicol N. Schraudolph http://www.inf.ethz.ch/~schraudo/ Chair of Computational Science mobile: +41-76-585-3877 ETH Zentrum, WET D3 office: -1-632-7942 CH-8092 Zuerich, Switzerland fax: -1374 _____________________________________________________________________________ From: Roseli Aparecida Francelin Romero I have used to teach a course in neural networks for graduate students, Haykin's book "Neural Networks - A Comprehensive Foundation". This book presents many exercises in each chapther and I think is a good book for getting the basis necessary in nn & related topics. _____________________________________________________________________________ We used Bechtel and Abrahamsen's 'Connectionism and the mind' to teach 3-4th year psych students in psychology and cognitive science (Univ. of Amsterdam and earlier also Univ. of Utrecht). The book needs updating and a 2nd edition was promised several years ago but has never come out. So, now we are back to using a reader and I have started to write a book myself. Prof.dr. Jaap Murre Department of Psychology University of Amsterdam Roetersstraat 15 1018 WB Amsterdam _____________________________________________________________________________ I've been mostly using Bishop's book on NNs for PR (1995), for graduate students in CS, some engineers and mathematicians (and few or no psych/bio people). It is very well written, and most students appreciate it very much, despite the arduous introductory (learning theory and non-parametric statistics) chapters. One of my colleagues (Kegl) has used for the same course this year the new edition of the Duda & Hart book, which is more up-to-date, however with many minor but annoying errors. Yoshua Bengio Associate professor / Professeur agrégé Canada Research Chair in Statistical Learning Algorithms / titulaire de la chaire de recherche du Canada en algorithmes d'apprentissage statistique Département d'Informatique et Recherche Opérationnelle Université de Montréal, _____________________________________________________________________________ From: "Tascillo, Anya (A.L.)" I'm not teaching large classes, just all my coworkers, and I still hand them Maureen Caudill's "Understanding Neural Networks" spiral bound books, volumes 1 and 2. People do the exercises and and one friend turned around and taught the neural network section of his summer class after I lent him these books. _____________________________________________________________________________ From: Peter Dayan > I'd like to know the name of your course, theoretical neuroscience > who takes it (undergrad vs grad, grad students > comp sci/eng vs psych/bio), a mix of those two, more the former > what textbook you are using you guess :-) > consider to be the pros and cons of this book. The only con is the prose.... More seriously, our students take this course, plus one on machine learning, which doesn't really have an ideal book yet. I expect that a mixture of David MacKay and Mike Jordan's forthcoming books will ultimately be used. _____________________________________________________________________________ Your message was forwarded to me... I use Principles of Neurocomputing = for Science and Engineering by Ham and Kostanic. The course taught in = the ECE dept. of UNH is attended by mostly first year grad students but = I also have two ugs. The book is well written and comes with a large = set of solved Matlab assignments which I use extensively. Andrew L. Kun Assistant Professor University of New Hampshire ECE Department, Kingsbury Hall Durham, NH 03824 voice: 603 862 4175 fax: 603 862 1832 www.ece.unh.edu/bios/andrewkun.html _____________________________________________________________________________ From: "Randall C. O'Reilly" We have compiled a list of courses using our textbook: http://psych.colorado.edu/~oreilly/cecn_teaching.html We and several others use our book for undergrad and grad level courses, and it works pretty well. Hopefully some of these other people will email you with a more objective 3rd-party perspective on the pros and cons of the book. Here's our general sense: Pros: - covers neuroscience, computation & cognition in a truly integrated fashion: units are point neurons w/ ion channel & membrane potential eq's, implement powerful learning algorithms, and are applied to a wide range of cognitive phenomena. - includes many "research grade" cognitive models -- not just a bunch of toy models. - integrated software & use of one framework throughout makes it easy on the students Cons: - not encyclopedic: generally presents one coherent view instead of a bunch of alternatives (alternatives are only briefly discussed). - not as computationally focused as other books: emphasizes biology and cognition, but does not cover many of the more recent machine learning advances. _____________________________________________________________________________ From: Andy Barto I teach a course entitled "Computing with Artificial Neural Networks" to advanced undergraduates and a few graduate students. Students come mostly from CS, but some for engineering, psychology, and neuroscience. I have used almost all the books out there. I have used Anderson's book, finding it unique, but a bit too low-level of technical detail for my students. Currently I am using Haykin, second edition. I have used Bishop's book for independent studies, but it is too difficult for my undergrads. I refer a lot to Reed and Marks "Neural Smithing" but it is too narrow for the whole course. I have not found the perfect book. Still waiting for Geoffrey to write one. I would be most interested to hear what you are able to find out. On another matter, as we revise our overall curriculum, I am not sure that the NN course will survive in present form. It may happen that we will cover major NN algorithms as part of a broader machine learning course sequence. We like the Duda, Hard, and Stork book, and I really like the book "The Elements of Statistical learning" by Hastie, Tisbhirani, and Friedman, although it is not completely satisfactory for a course either, being too difficult and also lacking the algorithmic point of view that I think is important for CS students. ______________________________________________________________________ I teach a graduate course out of my own book (now in the 2nd edition, D. S. Levine, Intro. to Neural and Cognitive Modeling, Erlbaum, 2000). My current class is predominantly psychology students, but I have also taught a class out of the first edition that was predominantly computer science students with a substantial minority from EE. (The second edition is not too changed from the first in Chs. 1-5 except for updating, but Chapters 6 and 7 have several new or expanded sections). In my biased author's view, I see some pros of my book as its discussions of general modeling principles, of the history of the field, of the relations of network architectures to psychological functions, and of the approaches of Grossberg and his colleagues. (I remember you noted some of these when reviewing the first edition for an AI journal.) But perhaps more important than any of these, it is the only NN book I know of that proceeds systematically from building blocks (such as lateral inhibition and associative learning) to modeling of complex cognitive/behavioral functions such as categorization, conditioning, and decision making. This is reflected in its chapters which are organized not primarily by "schools" or "paradigms" as are many other NN introductions, but primarily by functions and secondarily by paradigms. The flow chart which illustrates this organization has been moved to the front of the book to be more apparent. The book is multidisciplinary friendly because the equations are mostly clustered at the ends of chapters, and there are neuroscience and math appendices for those who lack background in those areas. Also it is relatively accessible monetarily (the last I checked, $36 in paperback). More description of the book's distinctive features can be obtained from my web site (www.uta.edu/psychology/faculty/levine), under Books, or the Erlbaum web site (www.erlbaum.com). When I teach out of the book I supplement it with copies of the original papers being discussed (by Grossberg, Sutton-Barto, Klopf, Anderson, Rumelhart et al., et cetera). That is probably a good idea for any textbook in the field, particularly for grad courses. The cons of my book are that it is probably not the best one for those interested in detailed engineering or computational implementations or mathematical algorithms. My experience is that many CS or EE students found a course based on the book a useful complement to another course based on a book, or professor's notes, that have those other strengths -- although I know one EE professor doing research on vision and robotics, Haluk Ogmen of the U of Houston, who has taught from my book extensively with his students. Also, while the book does not demand specific technical prerequisites in any one area, it may or may not be the best book for undergraduates. Some undergrads have found it stylistically difficult because the discussion cuts across traditional boundaries, though many undergrads already involved in beginning research have taken my course or a roughly equivalent independent study course and felt comfortable with the book. My book has a variety of exercises, some computational and some open-ended. I have planned to write a web-based instructor's manual but that has remained on the back burner: the idea is still alive and audience interest might help push it forward. But the following exercises have been particularly valuable for the students in my classes: Ch. 2, Exercises 2 (McCulloch-Pitts), 3 (Rosenblatt perceptron: very detailed!), 8 (ADALINE); Ch. 3, 2 (outstar), 4 (Grossberg gated dipole), 6 (BAM); Ch. 4, 1 and 2 (shunting lateral inhibition); Ch. 5, 2 (Klopf), 3 (Sutton-Barto); Ch. 6, 4 (backprop T vs. C), 6 (BSB). Best, Dan Levine levine at uta.edu ______________________________________________________________________ From: Alex Smola you're welcome to have a look at my slides (linked on my homepage) for the past two courses i gave. and there'll be slides on bayesian kernel methods online really soon, too (they're, in fact, at http://mlg.anu.edu.au/~smola/summer2002). most of that is taken from bernhard's and my book http://www.learning-with-kernels.org for a free sample of the first 1/3, just go to contents and download it. and, of course, i like that book a lot ;). but you won't find a word on neural networks in it (except for the perceptron). ''~`` ( o o ) +------------------------------.oooO--(_)--Oooo.----------------------------+ | Alexander J. Smola http://mlg.anu.edu.au/~smola | | Australian National University Alex.Smola at anu.edu.au | | Research School for Information Tel: (+61) 2 6125-8652 | | Sciences and Engineering .oooO Fax: (+61) 2 6125-8651 | | Canberra, ACT 0200 (#00120C) ( ) Oooo. Cel: (+61) 410 457 686 | +---------------------------------\ (----( )------------------------------+ \_) ) / (_/ ______________________________________________________________________ I use Simon Haykin's 2nd edition of Neural Networks (It doesn't cover ART). For density estimation and the Bayesian framework I use Chris Bishop's Neural Networks for Pattern Recognition (which I see as an updated version of Duda and Hart). My students are final year CS or Math and stats undergraduates. Wael El-Deredy, PhD Cognitive Neuroscience Group Visiting Lecturer Unilever Research Port Sunlight School of Comp. and Math Sci. Bebington CH63 3JW - UK Liverpool John Moores Univ. ______________________________________________________________________ I am using in my 3rd and 4rth year courses: "Foundations of neural networks, fuzzy systems and knowledge engineering", N.Kasabov, MIT Press, 1966 with a WWW page being updated annually: http://divcom.otago.ac.nz/infosci/kel/courses/ Nik Kasabov Professor of Information Science University of Otago New Zealand ______________________________________________________________________ This is a brief response to your broadcast query for information about neural network classes. I guess I have three experiences to contribute: At Carnegie Mellon University I taught a class in the Department of Psychology on Parallel Distributed Processing. The focus of the course was on cognitive modeling. The students were a mix of undergraduates, graduate students, and even a few visiting postdocs and faculty members. The undergraduates included computer science students, psychology majors, and participants in an interdisciplinary cognitive science program. I used the original PDP volumes, including the handbook of exercises, but assignments were completed using the PDP++ software. Selected papers were also used as assigned readings. In terms of course credit, this class counted as a class and a half, so the students were worked pretty hard. I found the PDP volumes to still be an excellent source of readings, but I think that most students depended on lectures heavily to clarify concepts. Materials from this course are collecting dust in a corner of the web at: "http://www.cnbc.cmu.edu/~noelle/classes/PDP/". I am currently teaching two relevant classes at Vanderbilt University. The first is a graduate level seminar on Computational Modeling Methods for Cognitive Neuroscience, offered through the Department of Electrical Engineering and Computer Science. Between a third and a half of the participants are engineering students, most of which are interested in robotics. The remainder are psychology students, with a couple of neuroscientists among them. I am following the O'Reilly and Munakata text, _Computational Explorations in Cognitive Neuroscience_, fairly closely. I am really enjoying teaching from this book, though I sense that the seminar participants feel a bit like they're trying to drink from a fire hose -- the pages are densely packed with material presented at a fairly rapid clip. Most lecture time has been spent providing guidance through this wealth of material, but some class time has been used to provide balance and counterpoint to the biases expressed by the authors. The book does not provide the usual taxonomic review of techniques, opting instead to focus on a specific set of methods which may be fruitfully integrated into a unified modeling framework. This means that I've had to put in a bit of extra work to make sure that the students are at least briefly exposed to the range of techniques that they will encounter in the literature. Despite these issues, the book has been a very useful tool. The text's integrated computer exercises using PDP++, packaged so as to require only minimal computer skills, have proven to be especially helpful. The materials that I have prepared for this seminar are at: "http://www.vuse.vanderbilt.edu/~noelledc/classes/S02/cs396/". I am also teaching an undergraduate computer science class entitled "Project in Artificial Intelligence". My version of this course is subtitled "Fabricating Intelligent Systems With Artificial Neural Networks". The class is intended to provide seniors in computer science with an opportunity to develop and evaluate a substantial software project. These students are being introduced to neural network technology with the help of _Elements of Artificial Neural Networks_ by Mehrotra, Mohan, and Ranka. I selected this text because it appeared to present the basics of neural networks in a concise manner, it focused on engineering issues in favor of cognitive modeling issues, and it included some discussions of applications. Unfortunately, my students have, for the most part, found the text to be too concise, and they have had difficulty with some of the mathematical notation. I have had to supplement this material extensively in order to communicate central concepts to the students. I have fabricated a web site for this course, which may be found at: "http://www.vuse.vanderbilt.edu/~noelledc/classes/S02/cs269/". -- David Noelle ---- Vanderbilt University, Computer Science ------ -- noelle at acm.org -- http://people.vanderbilt.edu/~david.noelle/ -- ________________________________________________________________________ From: Chris Williams I no longer teach a NN course. For machine learning I use Tom Mitchell's book. This includes some NN stuff (I also add stuff on unsupervised learning) For probabilistic modelling (belief nets) I use the upcoming Jordan & Bishop text. _____________________________________________________________________________ I am teaching at UQAM (Montreal) and have the same problem. Please let me know if you get some useful answers. Alex Friedmann Neural Networks Group LEIBNIZ-IMAG Tel: (33) 4 76 57 46 58 46, avenue Felix Viallet Fax: (33) 4 76 57 46 02 38000 Grenoble (France) Email: Alex.Friedmann at imag.fr _____________________________________________________________________________ I have taught a comp neuro course several times. I have used the Koch & Segev book supplemented with lots of guest lectures. Recently I used Dayan & Abbott and will probably do so again since it is a pretty comprehensive book. My syllabus can be found in various formats on my web page http://www.pitt.edu/~phase Bard Ermentrout _____________________________________________________________________________ With two collegues I teach a course called "Connectist modeling in Psychology" at the University of Amsterdam. It is a course at undergraduate level, but Dutch degrees do not translate neatly to American ones. When in three years we switch to an Anglosaxon bachelor-master system, it will probably be part of the master's curriculum. As the name suggests, it is a course within the field of psychology. Next to psychology students, some biology and AI majors also take our course. We have used a book of Bechtel and Abrahamson, called "Connectionism and the mind". This book, from the early nineties, is mainly targeted at cognitive science: cognitive psychology, philosophy. It is not a good book. Students complain that it is unorganized, and it misses out on many important topics while rambling on about symbolism vs. connectionism, important in the eighties but now not a very lively discussion anymore. Moreover, it has run out of print (the second edition promised years ago still isn't there), so we had to give students the opportunity to photocopy the book. For all these reasons, we decided to this year not use the book, but instead make a reader. We are now collecting articles to include in it (the introductory chapters will be written by one of us, Jaap Murre). To give our students hands-on experience, we have also made excercises. These are available at www.neuromod.org/connectionism2002/opdrachten/, but alas most of the explanations are in Dutch. I hope that you get a lot of reactions. Would it perhaps be possible for you to, if you do get many reactions, somehow share the information? Perhaps by putting reactions on the web or sending a mail to Connectionists? I would be very interested in what books others have used. Thank you in advance. Sincerely, Martijn Meeter _____________________________________________________________________________ I give a course simply called "Neural Networks" for undergraduate computer science students (3rd year). The book is S. Haykin, Neural Networks: A Comprehensive Foundation, Second edition, Prentice Hall, 1999. ISBN 0-13-273350-1. The book is indeed comprehensive, but in my opinion not an ideal book for computer science students. More so for engineering students, I think. The signal-theoretic viewpoint on neural networks is lost on computer science students, who would prefer a more computational view. But I still use the book, since it is so comprehensive. The book is both deeper and wider than the course content, which makes it possible for the students to use the same book as a reference book when they want to dig deeper or broaden their view. This is useful since the course ends with a project-like assignment, defined by the students themselves. Different students dig in difference places, but most of them can use the same source. ----- If God is real, why did he create discontinuous functions? ----- Olle Gällmo, Dept. of Computer Systems, Uppsala University Snail Mail: Box 325, SE-751 05 Uppsala, Sweden. Tel: +46 18 471 10 09 URL: http://www.docs.uu.se/~crwth Fax: +46 18 55 02 25 Email: crwth at DoCS.UU.SE _____________________________________________________________________________ Germán Mato and myself are giving a course on biologically oriented neural networks, for advanced undergraduate students, in physics and engineering. This is the material, and the books we include: 1. An introduction to the nervous system, neurons and synapses: a snapshot from the books of "Essentials of Neural Science and Behavior" Kandel, Schwartz and Jessell 2 classes. 2. Single cell dynamics: 3 classes. We take the theory of dynamical systems and excitable media in "Nonlinear Dynamics and Chaos : With Applications to Physics, Biology, Chemistry, and Engineering" S. Strogatz, and then adapt it to several neuronal models (Hodgkin Huxley, integrate and fire, FitzHugh-Nagumo). We study the stable states, their stability, phase transitions, and the emergence of cycles. 3. Dynamic behavior of a large ensemble of neurons: We use chapters 3 and 5 of Kuramoto's book "Chemical Oscillations, Waves and Turbulence", to introduce the phase models, and study the conditions for synchronization. We exemplify with the integrate and fire and the Hodgkin Huxley models. 3 classes 4. Associative memories: "Introduction to the theory of neural computation", Hertz, Krogh and Palmer 8 classes 5. Information theoretical analysis of neural coding and decoding Cover and Thomas, to give the basic definitions. These two books: "Neural Networks and Brain Function", Rolls and Treves, and "Spikes", Rieke et al. provide us with examples in the nervous system. We study the problem of decoding the neural signal, how to overcome limited sampling, rate coding, the role of correlations and time dependent codes, and how to calculate the information transmitted in very short time windows. This unit is not actually one where we analyze how to build or model a neural network, but rather, how to extract information from it. 10 classes 6. Modeling cortical maps The book of Hertz "Introduction to the theory of neural computation" gives us the basics for unsupervised learning. In Haykin's book "Neural Networks" we find an information theoretical description of the Linsker effect, and of self organized feature maps. In the book of Neural Networks and Brain Function, Rolls and Treves give examples of this, in the brain. 6 classes Ines Samengo samengo at cab.cnea.gov.ar http://www.cab.cnea.gov.ar/users/samengo/samengo.html tel: +54 2944 445100 - fax: +54 2944 445299 Centro Atomico Bariloche (8.400) San Carlos de Bariloche Rio Negro, Argentina _______________________________________________________________________ From: Geoff Goodhill What a great question! I have twice so far at Georgetown taught a "computational neuroscience" course, an an elective course for graduate students in the Interdisciplinary Program in Neuroscience (backgrouns: bio/psych). The class has been around 4-5 each time, including one physics undergrad in each case. I used mostly papers, plus a few chapters from Churchland and Sejnowski. Overall I found the experience very unsatisfying, since none of the students could deal with any math (not even what I would regard as high school math). This year I am instead organizing an informal (no credit) reading group to work through the new Dayan and Abbott book. I think this is excellent: broad, sophisticated, though certainly not for anyone without a very solid math background. I think next year I will offer a formal course again (fortunately it's entirely up to me if and when I teach it) based around this book, and basically tell prospective students that if they can't understand the math then don't bother coming. _______________________________________________________________________ From: Stevo Bozinovski I am teaching Artificial Intelligence, at the Computer Science department, South Carolina state University. It is an undergraduate level (CS 323) course. I use the Pfeifer's book "Understanding Intelligence", in which a whole chapter is dedicated to Neural Networks. The book covers sufficiently the myltilayer percepton type networks. The adaptive neural arrays are not covered. I use my book "Consequence Driven Systems" in order to cover the adaptive neural arrays. _______________________________________________________________________ From: Lalit Gupta At SIU-C, we offer a graduate level Neural Networks course in the Dept of Electrical & Computer Engineering. The text that is currently being used is: Neural Networks, Second Edition, Simon Haykin, Prentice Hall, 1999. It covers many aspects of neural networks. It is a relatively difficult text for students to read directly without instruction. It is suitable for both engineers and scientists. Information about the text is available at: http://www.amazon.com/exec/obidos/tg/stores/detail/-/books/0132733501/glance /ref=pm_dp_ln_b_1/103-6340812-8312600 _______________________________________________________________________ From: Todd Troyer I have been teaching a one semester graduate level survey course entitled Computational Neuroscience for the last two years. The course is part of an optional core sequence of required courses in Maryland's Neuroscience and Cognitive Science interdisciplinary Ph.D. program. The course is designed to be taken by students spanning the range of neuroscience and cognitive science - everyone from engineers, and computer scientists to biologists and psychologists who have only a rudimentary math background. I too was frustrated with the lack of available textbooks and so have spent a lot of time writing extensive "class notes" in semi-book form. At this point things are obviously incomplete. I'm also spending some time developing computer simulations and I hope to have a "lab" component of the course next fall. Both things take huge amounts of time. The topic areas I focus on have extensive overlap with the topics presented in the new Dayan and Abbott book, but I spend more time on the linear algebra/network stuff. Overall the course is at a more basic level than Dayan and Abbott. I feel that their book ramps up the math too fast for many of my students. Next fall I will also require Churchland and Sejnowski's Computational Brain. I found that in focusing on the presentation of the technical material, I was having a hard time organizing the presentation of the biological examples. The Computational Brain, while getting a bit dated, does a wonderful job at this (with the usual minor qualms). I also recommend the Rieke et al Spikes book for my more advanced students. I've looked at some other books as well. I like the Anderson book, but it doesn't focus on the material that I'd like. I think Randy O'Reilly's new book presents an interesting attempt to present a connectionist perspective on cognitive neuroscience, although I haven't had a chance to look through it in detail. Overall, I've found that the key hinge point in thinking about such a class is the computational skills. It is much easier either to assume a reasonably sophisticated math background or not to delve into the math much at all. In the first case you can use formal language to talk about how the equations relate to the underlying ideas, but the course excludes students with noncomputational backgrounds. Without assuming such a background you can talk about some of the underlying concepts, but such a presentation tends to be vague and often either overstates what computation adds or seems obvious and not interesting. While I've had mixed success so far, I beleive that you can have a unified class where you teach enough math to bring the students that need it up to speed, at least to the degree that all students can begin to explore the *relationship* between the math and the neuroscience. I've been trying to set things up so that the students with a stronger background can work on self-directed outside reading and/or projects during some of the basic math sessions. It's a stretch. Finally, I'm thinking about developing the lab component into a joint lab/lecture course that I could teach to advanced undergraduates, again from a range of backgrounds. Todd W. Troyer Ph: 301-405-9971 Dept. of Psychology FAX: 301-314-9566 Program in Neuroscience ttroyer at glue.umd.edu and Cognitive Science www.glue.umd.edu/~ttroyer University of Maryland College Park, MD 20742 ____________________________________________________________________________ I teach a course Computational Neurobiology, to graduate students in the school of computational sciences. Most students are computational neuroscience students, though some engineers, math majors, and psychologists have taken it. I use Johnston and Wu, Foundations of Cellular Neurophysiology. It is fabulous, because is has all the equations describing cell properties. I.e. cable equation, Hodgkin-Huxley equations, etc. It is probably a bit advanced for undergrads, and even the psych majors have trouble if they don't have a strong math background (which they don't at GMU). The only thing against the book is the lack of good figures, such as found in From Neuron to Brain, by Nicholls et al. But I supplement my lectures from that book, since the students have not had a basic non-computational neuroscience course. I also teach a course Comptutational Neuroscience Systems. However, I haven't found a good textbook. So I use Kandel and Schwartz, or Fundamental Neuroscience for the basic neuroscience systems, and supplement each lecture with one or two modeling papers on the system we discuss. I'm still hoping to find a book that presents different methods of modeling systems of neurons, e.g. A chapter or two on (1) using genesis and neuron to develop large scaled detailed models, and (2) a linear systems approach, (3) a neural network approach, (4) integrate and fire neuron networks (5) spike train analysis, (6) higher level abstract models, such as the Rescorla-Wagner model. If you know of any like this, please let me know. Avrama Blackwell ____________________________________________________________________________ I teach an undergrad course on "Information processing models." Its mainly psych and neuro students, and a handful of graduate students. Most have little or no math or programming background. I have not found any one book that's good for this level. I have written up some of my own lecture notes and mini-tutorials which you can get at http://redwood.ucdavis.edu/bruno/psc128 I am also teaching a graduate course this spring on computational neuroscience in which I will use Peter and Larry's book for the first time. I offer this course only once every two years though. Bruno A. Olshausen Phone: (530) 757-8749 Center for Neuroscience Fax: (530) 757-8827 UC Davis Email: baolshausen at ucdavis.edu 1544 Newton Ct. WWW: http://redwood.ucdavis.edu/bruno Davis, CA 95616 _____________________________________________________________________________ From stefan.wermter at sunderland.ac.uk Mon Mar 25 12:36:44 2002 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Mon, 25 Mar 2002 17:36:44 +0000 Subject: Positions Cognitive Neuro Robotics for Language Learning Message-ID: <3C9F602C.8C67F20C@sunderland.ac.uk> Research Scientist Cognitive Neuro-Robotics (MirrorBot Project) 16,905 - 25,793 Ref No: CETR45/01 Research Associate Cognitive Neuro-Robotics (MirrorBot Project) 11,562 - 25,793 Ref No: CETR46/01 Biomimetic Multimodal Learning in a Mirror Neuron-based Robot EU Fundamental Emerging Technology project planned project start 1.6.2002, duration 3 years The Hybrid Intelligent Systems group ( http://www.his.sunderland.ac.uk ) within the area of Computing of the University of Sunderland is looking for two researchers in the new area of neuroscience- inspired computing on cognitive robots. This project will be in collaboration with research groups in cognitive neuroscience and neuroinformatics in Sunderland, Cambridge, Parma, Nancy and Ulm. The computational experiments are planned to be performed on a parallel neural high performance computer which should act as the "computational brain" for the robots. The researchers will play a key role in the design, development, programming and testing of a neural system on a mirror-neuron-based robot integrating vision and language for actions based on novel neural networks. It is expected that both researchers also contribute to the coordination of the project at different levels. The Research Scientist position will be for an experienced researcher with broad skill and expertise in computing, neural networks, computational neuroscience, and intelligent systems, a higher degree in Computing, PhD desirable. Knowledge in areas like vision, language, robotics, neuroscience and project management are an advantage. The Research Associate position will be for a researcher with expertise in computing, neural networks, a higher degree in Computing, at least Masters. Experience in supporting research projects (website, project reports etc) will be an advantage for the research associate. Closing date: 30th April 2002 For more information and discussion regarding either of these posts, please contact Professor Stefan Wermter, email stefan.wermter at sunderland.ac.uk. To apply, please submit your CV along with a letter of application and details of current salary, quoting vacancy title and reference number, to the Personnel Department, University of Sunderland, Langham Tower, Ryhope Road, Sunderland, SR2 7EE or e-mail employee.recruitment at sunderland.ac.uk *************************************** Professor Stefan Wermter Chair for Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From marina at cs.rhul.ac.uk Tue Mar 26 03:21:23 2002 From: marina at cs.rhul.ac.uk (marina d'Engelbronner) Date: Tue, 26 Mar 2002 08:21:23 -0000 Subject: vacancies for research assistants Message-ID: <003901c1d49f$37bdd1c0$7ebcdb86@VENICE> ROYAL HOLLOWAY University of London THREE RESEARCH ASSISTANT IN KERNEL BASED METHODS Department of Computer Science Royal Holloway, University of London invites applications for three research assistant positions in Computer Science. The first post is funded for two years duration, while the other two are funded for three years. The starting date for the first project is flexible, but the other vacancies are from April 2002. The first vacancy is for an EPSRC funded project entitled, 'Delimiting Kernel-based Learning Methods'. It spans work on the learning-theoretic analysis of kernel-based methods, kernel-theoretic work, and algorithmic design. A background working in the theoretical analysis of learning methods is very desirable for this position. Salary is up to =A324,656 inclusive of London Allowance Ref: KB/2053. The 2 remaining vacancies are for a project that involves developing kernel-based methods for the analysis of images, Learning for Adaptable Visual Assistants. The project is financed by the EU and also involves partners in Cambridge (Xerox), France (INRIA, Grenoble and University of Rennes), Austria (Technical University of Graz), Switzerland (IDIAP) and Sweden (University of Lund). We are seeking researchers with experience in applications of machine learning with a preference for knowledge of image processing, and a strong programming background. Experience with kernel methods is desirable but not required. Salary is in the range =A320,865 to =A328,625 per annum inclusive of London Allowance Ref: KB/2052. Please contact John Shawe-Taylor by email at jst at cs.rhul.ac.uk for more information. Information on the Department may be found at www.cs.rhbnc.ac.uk/. Further details and an application form can be obtained from The Personnel Office, Royal Holloway, University of London, Egham, Surrey TW20 0EX; fax: 01784 473527; tel: 01784 414241; email Sue.Clarke at rhul.ac.uk Please quote the appropriate reference The closing date for receipt of applications is 17th April 2002. We positively welcome applications from all sections of the community ROYAL HOLLOWAY University of London Department of Computer Science Research Assistant - Kernal Based MethodsInformation for Candidates =20 Royal Holloway, University of London invites applications for three research assistant positions in Computer Science. The first post is funded for two years duration, while the other two are funded for three years. The starting date for the first project is flexible, but the other vacancies are from April 2002. The College Royal Holloway is dedicated to achieving the highest international standards in teaching and research in the Sciences, Social Sciences, Humanities and Creative Arts. The College exists to promote education and scholarship for the public benefit. By constantly reinterpreting the ideals and principles of a university through undergraduate teaching, advanced teaching and research we carry forward the vision of our founders in the changing context of the society we serve. Royal Holloway is one of the eight large Colleges of the University of London. It has over 5,400 students in the Faculties of Arts, History and Social Sciences and Science. There are 20 academic departments and almost 1,200 staff, including 370 academic teaching staff. As well as strong links within the University of London, the College has forged many successful national and international collaborations. There is a strong commitment to excellence in both teaching and research and in the 1996 Research assessment Exercise, 8 departments were rated 5* or 5 and none below 3. The College occupies a large attractive campus at Egham, Surrey, situated in the green belt near Runnymede and Windsor Great Park, with good communications to and from London. Egham is 35 minutes by train from Waterloo, and the College is one mile from the M25 and 15 minutes' drive from Heathrow Airport. For further information about the College see website http://www.rhbnc.ac.uk. The Department The Department of Computer Science and Computer Learning Research Centre occupy the modern purpose-designed McCrea Building, which accommodates staff offices, computing laboratories and terminal rooms, providing the latest facilities for research and undergraduate teaching. The Department has 15 established academic posts in Computer Science, expanding to 18 this year, and also employs 10 Research Assistants, 3 Technical support staff and 4 Administrative/secretarial staff. The Department was given a rating of 5 in the last (2001) Research Assessment Exercise. The Department's teaching load at present is around 170 Full Time Equivalent (FTE) undergraduate students taking part in both Single and Joint Honours Degree programmes, and around 35 FTEs, approximately half taught and half research. The main taught postgraduate programmes are the MSc in Computer Science by Research, which offers intensive training in any one of the Department's research specialties, and the MSc in Business Information Systems, taught jointly with the School of Management.The Department is well equipped for teaching and research. It provides networked computer facilities accessed from desktop X-terminals. The servers include powerful DEC Alpha computers running Digital Unix, a Sun Sparc running Solaris, and an NT server running various office applications such as Microsoft Office. There is fast Internet access (multi megabit per second), and a wide range of specialist research software as well as laser printers and photocopiers. All staff and research postgraduates who do not have equipment provided under personal research grants are provided with an X terminal. This gives them access to both Unix services and an NT server. The Department holds weekly research seminars and invited lectures throughout the academic year as well as a Distinguished lecture series - recent speakers include Professors Donald Davies, Roger Penrose, Frank Sumner, Maurice Wilkes and Tony Hoare. The department celebrated its 30th anniversary in 1998. The department's website is http:\\www.dcs.rhbnc.ac.uk COMPUTER SCIENCE STAFF - RESEARCH INTERESTS A. Gammerman, BSc, PhD St Petersburg (Head of Department) Algorithmic randomness, Kolmogorov complexity, induction and transduction, applications. A. Chervonenkis, BSc, PhD Moscow Mathematical statistics, pattern recognition, learning theory, ore deposit modelling. D.A. Cohen, BA, DPhil Oxon Mathematics within Computer Science, theory of neural networks, constraint satisfaction problems. A.R. Davies, BSc, MSc Lond (Deputy Head of Department) Computer modelling of large scale scientific problems, laser optics, wave guides. A. Fukshansky, Dipl Math, Dr rer nat Freiburg Mathematics in computer science, bioinformatics/biomathematics. Z.G. Gutin, BSc Gomel, PhD Tel Aviv Graph theory and algorithms, combinatorial optimisation, linear and integer programming, bioinformatics. J.M. Hancock, BSc London, PhD Edinburgh Bioinformatics, molecular evolution, genome analysis, repetitive sequences. E.I. Hogger, BSc, MSc Lond, BA Open, ARCS, DIC Machine learning and its relationship to non-monotonic logic, intelligent knowledge-based systems, expert systems. A.I.C. Johnstone, BSc, PhD Lond, CEng, MBCS, MIEE Multiprocessor systems for real-time machine vision, language design for multiprocessor and array processor systems, VLSI implementation of image processing algorithms, advanced processors. C. Saunders, BSc, PhD Lond Machine learning algorithms, kernel methods, transductive inference, fault diagnosis, text analysis.S.A. Schneider, BA, DPhil Oxon Process algebra, concurrency theory, real-time systems, formal methods, computer security. E.A. Scott, BSc, DPhil Oxon Theoretical computer science, compiler theory, language analysis and design, termination theory, automated theorem proving, machine learning. J.S. Shawe-Taylor, PhD, MSc Lond, DIC, CMath, FIMA Computational learning theory, the mathematical analysis of kernel-based learning methods. H.E. Treharne, BSc, MSc, PhD Lond Safety critical software, formal methods, software metrics. V.N. Vapnik, BSc, PhD, DSc Moscow Pattern recognition, statistical analysis, support vector machine. V. Vovk, BSc, PhD Moscow Limits of machine learning: predictive complexity, randomness and information; inductive and transductive inference. C. Watkins, MA, PhD Cantab Reinforcement learning, computational learning theory, mathematical finance. RESEARCH COMPUTATIONAL LEARNING RESEARCH CENTRE The Centre was established in January 1998 to provide a focus for fundamental research and commercial and industrial applications in the fields of computer learning, including inductive/transductive inference and universal prediction. The current research topics are Universal Prediction, Support Vector method, Probabilistic Reasoning, the theory of Kolmogorov and predictive complexity, on-line prediction with expert advice, transductive inference and computational finance. Members of the Centre are Alex Gammerman (Director), Alexey Chervonenkis, Craig Saunders, Vladimir Vapnik, Volodya Vovk and Chris Watkins, with their Research Assistants and PhD students, and Visiting Professors are Jorma Rissanen, Chris Wallace, Glenn Shafer, Leonid Levin, Ray Solomonoff and Vladimir V'yugin. THEORETICAL COMPUTING GROUP The group is concerned with the modelling of computing systems and application areas in order to derive principled and practically effective solution strategies. Such modelling ensures that the development of solutions is guided by a scientific and well-founded analysis, which can provide practical guidance into the scaling and extent of their applicability. The research strands include: NEURAL AND COMPUTATIONAL LEARNING Members of this area work closely with the CLRC and their research is centred around analysis of neural networks including work on a novel digital neural chip, large margin algorithms and analysis, and relations between the Bayesian and probably approximately correct (pac) models. The research group has collaborated in ESPRIT funded research in the mobile telecoms area and currently co-ordinates an ESPRIT funded Working Group Neurocolt2. More recently they have received funding for an EU project 'Kernel Methods for Images and Text' (KerMIT) involving Reuters, Xerox and three other university sites.CONSTRAINTS Constraint satisfaction problems have been studied in the Department since 1989, supported by the EPSRC, the DTI, the Royal Society, the British Council, and the Nuffield Foundation. The group is currently collaborating with the government Radiocommunications Agency, and Vodafone Ltd, who support a PhD student. Theoretical research is currently focused on developing the mathematical methods that are needed to classify different types of constraints. A new approach to computational complexity theory, which makes use of tools from algebra and logic, is being investigated. On the applications side, radio frequency planning particularly in regard to the growth of mobile telecommunications, and large scale scheduling problems, such as the planning of manufacturing processes and the design of airline timetables, are of particular interest. FORMAL METHODS The main interests of the Formal Methods group concern the theory and application of formal methods to security- and safety-critical systems, with particular focus in the areas of concurrency and combining methods. The group has an international reputation in this field and is substantially supported by external funding. The group maintains active links with industry, including Motorola, the Defence and Evaluation Research Agency, and SRI. LANGUAGES AND ARCHITECTURES Research in the area of language translation and compiler theory aims to provide sound theory linked to approachable toolsets for work in software/hardware co-design, language implementation and reverse engineering. We have made contributions in the areas of general parsing; semi-automatic construction of intermediate forms and control flow graphs; control flow analysis for environments with poorly defined notions of procedure call; reverse compilation from assembler source to high level languages; mixed implementation of conventional processors, Digital Signal processors and Field Programmable Gate Arrays; and the development of toolsets for non-specialist users. BIOINFORMATICS The volume and variety of data coming out of current research in molecular biology contain the answers to many scientific questions and the keys to many medical advances. To answer the new types of question that are being asked, new computational techniques are needed, and machine learning methods are now becoming standard in computational biology. Techniques developed in the Department are now being applied to the analysis of biological sequences. In particular, support vector machine kernels suitable for the comparison and classification of protein and DNA sequences are being developed with a view to providing predictions of the structure and function of proteins from their sequences. The Post Delimiting Kernal-based learning Assisting Professor John Shawe-Taylor with the above research project, the aim of which is to analyse the limitations of kernal-based learning methods. Learning for Adaptable Visual Assistants=20 Assisting Professor John Shawe-Taylor with the above research project, the aim of which is developing kernal based methods for the analysis if images in collaboration with the other partners on the project. General Any other duties or responsibilities as the department may reasonably require. Opportunity to assist with undergraduate lectures and tutorials, assistance with postgraduate supervision. The PersonCandidates for all three positions will preferably have a doctorate in Computer Science or Mathematics. They will be expected to be able to demonstrate significant research accomplishment in a relevant field, including published work in high quality journals. Experience with kernal methods is desirable. Enquiries Please contact John Shawe-Taylor by e-mail at jst at cs.rhul.ac.uk for more information. The Appointment The first vacancy is for an EPSRC funded project entitled, 'Delimiting Kernel-based Learning Methods'. It spans work on the learning-theoretic analysis of kernel-based methods, kernel-theoretic work, and algorithmic design. A background working in the theoretical analysis of learning methods is very desirable for this position. Salary is up to =A324,656 inclusive of London Allowance Ref: KB/2053. The 2 remaining vacancies are for a project that involves developing kernel-based methods for the analysis of images, Learning for Adaptable Visual Assistants. The project is financed by the EU and also involves partners in Cambridge (Xerox), France (INRIA, Grenoble and University of Rennes), Austria (Technical University of Graz), Switzerland (IDIAP) and Sweden (University of Lund). We are seeking researchers with experience in applications of machine learning with a preference for knowledge of image processing, and a strong programming background. Experience with kernel methods is desirable but not required. Salary is in the range =A320,865 to =A328,625 per annum inclusive of London Allowance Ref: KB/2052. The appointments will be made on 12 months probation. There is also an annual appraisal scheme. Payment for salaries will be monthly, in arrears, by credit transfer into your bank account. Payment will usually be made on the 27th of the month or before if the 27th falls on a weekend or Public or College holiday. The normal date for the review of salaries is 1 April. Where a salary increment is due it will be payable on 1st August in each calendar year. Where a member of staff is appointed between 1st February and 31 July inclusive the first increment will be payable on 1 August in the following calendar year. This post is superannuable under the USS (Universities Superannuation Scheme). The College currently contributes 14% of salary and individuals contribute 6.35%. 27 days annual leave are attached to the post. In addition you will receive the 6 discretionary days which are granted and shared between Easter and Christmas when the College is closed and public holidays. ApplicationsApplications should be made using the standard application form obtainable from the Personnel Department (address below), to which should be appended: (I) a full curriculum vitae; (ii) the names and addresses of two or more academic referees; (iii) a statement of current research activities and areas of interest; (iv) a list of publications. These should be sent by 17th April 2002 to Personnel, Royal Holloway, University of London, Egham, Surrey TW20 0EX, UK (Tel: 01784 414241; Fax: 01784 473527; E-mail: s.watson at rhbnc.ac.uk).Extract from equal opportunities policy The only consideration in recruitment of employees will be how the genuine requirements of the post are met or likely to be met by the individual under consideration. These requirements, including retirement at the appropriate age, being met, no regard will be taken (except where legally required) of that person's race, sex, age, marital status, number of children, physical disability or beliefs or lawful preferences, privately held, on any matter including religion, politics and sex. =20 Information for Applicants with Special Needs Royal Holloway encourages and welcomes applications from people with disabilities and special needs. However it is important to note that the Campus is built on a hill side and that many buildings are of older construction and are not fully accessible. If you have a disability or a special need and wish to discuss any practical aspects of your application, please contact the Personnel Office in confidence on +44 (0) 1784 414058. General In an effort to provide a healthy and comfortable working environment, smoking is prohibited in public areas and in shared occupancy rooms. Full details of the Smoking Policy are available from the College Safety Officer. Staff have full use of the dining facilities on campus; there is also a College Shop which offers a wide range of goods. A National Westminster bank is on site together with a Waterstones bookshop. There is an independently run Nursery (for which a charge is made) located on the College campus for children over the age of two years and up to the age of five years. Designated areas of car parking are allocated to staff. Permits are required to allow parking on campus. A bus service is in operation from Egham Railway Station and may be used by staff in addition to students. The service operates in conjunction with trains arriving from Waterloo and Reading. Tickets for this bus service must be bought in advance and are available from the College Shop, Athlone Hall and Kingswood Hall. This bus service operates during term-time only. Internal promotion is encouraged by the college and all vacancies are advertised on internal noticeboards and circulated to departments. From wahba at stat.wisc.edu Tue Mar 26 13:35:07 2002 From: wahba at stat.wisc.edu (Grace Wahba) Date: Tue, 26 Mar 2002 12:35:07 -0600 (CST) Subject: responses to textbook survey Message-ID: <200203261835.MAA11615@hera.stat.wisc.edu> I teach a graduate level course in the Statistics Department on Statistical Modeling based on solving optimization problems in a Reproducing Kernel Hilbert Space (now called `kernel methods'). It attracts occasional graduate students from CS and Engineering. I use my book `Spline Models for Observational Data' (1990). This fall I will probably include parts of Smola and Schoelkopf's new book on SVM's, Hastie, Tibshirani and Friedman on Statistical Learning, and Chong Gu's new book on Smoothing Spline ANOVA. From cindy at bu.edu Tue Mar 26 14:22:56 2002 From: cindy at bu.edu (Cynthia Bradford) Date: Tue, 26 Mar 2002 14:22:56 -0500 Subject: Neural Networks 15(2) Message-ID: <200203261922.g2QJMui01212@cns-pc75.bu.edu> NEURAL NETWORKS 15(2) Contents - Volume 15, Number 2 - 2002 ------------------------------------------------------------------ INVITED ARTICLE: Synapses as dynamic memory buffers Wolfgang Maass and Henry Markram CONTRIBUTED ARTICLES: ***** Psychology and Cognitive Science ***** Learning the parts of objects by auto-association Xijin Ge and Shuichi Iwata ***** Neuroscience and Neuropsychology ***** Prospective control of manual interceptive actions: Comparative simulations of extant and new model constructs Joost C. Dessing, Daniel Bullock, C. (Lieke) E. Peper, and Peter J. Beek Temporal dynamics of binocular disparity processing with corticogeniculate interactions Stephen Grossberg and Alexander Grunewald A mathematical analysis of the development of oriented receptive fields in Linsker's model Tadashi Yamazaki ***** Mathematical and Computational Analysis ***** A general framework for neural network models on censored survival data Elia Biganzoli, Patrizia Boracchi, and Ettore Marubini MCMAC-CVT: A novel on-line associative memory based CVT transmission control system K.K. Ang, C. Quek, and A. Wahab A methodology to explain neural network classification Raphael Feraud and Fabrice Clerot ***** Engineering and Design ***** A neuro-fuzzy framework for inferencing Sukumar Chakraborty, Kuhu Pal, and Nikhil R. Pal Nonlinear Fisher discriminant analysis using a minimum squared error cost function and the orthogonal least squares algorithm Steve A. Billings and Kian L. Lee ***** Technology and Applications ***** Solving large scale traveling salesman problems by chaotic neurodynamics Mikio Hasegawa, Tohru Ikeguchi, and Kazuyuki Aihara LETTERS TO THE EDITOR Comments for Leung, Wong, Sum, and Chan (2001) Pierre van de Laar Response to comments for Leung, Wong, Sum, and Chan (2001) C.-S. Leung CURRENT EVENTS ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 (regular) Y 13,000 (regular) Neural Networks (plus 2,000 enrollment fee) $20 (student) SEK 460 (student) Y 11,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- membership without $30 SEK 200 not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Takashi Nagano Faculty of Engineering Hosei University 3-7-2, Kajinocho, Koganei-shi Tokyo 184-8584 Japan 81 42 387 6350 (phone and fax) jnns at k.hosei.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From T.J.Prescott at sheffield.ac.uk Wed Mar 27 08:04:29 2002 From: T.J.Prescott at sheffield.ac.uk (Tony Prescott) Date: Wed, 27 Mar 2002 13:04:29 +0000 Subject: Robotics as Theoretical Biology Workshop Message-ID: (apologies for repeat postings) FIRST CALL FOR PARTICIPATION WORKSHOP ON ROBOTICS AS THEORETICAL BIOLOGY AUGUST 10TH, 2002, EDINBURGH, SCOTLAND. http://www.shef.ac.uk/~abrg/sab02/index.shtml PART OF SAB '02: THE 7TH MEETING OF THE INTERNATIONAL SOCIETY FOR SIMULATION OF ADAPTIVE BEHAVIOR AUGUST 4TH-11TH, 2002, EDINBURGH, SCOTLAND. http://www.isab.org.uk/sab02/ WORKSHOP AIMS This workshop seeks to bring together researchers in robotics and biology who are interested in developing robot models of the biological systems underlying animal behavior. The scope of the workshop will include: * Evaluating current progress in applying robot models to biological questions * Identifying areas of biology where robots could make a future contribution * Exploring methodological issues * Considering how better ties can be forged between the robotics and biological research communities * Providing a forum for exhibiting current research and work in progress FORMAT OF THE WORKSHOP This will be a one-day workshop consisting of a mixture of invited talks and discussion sessions (see PROGRAM below). The workshop will include three sessions of invited talks: one on invertebrate behavior, one on vertebrate behavior, and one on collective behavior. For each session there will be two invited speakers, one a distinguished researcher from a (primarily) biological background, the other from a robotics background. Each invited speaker will be asked to address the question "How can robotics contribute to theoretical biology?" from the perspective of their own field and drawing on the experience of their own research. Each session will also conclude with a discussion involving the two speakers and a panel of other prominent researchers in the field. The workshop will end with a discussion about building stronger ties between the biological and robotics research communities. Representatives from funding bodies and from journal editorial boards will be asked to join the panel for this discussion. All attendees at the workshop are invited to BRING POSTERS DESCRIBING THEIR OWN RESEARCH for people to browse during scheduled coffee and lunch breaks (see CONTRIBUTING TO THE WORKSHOP below). WORKSHOP ORGANISERS The workshop is co-organised by: Tony Prescott, Adaptive Behaviour Research Group, University of Sheffield, UK, Barbara Webb, Department of Psychology, University of Stirling, UK. WORKSHOP PROGRAM Session I: Invertebrate behavior Chair: Barbara Webb Talk 1: Roy Ritzmann (Professor of Biology, Case Western Reserve University, US) Talk 2: Roger Quinn (Professor of Mechanical Engineering, Case Western Reserve University, US) Discussion (Panel to be confirmed) Session II: Vertebrate behavior Chair: Tony Prescott Talk 3: Peter Redgrave (Professor of Neuroscience, University of Sheffield, UK) Talk 4: Angelo Arleo (Research Fellow, College de France, Paris, France) Discussion (Panel to be confirmed) Session III: Collective behavior Chair: To be confirmed. Talk 5: Nigel Franks (Professor of Biological Sciences, University of Bristol). Talk 6: Chris Melhuish (Director of the Intelligent Autonomous Systems Laboratory, University of the West of England, UK) Discussion (Panel to be confirmed) Session IV: Building ties between robotics and biology Chair: To be confirmed Discussion (Panel to be confirmed but to include representatives from research funding bodies, research societies, and journal editorial boards) CONTRIBUTING TO THE WORKSHOP All attendees are invited (but not required) to bring posters, and robot demos, of relevant research to the workshop. Posters may have been previously exhibited at other recent workshops/conferences (including SAB '02), or may describe new research that has not been exhibited elsewhere. If you wish to show your work you are requested to SUBMIT AN ABSTRACT OF 200-400 WORDS to the workshop organisers before the 1ST OF MAY 2002. Submissions will not be reviewed however they will be vetted for relevance, and work that falls outside the scope of the workshop may be refused. Potential contributors should note that the focus of the workshop is on the possible contribution of robotics to biology, rather than the converse (the contribution of biology to robotics). Posters describing biologically-inspired robots, computer simulation work, or straight biological research are acceptable where they can be shown to have relevance to the understanding or development of future robot models in biology. Please make the biological relevance clear in your abstract. Attendees who wish to exhibit robots should also submit an abstract and should contact the organizers with full details of the demonstration and any technical requirements. There may be a limit on the number of available poster spaces, and there may also be space restrictions on robot demonstrations. If these limits become an issue then contributions will be prioritized according to the order in which they were received. Please note, we are NOT inviting submissions for oral presentations, however, the program is organized to include discussion sessions which will certainly be open to input from all attendees. For enquiries and for ELECTRONIC SUBMISSION OF ABSTRACTS please email: t.j.prescott at shef.ac.uk including "SAB '02" in the subject line. PROCEEDINGS There will be a proceedings booklet for attendees that will include the abstracts of invited talks and accepted posters/demos, contact details for all attendees, and other pertinent information. Attendees with accepted abstracts will be asked to contribute 1 page of 'camera-ready' material to the proceedings (formatting instructions to be published here at a later date). BOOKING INFORMATION Please book through the SAB 'O2 web-site (http://www.isab.org.uk/sab02/). Booking information should be available there soon. Note, it should be possible to register for the workshop without registering for the full conference. FOR MORE INFORMATION Please visit the workshop home-page: http://www.shef.ac.uk/~abrg/sab02/index.shtml. From vato at dibe.unige.it Wed Mar 27 06:26:50 2002 From: vato at dibe.unige.it (Alessandro Vato) Date: Wed, 27 Mar 2002 12:26:50 +0100 Subject: NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL Message-ID: <008101c1d582$4a034e80$6859fb82@Nathannever> NE.W.S. : NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOLJune 10 - 13, 2002 University of Genova, Italy http://www.bio.dibe.unige.it Organized by: Prof. Sergio Martinoia Neuroengineering and Bio-nanoTechnologies Group, Department of Biophysical and Electronic Engineering (DIBE), University of Genova Prof. Pietro Morasso Department of Communications, Computer and System Sciences (DIST), University of Genova Funded by the University of Genova MAIN GOAL: to understand , modify and use brain plasticity in order to advance Neuroscience at the network level and to inspire new computer architectures. Scientific background: Bioengineering, Electronics, Informatics, Neuroscience. How to reach the main goal: * By interfacing in vitro neurons to standard and microelectronic transducers capable to monitor and modify the neuron electrophysiological activity * By creating hybrid neuro-electronic systems * By developing neuro-prostheses * By computer simulating plasticity at the network level * By developing neuromorphic silicon neurons INVITED SPEAKERS: Fabio Babiloni, Department of Human Physiology and Pharmacology, University of Rome "La Sapienza", Rome, Italy Paolo Dario, Sant'Anna School of University Studies and Doctoral Research, Pisa, Italy Alain Destexhe, Unite de Neuroscience Integratives et Computationelles, CNRS,Gif-sur-Yvette, France Stefano Fusi, Institute of Physiology, University of Bern, Bern, Switzerland Michele Giugliano, Institute of Physiology, University of Bern, Bern, Switzerland Giacomo Indiveri, Institute for Neuroinformatics, ETH / University of Zurich, Switzerland Leandro Lorenzelli, Istituto Trentino di Cultura, Centre for Scientific and Technological Research Microsystems Division,Trento,Italy Giorgio Metta, Humanoid Robotics Group, MIT - Artificial Intelligence Lab, Cambridge (MA), USA Sandro Mussa-Ivaldi, Department of Physiology Northwestern University, Medical School, Chicago (IL), USA Miguel Nicolelis, Department of Neurobiology, Duke University, Durham (NC), USA Rolf Pfeifer, Department of Information Technology, University of Zurich, Switzerland Thomas Sinkjaer, Centre for Sensory-Motor Interaction, Aalborg University, Denmark Gielen Stan, Department of Medical Physics and Biophysics, University of Nijmegen, The Netherlands Stefano Vassanelli, Department of Anatomy and Human Physiology, University of Padova, Italy REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below and send it by email to news2002 at bio_nt.dibe.unige.it . REGISTRATION FORM (Please send it by email to: news2002 at bio_nt.dibe.unige.it ) NE.W.S. : NEUROENGINEERING WORKSHOP AND ADVANCED SCHOOL June 10 - 13, 2002 University of Genova, Department of Biophysical and Electronic Engineering V. Opera Pia 11a, 16145 Genova Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ Registration fee: CHECK ONE: ( ) Euro 50 Registration Fee (Student) ( ) Euro 100 Registration Fee (Regular) PREFERRED METHOD OF PAYMENT : [ ] Bank transfer: Bank CARIGE, Agency 41, ABI:6175, CAB: 1472 c/c DIBE - University of Genova 5341/90. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (VISA), check or cash at the very beginning of the workshop. -------------------------------------------------------------------- Alessandro Vato Neuroengineering and Bio-nanoTechnology - NBT Department of Biophysical and Electronic Engineering - DIBE Via Opera Pia 11A, 16145, GENOVA, ITALY Tel. +39-10-3532765 Fax. +39-10-3532133 URL: http://www.bio.dibe.unige.it/ From levine at uta.edu Wed Mar 27 17:54:06 2002 From: levine at uta.edu (Daniel S Levine) Date: Wed, 27 Mar 2002 16:54:06 -0600 Subject: Textbook examination copies Message-ID: <6A467F62A0D2D51194FE0004AC4CA556E7EA52@exchange.uta.edu> Dear fellow Connectionists, A free examination copy of my textbook, Introduction to Neural and Cognitive Modeling (2nd ed., 2000), is available to anyone requesting it for teaching purposes from Bill Webber at Lawrence Erlbaum Associates. His e-mail is bwebber at erlbaum.com. Best wishes, Dan Levine levine at uta.edu From bower at uthscsa.edu Wed Mar 27 20:27:45 2002 From: bower at uthscsa.edu (James Bower) Date: Wed, 27 Mar 2002 19:27:45 -0600 Subject: GUM*2002 Message-ID: CALL FOR PAPERS First Annual GENESIS Users Meeting GUM*2002 November 8,9,10 San Antonio Texas The first annual GENESIS Users Meeting will take place this fall in beautiful San Antonio, Texas. Unique in its structure, this meeting will combine introductory, intermediate, and advanced tutorials in the use of GENESIS with a full agenda of scientific presentations focused on the study of biological systems using realistic modeling techniques. The meeting's overall objective will be to promote communication and collaboration between GENESIS users and others involved in realistic biological modeling and to provide an introduction to other interested scientists. GENESIS Tutorials. The schedule on Friday, November 8th, will be devoted to tutorials organized at several different levels. The introductory tutorial will be lead by Dr. David Beeman, who has been providing introductory instruction in GENESIS use since the first course in Computational Neuroscience at the Marine Biological Laboratory in the summer of 1989, and more recently with the European Course in Computational Neuroscience. Students, Postdoctoral Fellows or Faculty interested in realistic modeling techniques are encouraged to attend. Concurrent tutorials will also be offered on Friday for intermediate and advanced GENESIS users. Scientific Program Saturday and Sunday, November 9th and 10th, will be devoted to scientific and technical presentations related to realistic simulations of biological systems. Appropriate presentations include scientific results from realistic modeling efforts, presentations on technical aspects of simulator use and development, and presentations describing biological systems felt to be ripe for simulation and modeling. While research using GENESIS is especially encouraged, presentations based on other simulation systems are also welcomed. The organization of the scientific program will also be somewhat unique. The morning sessions will be devoted to 15 minute oral presentations from each of the contributing authors. This will provide all attendees exposure to the full range of work presented at the meeting. The afternoon will then be devoted to more detailed discussion of results in a poster/demonstration format. Community Building In addition to the educational and scientific aspects of meeting, opportunities will also be provided for more relaxed interactions between participants. San Antonio Texas is famous for its River Walk (http://www.thesanantonioriverwalk.com/index.asp) where the meeting banquet will take place on Saturday evening. The San Antonio/Austin area of South Texas is also famous for its indigenous music scene - recently added to by Ramon and the K-Halls, who, it is rumored, are already putting pressure on meeting organizers for an exclusive contract. In other words, a good time will be had by all! Society for Neuroscience Attendees Likely attendees of this year's Society for Neuroscience meeting in Orlando Florida should note that GUM*02 is scheduled the weekend after the neuroscience meeting, and Texas is not very far from Florida and much more interesting and real than Orlando. Meeting Registration Costs Registration for GUM*02 will be $99 for students and $159 for non-students. Extra banquet tickets will be available for $35. Pre-Registration Because this is the first GUM, we would ask that interested participants pre-register by sending email to the GENESIS users group at babel at genesis-sim.org. There is no obligation, but pre-registration will allow the organizers to gauge the likely number of participants. With that information, the organizers will be able to arrange special rates for housing. Registration and Request for Presentations Meeting and tutorial registration will open on August 1st. At that time authors will be asked to provide an abstract for their presentations. We hope to see you in 'ol San Antone this Fall -- James M. Bower Ph.D. Research Imaging Center University of Texas Health Science Center at San Antonio 7703 Floyd Curl Drive San Antonio, TX 78284-6240 Cajal Neuroscience Center University of Texas San Antonio Phone: 210 567 8080 Fax: 210 567 8152 From hong at deakin.edu.au Thu Mar 28 00:42:05 2002 From: hong at deakin.edu.au (hong) Date: Thu, 28 Mar 2002 16:42:05 +1100 Subject: ICONIP'02-SEAL'02-FSKD'02 Call for Papers Message-ID: <3CA2AD2D.39566521@deakin.edu.au> [Apologies if you receive this announcement more than once.] ---------------------------------------------------------------------- 9th International Conference on Neural Information Processing (ICONIP'02) 4th Asia-Pacific Conference on Simulated Evolution And Learning (SEAL'02) International Conference on Fuzzy Systems and Knowledge Discovery (FSKD'02) ---------------------------------------------------------------------- November 18 - 22, 2002, Orchid Country Club, Singapore http://www.ntu.edu.sg/home/nef Organized by: School of Electrical and Electronic Engineering Nanyang Technological University, Singapore Sponsored by: Asia-Pacific Neural Network Assembly SEAL & FSKD Steering Committees Singapore Neuroscience Association In Co-Operation with: IEEE Neural Network Society International Neural Network Society European Neural Network Society SPIE Supported by: Lee Foundation Singapore Exhibition & Convention Bureau ~~~~~~~~ CALL FOR PAPERS, SPONSORSHIPS, AND SPECIAL SESSION PROPOSALS ~~~~~~~~ ICONIP'02, SEAL'02, and FSKD'02 will be jointly held in Orchid Country Club, Singapore from November 18 to 22, 2002. The conferences will not only feature the most up-to-date research results in natural and arti- ficial neural systems, evolutionary computation, fuzzy systems, and knowledge discovery, but also promote cross-fertilization over these exciting and yet closely-related areas. Registration to any one of the conferences will entitle a participant to the technical sessions and the proceedings of all three conferences, as well as the conference banquet, buffet lunches, and tours to two of the major attractions in Singapore, i.e., Night Safari and Sentosa Resort Island. Many well- known researchers will present keynote speeches, panel discussions, invited lectures, and tutorials. About Singapore --------------- Located at one of the most important crossroads of the world, Singapore is truly a place where East and West come together. Here you will find Chinese, Indian, and Malay communities living together, their long established cultures forming a unique backdrop to a clean and modern garden city. English is spoken everywhere and is the common business language of all. Few places on earth promise such a delight for the palate, with gourmet cuisine from over 30 countries. Exotic resorts in neighboring countries are only a short bus/ferry ride away. Orchid Country Club (OCC) ------------------------- The venue for this year's conferences is at one of Singapore's premier country clubs, a 25-minute bus ride from the city. Away from the hustle and bustle of downtown Singapore, the tranquil setting of the resort is ideal for serious technical discussions with an accommodating space and ambience for relaxation. Not to miss out on the splendor of downtown Singapore, the organizer has also secured good quality and affordable accommodation in the heart of the city with pre-arranged transport to/from the OCC. For golf enthusiasts, OCC is equipped with the largest computerized driving range in South East Asia and boasts of a 27-hole golf course with facilities for night golfing, ideal for relaxation after each day of technical discussions. Visit the OCC website at http://www.orchidclub.com Night Safari and Sentosa Resort Island -------------------------------------- It is said that a visit to Singapore is not complete without making a trip to two of the Republic's famous attractions. The only one of its kind in the world, the Night Safari provides a setting for visitors to experience what it is like to observe animals in their nocturnal habitat. The island of Sentosa offers some unique attractions and a visit there will also provide a glimpse and imagery of Singapore's past and present. Visits to these two attractions will be included as recreation for the joint conference. (Websites: http://www.zoo.com.sg/safari/, http://www.sentosa.com.sg) Topics of Interest ------------------ The joint conferences welcomes paper submissions from researchers, practitioners, and students worldwide in but not limited to the following areas. ICONIP'02: ~~ ARTIFICIAL NEURAL MODELS - Learning algorithms, Neural modeling and architectures, Neurodynamics NATURAL NEURAL SYSTEMS - Neuroscience, Neurobiology, Neuro- physiology, Brain imaging, Learning and memory COGNITIVE SCIENCE - Perception, emotion, and cognition, Selective attention, Vision and auditory models HARDWARD IMPLEMENTATION - Artificial retina & cochlear chips HYBRID SYSTEMS - Neuro-fuzzy systems, Evolutionary neural nets, etc APPLICATIONS - Bioinformatics, Finance, Manufacturing, etc. SEAL'02: ~~ THEORY - Co-evolution, Coding methods, Collective behavior METHODOLOGY - Evolution strategies, Genetic algorithms, Genetic programming, Molecular and quantum computing, Evolvable hardware, Multi-objective optimization, Ant colony, Artificial ecology EVOLUTIONARY LEARNING - Artificial life, Bayesian evolutionary algorithms HYBRID SYSTEMS - Evolutionary neuro-fuzzy systems, Soft computing APPLICATIONS - Scheduling, Operations research, Design, etc FSKD'02: ~~ THEORY AND FOUNDATIONS - Fuzzy theory and models, Uncertainty management, Statistical & probabilistic data mining, Computing with words, Rough sets, Intelligent agents METHODS AND ALGORITHMS - Classification, Clustering, Information retrieval & fusion, Data warehousing & OLAP, Fuzzy hardware, Visualization, Decision trees, Data preprocessing HYBRID SYSTEMS - Evolutionary neuro-fuzzy systems, Soft computing APPLICATIONS - Control, Optimization, Natural language processing, Forecasting, Human-computer interaction, etc. Special Sessions ---------------- The conferences will feature special sessions on specialized topics to encourage in-depth discussions. To propose a special session, email the session title, name of the conference under which the special session will be organized, contact information of the organizer(s), and a short description on the theme and topics covered by the session to Xin Yao, Special Sessions Chair (x.yao at cs.bham.ac.uk), with a copy to Lipo Wang, General Chair (Cc: elpwang at ntu.edu.sg). Sponsorship ----------- The conferences will offer product vendors a sponsorship package and/or an opportunity to interact with conference participants. Product demonstration and exhibition can also be arranged. For more information, please visit the conference website or contact Tong Seng Quah, Sponsorship/Exhibition Chair (itsquah at ntu.edu.sg), with a copy to Lipo Wang, General Chair (Cc: elpwang at ntu.edu.sg). Keynote Speakers ---------------- Shun-ichi Amari, RIKEN Brain Science Institute, Japan David Fogel, Natural Selection, Inc., USA Mitsuo Kawato, ATR, Japan Xin Yao, The University of Birmingham, UK Lotfi A. Zadeh, University of California, USA Registration Fee ---------------- The registration fee for regular participants before August 15, 2002 is S$680 (approximately US$370 as at February 6, 2002), which includes the proceedings, lunches, banquet, and tours. Submission of Papers -------------------- Authors are invited to submit electronic files (postscript, pdf or Word format) through the conference home page. Papers should be double-column and use 10 pt Times Roman or similar fonts. The final version of a paper should not exceed 5 pages in length. A selected number of accepted papers will be expanded and revised for possible inclusion in edited books and peer-reviewed journals, such as "Soft Computing" and "Knowledge and Information Systems: An International Journal" by Springer-Verlag. Important Dates --------------- Paper/Summary Deadline : April 30, 2002 Notification of Acceptance : July 15, 2002 Final Paper/ Registration : August 15, 2002 Honorary Conference Chairs -------------------------- Shun-ichi Amari, Japan Hans-Paul Schwefel, Germany Lotfi A. Zadeh, USA International Advisory Board ---------------------------- Sung-Yang Bang, Korea Meng Hwa Er, Singapore David B. Fogel, USA Toshio Fukuda, Japan A. Galushkin, Russia Tom Gedeon, Australia Zhenya He, China Mo Jamshidi, USA Nikola Kasabov, New Zealand Sun-Yuan Kung, USA Tong Heng Lee, Singapore Erkki Oja, Finland Nikhil R. Pal, India Enrique H. Ruspini,USA Harcharan Singh, Singapore Ah Chung Tsoi, Australia Shiro Usui, Toyohashi, Japan Lei Xu, China Benjamin W. Wah, USA Donald C. Wunsch II, USA Xindong Wu, USA Youshou Wu, China Yixin Zhong, China Jacek M. Zurada, USA Advisor ------- Alex C. Kot, Singapore General Chair ------------- Lipo Wang, Singapore Program Co-Chairs ----------------- ICONIP'02: Kunihiko Fukushima, Japan Soo-Young Lee, Korea Jagath C. Rajapakse, Singapore SEAL'02: Takeshi Furuhashi, Japan Jong-Hwan Kim, Korea Kay Chen Tan, Singapore FSKD'02: Saman Halgamuge, Australia Special Sessions: Xin Yao, UK Finance Chair ------------- Charoensak Charayaphan, Singapore Local Arrangement Chair ----------------------- Meng Hiot Lim, Singapore Proceedings Chair ----------------- Farook Sattar, Singapore Publicity Co-Chairs ------------------- Hepu Deng, Australia Chunru Wan, Singapore Li Weigang, Brazil Zili Zhang, Australia Sponsorship/Exhibition Chair ---------------------------- Tong Seng Quah, Singapore Tutorial Chair -------------- P. N. Suganthan, Singapore For More Information -------------------- Please visit the conference home page or contact: Lipo Wang, ICONIP'02-SEAL'02-FSKD'02 General Chair School of Electrical and Electronic Engineering Nanyang Technological University Block S2, 50 Nanyang Avenue, Singapore 639798 Email: elpwang at ntu.edu.sg Phone: +65 6790 6372 Fax: +65 6792 0415 Conference Secretariat ---------------------- ICONIP'02-SEAL'02-FSKD'02 Secretariat Conference Management Center/CCE, NTU Administration Annex Building #04-06 42 Nanyang Avenue, Singapore 639815 Email: nef at ntu.edu.sg Fax: +65 6793 0997