From bogus@does.not.exist.com Mon Oct 1 06:32:01 2001 From: bogus@does.not.exist.com () Date: Mon, 1 Oct 2001 12:32:01 +0200 Subject: special sessions at ESANN'2002 Message-ID: ---------------------------------------------------- | | | ESANN'2002 | | | | 10th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 24-25-26, 2002 | | | | Special sessions | ---------------------------------------------------- The following message contains a summary of all special sessions that will be organized during the ESANN'2002 conference. Authors are invited to submit their contributions to one of these sessions or to a regular session, according to the guidelines found on the web server of the conference (http://www.dice.ucl.ac.be/esann/). List of special sessions that will be organized during the ESANN'2002 conference ============================================================================ 1. Perspectives on Learning with Recurrent Networks (B. Hammer, J.J. Steil) 2. Representation of high-dimensional data (A. Gurin-Dugu, J. Hrault) 3. Neural Network Techniques in Fault Detection and Isolation (S. Simani) 4. Hardware and Parallel Computer Implementations of Neural Networks (U. Seiffert) 5. Exploratory Data Analysis in Medicine and Bioinformatics (A. Wismller, T. Villmann) 6. Neural Networks and Cognitive Science (H. Paugam-Moisy, D. Puzenat) Short description ================= 1. Perspectives on Learning with Recurrent Networks --------------------------------------------------- Organised by : Barbara Hammer, University of Osnabrck Jochen J.Steil, University of Bielefeld Description of the session: Recurrent neural networks constitute a natural and widely applicable tool for processing spatio-temporal data such as time-series, language, control signals, financial data, etc. According to the various areas of application, many different models have been proposed: Recurrent networks may evolve in continuous or discrete time, they may be fully or partially connected, they may be trained with supervised or unsupervised methods, to name just a few aspects. With respect to this variety, it is difficult to compare and analyse recurrent networks in a common framework. One approach is to focus on their learning ability, where it is well known that classical gradient descent techniques suffer from numerical problems and may require a huge amount of data for adequate generalization. Then a main question is: Can we find efficient learning algorithms for the specific tasks and can we guarantee their success? To this aim, different means like normalization of the gradients, stochastic approaches, or genetic algorithms have been proposed. Hereby a key ingredient for efficient learning and valid generalization is some kind of regularisation. At a technical level this is mainly achieved by a proper choice of the fitness functions or the stochastic model, and/or constraints on the weights, often inspired by similar techniques for feedforward networks. Recently there have been proposed also more none-standard methods: for continuous networks the learning can be restricted to well behaved, e.g. stable, regions of the network known from a dynamical analysis. In case of symbolic, i.e. discrete inputs relations to symbolic systems such as finite automata can be exploited. Integration of automata rules or a restriction of the network to automata behavior often yields a good starting point for further training. We can borrow ideas from biological systems for adequate training according to the specific area of application, e.g. in speech acquisition or restrict the training to combine prototypic weight matrix templates. The session will focus on methods and examples for efficient training of recurrent networks such that valid generalization can be achieved. This involves algorithms, successful applications, and theoretical investigations to forward new insights and ideas for algorithm design. Authors are invited to submit contributions related to the following list of topics: training methods for recurrent networks, genetic, evolutionary, and hybrid approaches, regularization e.g. via stability constraints, connection to automata or symbolic systems, investigation of the dynamical behavior, general learning theory of recurrent networks, applications e.g. in speech recognition or forecasting, other learning related topics. 2. Representation of high-dimensional data ------------------------------------------ Organised by : Anne Gurin-Dugu, CLIPS-IMAG, Grenoble (France) Jeanny Hrault, LIS Grenoble (France) Description of the session: A common problem in Artificial Neural Networks is the analysis of non-linearly related and high-dimensional data. For human beings living in a 3-D world, there is a strong need for the representation and visualisation of these data. The problem is not simple; because of the so-called "curse of dimensionality", our understanding about high dimensions is often similar to the primitive numeration: one, two, three... many. The problem implies the need of dimension and size reduction of data sets, but for this purpose, some questions remain widely open and are of major importance: evaluation of similarities, distance metrics, manifold unfolding, local projections, non-linear relations... Topics to be addressed (this list is not limitative): Exploratory data analysis Similarity, distance metrics Vector quantization Multidimensional scaling, Self-Organized Maps Visualization High Order Statistics Human-driven exploration Applications (Vision, Genetics, Astronomy, Psychometrics...) 3. Neural Network Techniques in Fault Detection and Isolation ------------------------------------------------------------- Organised by : Silvio Simani, Department of Engineering, University of Ferrara Background: In recent years neural networks have been exploited successfully in pattern recognition as well as function approximation theory and they have been therefore proposed as a possible technique for fault diagnosis, too. Neural networks can handle non-linear behaviour and partially known process because they learn the diagnostic requirements by means of the information of the training data. They are noise tolerant and their ability to generalise the knowledge as well as to adapt during use are extremely interesting properties. Contribution topics: Under previous assumptions, session contributions should concern neural networks methods for fault diagnosis and identification which can be applied to a broad spectrum of processes. In particular, the papers can be address the following items: 1. Diagnosis problem (fault detection and isolation, FDI). Neural networks are exploited to estimate unknown relationships between symptoms and faults. In such a way, residuals which can be generated by means of model-based techniques dependent only on system faults. Therefore, neural networks are able to evaluate patterns of residuals, which are uniquely related to particular fault conditions independently from the plant dynamics. 2. Identification problem for FDI. Neural networks can be exploited also for the identification of complex dynamic processes. Such structures can be therefore successfully used to describe the input-output behaviour of the monitored systems. Moreover, on the basis of the analytical redundancy principle, the identified non-linear models can be hence applied for the development of model-based FDI algorithms. 4. Hardware and Parallel Computer Implementations of Neural Networks -------------------------------------------------------------------- Organised by : Udo Seiffert, University of Magdeburg (Germany) Description of the session: There are three major reasons to implement neural networks on specialised hardware. The first one comes from the internal topology and data flow of the network itself, which provides, depending on the considered network type, a more or less massive parallelism. While parallel implementations for that reason are often intended to adapt a technical model as close as possible to its biological original, the second objective becomes evident when dealing with large scale networks and increasing training times. In this case parallel computer hardware can significantly accelerate the training of existing networks or make their realisation viable at all. Sometimes a particular hardware, which is not necessarily parallel, is essential to meet some requirements of a practical application. This session covers all three topics and tries to reflect the great diversity of dedicated hardware implementations from the neural networks point of view. 5. Exploratory Data Analysis in Medicine and Bioinformatics ----------------------------------------------------------- Organised by : Axel Wismller, Institut fr Klinische Radiologie, Ludwig-Maximilians-Univ. Mnchen (Germany) Thomas Villmann, Klinik fr Psychotherapie, Universitt Leipzig (Germany) Description of the session: Biomedical research is a challenge to neural network computation. As medical doctors and bioscientists are facing vast, rapidly growing amounts of data, the need for advanced exploratory data analysis techniques increasingly moves into the focus of attention. In this context, artificial neural networks, as a special kind of learning and self-adapting data processing systems, have to offer considerable contributions. Their abilities to handle noisy and high-dimensional data, nonlinear problems, large data sets etc. have lead to a wide scope of successful applications in biomedicine. Beyond the classical subjects of neural computation in biomedicine, such as computer-aided diagnosis or biomedical image processing, new application domains discover the conceptual power of artificial neural networks for exploratory data analysis and visualization. As an important example, the subject of `bioinformatics' has emerged in recent years as a promising application domain with growing importance for both biomedical basic research and clinical application. Neural network computation in biomedicine can aim at different motivations. We have to distinguish at least two main directions: the first one is the description of neural processes in brains by neural network models. The other one is to exploit neural computation techniques for biomedical data analysis. The announced special session should focus on the second item. Although, at a first glance, the growing number of applications in the field may seem encouraging, there are still considerable unsolved problems. In particular, there is a need for continuous research emphasizing quality assessment including critical comparative evaluation of competing biosignal processing algorithms with respect to specific constraints of given application domains. In this context, it increasingly becomes clear that knowledge about neural network theory alone is not sufficient for designing successful applications aiming at the solution of relevant real-world problems in biomedicine. What is required as well is a sound knowledge of the data, i.e.~the underlying application domain. Although there may be methodological similarities, each application requires specific careful consideration with regard to data preprocessing, postprocessing, interpretation, and quality assessment. This challenge can only be managed by close interdisciplinary cooperation of medical doctors, biologists, engineers, and computer scientists. Hence, this subject can serve as an example for lively cross-fertilization between neural network computing and related research. In the proposed special session we want to focus on exploratory data analysis in medicine and bioinformatics based on neural networks as well as other advanced methods of computational intelligence. A special emphasis is put on real-world applications combining original ideas and new developments with a strong theoretical background. Authors are invited to submit contributions which can be in any area of medical research or bioinformatics applications. The following non-restrictive list can serve as an orientation, however, additional topics may be chosen as well: time-series analysis (EEG, EKG analysis, sleep monitoring, ...) pattern classification, clustering functional and structural genomics blind source separation and decorrelation dimension and noise reduction evaluation of non-metric data (e.g.~categorial, ordinal data) hybrid systems decision support systems data mining quality assessment knowledge-data fusion 6. Neural Networks and Cognitive Science ---------------------------------------- Organised by : Helene Paugam-Moisy, Universite Lyon 2 (France) Didier Puzenat, Universite Antilles-Guyanne (France) Description of the session: First, neural networks have been inspired by cognitive processes [PDP,1986]. Second, they were proved to be very efficient computing tools for engineering, financial and medical applications, etc... The aim of this session is to point out that there is still a great interest, for both engineering and cognitive science, to explore more deeply the links between natural and artificial neural systems, from a theoretical point of view. On the one hand: how to define more complex learning rules adapted to heterogeneous neural networks and how to build modular multi-network systems for modeling cognitive processes. On the other hand: how to derive new interesting learning paradigms back, for artificial neural networks, and how to design more performant systems than classical basic connectionist models. Especially, the strong power of parallel distributed processing is far from being fully understood and new ideas can be found in cognitive science both for boosting the efficiency of parallel computing and for designing more efficient learning rules. [PDP,1986] Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D. E. Rumelhart, J. L. McClelland and the PDP Research Group, MIT Press, 1986 ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From bbs at bbsonline.org Tue Oct 2 16:13:39 2001 From: bbs at bbsonline.org (Behavioral & Brain Sciences) Date: Tue, 02 Oct 2001 16:13:39 -0400 Subject: BBS: ANNOUNCEMENT Message-ID: Dear Dr. Connectionists List User, BBS announcement Cambridge University Press regrets to announce that Dr Stevan Harnad (University of Southampton, UK) has resigned as Editor of Behavioral and Brain Sciences. We are deeply grateful for the energy and commitment that he has devoted to the journal since its foundation and launch in 1978. His ideas and insights have always been stimulating and provocative, and he is especially recognised for providing much of the impetus in the continuing development and evolution of the exciting new endeavours of electronic publication and dissemination of knowledge. Cambridge is now consulting with Stevan and the Associate Editors of BBS and will shortly be establishing a Search Committee to appoint a new Editor. In the meantime, Dr Gavin Swanson, Editorial Manager, Cambridge Journals, has assumed editorial responsibility for BBS in the interregnum. All submissions and production or progress enquiries for BBS should continue to be addressed to bbs at bbsonline.org Dr Conrad Guettler Director, Journals Cambridge University Press -------------------------------------- -------------------------------------------------------------------- Dr Gavin Swanson Tel: +44 (0)1223 326223 (direct) Editorial Manager, Journals Cambridge University Press Fax: +44 (0)1223 315052 Shaftesbury Road ttab E-mail: gswanson at cambridge.org Cambridge CB2 2RU UK ttab Web: http://uk.cambridge.org (outside North America) http://us.cambridge.org (North America) Cambridge Journals Online: http://journals.cambridge.org/ From istvan at louisiana.edu Tue Oct 2 17:18:34 2001 From: istvan at louisiana.edu (Istvan Berkeley) Date: Tue, 02 Oct 2001 16:18:34 -0500 Subject: 2 Position Announcements Message-ID: <3BBA2F2A.594BCC98@louisiana.edu> Hi there, The following two position announcements may be of interest to list members. All the best, Istvan FACULTY POSITION IN THE INSTITUTE OF COGNITIVE SCIENCE. The Institute of Cognitive Science at the University of Louisiana at Lafayette invites applications for a tenure-track faculty appointment for the Fall of 2002. The appointment will be made at the associate professor or senior assistant professor level. The Institute of Cognitive Science is a graduate unit offering a Ph.D. program in cognitive science. Focus areas of the program are in cognitive processes, comparative cognition, cognitive development, computational models of mind, cognitive neuroscience, and linguistic/psycholinguistic processes. Applicants should hold a Ph.D. in cognitive science, psychology, or a related discipline, and must exhibit evidence of a productive research program. Please send a curriculum vitae, selected reprints, and at least three letters of reference to Subrata Dasgupta, Institute of Cognitive Science, University of Louisiana at Lafayette, P.O. Drawer 43772, Lafayette, LA 70504-3772. Formal review of applications will commence December 1, 2001, but applications will be accepted until the position is filled. The University of Louisiana at Lafayette is an equal opportunity/affirmative action employee. Assistant Professor, tenure-track, beginning Fall 2002. AOS: Philosophy of Mind/Cognitive Science. AOC: Open, but Ancient, Aesthetics, Metaphysics, and/or Philosophy of Language a plus. Ph.D. at time of hire and teaching experience required. The candidate should exhibit research promise, proven excellence in teaching and a commitment to the development and life of the philosophy program and the new Institute of Cognitive Science Ph.D program at UL Lafayette. Thus, empirical experience in a cognitive science related discipline would be a considerable advantage. Send cover letter, C.V., at least 3 confidential letters of reference, recent teaching evaluations, sample(s) of scholarly written work, statement describing research program and statement of teaching philosophy to: Dr. John Moore Philosophy Program P.O. Box 43770 The University of Louisiana at Lafayette Lafayette, LA 70504, USA Screening of applications will begin Nov. 25 . For more information about the University of Louisiana at Lafayette visit our web page at http://www.louisiana.edu. UL Lafayette is an AA/EEO employer. Women and minorities are encouraged to apply. -- Istvan S. N. Berkeley, Ph.D. Philosophy & Cognitive Science E-mail: istvan at louisiana.edu The University of Louisiana at Lafayette P.O. Box 43770 Tel: +1 337 482-6807 Lafayette, LA 70504-3770 Fax: +1 337 482-5002 USA http://www.ucs.louisiana.edu/~isb9112 From michaelPichat at univ-paris8.fr Wed Oct 3 09:27:18 2001 From: michaelPichat at univ-paris8.fr (Michael PICHAT) Date: Wed, 03 Oct 2001 15:27:18 +0200 Subject: Workshop on Multidisciplinary Aspects of Learning Message-ID: <3BBB1236.4EC0999A@univ-paris8.fr> EUROPEAN SOCIETY FOR THE STUDY OF COGNITIVE SYSTEMS Special Workshop on Multidisciplinary Aspects of Learning Clichy (Paris, France), 17-19 January 2002 TOPIC The ESSCS attempts to promote the multidisciplinary study of all aspects of cognition. Learning is one of the nodal points of cognition and raises many integrative issues with regard to the study of cognitive systems. This is the first special workshop on this topic organised by the ESSCS. The spirit of the workshop is deliberately chosen to encourage researchers from various fields to discuss with each other about the challenges and opportunities offered by a cross disciplinary approach to learning. SCOPE Contributions are invited on all aspects of learning, in human, animal, and artificial systems. More specifically the following subdisciplines of cognitive sciences are involved: - Psychology (cognitive, clinical, developmental, ergonomics) - Artificial intelligence (general aspects) - Neurosciences (associative memory, neural networks, etc.) - Linguistics (also computational), language disorders - Educational and Instructional sciences - Philosophy, History of concepts. INVITED SPEAKERS G.J. Dalenoort (University of Groningen): Theoretical considerations on learning G. Vergnaud (CNRS, Universite Paris 8): Learning and conceptual development J. Rogalski (CNRS, Universite Paris 8): Epistemology and cognitive analysis of the task: towards a common frame for analysing competence acquisition from students to professionals ORGANISATION The scientific program includes both oral communications and poster presentations. Each oral communication will be allotted 20 minutes for presentation plus 10 minutes for discussion. The workshop schedule will include a poster session; presenters will stand by their posters for informal discussion with workshop participants. The working language of the workshop is English. There will be a maximum of 20 oral presentations. The total number of presentations will be restricted to about 40. Participation is also possible without a communication. The workshop will take place at the "Lycee Rene Auffray", 23 rue Fernand Pelloutier, 92110 Clichy, Tel: 01-49-68-90-00 (from abroad: +33-1-49-68-90-00), Web site: www.lycee-rene-auffray.com, Metro (underground): line 13 (direction: "Gabriel Peri - Asnieres - Gennevilliers"; Stop: "Mairie de Clichy") For more information see the ESSCS webpage: http://www.esscs.org For scientific or local information please contact: michael.pichat at univ-paris8.fr SUBMISSION OF CONTRIBUTIONS Contributions will be peer-reviewed and considered on the basis of their relevance over a variety of subdisciplines of cognitive sciences. This will imply that papers that exclusively report on experimental results, without a theoretical basis or interpretation, will not be accepted. In case of doubt, please take up contact with the organisers. Papers that have been accepted can also be submitted to a special issue of Cognitive Systems, the international peer-reviewed journal of the ESSCS. Submissions (in English) should be sent in the form a 300 words abstract. Desired presentation form is to be indicated (oral, poster, oral or poster). Submissions should be sent as an email attachment (please both in RTF and DOS formats) to both G. Dalenoort (G.J.DALENOORT at ppsw.rug.nl) and M. Pichat (michael.pichat at univ-paris8.fr) by the 30th of October 2001. Acceptance will be notified within a week, upon which registration payment is required. COMMITTEES Scientific committee G.J. Dalenoort, University of Groningen G. Ricco, Universite Paris 8 G. Vergnaud, CNRS, Universite Paris 8 K.B. Koster, University of Groningen P.L.C. Van Geert, University of Groningen Organising committee M. Pichat, Universite Paris 8 L. Numa-Bocage, IUFM de Picardie M. Merri, ENFA de Toulouse D. Morange, Universite Lyon 2 M.-C. Jollivet, IUFM de Poitou-Charentes REGISTRATION Owing to administrative reasons, it is not possible to separate workshop and catering charges. Therefore, the following amounts include both workshop fees and catering fees (3 breakfasts, 2 lunches, 5 breaks). Please note that = 1 Euro. Members of the ESSCS: 85 Non members of the ESSCS: 95 Students: 75 ESSCS membership: 12 Full registration fee deposit deadline: 15 November 2001 Payment modalities shall be notified with declaring of acceptance. Please note that communication abstract will not be published in the workshop proceedings unless payment is received. ACCOMMODATION Some double rooms are available in the Lycee Rene Auffray itself (place of the meeting). Double rooms (with shower/bath and toilets) are about 40 altogether per night. These rooms have to be booked in advance. Full accommodation fee deposit deadline: 15 November 2001 Payment modalities shall be notified with notification of acceptance Alternative accommodation (hotels) Each room is provided with internal bathroom (toilets and shower or bath), television and telephone. Prices are per night and in Euro. Reservations with the hotels can be made directly (if deposits are required: best via a letter with authorisation to charge a credit card, to avoid high costs of international money transfers). Hotel des Chasses (**), Single room: 61, Breakfast: 6, Distance to workshop place: 5 minutes walk. Tel: From abroad: 00-33-1-47-37-01-73, in France: 01-47-37-01-73 Address: 49 rue Pierre Beregovy, 92110 Clichy Website: www.hoteldeschasses.fr Hotel Sovereign (***), Single room: 60, Breakfast: 6 Distance to workshop place: 10 minutes walk. Tel: from abroad: 00-33-1-47-37-54-24, in France: 01-47-37-54-24 Address: 14 rue Dagobert, 92110 Clichy Hotel Savoy (**), Single room: 68, Breakfast: 7 Distance from workshop place: 8 minutes walk. Tel: from abroad: 00-33-1-47-37-17-01, in France: 01-47-37-17-01 Address: 20 rue Villeneuve, 92110 Clichy Website: www.123france.com/europe/france/paris/hotels/hoclichy.htm IMPORTANT DATES Workshop: 17-19 January 2002 Submission deadline: 30th October 2001 Notification of acceptance: 7th November 2001 Registration fee deposit deadline: 15th November 2001 Accommodation fee deposit deadline: 15th November 2001 From jose.dorronsoro at iic.uam.es Wed Oct 3 12:49:02 2001 From: jose.dorronsoro at iic.uam.es (Jose Dorronsoro) Date: Wed, 03 Oct 2001 18:49:02 +0200 Subject: ICANN2002 Message-ID: <1.5.4.32.20011003164902.0117e670@iic.uam.es> Note: efforts have been made to avoid duplicate postings of this message. Apologies if, nevertheless, you are getting them. ICANN 2002 First Call for Papers The 12th International Conference on Artificial Neural Networks, ICANN 2002, to be held from August 27 to August 31 2002 at the ETS de Inform?tica of the Universidad Aut?noma de Madrid, welcomes contributions on Theory, Algorithms, Applications and Implementations on the following broad Areas: Computational Neuroscience Data Analysis and Pattern Recognition Vision and Image Processing Robotics and Control Signal and Time Series Processing Connectionist Cognitive Science Selforganization Dynamical Systems Suggestions for Tutorials, Workshops and Special Sessions are also welcome. Submissions will be possible by surface mail, e-mail attach or through an upload page to be available at a later time. Concrete submission procedures and other related details will soon appear at the conference's web site, www.ii.uam.es/icann2002. The Proceedings will be published in the "Lecture Notes in Computer Science" series of Springer-Verlag. Paper length is restricted to a maximum of 6 pages, including figures. The final paper layout must adhere strictly to the Author Instructions set out in the page http://www.springer.de/comp/lncs/authors.html of the LNCS web site. In particular, a LaTeX style file is available to help authors format their contributions according to the LNCS standard. Authors are asked to use that file and, in general, to follow very carefully that page's instructions. Important deadlines are End of submission receptions: February 15, 2002. Notification of acceptance/rejection: April 15, 2002. Final papers due (in hardcopy and electronically): May 15, 2002. Three independent referees will review each submitted paper. Acting on those reviews, the Program Committee will accept papers and assign them to either oral or poster presentation. All accepted papers (either for oral or poster presentation) will be published in the Proceedings under the same length restrictions. Proceedings will be distributed to all registered participants at the beginning of the Conference. For further information and/or contacts, send inquiries to the following address ICANN 2002 Conference Secretariat Mrs. Juana Calle Escuela T?cnica Superior de Inform?tica Universidad Aut?noma de Madrid 28049 Madrid, Spain e-mail: icann2002 at ii.uam.es From meesad at okstate.edu Tue Oct 2 23:45:38 2001 From: meesad at okstate.edu (Phayung Meesad) Date: Tue, 02 Oct 2001 22:45:38 -0500 Subject: Call for paper IJCNN2002 Message-ID: <000b01c14bbd$de1e14a0$fa384e8b@okstate.edu> CALL FOR PAPERS 2002 International Joint Conference on Neural Networks (IJCNN2002) May 12-17, 2002 Hilton Hawaiian Village, Honolulu, HI Held as part of the World Congress on Computational Intelligence (http://www.wcci2002.org) The annual IEEE/INNS International Joint Conference on Neural Networks (IJCNN), is one of the premier international conferences in the field. It covers all topics in neural networks, including but not limited to: - supervised, unsupervised and reinforcement learning, - hardware implementation, - time series analysis, - neurooptimization, - neurocontrol, - hybrid architectures, - bioinformatics, - neurobiology and neural modeling, - embedded neural systems, - intelligent agents, - image processing, - rule extraction, - statistics, - chaos, - learning theory, - and a huge variety of applications. The emphasis of the Congress will be on original theories and novel applications of neural networks. The Congress welcomes paper submissions from researchers, practitioners, and students worldwide. IJCNN 2002 will be held in conjunction with the Congress on Evolutionary Computation (CEC) and the IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) as part of the World Congress on Computational Intelligence (WCCI). Crossfertilization of the three fields will be strongly encouraged. The Congress will feature keynote speeches and tutorials by world-leading researchers. It also will include a number of special sessions and workshops on the latest hot topics. Your registration admits you to all events and includes the World Congress proceedings and banquet. The deadline for submissions is December 1, 2001. Look for more details on paper submission and conference registration coming soon. Bookmark your webpage at http://www.wcci2002.org. For more information, contact David Fogel, d.fogel at ieee.org General Chairman, WCCI2002: David B. Fogel , Natural Selection, Inc., USA Vice-General Chairman, WCCI2002: Mohamed A. El-Sharkawi, University of Washington, Inc., USA Program Chairman, IJCNN2002: C. Lee Giles , NEC Research, USA Technical co-Chairmen, IJCNN2002: Don Wunsch ,University of Missouri, Rolla, USA Marco Gori , Universita degli Studi de Sienna, Italy Nik Kasabov , University of Otago, New Zealand Michael Hasselmo , Boston University, USA Special Sessions co-Chairmen , IJCNN2002: C. Lee Giles , NEC Research, USA Don Wunsch, University of Missouri Rolla, USA Local Arrangements Chairman, WCCI2002: Anthony Kuh , University of Hawaii at Manoa, USA +++++++++++++++++++++++++++++++++++++++++++++++++++++ Publicity Chair, IJCNN2002 Gary Yen, Oklahoma State University, USA gyen at okstate.edu Publicity Co-Chair, IJCNN2002 Phayung Meesad, Oklahoma State University, USA meesad at okstate.edu +++++++++++++++++++++++++++++++++++++++++++++++++++++ From ttroyer at glue.umd.edu Wed Oct 3 15:45:10 2001 From: ttroyer at glue.umd.edu (Todd Troyer) Date: Wed, 03 Oct 2001 15:45:10 -0400 Subject: tenure-track position at U. Maryland Message-ID: <3BBB6AC6.BE13C6FD@glue.umd.edu> THE DEPARTMENT OF PSYCHOLOGY AT THE UNIVERSITY OF MARYLAND AT COLLEGE PARK has a tenure-track position at the assistant professor level for a cognitive scientist with expertise in computational, mathematical or neural modeling in areas such as, but not limited to, decision processes, memory, judgment, categorization, motor control and/or perception. The successful candidate must provide evidence of research productivity, and have a clear program of research capable of attracting external support. The person hired will be expected to teach at both the graduate and undergraduate levels. Please send a CV, a statement of research and teaching interests, and arrange to have three letters of recommendation sent to Professor Thomas Wallsten, Computational/Mathematical Psychology Search Committee, Department of Psychology, University of Maryland, College Park, MD 20742. The University of Maryland is an Affirmative Action/Equal Employment Opportunity Employer. For best consideration, materials should be received by October 15, 2001. -- ------------------------------------------------------------------------- Todd Troyer Dept of Psychology Ph: 301-405-9971 Program in Neuroscience FAX: 301-314-9566 and Cognitive Science e-mail: ttroyer at glue.umd.edu University of Maryland http://www.wam.umd.edu/~ttroyer College Park, MD 20742 ------------------------------------------------------------------------- From wahba at stat.wisc.edu Thu Oct 4 17:19:42 2001 From: wahba at stat.wisc.edu (Grace Wahba) Date: Thu, 4 Oct 2001 16:19:42 -0500 (CDT) Subject: Variable Selection, Multicategory SVM's Message-ID: <200110042119.QAA19382@hera.stat.wisc.edu> The following papers are available via http://www.stat.wisc.edu/~wahba -> TRLIST TR 1042 Variable Selection via Basis Pursuit for Non-Gaussian Data Hao Zhang, Grace Wahba,Yi Lin, Meta Voelker, Michael Ferris Ronald Klein and Barbara Klein Abstract A simultaneous flexible variable selection procedure is proposed by applying a basis pursuit method to the likelihood function. The basis functions are chosen to be compatible with variable selection in the context of smoothing spline ANOVA models. Since it is a generalized LASSO-type method, it enjoys the favorable property of shrinking coefficients and gives interpretable results. We derive a Generalized Approximate Cross Validation function (GACV), which is an approximate leave-out-one cross validation function used to choose smoothing parameters. In order to apply the GACV function for a large data set situation, we propose a corresponding randomized GACV. A technique called `slice modeling' is used to develop an efficient code. Our simulation study shows the effectiveness of the proposed approach in the Bernoulli case. TR 1043 Multicategory Support Vector Machines Yoonkyung Lee, Yi Lin and Grace Wahba Abstract The Support Vector Machine (SVM) has shown great performance in practice as a classification methodology. Oftentimes multicategory problems have been treated as a series of binary problems in the SVM paradigm. Even though the SVM implements the optimal classification rule asymptotically in the binary case, solutions to a series of binary problems may not be optimal for the original multicategory problem. We propose multicategory SVMs, which extend the binary SVM to the multicategory case, and encompass the binary SVM as a special case. The multicategory SVM implements the optimal classification rule as the sample size gets large, overcoming the suboptimality of the conventional one-versus-rest approach. The proposed method deals with the equal misclassification cost and the unequal cost case in unified way. (Long version of TR 1040) From meyoung at siu.edu Thu Oct 4 09:41:57 2001 From: meyoung at siu.edu (Michael Young) Date: Thu, 4 Oct 2001 08:41:57 -0500 Subject: faculty position: Southern Illinois University Message-ID: We're looking to hire in the cognitive area, broadly defined. We welcome applicants with a modeling background, although an active empirical research program is a must. Cheers, Mike Young Chair of the Search Committee ====== Job Opening in Brain and Cognitive Sciences (BCS) We are seeking candidates with a Ph.D. in Psychology or a related field, whose research interests include memory and cognitive psychology, cognitive neuroscience, cognitive development, or social cognition. Candidates for this position will be expected to develop a research program with potential for external funding and to share responsibility for teaching basic graduate and undergraduate courses in cognitive psychology, possibly undergraduate research methods or introductory statistics, as well as courses in their own specialty area. Interest and experience with an integrated multidisciplinary approach is highly desirable. Candidates will join a group whose research interests include cognitive psychology, neuroscience, cognitive development, behavior genetics, behavioral economics, decision making, and cognitive aging (see http:/www.siu.edu/~psycho/bcs for a detailed description of BCS faculty research interests). Applicants for the position must have either demonstrated potential for (Assistant Professor level) or an established record of (Associate Professor level) excellence in teaching, publication, and externally funded research. Applicants are expected to have completed all requirements for the Ph.D. by the date of employment. If all requirements have not been completed, a one-year term appointment at the rank of Instructor will be offered. Applicants should send a cover letter with an explicit statement of research and teaching interests, a current curriculum vita, reprints, teaching evaluations (if available), and have three letters of recommendation sent to Chair, BCS Search Committee Department of Psychology, Southern Illinois University, Carbondale, IL 62901-6502. Review of applications will begin November 15, but applications will be accepted until the position is filled. Southern Illinois University is an Equal Opportunity/Affirmative Action Employer. -- Dr. Michael E. Young http://www.siu.edu/~psycho/bcs/young.html Dept. of Psychology 618/453-3567 271F Life Sciences II Southern Illinois University Carbondale, IL 62901-6502 From hastie at stat.stanford.edu Thu Oct 4 13:39:21 2001 From: hastie at stat.stanford.edu (Trevor Hastie) Date: Thu, 4 Oct 2001 10:39:21 -0700 Subject: Book announcement: Elements of Statistical Learning Message-ID: <004c01c14cfb$81b74b20$d6bb42ab@GIBBERS> Book announcement: The Elements of Statistical Learning -data mining, inference and prediction 536p (in full color) Trevor Hastie, Robert Tibshirani, and Jerome Friedman Springer-Verlag, 2001 For more details visit our book homepage: http://www-stat.stanford.edu/ElemStatLearn To buy this book: Springer: http://www.springer-ny.com/detail.tpl?isbn=3D0387952845&cart=3D10022167731259632 Amazon:http://www.amazon.com/exec/obidos/ASIN/0387952845/o/qid%3D994019007/sr%3D2-2/ref%3Daps%5Fsr%5Fb%5F1%5F2/107-4101918-6486124 Barnes&Noble: http://shop.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=3D6B0UGX3JWY&mscssid=3DKSW8Q7J9FHV78HC3E4UM2UF3KK9H4E33&isbn=3D0387952845 Here is a brief description: During the past decade there has been an explosion in computation and information technology. With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of Statistics, and spawned new areas such as data mining, machine learning and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data-mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting --- the first comprehensive treatment of this topic in any book. Jerome Friedman, Trevor Hastie, and Robert Tibshirani are Professors of Statistics at Stanford University. They are prominent researchers in this area: Friedman is the (co-)inventor of many data-mining tools including CART, MARS, and projection pursuit. Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modelling software in S-PLUS, and invented principal curves and surfaces. Tibshirani proposed the Lasso and co-wrote the best selling book ``An Introduction to the Bootstrap''. -------------------------------------------------------------------- Trevor Hastie hastie at stat.stanford.edu Professor, Department of Statistics, Stanford University Phone: (650) 725-2231 (Statistics) Fax: (650) 725-8977 (650) 498-5233 (Biostatistics) Fax: (650) 725-6951 URL: http://www-stat.stanford.edu/~hastie address: room 104, Department of Statistics, Sequoia Hall 390 Serra Mall, Stanford University, CA 94305-4065 -------------------------------------------------------------------- From karen at research.nj.nec.com Thu Oct 4 09:18:34 2001 From: karen at research.nj.nec.com (Karen Hahn) Date: Thu, 4 Oct 2001 09:18:34 -0400 Subject: Research Scientist positions at NEC Message-ID: <453644D81568434F85CF1EC0E991E01C4A3848@exchange.nj.nec.com> The NEC Research Institute (NECI), founded twelve years ago, has as its mission basic research in Computer Science and Physical Sciences underlying future technologies relevant to NEC. The Institute has research programs in theory, machine learning, computer vision, computational linguistics, web characterization and applications, bioinformatics, as well as research activities in Physical Sciences. NECI is soliciting applications for full time Research Scientists in Computer Science. In addition to enhancing its current research efforts, NECI also plans to establish a group in distributed systems. Although these areas are preferred, NECI will consider exceptional applicants from other areas of computer science. NECI offers unique and unusual opportunities to its scientists including great freedom in deciding basic research directions and projects; budgets for research, travel, equipment, and support staff that are directly controlled by principal researchers; and publication of all research results in the open literature. The Institute's laboratories are state-of-the-art and include several high-end parallel compute servers. NECI has close ties with outstanding research universities in and outside the Princeton area and with NEC's Central Research Laboratory (CRL) in Japan. Collaborations with university and CRL research groups are encouraged. Full applications should include resumes, copies of selected publications, names of at least three references, and a two-page statement of proposed research directions. Applications will be reviewed beginning January 1, 2002. NECI is an equal opportunity employer. For more details about NECI, please see http://www.neci.nj.nec.com. Please send applications or inquiries to: CS Search Committee Chair NEC Research Institute 4 Independence Way Princeton, NJ 08540 Email: compsci-candidates at research.nj.nec.com From myosioka at brain.riken.go.jp Fri Oct 5 01:54:15 2001 From: myosioka at brain.riken.go.jp (Masahiko Yoshioka) Date: Fri, 05 Oct 2001 14:54:15 +0900 Subject: Paper: Spike-timing-dependent learning rule Message-ID: <20011005145415H.myosioka@brain.riken.go.jp> Dear Connectionists, I am pleased to announce the availability of my recent paper and of one potentially related paper. Recent paper: ------------- "The spike-timing-dependent learning rule to encode spatiotemporal patterns in a network of spiking neurons" M. Yoshioka, Phys. Rev. E (in press) Available at http://arXiv.org/abs/cond-mat/0110070 (preprint) Abstract: We study associative memory neural networks based on the Hodgkin-Huxley type of spiking neurons. We introduce the spike-timing-dependent learning rule, in which the time window with the negative part as well as the positive part is used to describe the biologically plausible synaptic plasticity. The learning rule is applied to encode a number of periodical spatiotemporal patterns, which are successfully reproduced in the periodical firing pattern of spiking neurons in the process of memory retrieval. The global inhibition is incorporated into the model so as to induce the gamma oscillation. The occurrence of gamma oscillation turns out to give appropriate spike timings for memory retrieval of discrete type of spatiotemporal pattern. The theoretical analysis to elucidate the stationary properties of perfect retrieval state is conducted in the limit of an infinite number of neurons and shows the good agreement with the result of numerical simulations. The result of this analysis indicates that the presence of the negative and positive parts in the form of the time window contributes to reduce the size of crosstalk term, implying that the time window with the negative and positive parts is suitable to encode a number of spatiotemporal patterns. We draw some phase diagrams, in which we find various types of phase transitions with change of the intensity of global inhibition. Related paper: -------------- "Associative memory storing an extensive number of patterns based on a network of oscillators with distributed natural frequencies in the presence of external white noise" M. Yoshioka and M. Shiino, Phys. Rev. E 61, 4732 (2000) Available at http://pre.aps.org/ (subscription is required) http://xxx.lanl.gov/abs/cond-mat/9903316 (preprint) Abstract: We study associative memory based on temporal coding in which successful retrieval is realized as an entrainment in a network of simple phase oscillators with distributed natural frequencies under the influence of white noise. The memory patterns are assumed to be given by uniformly distributed random numbers on $[0,2\pi)$ so that the patterns encode the phase differences of the oscillators. To derive the macroscopic order parameter equations for the network with an extensive number of stored patterns, we introduce the effective transfer function by assuming the fixed-point equation of the form of the TAP equation, which describes the time-averaged output as a function of the effective time-averaged local field. Properties of the networks associated with synchronization phenomena for a discrete symmetric natural frequency distribution with three frequency components are studied based on the order parameter equations, and are shown to be in good agreement with the results of numerical simulations. Two types of retrieval states are found to occur with respect to the degree of synchronization, when the size of the width of the natural frequency distribution is changed. Regards, Masahiko Yoshioka Brain Science Institute, RIKEN Hirosawa 2-1, Wako-shi, Saitama, 351-0198, Japan From ken at phy.ucsf.edu Fri Oct 5 20:42:02 2001 From: ken at phy.ucsf.edu (Ken Miller) Date: Fri, 5 Oct 2001 17:42:02 -0700 Subject: Paper available: Model of visual cortical responses Message-ID: <15294.21338.975363.928535@coltrane.ucsf.edu> A preprint of the following paper is now available from ftp://ftp.keck.ucsf.edu/pub/ken/lauritzen_etal.pdf or from http://www.keck.ucsf.edu/~ken (click on "Publications", then on "Models of Neuronal Integration and Circuitry") Lauritzen, T.Z., A.E. Krukowski and K.D. Miller (2001). "Local correlation-based circuitry can account for responses to multi-grating stimuli in a model of cat V1." In press, Journal of Neurophysiology. Abstract: In cortical simple cells of cat striate cortex, the response to a visual stimulus of the preferred orientation is partially suppressed by simultaneous presentation of a stimulus at the orthogonal orientation, an effect known as ``cross-orientation inhibition". It has been argued that this is due to the presence of inhibitory connections between cells tuned for different orientations, but intracellular studies suggest that simple cells receive inhibitory input primarily from cells with similar orientation tuning. Furthermore, response suppression can be elicited by a variety of non-preferred stimuli at all orientations. Here we study a model circuit that was presented previously to address many aspects of simple cell orientation tuning, which is based on local intracortical connectivity between cells of similar orientation tuning. We show that this model circuit can account for many aspects of cross-orientation inhibition and, more generally, of response suppression by non-preferred stimuli and of other non-linear properties of responses to stimulation with multiple gratings. Ken Kenneth D. Miller telephone: (415) 476-8217 Associate Professor fax: (415) 476-4929 Dept. of Physiology, UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From dayan at gatsby.ucl.ac.uk Mon Oct 8 08:39:07 2001 From: dayan at gatsby.ucl.ac.uk (Peter Dayan) Date: Mon, 8 Oct 2001 13:39:07 +0100 (BST) Subject: Gatsby Unit research positions Message-ID: <200110081239.NAA06852@flies.gatsby.ucl.ac.uk> Gatsby Computational Neuroscience Unit http://www.gatsby.ucl.ac.uk/ Post-doctoral and PhD Research Positions Computational Neuroscience The Gatsby Computational Neuroscience Unit invites applications for PhD studentships and post-doctoral research positions. Members of the unit are interested in models of all aspects of brain function, especially unsupervised learning, reinforcement learning, neural dynamics, population coding and computational motor control. There is the opportunity to conduct psychophysical experiments in motor control. The Unit also has active interests in more general aspects of machine learning. For further details please see: http://www.gatsby.ucl.ac.uk/research.html The Gatsby Unit provides a unique opportunity for a critical mass of theoreticians to interact closely with each other and with University College's other world class research groups including Anatomy, Computer Science, Functional Imaging Laboratory, Physics, Physiology, Psychology, Neurology, Ophthalmology, and Statistics. The unit has excellent computational facilities, and laboratory facilities for theoretically motivated experimental studies. The unit's visitor and seminar programmes enable its staff and students to interact with leading researchers from across the world. Candidates should have a strong analytical background and a keen interest in neuroscience. Competitive salaries and studentships are available. Applicants should send in plain text format a CV (PhD applicants should include details of course work and grades), a statement of research interests, and names and addresses of three referees to admin at gatsby.ucl.ac.uk (email preferred) or to The Gatsby Computational Neuroscience Unit University College London Alexandra House 17 Queen Square LONDON WC1N 3AR UK ** Closing date for applications: 9th November 2001 ** From jordan at CS.Berkeley.EDU Mon Oct 8 17:33:25 2001 From: jordan at CS.Berkeley.EDU (Michael Jordan) Date: Mon, 8 Oct 2001 14:33:25 -0700 (PDT) Subject: letter of resignation from Machine Learning journal Message-ID: Dear colleagues in machine learning, The forty people whose names appear below have resigned from the Editorial Board of the Machine Learning Journal (MLJ). We would like to make our resignations public, to explain the rationale for our action, and to indicate some of the implications that we see for members of the machine learning community worldwide. The machine learning community has come of age during a period of enormous change in the way that research publications are circulated. Fifteen years ago research papers did not circulate easily, and as with other research communities we were fortunate that a viable commercial publishing model was in place so that the fledgling MLJ could begin to circulate. The needs of the community, principally those of seeing our published papers circulate as widely and rapidly as possible, and the business model of commercial publishers were in harmony. Times have changed. Articles now circulate easily via the Internet, but unfortunately MLJ publications are under restricted access. Universities and research centers can pay a yearly fee of $1050 US to obtain unrestricted access to MLJ articles (and individuals can pay $120 US). While these fees provide access for institutions and individuals who can afford them, we feel that they also have the effect of limiting contact between the current machine learning community and the potentially much larger community of researchers worldwide whose participation in our field should be the fruit of the modern Internet. None of the revenue stream from the journal makes its way back to authors, and in this context authors should expect a particularly favorable return on their intellectual contribution---they should expect a service that maximizes the distribution of their work. We see little benefit accruing to our community from a mechanism that ensures revenue for a third party by restricting the communication channel between authors and readers. In the spring of 2000, a new journal, the Journal of Machine Learning Research (JMLR), was created, based on a new vision of the journal publication process in which the editorial board and authors retain significant control over the journal's content and distribution. Articles published in JMLR are available freely, without limits and without conditions, at the journal's website, http://www.jmlr.org. The content and format of the website are entirely controlled by the editorial board, which also serves its traditional function of ensuring rigorous peer review of journal articles. Finally, the journal is also published in a hardcopy version by MIT Press. Authors retain the copyright for the articles that they publish in JMLR. The following paragraph is taken from the agreement that every author signs with JMLR (see www.jmlr.org/forms/agreement.pdf): You [the author] retain copyright to your article, subject only to the specific rights given to MIT Press and to the Sponsor [the editorial board] in the following paragraphs. By retaining your copyright, you are reserving for yourself among other things unlimited rights of electronic distribution, and the right to license your work to other publishers, once the article has been published in JMLR by MIT Press and the Sponsor [the editorial board]. After first publication, your only obligation is to ensure that appropriate first publication credit is given to JMLR and MIT Press. We think that many will agree that this is an agreement that is reflective of the modern Internet, and is appealing in its recognition of the rights of authors to distribute their work as widely as possible. In particular, authors can leave copies of their JMLR articles on their own homepage. Over the years the editorial board of MLJ has expanded to encompass all of the various perspectives on the machine learning field, and the editorial board's efforts in this regard have contributed greatly to the sense of intellectual unity and community that many of us feel. We believe, however, that there is much more to achieve, and that our further growth and further impact will be enormously enhanced if via our flagship journal we are able to communicate more freely, easily, and universally. Our action is not unprecedented. As documented at the Scholarly Publishing and Academic Resources Coalition (SPARC) website, http://www.arl.org/sparc, there are many areas in science where researchers are moving to low-cost publication alternatives. One salient example is the case of the journal "Logic Programming". In 1999, the editors and editorial advisors of this journal resigned to join "Theory and Practice of Logic Programming", a Cambridge University Press journal that encourages electronic dissemination of papers. In summary, our resignation from the editorial board of MLJ reflects our belief that journals should principally serve the needs of the intellectual community, in particular by providing the immediate and universal access to journal articles that modern technology supports, and doing so at a cost that excludes no one. We are excited about JMLR, which provides this access and does so unconditionally. We feel that JMLR provides an ideal vehicle to support the near-term and long-term evolution of the field of machine learning and to serve as the flagship journal for the field. We invite all of the members of the community to submit their articles to the journal and to contribute actively to its growth. Sincerely yours, Chris Atkeson Peter Bartlett Andrew Barto Jonathan Baxter Yoshua Bengio Kristin Bennett Chris Bishop Justin Boyan Carla Brodley Claire Cardie William Cohen Peter Dayan Tom Dietterich Jerome Friedman Nir Friedman Zoubin Ghahramani David Heckerman Geoffrey Hinton Haym Hirsh Tommi Jaakkola Michael Jordan Leslie Kaelbling Daphne Koller John Lafferty Sridhar Mahadevan Marina Meila Andrew McCallum Tom Mitchell Stuart Russell Lawrence Saul Bernhard Schoelkopf John Shawe-Taylor Yoram Singer Satinder Singh Padhraic Smyth Richard Sutton Sebastian Thrun Manfred Warmuth Chris Williams Robert Williamson From becker at meitner.psychology.mcmaster.ca Mon Oct 8 21:15:54 2001 From: becker at meitner.psychology.mcmaster.ca (Sue Becker) Date: Mon, 8 Oct 2001 21:15:54 -0400 (EDT) Subject: NIPS*2001 student travel awards Message-ID: Dear connectionists, This is to let you know that the deadline for applications for student travel awards to attend the NIPS*2001 meeting in Vancouver is Friday October 12, and the application form is now available on the NIPS web page, at www.cs.cmu.edu/Web/Groups/NIPS/ These awards cover travel only. Graduate students interested in doing volunteer work at the meeting in exchange for a registration fee waiver should send email to Sid Fels ssfels at ece.ubc.ca. Only a limited number of openings are available. cheers, Sue Becker NIPS*2001 Program Chair From giro-ci0 at wpmail.paisley.ac.uk Tue Oct 9 12:35:23 2001 From: giro-ci0 at wpmail.paisley.ac.uk (Mark Girolami) Date: Tue, 09 Oct 2001 17:35:23 +0100 Subject: Two Post-Doctoral Research Assistantships Available Message-ID: Two Post-Doctoral Research Assistantships Available Division of Computing and Information Systems University of Paisley A research project which is to be funded by the Engineering and Physical Sciences Research Council (EPSRC), the Department of Trade and Industry (DTI), and industrial partners, to a total value of over 530K, will be conducted at the University of Paisley in Scotland for a period of three years. The project aims to investigate the technologies required in software systems which will be able to provide effective detection and subsequent analysis of fraudulent activity within the general framework required of emerging fixed and mobile telecommunications applications such as electronic and mobile commerce. Two postdoctoral positions are now available to investigate the application of machine learning and advanced data mining methods in the detection and analysis of anomalous and possibly fraudulent usage of fixed and mobile telecommunications applications such as electronic and mobile commerce. The project will involve the design and implementation of novel algorithms and systems to both discover and analyse emerging patterns of anomalous telecommunication system user activity. Highly motivated candidates who have a publication record in, ideally, machine learning, data mining or artificial intelligence applications are encouraged to apply. Applicants should have, or shortly expect to obtain, a PhD in Computer Science. State-of-the-art computer hardware and software will be made available to the selected candidates, as will ample funding for travel to international conferences and meetings. Salaries will be on the R1A scale, starting at 20,066pa to 27,550pa. The Applied Computational Intelligence Research Unit (ACIRU) is a young, ambitious and growing interdisciplinary research group within the University of Paisley. Within Scotland ACIRU have active and funded research collaborations with the University of Edinburgh, University of Stirling (http://www.cn.stir.ac.uk/incite/), the University of Glasgow and the University of Strathclyde and it forms part of a rich network of research establishments within which to work. For further information and informal enquiries please contact Mark Girolami (mark.girolami at paisley.ac.uk, http://cis.paisley.ac.uk/giro-ci0) in the first instance. EPSRC & DTI Project Data mining Tools for Fraud Detection in M-Commerce * DETECTOR http://cis.paisley.ac.uk/giro-ci0/projects.html Abstract: The effective detection and subsequent analysis of the types of fraudulent activity which occur within telecommunications systems is varied and changes with the emergence of new technologies and new forms of commercial activity (e&m-commerce). The dynamic nature of fraudulent activity as well as the dynamic and changing nature of normal usage of a service has rendered the detection of fraudulent intent from observed behavioural patterns a research problem of some considerable difficulty. It is proposed that a common theoretical probabilistic framework be employed in the development of dynamic behavioural models which combine a number of prototypical behavioural aspects to define a model of acceptable behaviour (e.g. usage of a mobile phone, web-browsing patterns) from which inferences of the probability of abnormal behaviour can be made. In addition to these inferential models a means of visualising the observed behaviour and the intentions behind it (based on call records, web activity, or purchasing patterns) will significantly aid the pattern recognition abilities of human fraud analysts. Employing the common probabilistic modelling framework which defines the 'fraud detection' models visualisation tools will be developed to provide meaningful visual representations of dynamic activity which has been observed and visualisations of the evolution of the underlying states (or user intentions) generating the observed activity. The development of detection & analysis tools from the common theoretical framework will provide enhanced detection and analysis capability in the identification of fraud. M.A.Girolami School of Communication and Information Technologies University of Paisley High Street Paisley, PA1 2BE Scotland, UK Tel: +44 - 141 - 848 3317 Fax: +44 - 141 - 848 3542 Email: mark.girolami at paisley.ac.uk Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From dld at mail.csse.monash.edu.au Wed Oct 10 01:49:28 2001 From: dld at mail.csse.monash.edu.au (David L Dowe) Date: Wed, 10 Oct 2001 15:49:28 +1000 (EST) Subject: Research Fellows in Machine Learning at Monash University Message-ID: <200110100549.f9A5nSG27946@bruce.csse.monash.edu.au> Dear all, Apologies for any cross-posting. Below is an advertisement for Research Fellows in Information Technology at Monash University. If you are interested in machine learning and machine learning by Minimum Message Length (MML), then I warmly invite you to read the advertisement below for Research Fellows, for which the application deadline is Friday 19th October. David Dowe. http://www.csse.monash.edu.au/~dld/MML.html =========================================================================== Research Fellows in Information Technology ------------------------------------------ As part of its goal to cement its position as Australia's premier academic institution for information technology research, the Faculty of Information Technology of Monash University has established the IT Research Fellowship Scheme to attract outstanding early career researchers in any field of information technology to Monash University. The positions are for up to 3 years and the salary is in the Research Fellow Level B range $50,847 - $60,382 per annum depending on experience. In addition Fellowship holders are eligible for a Research Support Grants which can be up to $10,000 per annum depending on the needs of the research program. Return airfares may be available for successful interstate/overseas candidates and their dependants. The Faculty of Information Technology was created in 1990. It is Australia's largest faculty exclusively devoted to information technology with 190 academics. It has an enviable research reputation in virtually all aspects of information technology and produces more research postgraduates than any other Australian university. Research in the Faculty is centred around 8 research groups which cover: distributed systems, mobile computing and software engineering; artificial intelligence and operations research techniques for intelligent decision support; information systems and information management; digital communications and multimedia; and computer education. Location: Appointees may be based at Clayton, Caulfield, Peninsula, Gippsland or Berwick campuses. Contact: Further information and particulars of the application procedure may be obtained from A/Prof. Kim Marriott, Associate Dean of Research, Faculty of Information Technology telephone +61 3 9905 5525, facsimile +61 3 9905 5146, e-mail: adr at infotech.monash.edu.au Applications: Ms M Jones-Roberts, Manager, Research Services, Faculty of Information Technology, P.O. Box 197, Caulfield East, Vic 3145 by 19/10/01. Quote Ref No. A013055 and include a completed application form. The position description, selection criteria, background information and application form are available at http://www.infotech.monash.edu.au/sta_pv_research.html From bap at cs.unm.edu Wed Oct 10 18:07:34 2001 From: bap at cs.unm.edu (Barak Pearlmutter) Date: Wed, 10 Oct 2001 16:07:34 -0600 (MDT) Subject: NIPS*2001 Workshops Announcement Message-ID: * * * Post-NIPS*2001 Workshops * * * * * * Whistler, BC, CANADA * * * * * * December 7-8, 2001 * * * The NIPS*2001 Workshops will be on Friday and Saturday, December 7/8, in Whistler, BC, Canada, following the main NIPS conference in Vancouver Monday-Thursday, December 3-6. This year there are 19 workshops: Activity-Dependent Synaptic Plasticity Artificial Neural Networks in Safety-Related Areas Brain-Computer Interfaces Causal Learning and Inference in Humans & Machines Competition: Unlabeled Data for Supervised Learning Computational Neuropsychology Geometric Methods in Learning Information & Statistical Structure in Spike Trains Kernel-Based Learning Knowledge Representation in Meta-Learning Machine Learning in Bioinformatics Machine Learning Methods for Images and Text Minimum Description Length Multi-sensory Perception & Learning Neuroimaging: Tools, Methods & Modeling Occam's Razor & Parsimony in Learning Preference Elicitation Quantum Neural Computing Variable & Feature Selection Some workshops span both days, while others will be only one day long. One-day workshops will be assigned to friday or saturday by October 14. Please check the web page after this time for individual dates. All workshops are open to all registered attendees. Many workshops also invite submissions. Submissions, and questions about individual workshops, should be directed to the individual workshop organizers. Included below is a short description of most of the workshops. Additional information (including web pages for the individual workshops) is available at the NIPS*2001 Web page: http://www.cs.cmu.edu/Groups/NIPS/ Information about registration, travel, and accommodations for the main conference and the workshops is also available there. Whistler is a ski resort a few hours drive from Vancouver. The daily workshop schedule is designed to allow participants to ski half days, or enjoy other extra-curricular activities. Some may wish to extend their visit to take advantage of the relatively low pre-season rates. We look forward to seeing you in Whistler. Virginia de Sa and Barak Pearlmutter NIPS Workshops Co-chairs ------------------------------------------------------------------------- Activity-dependent Synaptic Plasticity Paul Munro, Larry Abbott http://www.pitt.edu/~pwm/plasticity While the mathematical and cognitive aspects of rate-based Hebb-like rules have been broadly explored, relatively little is known about the possible role of STDP at the computational level. Hebbian learning in neural networks requires both correlation-based synaptic plasticity and a mechanism that induces competition between different synapses. Spike-timing-dependent synaptic plasticity is especially interesting because it combines both of these elements in a single synaptic modification rule. Some recent work has examined the possibility that STDP may underlie older models, such as Hopfield networks or the BCM rule. Temporally dependent synaptic plasticity is attracting a rapidly growing amount of attention in the computational neuroscience community. The change in synaptic efficacy arising from this form of plasticity is highly sensitive to temporal correlations between different presynaptic spike trains. Furthermore, it can generate asymmetric and directionally selective receptive fields, a result supported by experiments on experience-dependent modifications of hippocampal place fields. Finally, spike-timing-dependent plasticity automatically balances excitation and inhibition producing a state in which neuronal responses are rapid but highly variable. The major goals of the workshop are: 1. To review current experimental results on spike-timing-dependent synaptic plasticity and related effects. 2. To discuss models and mechanisms for this form of synaptic plasticity. 3. To explore the relationship of STDP with other approaches. 4. To reconcile the rate-based and spike-based plasticity data with a unified theoretical framework (very optimistic!).. ------------------------------------------------------------------------- Artificial Neural Networks in Safety-Related Areas: Applications and Methods for Validation and Certification J. Schumann, P. Lisboa, R. Knaus http://ase.arc.nasa.gov/people/schumann/workshops/NIPS2001 Over the recent years, Artificial Neural Networks have found their way into various safety-related and safety-critical areas, for example, power generation and transmission, transportation, avionics, environmental monitoring and control, medical applications, and consumer products. Applications range from classification to monitoring and control. Quite often, these applications proved to be highly successful, leading from pure research prototypes into serious experimental systems (e.g., a neural-network-based flight-control system test-flown on a NASA F-15ACTIVE) or commercial products (e.g., Sharp's Logi-cook). However, the general question of how to make sure that the ANN-based system performs as expected in all cases has not yet been addressed satisfactorily. All safety-related software applications require careful verification and validation (V&V) of the software components, ranging from extended testing to full-fledged certification procedures. However, for neural-network based systems, a number of specific issues have to be addressed. For example, a lack of a concise plant model, often a major reason to use a ANN in the first place, makes traditional approaches to V&V impossible. In this workshop, we will address such issues. In particular, we will discuss the following (non-exhaustive list of) topics: * theoretical methodologies to characterise the properties of ANN solutions, e.g., multiple realisations of a particular network and ways of managing this * fundamental software approaches to V&V and implications for ANNs, e.g., the application of FMEA * statistical (Bayesian) methods and symbolic techniques like rule extraction with subsequent V&V to assess and guarantee the performance of a ANN * dynamic monitoring of the ANN's behavior * stability proofs for control of dynamical systems with ANNs * principled approaches to design assurance, risk assessment, and performance evaluation of systems with ANNs * experience of application and certification of ANNs for safety-related applications * V&V techniques suitable for on-line trained and adaptive systems This workshop aims to bring together researchers who have applied ANNs in safety-related areas and actually addressed questions of demonstrating flawless operation of the ANN, researchers working on theoretical topics of convergence and performance assessment, researchers in the area of nonlinear adaptive control, and researchers from the area of formal methods in software design for safety-critical systems. Many prototypical/experimental application of neural networks in safety-related areas have demonstrated their usefulness successfully. But ANN applicability in safety-critical areas is substantially limited because of a lack of methods and techniques for verification and validation. Currently, there is no silver bullet for V&V in traditional software, and with the more complicated situation for ANNs, none is expected here in the short run. However, any result can have substantial impact in this field. ------------------------------------------------------------------------- Brain-Computer Interfaces Lucas Parra, Paul Sajda, Klaus-Robert Mueller http://newton.bme.columbia.edu/bci ------------------------------------------------------------------------- Causal learning and inference in humans and machines T. Griffiths, J. Tenenbaum, T. Kushnir, K. Murphy, A. Gopnik http://www-psych.stanford.edu/~jbt/causal-workshop.html The topic of causality has recently leapt to the forefront of theorizing in the fields of cognitive science, statistics, and artificial intelligence. The main objective of this workshop is to explore the potential connections between research on causality in the these three fields. There has already been much productive cross-fertilization: the development of causal Bayes nets in the AI community has often had a strong psychological motivation, and recent work by several groups in cognitive science has shown that some elementary but important aspects of how people learn and reason about causes may be best explained by theories based on causal Bayes nets.? Yet the most important questions lay wide open. Some examples of the questions we hope to address in this workshop include: * Can we scale up Bayes-net models of human causal learning and inference from microdomains with one or two causes and effects to more realistic large-scale domains? * What would constitute strong empirical tests of large-scale Bayes net models of human causal reasoning? * Do approximation methods for inference and learning on large Bayes nets have anything to do with human cognitive processes? * What are the relative roles of passive observation and active manipulation in causal learning? * What is the relation between psychological and computational notions of causal independence? The workshop will last one day. Most of the talks will be invited, but we welcome contributions for short talks by researchers in AI, statistics or cognitive science would like to make connections between these fields. Please contact one of the organizers if you are interested in participating. For more information contact Josh Tenenbaum (jbt at psych.stanford.edu) or Alison Gopnik (gopnik at socrates.berkeley.edu). ------------------------------------------------------------------------- Competition: Unlabeled Data for Supervised Learning Stefan C. Kremer, Deborah A. Stacey http://q.cis.uoguelph.ca/~skremer/NIPS2001/ Recently, there has been much interest in applying techniques that incorporate knowledge from unlabeled data into systems performing supervised learning. The potential advantages of such techniques are obvious in domains where labeled data is expensive and unlabeled data is cheap. Many such techniques have been proposed, but only recently has any effort been made to compare the effectiveness of different approaches on real world problems. This web-site presents a challenge to the proponents of methods to incorporate unlabeled data into supervised learning. Can you really use unlabeled data to help train a supervised classification (or regression) system? Do recent (and not so recent) theories stand up to the data test? On this web-site you can find challenge problems where you can try out your methods head-to-head against anyone brave enough to face you. Then, at the end of the contest we will release the results and find out who really knows something about using unlabeled data, and if unlabeled data are really useful or we are all just wasting our time. So ask yourself, are you (and your theory) up to the challenge?? Feeling lucky??? ------------------------------------------------------------------------- Computational Neuropsychology Sara Solla, Michael Mozer, Martha Farah http://www.cs.colorado.edu/~mozer/nips2001workshop.html The 1980's saw two important developments in the sciences of the mind: The development of neural network models in cognitive psychology, and the rise of cognitive neuroscience. In the 1990's, these two separate approaches converged, and one of the results was a new field that we call "Computational Neuropsychology." In contrast to traditional cognitive neuropsychology, computational neuropsychology uses the concepts and methods of computational modeling to infer the normal cognitive architecture from the behavior of brain-damaged patients. In contrast to traditional neural network modeling in psychology, computational neuropsychology derives constraints on network architectures and dynamics from functional neuroanatomy and neurophysiology. Unfortunately, work in computational neuropsychology has had relatively little contact with the Neural Information Processing Systems (NIPS) community. Our workshop aims to expose the NIPS community to the unusual patient cases in neuropsychology and the sorts of inferences that can be drawn from these patients based on computational models, and to expose researchers in computational neuropsychology to some of the more sophisticated modeling techniques and concepts that have emerged from the NIPS community in recent years. We are interested in speakers from all aspects of neuropsychology, including: * attention (neglect) * visual and auditory perception (agnosia) * reading (acquired dyslexia) * face recognition (prosopagnosia) * memory (Alzheimer's, amnesia, category-specific deficits) * language (aphasia) * executive function (schizophrenia, frontal deficits). Contact Sara Solla (solla at nwu.edu) or Mike Mozer (mozer at colorado.edu) if you are interested in speaking at the workshop. ------------------------------------------------------------------------- Geometric Methods in Learning workshop Amir Assadi http://www.lmcg.wisc.edu/bioCVG/events/NIPS2001/NIPS2001Wkshp.htm http://www.lmcg.wisc.edu/bioCVG The purpose of this workshop is to attract the attention of the learning community to geometric methods and to take on an endeavor: 1. To lay out a geometric paradigm for formulating profound ideas in learning; 2. To facilitate the development of geometric methods suitable of investigation of new ideas in learning theory. Today's continuing advances in computation make it possible to infuse geometric ideas into learning that otherwise would have been computationally prohibitive. Nonlinear dynamics in brain-like complex systems has created great excitement, offering a broad spectrum of new ideas for discovery of parallel-distributed algorithms, a hallmark of learning theory. By having great overlap, geometry and nonlinear dynamics together offer a complementary and more profound picture of the physical world and how it interacts with the brain, the ultimate learning system. Among the discussion topics, we envision the following: information geometry, differential topological methods for turning local estimates into global quantities and invariants, Riemannian geometry and Feynman path integration as a framework to explore nonlinearity, advanced in complex dynamical system theory in the context of learning and dynamic information processing in brain, and information theory of massive data sets. As before, in our discussion sessions we will also examine the potential impact of learning theory on future development of geometry, and report on new examples of new vistas on the impact of learning theoretic parallel-distributed algorithms on research in mathematics. With 3 years of meetings, we are in a position to plan a volume based on the materials for the workshops and other contributions to be proposed to the NIPS Program Committee. ------------------------------------------------------------------------- Information and Statistical Structure in Spike Trains Jonathon D. Victor http://www-users.med.cornell.edu/~jdvicto/nips2001.html Understanding how neurons represent and manipulate information in their spike trains is one of the major fundamental problems in neuroscience. Moreover, advances towards its solution will rely on a combination of appropriate theoretical, computational, and experimental strategies. Meaningful and reliable statistical analyses, including calculation of information and related quantities, are at the basis of understanding neural information processing. The accuracy and precision of statistical analyses and empirical information estimates depend strongly on the amount and quality of the data available, and on the assumptions that are made in order to apply the formalisms to a laboratory data set. These assumptions typically relate to the neural transduction itself (e.g., linearity or stationarity) and to the statistics of the spike trains (e.g., correlation structure). There are numerous approaches to conducting statistical analyses and estimating information-theoretic quantities, and there are also some major differences in findings across preparations. It is unclear to what extent these differences represent fundamental biological differences, differences in what is being measured, or methodological biases. Specific areas of focus will include: Theoretical and experimental approaches to analyze multineuronal spiking activity; Bursting, rhythms, and other endogenous patterns; Is "Poisson-like" a reasonable approximation to spike train stochastic structure?; How do we formulate alternative models to Poisson?; How do we evaluate model goodness-of-fit? A limited number of slots are available for contributed presentations. Individuals interested in presenting a talk (approximately 20 minutes, with 10 to 20 minutes for discussion) should submit a title and abstract, 200-300 words, to the organizers, Jonathan D. Victor (jdvicto at med.cornell.edu) and Emery Brown (brown at neurostat.mgh.harvard.edu) by October 12, 2001. ------------------------------------------------------------------------- Workshop on New Directions in Kernel-Based Learning Methods Chris Williams, Craig Saunders, Matthias Seeger, John Shawe-Taylor http://www.cs.rhul.ac.uk/colt/nipskernel.html The aim of the workshop is to present new perspectives and new directions in kernel methods for machine learning. Recent theoretical advances and experimental results have drawn considerable attention to the use of kernel functions in learning systems. Support Vector Machines, Gaussian Processes, kernel PCA, kernel Gram-Schmidt, Bayes Point Machines, Relevance and Leverage Vector Machines, are just some of the algorithms that make crucial use of kernels for problems of classification, regression, density estimation, novelty detection and clustering. At the same time as these algorithms have been under development, novel techniques specifically designed for kernel-based systems have resulted in methods for assessing generalisation, implementing model selection, and analysing performance. The choice of model may be simply determined by parameters of the kernel, as for example the width of a Gaussian kernel. More recently, however, methods for designing and combining kernels have created a toolkit of options for choosing a kernel in a particular application. These methods have extended the applicability of the techniques beyond the natural Euclidean spaces to more general discrete structures. The workshop will provide a forum for discussing results and problems in any of the above mentioned areas. But more importantly, by the structure of the workshop we hope to examine the future directions and new perpsectives that will keep the field lively and growing. We seek two types of contributions: 1) Contributed 20 minutes talks that offer new directions (serving as a focal point for the general discussions) 2) Posters of new ongoing work, with associated spotlight presentations (summarising current work and serving as a springboard for individual discussion). Important Dates: Submission of extended abstracts: 15th October 2001. Notification of acceptance: Early November. Submission Procedure: Extended abstracts in .ps or .pdf formats (only) should be e-mailed to nips-kernel-workshop at cs.rhul.ac.uk ------------------------------------------------------------------------- Knowledge Representation In Meta-Learning Ricardo Vilalta http:www/research.ibm.com/MetaLearning Learning across multiple related tasks, or improving learning performance over time, requires knowledge be transferred across tasks. In many classification algorithms, successive applications of the algorithm over the same data always produces the same hypothesis; no knowledge is extracted across tasks. Knowledge across tasks can be used to construct meta-learners able to improve the quality of the inductive bias through experience. To attain this goal, different pieces of knowledge are needed. For example, how can we characterize those tasks that are most favorable to a particular classification algorithm? On the other hand, What forms of bias are most favorable for certain tasks? Are there invariant transformations inherent to a domain that can be captured when learning across tasks? The goal of the workshop is to discuss alternative ways of knowledge representation in meta-learning with the idea of achieving new forms of bias adaptation. Important Dates: Paper submission: Nov 1, 2001. Notification of acceptance: Nov 12, 2001. Camera-ready copy: Nov 26, 2001. ------------------------------------------------------------------------- Machine Learning Techniques for Bioinformatics Colin Campbell, Shayan Mukherjee http://lara.enm.bris.ac.uk/cig/nips01/nips01.htm There has been significant recent interest in the development of new methods for functional interpretation of gene expression data derived from cDNA microarrays and related technologies. Analysis frequently involves classification, regression, feature selection, outlier detection and cluster analysis, for example. To provide a focus, this topic be the main theme for this one-day Workshop, though contributions in related areas of bioinformatics are welcome. Contributed papers should ideally be in the area of new algorithmic or theoretical approaches to analysing such datasets as well as biologically interesting applications and validation of existing algorithms. To make sure the Workshop relates to issues of real importance to experimentalists there will be four invited tutorial talks to introduce microarray technology, illustrate particular case studies and discuss issues relevant to eventual clinical application. The invited speakers are Pablo Tamayo or Todd Golub (Whitehead Institute, MIT), Dan Notterman (Princeton University), Roger Bumgarner (University of Washington) and Richard Simon (National Cancer Institute). The invited speakers have been involved in the preparation of well-known datasets and studies of expression analysis for a variety of cancers. Authors wishing to contribute papers should submit a title and extended abstract to both organisers (C.Campbell at bris.ac.uk and sayan at mit.edu) before 14th October 2001. Further details about this workshop and the final schedule are available from the workshop webpage. ------------------------------------------------------------------------- Machine Learning Methods for Images and Text Thomas Hofmann, Jaz Kandola, Tomaso Poggio, John Shawe-Taylor http://www.cs.rhul.ac.uk/colt/nipstext.html The aim of the workshop is to present new perspectives and new directions in information extraction from structured and semi-structured data for machine learning. The goal of this workshop is to investigate extensions of modern statistical learning techniques for applications in the domains of categorization and retrieval of information for example text, video and sound, as well as to their combination -- multimedia. The focus will be on exploring innovative and potentially groundbreaking machine learning technologies as well as on identifying key challenges in information access, such as multi-class classification, partially labeled examples and the combination of evidence from separate multimedia domains. The workshop aims to bring together an interdisciplinary group of international researchers from machine learning, information retrieval, computational linguistics, human-computer interaction, and digital libraries for discussing results and dissemination of ideas, with the objective of highlighting new research directions. The workshop will provide a forum for discussing results and problems in any of the above mentioned areas. But more importantly, by the structure of the workshop we hope to examine the future directions and new perpsectives that will keep the field lively and growing. We seek two types of contributions: 1) Contributed 20 minutes talks that offer new directions (serving as a focal point for the general discussions) 2) Posters of new ongoing work, with associated spotlight presentations (summarising current work and serving as a springboard for individual discussion). Important Dates: Submission of extended abstracts: 15th October 2001. Notification of acceptance: 2nd November 2001. Submission Procedure: Extended abstracts in .ps or .pdf formats (only) should be e-mailed to nips-text-workshop at cs.rhul.ac.uk by 15th October 2001. Extended abstracts should be 2-4 sides of A4. The higlighting of a confernce-style group for the paper is not necessary, however the indication of a group and/or keywords would be helpful. ------------------------------------------------------------------------- Minimum Description Length: Developments in Theory and New Applications Peter Grunwald, In-Jae Myung, Mark Pitt http://quantrm2.psy.ohio-state.edu/injae/workshop.htm Inductive inference, the process of inferring a general law from observed instances, is at the core of science. The Minimum Description Length (MDL) Principle, which was originally proposed by Jorma Rissanen in 1978 as a computable approximation of Kolmogorov complexity, is a powerful method for inductive inference. The MDL principle states that the best explanation (i.e., model) given a limited set of observed data is the one that permits the greatest compression of the data. That is, the more we are able to compress the data, the more we learn about the underlying regularities that generated the data. This conceptualization originated in algorithmic information theory from the notion that the existence of regularities underlying data necessarily implies redundancy in the information from successive observations. Since 1978, significant strides have been made in both the mathematics and application of MDL. For example, MDL is now being applied in machine learning, statistical inference, model selection, and psychological modeling. The purpose of this workshop is to bring together researchers, both theorists and practitioners, to discuss the latest developments and share new ideas. In doing so, our intent is to introduce to the broader NIPS community the current state of the art in the field. ------------------------------------------------------------------------- Multi-sensory Perception & Learning J. Fisher, L. Shams, V. de Sa, M. Slaney, T. Darrell http://www.ai.mit.edu/people/fisher/nips01/perceptwshop/description/ All perception is multi-sensory perception. Situations where animals are exposed to information from a single modality exist only in experimental settings in the laboratory. For a variety of reasons, research on perception has focused on processing within one sensory modality. Consequently, the state of knowledge about multi-sensory fusion in mammals is largely at the level of phenomenology, and the underlying mechanisms and principles are poorly understood. Recently, however, there has been a surge of interest in this topic, and this field is emerging as one of fast growing areas of research in perception. Simultaneously and with the advent of low-cost, low-power multi-media sensors there has been renewed interest in automated multi-modal data processing. Whether it be in an intelligent room environment, heterogenous sensor array or the autonomous robot, robust integrated processing of multiple modalities has the potential to solve perception problems more efficiently by leveraging complementary sensor information. The goals of this workshop are to further the understanding of the both the cognitive mechanisms by which humans (and other animals) integrate multi-modal data as well as the means by which automated systems may similarly function. It is not our contention that one should follow the other. It is our contention, that researchers in these different communities stand to gain much through interaction with each other. This workshop aims to bring these researchers together to compare methods and performance and to develop a common understanding of the underlying principles which might be used to analyze both human and machine perception of multi-modal data. Discussions and presentations will span theory, application, as well as relevant aspects of animal/machine perception. The workshop will emphasize a moderated discussion format with short presentations prefacing each of the discussions. Please see the web page for some of the specific questions to be addressed. ------------------------------------------------------------------------- Neuroimaging: Tools, Methods & Modeling B. M. Bly, L. K. Hansen, S. J. Hanson, S. Makeig, S. Strother http://psychology.rutgers.edu/Users/ben/nips2001/nips2001workshop.html Advances in the mathematical description of neuroimaging data are currently a topic of great interest. Last June, at the 7th Annual Meeting of the Organization for Human Brain Mapping in Brighton UK, the number of statistical modeling abstracts virtually exploded (30 abstracts were submitted on ICA alone.) Because of its high relevance for researchers in statistical modeling it has been the topic of several NIPS workshops. Neuroinformatics is an emerging research field, which besides a rich modeling activity also is concerned with database and datamining issues as well as ongoing discussions of data and model sharing. Several groups now distribute statistical modeling tools and advanced exploratory approaches are finding increasing use in neuroimaging labs. NIPS is a rich arena for multivariate and neural modeling, the intersection of Neuroimaging and neural models is important for both fields. This workshop will discuss the underlying methods and software tools related to a variety of strategies for modeling and inference in neuroimaging data analysis (Morning, Day 1.) Discussants will also present methods for comparison, evaluation, and meta-analysis in neuroimaging (Afternoon, Day 1.) On the second day of the workshop, we will continue the discussion with a focus on multivariate strategies (Morning, Day 2.) The workshop will include a discussion of hemodynamic and neural models and their role in mathematical modeling of neuroimaging data (Afternoon, Day 2). Each session of the two-day workshop will include discussion. Talks are intended to last roughly 20 minutes each, followed by 10 minutes of discussion. At the end of each day, there will be a discussion of themes by all participants, with the presenters acting as a panel. ------------------------------------------------------------------------- Foundations of Occam's razor and parsimony in learning David G. Stork http://www.rii.ricoh.com/~stork/OccamWorkshop.html "Entia non sunt multiplicanda praeter necessitatem" -- William of Occam (1285?-1349?) Occam's razor is generally interpreted as counselling the use of "simpler" models rather than complex ones, fewer parameters rather than more, and "smoother" generalizers rather than those that are less smooth. The mathematical descendents of this philosophical principle of parsimony appear in minimum-description-length, Akaike, Kolmogorov complexity and related principles, having numerous manifestations in learning, for instance regularization, pruning, and overfitting avoidance. For a given quality of fit to the training data, in the absence of other information should we favor "simpler" models, and if so, why? How do we measure simplicity, and which representation should we use when doing so? What assumptions are made -- explicitly or implicitly -- by these methods and when are such assumptions valid? What are the minimum assumptions or conditions -- for instance that by increasing the amount of training data we will improve a classifier's performance -- that yield Occam's razor? Support Vector Machines and some neural networks contain a very large number of free parameters, more than might be permitted by the size of the training data and in seeming contradiction to Occam's razor; nevertheless, such classifiers can work exceedingly well. Why? Bayesian techniques such as ML-II reduce a classifier's complexity in a data-dependent way. Does this comport with Occam's razor? Can we characterize problems for which Occam's razor should or should not apply? Even if we abandon the search for the "true" model that generated the training data, can Occam's razor improve our chances of finding a "useful" model? It has been said that Occam's razor is either profound and true, or vacuous and false -- it just isn't clear which. Rather than address specific implementation techniques or applications, the goal of this workshop is to shed light on, and if possible resolve, the theoretical questions associated with Occam's razor, some of the deepest in the intellectual foundations of machine learning and pattern recognition. ------------------------------------------------------------------------- Quantum Neural Computing Elizabeth Behrman ------------------------------------------------------------------------- Variable and Feature Selection Isabelle Guyon, David Lewis http://www.clopinet.com/isabelle/Projects/NIPS2001/ Variable selection has recently received a lot of attention from the machine learning and neural network community because of its applications in genomics and text processing. Variable selection refers to the problem of selecting input variables that are most predictive of a given outcome. Variable selection problems are found in all machine learning tasks, supervised or unsupervised (clustering), classification, regression, time series prediction, two-class or multi-class, posing various levels of challenges. The objective of variable selection is two-fold: improving the prediction performance of the predictors and providing a better understanding of the underlying process that generated the data. This last problem is particularly important in biology when the process may be a living organism and the variables gene expression coefficient. One of the goals of the workshop is to explore alternate statements of the problem, including: (i) discovering all the variables relevant to the concept (e.g. to identify all candidate drug targets) (ii) finding a minimum subset of variables that are useful to the predictor (e.g. to identify the best biomarkers for diagnosis or prognosis). The workshop will also be a forum to compare the best existing algorithms and to discuss the organization of a potential competition on variable selection for a future workshop. Prospective participants are invited to submit one or two pages of summary. Theory, algorithm, and application contributions are welcome. After the workshop, the participants will be offered the possibility of submitting a full paper to a special issue of the Journal of Machine Learning Research on variable selection. Deadline for submission: October 15, 2001. Email submissions to: Isabelle Guyon at isabelle at clopinet.com. ------------------------------------------------------------------------- New Methods for Preference Elicitation Craig Boutilier, Holger Hoos, David Poole (chair), Qiang Yang http://www.cs.ubc.ca/spider/poole/NIPS/Preferences2001.html As intelligent agents become more and more adept at making (or recommending) decisions for users in various domains, the need for effective methods for the representation, elicitation, and discovery of preference and utility functions becomes more pressing. Deciding on the best course of action for a user depends critically on that user's preferences. While there has been much work on representing and learning models of the world (e.g., system dynamics), there has been comparatively little similar research with respect to preferences. The need to reason about preferences arises in electronic commerce, collaborative filtering, user interface design, task-oriented mobile robotics, reinforcement learning, and many others. Many areas of research bring interesting tools to the table that can be used to tackle these issues: machine learning (classification, reinforcement learning), decision theory and control theory (Markov decision processes, filtering techniques), Bayesian networks and probabilistic inferences, economics and game theory, among others. The aim of this workshop is to bring together a diverse group of researchers to discuss the both the practical and theoretical problems associated with effective preference elicitation and to highlight avenues for future research. The deadline for extended abstracts and statements of interest is October 19. From wolfskil at MIT.EDU Wed Oct 10 13:26:17 2001 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Wed, 10 Oct 2001 13:26:17 -0400 Subject: book announcement--Marcus Message-ID: <5.0.2.1.2.20011010132201.0546ed08@po14.mit.edu> I thought readers of the Connectionist List might be interested in this book. For more information please visit http://mitpress.mit.edu/0262133792 Best, Jud The Algebraic Mind Integrating Connectionism and Cognitive Science Gary F. Marcus In The Algebraic Mind, Gary F. Marcus integrates two competing theories about how the mind works, one which says that the mind is a computer-like manipulator of symbols, and another which says that the mind is a large network of neurons working together in parallel. Refuting the conventional wisdom that says that if the mind is a large neural network it cannot simultaneously be a manipulator of symbols, Marcus shows how neural systems could be organized so as to manipulate symbols, why such systems explain language and cognition better than systems that eschew symbols, how such system could evolve, and how they might unfold developmentally within the womb. The Algebraic Mind revamps our understanding of models in cognitive neuroscience and helps to set a new agenda for the field. Gary F. Marcus is Associate Professor of Psychology at New York University. 6 x 9, 225 pp., 48 illus., cloth ISBN 0-262-13379-2 Learning, Development, and Conceptual Change series A Bradford Book Jud Wolfskill Associate Publicist MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617.253.2079 617.253.1709 fax wolfskil at mit.edu From zemel at cs.toronto.edu Wed Oct 10 19:59:58 2001 From: zemel at cs.toronto.edu (Richard Zemel) Date: Wed, 10 Oct 2001 19:59:58 -0400 Subject: NIPS*2001 registration Message-ID: <01Oct10.200006edt.453166-29679@jane.cs.toronto.edu> You are invited to attend the 14th annual conference of NIPS*2001, Neural Information Processing Systems, at the Hyatt Regency in Vancouver, British Columbia, Canada and workshops at the Whistler ski resort near Vancouver. http://www-2.cs.cmu.edu/Groups/NIPS/ Tutorials: December 3, 2001 Conference: December 4-6, 2001 Workshops: December 6-8, 2001 The DEADLINE for reduced early registration fees is November 2, 2001. Registration can now be made online through a secure credit card link or through bank wire transfer, fax, and check: https://www.nips.salk.edu/regist.html Because the number of submissions this year increased to 650, we were able to accept 173 and maintain the same high standards: http://www-2.cs.cmu.edu/Groups/NIPS/NIPS2001/nips-program.html All registrants this year will receive a CD-ROM of the conference proceedings, which will also be available free online. The 2 volume soft-cover format, published by the MIT Press, can be purchased at a special conference rate. The last month has been a difficult time for everyone. The organizing committee for NIPS*2001 has been working hard to ensure that the program and facilities for the annual meeting are better than ever. Vancouver is a beautiful city with many excellent restaurants within a short walk of the conference. The base at Whistler is at 2,200 feet, substantially lower than ski resorts in Colorado. We hope you will join us in Vancouver for an exciting new NIPS*2001 Terry Sejnowski ----------------------------------------------- NIPS*2001 TUTORIALS - December 3, 2001 Luc Devroye, McGill University - Nonparametric Density Estimation: VC to the Rescue Daphne Koller, Stanford, and Nir Friedman, Hebrew University - Learning Bayesian Networks from Data Shawn Lockery, University of Oregon - Why the Worm Turns: How to Analyze the Behavior of an Animal and Model Its Neuronal Basis Christopher Manning, Stanford University - Probabilistic Linguistics and Probabilistic Models of Natural Language Processing Bernhard Scholkopf, Biowulf Technologies and Max-Planck Institute for Biological Cybernetics - SVM and Kernel Methods Sebastian Thrun, Carnegie Mellon University - Probabilistic Robotics INVITED SPEAKERS - December 4-6, 2001 Barbara Finlay, Cornell University - How Brains Evolve, and the Consequences for Computation Alison Gopnik, UC Berkeley - Babies and Bayes-nets: Causal Inference and Theory-formation in Children, Chimps, Scientists and Computers Jon M. Kleinberg, Cornell University - Decentralized Network Algorithms: Small-world Phenomena and the Dynamics of Information Tom Knight, MIT - Computing with Life Judea Pearl, UCLA - Causal Inference As an Exercise in Computational Learning Shihab Shamma, U. Maryland - Common Principles in Auditory and Visual Processing WORKSHOPS - December 6-8, 2001 Activity-Dependent Synaptic Plasticity - Paul Munro Artificial Neural Networks in Safety-Related Areas - Johann Schumann Brain-Computer Interfaces - Lucas Parra Causal Learning and Inference in Humans & Machines - Joshua B. Tenenbaum Competition: Unlabeled Data for Supervised Learning - Stefan C. Kremer Computational Neuropsychology - Mike Mozer Geometric Methods in Learning - Amir H. Assadi Information & Statistical Structure in Spike Trains - Jonathon D. Victor Kernel-Based Learning - John Shawe-Taylor and Craig Saunders Knowledge Representation in Meta-Learning - Ricardo Vilalta Machine Learning in Bioinformatics - Colin Campbell, Sayan Mukherjee Machine Learning Methods for Text and Images - Jaz Kandola Minimum Description Length - Peter Grunwald Multi-sensory Perception & Learning - Ladan Shams, John Fisher Neuroimaging: Tools, Methods & Modeling - Steve Hanson Occam's Razor & Parsimony in Learning - David Stork Preference Elicitation - David Poole Quantum Neural Computing - Elizabeth Behrman Variable & Feature Selection - Isabelle Guyon ----------------------------------------------- From dmedler at mcw.edu Wed Oct 10 13:24:35 2001 From: dmedler at mcw.edu (David A. Medler) Date: Wed, 10 Oct 2001 12:24:35 -0500 Subject: Postdoctoral Fellowship in Speech and Language Processing Message-ID: <3BC48453.62F314E0@mcw.edu> POSTDOCTORAL FELLOWSHIP COGNITIVE NEUROSCIENCE OF SPEECH AND LANGUAGE PROCESSING The Medical College of Wisconsin The Language Imaging Laboratory, Department of Neurology, Medical College of Wisconsin, announces an NIH-funded postdoctoral position in cognitive neuroscience of language processes. Applicants will join a research team studying word recognition, speech perception, semantics, and language development using fMRI, neural network modeling, and event-related potentials. Computer proficiency and exposure to computational models of language processing are desirable. Facilities include state-of-the-art 3T and 1.5T fMRI systems dedicated to research and supported by a large physics and engineering core. Ample scanner time and training in fMRI techniques will be provided. Applicants should have a PhD in experimental psychology, linguistics, cognitive neuroscience, computing science, or related field. Send curriculum vitae, statement of research interests, and two letters of recommendation to: Jeffrey Binder, Department of Neurology, Medical College of Wisconsin, 9200 W. Wisconsin Ave., Milwaukee, WI 53226. Email: jbinder at mcw.edu. Fax: 414-259-0469. Equal Opportunity Employer. -- David A. Medler, Ph.D. dmedler at mcw.edu Department of Neurology, The Medical College of Wisconsin 8701 Watertown Plank Rd, MEB 4550 Milwaukee, WI 53226 From jek at first.fraunhofer.de Thu Oct 11 12:44:59 2001 From: jek at first.fraunhofer.de (Jens Kohlmorgen) Date: Thu, 11 Oct 2001 18:44:59 +0200 (MET DST) Subject: Paper available: An On-line Method for Segmentation and Identification of Non-stationary Time Series Message-ID: Readers of the connectionists list might be interested in the following paper: Kohlmorgen, J., Lemm, S. (2001), "An On-line Method for Segmentation and Identification of Non-stationary Time Series" in: Neural Networks for Signal Processing XI, IEEE, NJ, pp. 113-122. It is available from http://www.first.gmd.de/~jek/Kohlmorgen.Jens/publications.html Abstract: We present a method for the analysis of non-stationary time series from dynamical systems that switch between multiple operating modes. In contrast to other approaches, our method processes the data incrementally and without any training of internal parameters. It straightaway performs an unsupervised segmentation and classification of the data on-the-fly. In many cases it even allows to process incoming data in real-time. The main idea of the approach is to track and segment changes of the probability density of the data in a sliding window on the incoming data stream. An application to a switching dynamical system demonstrates the potential usefulness of the algorithm in a broad range of applications. ========================================================================== Dr. Jens Kohlmorgen Tel.(office) : +49 30 6392-1875 Tel.(secret.): +49 30 6392-1800 Intelligent Data Analysis Group Fax : +49 30 6392-1805 Fraunhofer-FIRST (former GMD FIRST) Kekulestr. 7 e-mail: jek at first.fraunhofer.de 12489 Berlin, Germany http://www.first.fraunhofer.de/~jek ========================================================================== From rsun at cecs.missouri.edu Thu Oct 11 14:21:50 2001 From: rsun at cecs.missouri.edu (rsun@cecs.missouri.edu) Date: Thu, 11 Oct 2001 13:21:50 -0500 Subject: papers on hybrid reinforcement learning Message-ID: <200110111821.f9BILou23150@ari1.cecs.missouri.edu> Two papers on hybrid reinforcement learning: combining symbolic and neural methods for reinforcement learning accessible from http://www.cecs.missouri.edu/~rsun/hybrid-rl.html ---------------------------------------------------------------------------- Supplementing Neural Reinforcement Learning with Symbolic Methods Ron Sun Several different ways of using symbolic methods to enhance reinforcement learning are identified and discussed in some detail. Each demonstrates to some extent the potential advantages of combining RL and symbolic methods. Different from existing work, in combining RL and symbolic methods, we focus on autonomous learning from scratch without a priori domain-specific knowledge. Thus the role of symbolic methods lies truly in enhancing learning, not in providing a priori domain-specific knowledge. These discussed methods point to the possibilities and the challenges in this line of research. ---------------------------------------------------------------------------- Beyond Simple Rule Extraction: Acquiring Planning Knowledge from Neural Networks Ron Sun Todd Peterson Chad Sessions This paper discusses learning in hybrid models that goes beyond simple classification rule extraction from backpropagation networks. Although simple rule extraction has received a lot of research attention, we need to further develop hybrid learning models that learn autonomously and acquire both symbolic and subsymbolic knowledge. It is also necessary to study autonomous learning of both subsymbolic and symbolic knowledge in integrated architectures. This paper will describe planning knowledge extraction from neural reinforcement learning that goes beyond extracting simple rules. It includes two approaches towards extracting planning knowledge: the extraction of symbolic rules from neural reinforcement learning, and the extraction of complete plans. This work points to a general framework for achieving the subsymbolic to symbolic transition in an integrated autonomous learning framework. ------------------------------------------------------------------------- Both papers are accessible from http://www.cecs.missouri.edu/~rsun/hybrid-rl.html =========================================================================== Prof. Ron Sun http://www.cecs.missouri.edu/~rsun CECS Department phone: (573) 884-7662 University of Missouri-Columbia fax: (573) 882 8318 201 Engineering Building West Columbia, MO 65211-2060 email: rsun at cecs.missouri.edu http://www.cecs.missouri.edu/~rsun http://www.cecs.missouri.edu/~rsun/journal.html http://www.elsevier.com/locate/cogsys =========================================================================== From roli at diee.unica.it Thu Oct 11 10:59:03 2001 From: roli at diee.unica.it (Fabio Roli) Date: Thu, 11 Oct 2001 15:59:03 +0100 Subject: IF Special Issue on Fusion of Multiple Classifiers Message-ID: Call for papers for a special issue on Fusion of Multiple Classifiers Information Fusion An International Journal on Multi-Sensor, Multi-Source Information =46usion An Elsevier Science Publication Editor-in-Chief: Dr. Belur V. Dasarathy belur.d at dynetics.com; ifjournal at yahoo.com The Information Fusion Journal has planned for publication, in the latter half of 2002, a special issue devoted to the fusion of multiple classifiers. Classifier fusion has recently become an important tool for enhancing the performance of pattern recognition systems. A myriad of techniques have been developed for combining classifiers at the decision or soft decision output level. These techniques have been conceived by researchers in many diverse communities including Machine Learning, Pattern Recognition, Neural Networks, Statistics, and Artificial Intelligence. The aim of this special issue is to provide a focal point for recent advances in this methodological area of pat-tern recognition across different paradigms and disciplines. Submitted papers should report new theories underpinning classifier combination, novel methodologies, applications where classifier fusion significantly enhanced the recognition system performance, or extensive comparative studies of different combination rules. Topics appropriate for this special issue include, but are not limited to: =85 Decision level fusion =85 Strategies for multiple classifier fusion =85 Bagging and boosting =85 Neural network ensembles =85 Multiple classifier design =85 Fusion of one-class classifiers =85 Fusion of measurement and contextual information =85 Innovative applications Prospective authors should follow the regular paper preparation guide-lines of the Journal. Submission can be done electronically in =2Epdf or .ps format, along with four hard copies sent to one of the Guest Editors listed below: Guest Editors Prof. Josef Kittler Center for Vision, Speech and Signal Proc. Univ. of Surrey, Guildford, Surrey GU2 5XH, UK e-mail j.kittler at eim.surrey.ac.uk Prof. Fabio Roli Dept. of Electrical and Electronic Eng. Univ. of Cagliari, 09123, Cagliari, Italy email roli at diee.unica.it Deadline for Submission: November 15, 2001 -- =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D =46abio Roli, Ph.D. Associate Professor of Computer Science Electrical and Electronic Engineering Dept. - University of Cagliari Piazza d'Armi 09123 Cagliari Italy Phone +39 070 675 5874 Fax +39 070 6755900 e-mail roli at diee.unica.it Web Page at http://www.diee.unica.it/~roli/info.html From bogus@does.not.exist.com Thu Oct 11 15:55:02 2001 From: bogus@does.not.exist.com () Date: Thu, 11 Oct 2001 12:55:02 -0700 Subject: Research position at Microsoft Research Cambridge, with associated College Fellowship Message-ID: <1D4512C91A0E044FBF24DDB221C94C5303E41645@red-msg-29.redmond.corp.microsoft.com> From roli at diee.unica.it Thu Oct 11 10:53:55 2001 From: roli at diee.unica.it (Fabio Roli) Date: Thu, 11 Oct 2001 15:53:55 +0100 Subject: MCS 2002 - Third International Workshop on Multiple Classifier Systems Message-ID: **Apologies for multiple copies** ****************************************** *****MCS 2002 Call for Papers***** ****************************************** *****Paper Submission: 1 FEBRUARY 2002***** *********************************************************************** THIRD INTERNATIONAL WORKSHOP ON MULTIPLE CLASSIFIER SYSTEMS Grand Hotel Chia Laguna, Cagliari, Italy, June 24-26 2002 Updated information: http://www.diee.unica.it/mcs E-mail: mcs at diee.unica.it *********************************************************************** WORKSHOP OBJECTIVES MCS 2002 is the third workshop of a series aimed to create a common international forum for researchers of the diverse communities working in the field of multiple classifier systems. Information on the previous editions of MCS workshop can be found on www.diee.unica.it/mcs. Contributions from all the research communities working in the field are welcome in order to compare the different approaches and to define the common research priorities. Special attention is also devoted to assess the applications of multiple classifier systems. The papers will be published in the workshop proceedings, and extended versions of selected papers will be considered for publication in a special issue of the International Journal of Pattern Recognition and Artificial Intelligence. WORKSHOP CHAIRS Josef Kittler (Univ. of Surrey, United Kingdom) Fabio Roli (Univ. of Cagliari, Italy) ORGANIZED BY Dept. of Electrical and Electronic Eng. of the University of Cagliari Center for Vision, Speech and Signal Proc. of the University of Surrey Sponsored by IAPR TC1 Statistical Pattern Recognition Techniques PAPER SUBMISSION Three hard copies of the full paper should be mailed to: MCS 2002 Prof. Fabio Roli Dept. of Electrical and Electronic Eng. University of Cagliari Piazza d'armi 09123 Cagliari Italy In addition, participants should submit an electronic version of the manuscript (PostScript or PDF format) to mcs at diee.unica.it. The papers should not exceed 10 pages (LNCS format, see http://www.springer.de/comp/lncs/authors.html). A cover sheet with the authors names and affiliations is also requested, with the complete address of the corresponding author, and an abstract (200 words). Two members of the Scientific Committee will referee the papers. IMPORTANT NOTICE: Submission implies the willingness of at least one author to register, attend the workshop, and present the paper. Accepted papers will be published in the proceedings only if the registration form and payment for one of the authors will be received. WORKSHOP TOPICS Papers describing original work in the following and related research topics are welcome: Foundations of multiple classifier systems Methods for classifier fusion Design of multiple classifier systems Neural network ensembles Bagging and boosting Mixtures of experts New and related approaches Applications INVITED SPEAKERS Joydeep Ghosh (University of Texas, USA) Trevor Hastie (Stanford University, USA) Sarunas Raudys (Vilnius University, Lithuania) SCIENTIFIC COMMITTEE J. A. Benediktsson (Iceland) H. Bunke (Switzerland) L. P. Cordella (Italy) B. V. Dasarathy (USA) R. P.W. Duin (The Netherlands) C. Furlanello (Italy) J. Ghosh (USA) T. K. Ho (USA) S. Impedovo (Italy) N. Intrator (Israel) A.K. Jain (USA) M. Kamel (Canada) L.I. Kuncheva (UK) L. Lam (Hong Kong) D. Landgrebe (USA) D-S. Lee (USA) D. Partridge (UK) A.J.C. Sharkey (UK) K. Tumer (USA) G. Vernazza (Italy) T. Windeatt (UK) IMPORTANT DATES February 1, 2002 : Paper Submission March 15, 2002: Notification of Acceptance April 10, 2002: Camera-ready Manuscript April 10, 2002: Registration WORKSHOP VENUE The workshop will be held at Grand Hotel Chia Laguna, Cagliari, Italy. See http://www.crs4.it/~zip/EGVISC95/chia_laguna.html (in English) or http://web.tiscali.it/chialaguna (in Italian). WORKSHOP PROCEEDINGS Accepted papers will appear in the workshop proceedings that will be published in the series Lecture Notes in Computer Science by Springer-Verlag. Extended versions of selected papers will considered for possible publication in a special issue of the International Journal of Pattern Recognition and Artificial Intelligence.. -- ==================================================================== Fabio Roli, Ph.D. Associate Professor of Computer Science Electrical and Electronic Engineering Dept. - University of Cagliari Piazza d'Armi 09123 Cagliari Italy Phone +39 070 675 5874 Fax +39 070 6755900 e-mail roli at diee.unica.it Web Page at http://www.diee.unica.it/~roli/info.html From efiesler at intopsys.com Thu Oct 11 20:05:01 2001 From: efiesler at intopsys.com (Emile Fiesler) Date: Thu, 11 Oct 2001 17:05:01 -0700 Subject: Vacancy in Southern California for a research scientist. Message-ID: <000701c152b1$8a6d6a40$0615010a@efeisler> Intelligent Optical Systems (IOS), a world leader in the development of innovative optical sensors, is seeking a Research Scientist with expertise and hands-on experience in advanced signal and image processing, including neural computation. The ideal candidate will have a doctoral degree in computer science, electrical engineering, or equivalent, plus experience in securing funding through proposal writing. Expertise in spectroscopy and image enhancement are a definite plus. Functions would include image analysis and enhancement, biomedical diagnosis, chemical analysis, and object recognition, using AI and neural computation-based implementations in software and hardware. We offer a competitive salary and benefits packages. Please send your application, including CV, and 3 references, by e-mail to: OHuang at intopsys.com Emile Fiesler From holte at cs.ualberta.ca Sat Oct 13 14:31:44 2001 From: holte at cs.ualberta.ca (Robert Holte) Date: Sat, 13 Oct 2001 12:31:44 -0600 (MDT) Subject: Machine Learning journal Message-ID: Dear colleagues, In response to the widely circulated letter of resignation of some members of the Machine Learning journal (MLJ), I would like to make two points: - MLJ articles *are* universally electronically accessible - MLJ seeks your support and input to continue serving the community The accessibility of MLJ papers has been dramatically improved in the past 12 months. The main changes are these: - the copyright agreement gives the author the right to distribute individual copies of an MLJ paper to students and colleagues, physically and electronically, including making the paper available from the author's personal web site. - all MLJ papers are freely available online at Kluwer's web page http://www.wkap.nl/kaphtml.htm/MACHFCP from the time of acceptance until the paper appears in print. - the individual MLJ subscription price has been dramatically reduced. It is excellent value for money: for $120 Kluwer prints, binds, and mails to your door around 1350 pages. As a consequence of the first two points, MLJ articles are universally accessible -- from Kluwer's home page in the first six months or so, and at any time from the author's home page. The primary purpose of paid subscriptions, in this new distribution model, is to enable an individual or institution to obtain a bound archival copy of the journal printed on high-quality paper -- exactly the same role served by the printed version of JMLR sold by MIT Press. Turning to the second point, all members of both editorial boards have the interests of the machine learning community at heart. Our job is to serve you. The current members of the MLJ board, and the new members we are in the process of adding, believe it is in the best interests of the research community to keep MLJ alive and strong at this time. This is not to say we hope JMLR will fail. There is ample excellent research to support two high-quality journals, so it is not necessary for one of the journals to be destroyed in order for the other to succeed. If you agree that MLJ is useful to the community and has a role to play in the future, I would like to hear from you - feedback from the community is the very best way for me to know how to steer MLJ's course so it best serves the community. -- Robert Holte holte at cs.ualberta.ca Executive Editor Machine Learning From giro-ci0 at wpmail.paisley.ac.uk Mon Oct 15 03:35:24 2001 From: giro-ci0 at wpmail.paisley.ac.uk (Mark Girolami) Date: Mon, 15 Oct 2001 08:35:24 +0100 Subject: 24th BCS-IRSG European Colloquium on IR Research Message-ID: There is an increasing level of research interest within the connectionist and machine learning communities on a number of aspects of information retrieval * evidenced by the number of papers appearing in recent NIPS and ICML conferences as well as recently organised post-conference workshops at NIPS on document mining and retrieval. Therefore the following cfp will be of interest to the connectionist and ml mailing lists. Rgds Mark Girolami ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The 24th BCS-IRSG European Colloquium on Information Retrieval (IR) Research - which was the precursor of the ACM SIGIR conference - is being held in the city of Glasgow, Scotland and submissions reporting recent research work in this area are welcomed. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 24th BCS-IRSG European Colloquium on IR Research March 25-27, 2002, Glasgow, Scotland, UK http://www.cs.strath.ac.uk/ECIR02/ The colloquium on information retrieval research provides an opportunity for both new and established researchers to present papers describing work in progress or final results. These Colloquia were established by the BCS IRSG (British Computer Society Information Retrieval Specialist Group), and named the Annual Colloquium on Information Retrieval Research. Recently, the location of the colloquium has alternated between the United Kingdom and continental Europe. To reflect the growing European orientation of the event, the Colloquium was renamed "European Annual Colloquium on Information Retrieval Research" from 2001. The previous five colloquia have been held in Darmstadt (2001), Cambridge (2000), Glasgow (1999), Grenoble (1998), and Aberdeen (1997). Details The colloquium on information retrieval research provides an opportunity for both new and established researchers to present papers describing work in progress or final results. Relevant papers should address (at the theoretical, methodological, system or application level) the analysis, design or evaluation of functions like: Indexing Information Extraction Data Mining Browsing Retrieval and Filtering User Interaction for the following types of documents and databases: Monomedia documents (e.g. text, images, audio, voice, video) Composite documents Multimedia documents Hypermedia documents Active documents Distributed documents and databases Digital Libraries the Web Organising Committee Dr Fabio Crestani, Department of Computer & Information Sciences, University of Strathclyde, GLASGOW. Prof Mark Girolami, Deparment of Computer Science, University of Paisley, PAISLEY. Prof Keith van Rijsbergen, Department of Computing Science, University of Glasgow, GLASGOW. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From bischof at icg.tu-graz.ac.at Mon Oct 15 05:27:19 2001 From: bischof at icg.tu-graz.ac.at (Horst Bischof) Date: Mon, 15 Oct 2001 11:27:19 +0200 Subject: Reminder CfP Special Issue PR Message-ID: <3BCAABF7.3050001@icg.tu-graz.ac.at> Pattern Recognition The Journal of the Pattern Recognition Society Special Issue on Kernel and Subspace Methods for Computer Vision http://www.prip.tuwien.ac.at/~bis/cfp-pr.html Guest Editors: Ales Leonardis Horst Bischof Faculty of Computer and Pattern Recognition and Information Science, Image Processing Group University of Ljubljana, Vienna University of Technology Trzaska 25, Favoritenstr. 9/1832, 1001 Ljubljana, Slovenia A-1040 Vienna, Austria alesl at fri.uni-lj.si bis at prip.tuwien.ac.at This Pattern Recognition Special Issue will address new developments in the area of kernel and subspace methods related to computer vision. High-quality original journal paper submissions are invited. The topics of interest include (but are not limited to): Support Vector Machines, Independent Component Analysis, Principal Component Analysis, Mixture Modeling, Canonical Correlation Analysis, etc. applied to computer vision problems such as: Object Recognition, Navigation and Robotics, Medical Imaging, 3D Vision, etc. All submitted papers will be peer reviewed. Only high-quality, original submissions will be accepted for publication in the Special Issue---in accordance with the Pattern Recognition guidelines (http://www.elsevier.nl/inca/publications/store/3/2/8/index.htt). Submission Timetable Submission of full manuscript: November 30, 2001 Notification of Acceptance: March 29, 2002 Submission of revised manuscript: End of June 2002 Final Decision: August 2002 Final papers: September 2002 Submission Procedure All submissions should follow the Pattern Recognition Guidelines and should be submitted electronically via anonymous ftp in either postscript or pdf format (compressed with zip or gzip). Files should be named by the surname of the first author i.e., surname.ps.gz, for multiple submissions surname1, surname2, ... should be used. Papers should be uploaded to the following ftp site by the deadline of 30th November 2001. ftp ftp.prip.tuwien.ac.at [anonymous ftp, i.e.: Name: ftp Password: < your email address > ] cd sipr binary put .ext quit After uploading the paper authors should email the guest editor Ales Leonardis giving full details of the paper title and authors. -------------------------------------------------------------------------- -- !!!!!!!!!!!!!!!!! ATTENTION NEW ADDRESS !!!!!!!!!!!! Horst Bischof Institute for Computer Graphics and Vision TU Graz Inffeldgasse 16 2. OG A-8010 Graz AUSTRIA email: bischof at icg.tu-graz.ac.at Tel.: +43-316-873-5014 Fax.: +43-316-873-5050 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! From wolpert at hera.ucl.ac.uk Tue Oct 16 06:13:17 2001 From: wolpert at hera.ucl.ac.uk (Daniel Wolpert) Date: Tue, 16 Oct 2001 11:13:17 +0100 Subject: Two Postdoctoral Fellowships in Motor Control Message-ID: <015c01c1562b$2d277e70$40463ec1@APHRODITE> Two Postdoctoral Fellowships are available in 1. Computational models of imitation and 2. Human Sensorimotor Control ---------------------------------------------------------------------------- 1. Postdoctoral Fellowship Computational models of imitation Sobell Research Department of Motor Neuroscience & Movement Disorders, Institute of Neurology, University College London, London & ATR Human Information Science Laboratories, Japan Advisors: Dr. Daniel Wolpert (IoN/UCL, London) www.hera.ucl.ac.uk & Mitsuo Kawato (ATR, Japan) www.his.atr.co.jp/~kawato/ We have an opening for a highly motivated Postdoctoral Fellow to work on an international collaborative project funded by the McDonnell foundation entitled "Mechanisms of Forward Thinking and Behaviour". The Fellow will investigate the computational mechanisms by which the motor system can be used to decode the observed actions of others. Computational architectures will be developed for action observation, imitation and communication. The ideal candidate will have a PhD in a related area and strong mathematical and computational skills. Each year the postdoctoral fellow will spend 9 months in London with Dr Daniel Wolpert, to whom informal enquiries are welcome (wolpert at hera.ucl.ac.uk) and 3 months in Kyoto, Japan working with Dr Mitsuo Kawato. The position is available for two years and the Fellow could start immediately. Starting salary is up to #30,453 pa inclusive, depending on experience. Applications (2 copies of CV and names of 3 referees) to Miss E Bertram, Assistant Secretary (Personnel), Institute of Neurology, Queen Square, London WC1N 3BG (fax: +44 (0)20 7278 5069) by 6th November 2001. Working toward Equal Opportunity ---------------------------------------------------------------------------- 2. Postdoctoral Fellowship Human Sensorimotor Control Sobell Research Department of Motor Neuroscience & Movement Disorders Institute of Neurology, University College London Advisor: Dr. Daniel Wolpert The Sobell Department of Motor Neuroscience & Movement Disorders has an opening for a highly motivated Postdoctoral Fellow in the area of computational and experimental human motor control. The Fellow will join a team investigating planning, control and learning of skilled action. The ideal candidate will have a PhD, technical expertise and computational skills relevant to the study of human movement. The sensorimotor control laboratory is housed with state of the art equipment for the collection of kinematic (Optotrak & multiple flock-of-birds), force (multiple six axis force transducers) and physiological data (EMG). In addition equipment is available for the provision of online visual feedback (both 3D and 2D virtual reality systems), and for the perturbation of movements (two robotic Phantom haptic interfaces, muscle stimulation and TMS facilities). The project, funded by a Wellcome Programme Grant, is under the direction of Dr. Daniel Wolpert to whom informal enquiries are welcome (wolpert at hera.ucl.ac.uk). The position is available for three years with a starting date from January 2002. Further details of the post and laboratory are available on www.hera.ucl.ac.uk. Starting salary is up to #30,453 pa inclusive, depending on experience. Applications (2 copies of CV and names of 3 referees) to Miss E Bertram, Assistant Secretary (Personnel), Institute of Neurology, Queen Square, London WC1N 3BG (fax: +44 (0)20 7278 5069) by 6th November 2001. Working toward Equal Opportunity ---------------------------------------------------------------------------- From terry at salk.edu Fri Oct 19 19:15:01 2001 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 19 Oct 2001 16:15:01 -0700 (PDT) Subject: NEURAL COMPUTATION 13:11 Message-ID: <200110192315.f9JNF1j49876@purkinje.salk.edu> Neural Computation - Contents - Volume 13, Number 11 - November 1, 2001 ARTICLE Predictability, Complexity and Learning William Bialek, Ilya Nemenman, and Naftali Tishby NOTE Dendritic Subunits Determined by Dendritic Morphology K. A. Lindsay, J. M. Ogden and J. R. Rosenberg LETTERS Computing the Optimally Fitted Spike Train for a Synapse Thomas Natschlager and Wolfgang Maass Period Focusing Induced by Network Feedback in Populations of Noisy Integrate-and-Fire Neurons Francisco B. Rodriguez, Alberto Suarez, Vicente Lopez A Variational Method for Learning Sparse and Overcomplete Representations Mark Girolami Random Embedding Machines for Pattern Recognition Yoram Baram Manifold Stochastic Dynamics for Bayesian Learning Mark Zlochin and Yoram Baram Resampling Method for Unsupervised Estimation of Cluster Validity Erel Levine and Eytan Domany The Whitney Reduction Network: A Method for Computing Autoassociative Graphs D. S. Broomhead and M. J. Kirby Enhanced 3D Shape Recovery Using the Neural-Based Hybrid Reflectance Model Siu-Yeung Cho and Tommy W. S. Chow ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2001 - VOLUME 13 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $460 $492.20 $508 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From psyc at coglit.ecs.soton.ac.uk Fri Oct 19 04:12:40 2001 From: psyc at coglit.ecs.soton.ac.uk (PSYCOLOQUY (Electronic Journal)) Date: Fri, 19 Oct 2001 09:12:40 +0100 (BST) Subject: Psycoloquy: Call for Submissions. Message-ID: <200110190812.JAA07734@coglit.ecs.soton.ac.uk> PSYCOLOQUY CALL FOR ARTICLES PSYCOLOQUY is a refereed international, interdisciplinary electronic journal sponsored by the American Psychological Association (APA) and indexed by APA's PsycINFO and by Institute for Scientific Information. http://www.apa.org/psycinfo/about/covlist.html PSYCOLOQUY publishes target articles and peer commentary in all areas of psychology as well as cognitive science, neuroscience, behavioral biology, artificial intelligence, robotics/vision, linguistics and philosophy. DIRECT SUBMISSIONS TO: psyc at pucc.princeton.edu Further information is available on the Psycoloquy website: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc Instructions for authors may be found at: http://www.princeton.edu/~harnad/psyc.html#inst http://www.cogsci.soton.ac.uk/psycoloquy/#inst Below is a list of other recently published PSYCOLOQUY treatments that are currently undergoing Open Peer Commentary: Navon, D. (2001), The Puzzle of Mirror Reversal: A View From Clockland. Psycoloquy 12 (017) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.017 Kramer, D. & Moore, M. (2001), Gender Roles, Romantic Fiction and Family Therapy. Psycoloquy 12 (024) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.024 Sherman, J. A. (2001), Evolutionary Origin of Bipolar Disorder (EOBD). Psycoloquy 12 (028) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.028 Overgaard, M. (2001), The Role of Phenomenological Reports in Experiments on Consciousness. Psycoloquy 12 (029) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.029 Crow, T. J. (2000) Did Homo Sapiens Speciate on the Y Chromosome? Psycoloquy 11 (001) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.001 Margolis, H. (2000) Wason's Selection Task with A Reduced Array Psycoloquy 11 (005) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.005 Place, U. T. (2000) The Role of the Hand in the Evolution of Language Psycoloquy 11 (007) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.007 Green, C. D. (2000) Is AI the Right Method for Cognitive Science? Psycoloquy 11 (061) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.061 Reifman, A. (2000) Revisiting the Bell Curve Psycoloquy 11 (099) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.099 SPECIAL SET OF 6 TARGET ARTICLES ON NICOTINE ADDICTION: Balfour, D. (2001), The Role of Mesolimbic Dopamine in Nicotine Dependence. Psycoloquy 12(001) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.001 Le Houezec, J. (2001), Non-Dopaminergic Pathways in Nicotine Dependence. Psycoloquy 12 (002) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.002 Oscarson, M. (2001), Nicotine Metabolism by the Polymorphic Cytochrome P450 2A6 (CYP2A6) Enzyme: Implications for Interindividual Differences in Smoking Behaviour. Psycoloquy 12 (003) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.003 Sivilotti, L. (2001), Nicotinic Receptors: Molecular Issues. Psycoloquy 12 (004) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.004 Smith, G. & Sachse, C. (2001), A Role for CYP2D6 in Nicotine Metabolism? Psycoloquy 12 (005) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.005 Wonnacott, S. (2001), Nicotinic Receptors in Relation to Nicotine Addiction. Psycoloquy 12 (006) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.006 MULTIPLE BOOK REVIEWS: Ben-Ze'ev, A. (2001), The Subtlety of Emotions. Psycoloquy 12 (007) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.007 Miller, G. F. (2001), The Mating Mind: How Sexual Choice Shaped the Evolution of Human Nature. Psycoloquy 12 (008) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.008 Bolton, D. & Hill, J. (2001), Mind, Meaning & Mental Disorder: The Nature of Causal Explanation in Psychology & Psychiatry. Psycoloquy 12 (018) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.018 Zachar, P. (2001), Psychological Concepts and Biological Psychiatry: A Philosophical Analysis. Psycoloquy 12 (023) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.023 Praetorius, N. (2001), Principles of Cognition, Language and Action: Essays on the Foundations of a Science of Psychology. Psycoloquy 12 (027) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.027 Carstairs-McCarthy, A. (2000) The Origins of Complex Language Psycoloquy 11 (082) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.082 Storfer, M. D. (2000) Myopia, Intelligence, and the Expanding Human Neocortex Psycoloquy 11 (083) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.083 Tenopir, C. & King, D. W. (2000) Towards Electronic Journals: Realities for Scientists, Librarians, and Publishers Psycoloquy 11 (084) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.084 Sheets-Johnston, M. (2000) The Primacy of Movement Psycoloquy 11 (098) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.098 From Machine.Learning at wkap.com Fri Oct 19 12:50:13 2001 From: Machine.Learning at wkap.com (Journal of Machine Learning) Date: Fri, 19 Oct 2001 12:50:13 -0400 Subject: Kluwer on Machine Learning Message-ID: In response to the public resignation and subsequent e-mail campaign of some MACHINE LEARNING (MLJ) editorial board members, please allow these facts to be clear. Kluwer Academic Publishers will continue its full support of the artificial intelligence community and MACHINE LEARNING. MLJ has been the premier publication venue for machine learning for 15 years. Its prestige is unequalled; even with a rejection rate of over 55% it publishes 12 full issues-over 1300 pages-per year. Its Science Citation Index ranking rises each year, and is consistently in the top 12 in all of Computer Science and Artificial Intelligence. MACHINE LEARNING is featured in over 30 abstracting and indexing services, including all of the relevant ISI indexes and those in fields as far-ranging as physics, psychology, and neuroscience. Print subscriptions continue to grow. Its electronic version reaches millions of users worldwide. Kluwer's commitment to the MACHINE LEARNING community includes: * Posting of accepted articles for free on the journal's web site immediately upon acceptance * Encouraging authors to post their papers on their own web site, prior to and after publication * Serving our editors, authors, and reviewers with an electronic reviewing system * Providing services from promotion to copyediting to distribution that ensure their work will reach the community, and reach it with a professional presentation * Representing the journal at conferences across all of computer science, exposing it to communities outside of machine learning * Maintaining the current individual subscription price of $120 * Including in the 2002 volumes over 15% more content with a less than 5% price increase to libraries Kluwer Academic Publishers has made, and continues to make, a very large investment in MLJ. Additionally, journals such as MACHINE LEARNING have paved the way for countless other new journals across all disciplines-its revenue provides a critical component for funding new projects that might not otherwise have been started. In fact, Kluwer publishes over 25 research journals in Artificial Intelligence. We are proud to sponsor targeted journals such as ARTIFICIAL INTELLIGENCE AND LAW and ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE as a service to the AI community. Publisher revenues also provide the very taxes that universities and other non-profit entities depend on to fund research. Kluwer Academic Publishers will continue our commitment to providing a healthy forum for many journals spanning all areas of academic research. The e-mail contact for MACHINE LEARNING is machine.learning at wkap.com. From charlotte.manly at louisville.edu Fri Oct 19 12:06:33 2001 From: charlotte.manly at louisville.edu (Charlotte Manly) Date: Fri, 19 Oct 2001 12:06:33 -0400 Subject: position in cognitive or computational neuroscience Message-ID: Dear Connectionists, Please circulate to interested parties. --------------- The Department of Psychological and Brain Sciences at the University of Louisville invites applications for a tenure-track position as Assistant Professor in cognitive neuroscience or computational neuroscience. The specific research area can come from a broad domain, including theoretical and applied areas of cognition, development, and vision. Applicants must show promise of building an outstanding record of externally funded research and publication. Postdoctoral experience is desirable. The position will begin August 1, 2002. Salary and start-up package are highly competitive. Applicants should have a curriculum vitae, description of research and teaching interests and experience, reprints and/or preprints, and three letters of recommendation forwarded to: John R. Pani, Ph.D., Chair, Experimental Search Committee, Department of Psychological and Brain Sciences, University of Louisville, Louisville, KY 40292. Review of applications will begin January 10, 2001 and will continue until the position is filled. Women and minorities are encouraged to apply. The University of Louisville is an Affirmative Action, Equal Opportunity Employer. The Department of Psychological and Brain Sciences (http://www.louisville.edu/a-s/psychology/) has been targeted for further enhancement by the University's Challenge for Excellence. Louisville is a dynamic city that is ranked highly for its quality of life, community and state commitment to education, and support for the arts. -- ====================================================== Charlotte F. Manly, Ph.D. | Psychological & Brain Sciences Assistant Professor | 317 Life Sciences Bldg ph: (502) 852-8162 | University of Louisville fax: (502) 852-8904 | Louisville, KY 40292 charlotte.manly at louisville.edu http://www.louisville.edu/a-s/psychology/ http://www.louisville.edu/~cfmanl01 From rothschild at cs.haifa.ac.il Sun Oct 21 04:10:32 2001 From: rothschild at cs.haifa.ac.il (Rothschild Institute) Date: Sun, 21 Oct 2001 10:10:32 +0200 (IST) Subject: Call-for-Papers: Haifa Winter Workshop on Computer Science and Statistics (Dec. 17-20, 2001) Message-ID: (Our apologies in advance for multiple copies due to posting on multiple mailing lists.) CALL FOR PAPERS Haifa Winter Workshop on Computer Science and Statistics 17-20 December 2001 The Caesarea Edmond Benjamin de Rothschild Foundation Institute for Interdisciplinary Applications of Computer Science at the University of Haifa, together with the Department of Statistics and the Department of Computer Science, are organizing an international workshop Dec. 17-20, 2001 on Computer Science and Statistics -- with an emphasis on Knowledge Discovery and other AI related topics. Additional funding is provided by co-sponsors the Ministry of Science and the US National Science Foundation. Purpose: The purpose of the workshop is to bring together experts from the fields of computer science and statistics and to explore potential areas of research in order to stimulate collaborative work. Particular areas of interests are (preliminary list): * Bayesian learning * Data mining * Simulation-based computation * Expert systems * Automated learning * Robotics Call-for-Papers: Contributed papers and posters are solicited for presentation. Submissions (extended abstract or full paper) should be sent electronically to libi at cs.haifa.ac.il no later than Nov. 15, 2001. Accepted abstracts will be posted on the workshop website http:// www.rothschild.haifa.ac.il /csstat (Late submissions will be considered on a space available basis.) Dates: 17-20 December 2001 Venue: University of Haifa Organizers: Martin C. Golumbic, Udi E. Makov, Yoel Haitovski, Ya'acov Ritov Tentative list of Invited speakers: Matt Beal (Gatsby) Steve Fienberg (Carnegie Mellon) Nir Friedman (Hebrew Univ.) Dan Geiger (Technion) Michael Kearns (Syntekcapital) Yishay Mansour (Tel Aviv) David Madigan (Rutgers) Thomas Richardson (Washington) Dan Roth (Urbana) Steve Skiena (Stony Brook) Yehuda Vardi (Rutgers) Volodya Vovk (London) Registration: There will be no registration fee, but participants are asked to register in advance using the form on the website http://www.rothschild.haifa.ac.il /csstat Hotel subsidies for advanced graduate students will be available upon the recommendation of their thesis advisor. For further information please contact libi at cs.haifa.ac.il From mieko at atr.co.jp Mon Oct 22 00:56:17 2001 From: mieko at atr.co.jp (Mieko Namba) Date: Mon, 22 Oct 2001 13:56:17 +0900 Subject: Neural Networks 14(9) Message-ID: NEURAL NETWORKS 14(9) Contents - Volume 14, Number 9 - 2001 ------------------------------------------------------------------ NEURAL NETWORKS LETTER: Neuronal integration mechanisms have little effect on spike auto-correlations of cortical neurons Yutaka Sakai Best estimated inverse versus inverse of the best estimator Amir Karniel, Ron Meir and Gideon F. Inbar INVITED ARTICLE How to be a gray box: dynamic semi-physical modeling Yacine Oussar and G?rard Dreyfus CONTRIBUTED ARTICLES: ***** Mathematical and Computational Analysis ***** An Infomax-based learning rule that generates cells similar to visual cortical neurons K. Okajima On the stability analysis of delayed neural networks systems Chunhua Feng and Rejean Plamondon A two-level Hamming network for high performance associative memory Nobuhiko Ikeda, Paul Watta, Metin Artiklar and Mohamad H. Hassoun A closed-form neural network for discriminatory feature extraction from high-dimensional data Ashit Talukder and David Casasent The enhanced LBG algorithm Giuseppe Patane and Marco Russo ***** Engineering & Design ***** Reconstruction of chaotic dynamics by on-line EM algorithm S. Ishii and M.-A. Sato Novelty detection using products of simple expertsa potential architecture for embedded systems Alan F. Murray A new algorithm to design compact two-hidden-layer artificial neural networks Md. Monirul Islam and K. Murase Cross-validation in Fuzzy ARTMAP for large databases Anna Koufakou, Michael Georgiopoulos, George Anagnostopoulos and Takis Kasparis ***** Technology and Applications ***** Fingerprints classification using artificial neural networks: a combined structural and statistical approach Khaled Ahmed Nagaty Bi-directional computing architecture for time series prediction Hiroshi Wakuya and Jacek M. Zurada ***** Book Review ***** Book review: Advances in Synaptic Plasticity: A Compact Account of the New, the Important, and the Interesting Murat Okatan ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 (regular) Y 13,000 (regular) Neural Networks (plus 2,000 enrollment fee) $20 (student) SEK 460 (student) Y 11,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- membership without $30 SEK 200 not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Takashi Nagano Faculty of Engineering Hosei University 3-7-2, Kajinocho, Koganei-shi Tokyo 184-8584 Japan 81 42 387 6350 (phone and fax) jnns at k.hosei.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- -- ========================================================= Mieko Namba Secretary to Dr. Mitsuo Kawato Editorial Administrator of NEURAL NETWORKS ATR-I, Human Information Science Laboratories Department 3 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-2647 E-MAIL mieko at atr.co.jp ========================================================= MY EMAIL ADDRESS HAS BEEN CHANGED FROM OCT.1, 2001 ========================================================= From bbs at bbsonline.org Mon Oct 22 14:41:14 2001 From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor)) Date: Mon, 22 Oct 2001 14:41:14 -0400 Subject: Rachlin: ALTRUISM AND SELFISHNESS: BBS Call for Commentators Message-ID: Dear Dr. Connectionists List User, Below is the abstract of a forthcoming BBS target article ALTRUISM AND SELFISHNESS by Howard Rachlin http://www.bbsonline.org/Preprints/Rachlin/Referees/ This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _____________________________________________________________ ALTRUISM AND SELFISHNESS Howard Rachlin Psychology Department State University of New York Stony Brook, New York, 11794-2500 KEYWORDS: addiction, altruism, commitment, cooperation, defection, egoism, impulsiveness, patterning, prisoners dilemma, reciprocation, reinforcement, selfishness, self-control ABSTRACT: Many situations in human life present choices between (a) narrowly preferred particular alternatives and (b) narrowly less preferred (or aversive) particular alternatives that nevertheless form part of highly preferred abstract behavioral patterns. Such alternatives characterize problems of self-control. For example, at any given moment, a person may accept alcoholic drinks yet also prefer being sober to being drunk over the next few days. Other situations present choices between (a) alternatives beneficial to an individual and (b) alternatives that are less beneficial (or harmful) to the individual that would nevertheless be beneficial if chosen by many individuals. Such alternatives characterize problems of social cooperation; choices of the latter alternative are generally considered to be altruistic. Altruism, like self-control, is a valuable temporally-extended pattern of behavior. Like self-control, altruism may be learned and maintained over an individuals lifetime. It needs no special inherited mechanism. Individual acts of altruism, each of which may be of no benefit (or of possible harm) to the actor, may nevertheless be beneficial when repeated over time. However, because each selfish decision is individually preferred to each altruistic decision, people can benefit from altruistic behavior only when they are committed to an altruistic pattern of acts and refuse to make decisions on a case-by-case basis http://www.bbsonline.org/Preprints/Rachlin/Referees/ ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENTS *** (1) The authors of scientific articles are not paid money for their refereed research papers; they give them away. What they want is to reach all interested researchers worldwide, so as to maximize the potential research impact of their findings. Subscription/Site-License/Pay-Per-View costs are accordingly access-barriers, and hence impact-barriers for this give-away research literature. There is now a way to free the entire refereed journal literature, for everyone, everywhere, immediately, by mounting interoperable university eprint archives, and self-archiving all refereed research papers in them. Please see: http://www.eprints.org http://www.openarchives.org/ http://www.dlib.org/dlib/december99/12harnad.html --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to self-archive all their papers in their own institution's Eprint Archives or in CogPrints, the Eprint Archive for the biobehavioral and cognitive sciences: http://cogprints.soton.ac.uk/ It is extremely simple to self-archive and will make all of our papers available to all of us everywhere, at no cost to anyone, forever. Authors of BBS papers wishing to archive their already published BBS Target Articles should submit it to BBSPrints Archive. Information about the archiving of BBS' entire backcatalogue will be sent to you in the near future. Meantime please see: http://www.bbsonline.org/help/ and http://www.bbsonline.org/Instructions/ --------------------------------------------------------------------- (3) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, useing your username and password above: http://www.bbsonline.org/ For information about the mailshot, please see the help file at: http://www.bbsonline.org/help/node5.html#mailshot *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* From malchiod at laren.usr.dsi.unimi.it Mon Oct 22 09:48:08 2001 From: malchiod at laren.usr.dsi.unimi.it (Dario Malchiodi) Date: Mon, 22 Oct 2001 15:48:08 +0200 (CEST) Subject: Course at the International School on Neural Nets "E. R. Caianiello" Message-ID: Here enclosed you find the programme of the course FROM SYNAPSES TO RULES: DISCOVERING SYMBOLIC RULES FROM NEURAL PROCESSED DATA. Dario Malchiodi malchiodi at dsi.unimi.it ----------------------------------------------------- GALILEO GALILEI FOUNDATION WORLD FEDERATION OF SCIENTISTS ETTORE MAJORANA CENTRE FOR SCIENTIFIC CULTURE GALILEO GALILEI CELEBRATIONS Four Centuries Since the Birth of MODREN SCIENCE INTERNATIONAL SCHOOL ON NEURAL NETS "E. R. CAIANIELLO" 5th Course: FROM SYNAPSES TO RULES: DISCOVERING SYMBOLIC RULES FROM NEURAL PROCESSED DATA ERICE-SICILY: 25 FEBRUARY - 7 MARCH 2002 Sponsored by the: International Institute for Advanced Scientific Studies (IIASS) Italian Ministry of Education, University, Scientific Research and Technology Sicilian Regional Government Italian Society for Nerual Networks (SIREN) University of Salerno University of Milan PROGRAMME AND LECTURERS Inferential bases for learning Theoretical foundations for soft computing Integration of symbolic-subsymbolic reasoning methods Physics and metaphysics of learning Toward applications * B. Apolloni, University of Milan, I * D. Malchiodi, University of Milan, I * D. Mundici, University of Milan, I * M. Gori, University of Siena, I * F. Kurfess, California Polytechnic State Univ., San Luis Obispo, CA , USA * A. Roy, Arizona State University, Tempe, AZ, USA * R. Sun, University of Missouri-Columbia, MO, USA * L. Agnati, Karolinska Institutet, Stockholm, S * G. Basti, Pontificia Università Lateranense, Rome, I * G. Biella, C.N.R. LITA, Milan, I * J. G. Taylor, King's College, London, UK * A. Esposito, Istituto Italiano Alti Studi Scientifici, Vietri, I * A. Moise, Boise State University, ID, USA PURPOSE OF THE COURSE The school aims at fixing a theoretical and applicatry framework for extracting formal rules from data. To this end the modern approaches will be expounded that collapse the two typical goals of the conventional AI and connectionism - respectively, deducing within an axiomatic shell formal rules about a phenomenon and inferring the actual behavior of it from examples - into a challenging inferential framework where we learn from data and understand what we have learnt. The target reads as a translation of the subsymbolic structure of the data - stored in the synapses of a neural network - into formal properties described by rules. To capture this trip from synapses to rules and then render it manageable for affording real world learning tasks, the Course will deal in depth with the following aspects: i. theoretical foundations of learning algorithms and soft computing, ii. intimate relationships between symbolic and subsymbolic reasoning methods, iii. integration of the related hosting architectures in both physiological and artificial brain. APPLICATIONS Interested candidates should send a letter to: * Professor Bruno Apolloni - Dipartimento di Scienze dell'Informazione Università degli Studi di Milano Via Comelico 39/41 20135 Milano, Italy Tel: ++39.02.5835.6284, Fax: ++39.02.5835.6228 e-mail: apolloni at dsi.unimi.it specifying: i) date and place of birth and present activity; ii) nationality. Thanks to the generosity of the sponsoring Institutions, partial support can be granted to some deserving students who need financial aid. Requests to this effect must be specified and justified in the letter of application. Notification of acceptance will be sent within the end of January 2002. * PLEASE NOTE Participants must arrive in Erice on 25 February, not later than 5 p.m. Closing date for applications: December 15, 2001 No special application form is required POETIC TOUCH According to legend, Erice, son of Venus and Neptune, founded a small town on top of a mountain (750 meters above sea level) more than three thousand years ago. The founder of modern history - i.e. the recording of eventos in a methodic and chronological sequence as they really happened without reference to mythical causes - the great Thucydides (~500 B.C.), writing about events connected with the conquest of Troy (1183 B.C.), says: "After the fall of Troy some Trojans on their escape from the Achaei arrived in Sicily on boats and as they settled near the border with the Sicanians all together they were named Elymi: their towns were Segesta and Erice". This inspired Virgil to describe the arrival of the Trojan royal family in Erice and the burial of Anchise, by his son Enea, on the coast below Erice. Homer (~1000 B.C.) , Theocritus (~300 B.C.), Polybius (~200 B.C.), Virgil (~50 B.C.), Horace (~20 B.C.), and other have celebrated this magnificent spot in Sicily in their poems. During seven centuries (XIII-XIX) the town of Erice was under the leadership of a local oligarchy, whose wisdom assured a long period of cultural development and economic prosperity which in turn gave rise to the many chunches, monasteries and private palaces which you see today. In Erice you can admire the Castle of Venus, the Cyclopean Walls (~800 B.C.) and the Gothic Cathedral (~1300 A.D.). Erice is at present a mixture of ancient and medieval architecture. Other masterpieces of ancient civilization are to be found in the neighbourhood: at Motya (Phoenician), Segesta (Elymian), and Selinunte (Greek). On the Aegadian Islands - theatre of the decisive naval battle of the first Punic War (264-241 B.C.) - suggestive neolithic and paleolithic vestiges are still visible: the grottoes of Favignana, the carvings and murals of Levanzo. Splendid beaches are to be found at San Vito Lo Capo, Scopello, and Cornino, and a wild and rocky coast around Monte Cofano: all at less than one hour's drive from Erice. More information about this Course and the other activities of the Ettore Majorana Centre can be found on the WWW at the following address: http://www.ccsem.infn.it B. APOLLONI, A. MOISE DIRECTORS OF THE COURSE M. J. JORDAN, M. MARINARO DIRECTORS OF THE SCHOOL A. ZICHICHI DIRECTOR OF THE CENTRE From moeller at mpipf-muenchen.mpg.de Tue Oct 23 07:02:57 2001 From: moeller at mpipf-muenchen.mpg.de (Ralf Moeller) Date: Tue, 23 Oct 2001 13:02:57 +0200 Subject: postdoctoral position "AMOUSE" project Message-ID: <3BD54E61.6051CC3E@mpipf-muenchen.mpg.de> The Max Planck Institute for Psychological Research in Munich, Germany, invites applications for a Postdoctoral position (4-year appointment, salary BAT IIa/Ib, approx. DEM 71k p.a.) in the research group "Cognitive Robotics", starting at the earliest convenience. The position is funded by the European Community in the project "Artificial Mouse" as part of the initiative "Neuroinformatics for Living Artefacts". The project aims at an improved understanding of the somatosensory (whisker) system in rodents, from transduction to visuo-tactile and sensorimotor integration. As part of the modeling process, an artificial whisker system will be developed and tested on a mobile robot. Candidates should have a background in signal processing, electronics, mechanics, and computer science, as well as an interest in interdisciplinary research in the fields of neuroscience and cognitive science. Experience with image processing, neural networks, and robotics is beneficial. The Max Planck Society seeks to increase the number of female scientists and encourages them to apply. Handicapped persons with comparable qualifications receive preferential status. Please submit a CV, complete academic records, and the name and email address of two academic references to: Administration Max Planck Institute for Psychological Research Amalienstr. 33 D-80799 Munich Germany web site: http://www.mpipf-muenchen.mpg.de For further information, please contact: Dr. Ralf Moeller email: moeller at mpipf-muenchen.mpg.de From plaut at cmu.edu Tue Oct 23 14:39:50 2001 From: plaut at cmu.edu (David Plaut) Date: Tue, 23 Oct 2001 14:39:50 -0400 Subject: Post-Doctoral Positions in Connectionist Modeling of Reading and Language Message-ID: <23079.1003862390@pewee.cnbc.cmu.edu> Post-Doctoral Positions in Connectionist Modeling of Reading and Language Center for the Neural Basis of Cognition and the Department of Psychology, Carnegie Mellon University Two postdoctoral research positions are available in the area of connectionist/neural-network modeling of normal and impaired cognitive processes in reading and language. Topics of particular interest include phonological and lexical development, reading acquisition and developmental and acquired dyslexia, cross-linguistic differences in morphological processing, and neuropsychological impairments of lexical semantic knowledge. Applicants should have expertise in connectionist modeling or in empirical investigation of language-related processes combined with some experience in modeling. The positions are for 2-3 years, with salary commensurate with experience, and are affiliated with the Center for the Neural Basis of Cognition (http://www.cnbc.cmu.edu) and the Department of Psychology (http://www.psy.cmu.edu) at Carnegie Mellon. Please send CV, a description of research experience and interests, copies of representative publications, and three letters of reference by January 1, 2002 to Dr. David Plaut, Center for the Neural Basis of Cognition, Mellon Institute 115, 4400 Fifth Avenue, Pittsburgh PA 15213-2683, USA. Carnegie Mellon is an AA/EEO employer. From cindy at cns.bu.edu Wed Oct 24 13:59:46 2001 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 24 Oct 2001 13:59:46 -0400 Subject: 6th ICCNS: Call for Abstracts Message-ID: <200110241759.NAA10020@retina.bu.edu> Apologies if you receive this more than once. ***** CALL FOR ABSTRACTS ***** SIXTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS Tutorials: May 29, 2002 Meeting: May 30 - June 1, 2002 Boston University 677 Beacon Street Boston, Massachusetts 02215 http://www.cns.bu.edu/meetings/ Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from the Office of Naval Research This interdisciplinary conference has drawn about 300 people from around the world each time that it has been offered. Last year's conference was attended by scientists from 31 countries. The conference is structured to facilitate intense communication between its participants, both in the formal sessions and during its other activities. As during previous years, the conference will focus on solutions to the fundamental questions: How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? The conference will include invited tutorials and lectures, and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Abstract submissions encourage submissions of the latest results. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. Confirmed invited speakers include: Dana Ballard Jeff Bowers Daniel Bullock Edward M. Callaway Gail Carpenter Bart Ermentrout David Field Mark Gluck Stephen Grossberg Frank Guenther Daniel Johnston Philip J. Kellman Stephen G. Lisberger James McClelland Ferdinando Mussa-Ivaldi Lynn Nadel Erkki Oja Randall O'Reilly Michael Page John Rinzel Edmund Rolls Daniel Schacter Wolfram Schultz Rudiger von der Heydt CALL FOR ABSTRACTS Session Topics: * vision * spatial mapping and navigation * object recognition * neural circuit models * image understanding * neural system models * audition * mathematics of neural systems * speech and language * robotics * unsupervised learning * hybrid systems (fuzzy, evolutionary, digital) * supervised learning * neuromorphic VLSI * reinforcement and emotion * industrial applications * sensory-motor control * cognition, planning, and attention * other Contributed abstracts must be received, in English, by January 31, 2002. Notification of acceptance will be provided by email by February 28, 2002. A meeting registration fee must accompany each Abstract. See Registration Information below for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted Abstracts will be returned on request only until April 19, 2002. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; requested preference for oral or poster presentation; and a first and second choice from the topics above, including whether it is biological (B) or technological (T) work. Example: first choice: vision (T); second choice: neural system models (B). (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, VCR, and LCD projector facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. Accepted Abstracts will be printed in the conference proceedings volume. No longer paper will be required. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows are available to help cover meeting travel and living costs. The deadline to apply for fellowship support is January 31, 2002. Applicants will be notified by email by February 28, 2002. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Fellowship applicants who also submit an Abstract need to include the registration fee with their Abstract submission. Those who are awarded fellowships are required to register for and attend both the conference and the day of tutorials. Fellowship checks will be distributed after the meeting. REGISTRATION FORM Sixth International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 29, 2002 Meeting: May 30 - June 1, 2002 FAX: (617) 353-7755 http://www.cns.bu.edu/meetings/ (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $85 Conference plus Tutorial (Regular) ( ) $55 Conference plus Tutorial (Student) ( ) $60 Conference Only (Regular) ( ) $40 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ From robbie at bcs.rochester.edu Wed Oct 24 11:24:42 2001 From: robbie at bcs.rochester.edu (Robbie Jacobs) Date: Wed, 24 Oct 2001 11:24:42 -0400 (EDT) Subject: postdoc position available Message-ID: Below is an ad for a postdoctoral position in my laboratory. Although the readers of this list are primarily computational, you (or someone you know) may be interested in gaining expertise in visual psychophysics and virtual reality. Robert Jacobs Brain and Cognitive Sciences Center for Visual Science University of Rochester ============================================== POSTDOCTORAL FELLOW POSITION IN VISUAL PSYCHOPHYSICS PI: Robert Jacobs Center for Visual Science Department of Brain and Cognitive Sciences University of Rochester A postdoctoral position is available immediately in the lab of Robert Jacobs, Center for Visual Science and the Department of Brain and Cognitive Sciences, University of Rochester. The lab focuses on experimental and computational studies of visual learning with respect to mid-level and high-level visual functions, particularly on experience-dependent perception of visual depth. Some projects in our lab study observers' abilities to recalibrate their interpretations of individual visual cues, other projects study how observers adapt their visual cue combination strategies, and still other projects examine how information from other perceptual modalities (such as haptic or auditory percepts) influence how observers interpret and combine information from visual cues. We have a well-equipped lab that includes access to a wide variety of virtual reality equipment for creating visual, auditory, and haptic environments. You can learn more about our lab (and obtain several of our papers) from: http://www.bcs.rochester.edu/bcs/people/faculty/robbie/robbie.html You can learn more about the Department of Brain and Cognitive Sciences from: http://www.bcs.rochester.edu You can learn more about the Center for Visual Science from: http://www.cvs.rochester.edu Interested candidates should send a vita, a research statement, recent publications, and the names of three individuals who can write letters of recommendation to: Robert Jacobs Brain and Cognitive Sciences Meliora Hall, River Campus University of Rochester Rochester, NY 14627-0268 robbie at bcs.rochester.edu From kivinen at axiom.anu.edu.au Thu Oct 25 02:07:51 2001 From: kivinen at axiom.anu.edu.au (Jyrki Kivinen) Date: Thu, 25 Oct 2001 16:07:51 +1000 (EST) Subject: Call for papers: COLT 2002 Message-ID: Call for Papers: Fifteenth Annual Conference on Computational Learning Theory The Fifteenth Annual Conference on Computational Learning Theory (COLT 2002) will be held during the week July 8-12, 2002 in Sydney, Australia. The conference will be co-located with ICML-2002. We invite submission of papers about the theory of machine learning. Possible topics include: * analysis of learning algorithms for specific classes of hypotheses, including established classes (e.g. neural networks, graphical models, decision trees, logical formulae, automata, pattern languages, grammars) and new classes; * bounds on the generalization ability of learning algorithms; * learning algorithms based on large margin hypotheses (SVM, boosting); * worst-case relative loss bounds for sequential prediction algorithms; * analysis of adaptive algorithms for decision, planning and control; * bounds on the computational complexity of learning; * learning with queries and learning in the limit; * new learning models that either capture important details of specific applications or that address general issues in a new way. We also welcome theoretical papers about learning that do not fit into the above categories; we are particularly interested in papers that include ideas and viewpoints that are new to the COLT community. While the primary focus of the conference is theoretical, papers can be strengthened by the inclusion of relevant experimental results. Papers that have appeared in journals or other conferences, or that are being submitted to other conferences, are not appropriate for submission to COLT. Paper submissions: We will be setting up a server to receive electronic submissions. Although electronic submissions are preferred, hard-copy submissions will also be possible. Details of the submission procedure will be made available on the conference web page http://www.learningtheory.org/colt2002. Please check this page for updates on submission and conference details. If you have questions, send e-mail to the program co-chairs (Jyrki.Kivinen at faceng.anu.edu.au, rsloan at nsf.gov). Important dates: Submissions, electronic or hard-copy, must be received by 23:59 GMT on Monday, January 28, 2002. Authors will be notified of acceptance or rejection on or before Friday April 5, 2002. Final camera-ready versions must be received by Friday April 19. Submission format: Unlike previous COLT conferences, we are asking the authors to submit a full paper, which should be in the Springer LNAI format (see http://www.springer.de/comp/lncs/authors.html) and no longer than 15 pages. Authors not using LaTeX2e are asked to contact the program chairs well in advance of the submission deadline. The paper should include a clear definition of the theoretical model used and a clear description of the results, as well as a discussion of their significance, including comparison to other work. Proofs or proof sketches should be included. Conference chair: Arun Sharma (Univ. of New South Wales) Program co-chairs: Jyrki Kivinen (Australian National Univ.) and Bob Sloan (NSF and Univ. of Illinois, Chicago). Program committee: Dana Angluin (Yale), Javed Aslam (Dartmouth), Peter Bartlett (BIOwulf Technologies), Shai Ben-David (Technion), John Case (Univ. of Delaware), Peter Grunwald (CWI), Ralf Herbrich (Microsoft Research), Mark Herbster (University College London), Gabor Lugosi (Pompeu Fabra University), Ron Meir (Technion), Shahar Mendelson (Australian National Univ.), Michael Schmitt (Ruhr-Universitaet Bochum), Rocco Servedio (Harvard), and Santosh Vempala (MIT) Student travel: We anticipate that some funds will be available to partially support travel by student authors. Eligible authors who wish to apply for travel support should indicate this on their submission's title page. Mark Fulk Award: This award is for the best paper authored or coauthored by a student. Eligible authors who wish to be considered for this prize should indicate this on their submission's title page. From juergen at idsia.ch Thu Oct 25 06:13:21 2001 From: juergen at idsia.ch (juergen@idsia.ch) Date: Thu, 25 Oct 2001 12:13:21 +0200 Subject: Coulomb's law yields support vector machines and more Message-ID: <200110251013.MAA22785@ruebe.idsia.ch> Important recent results by Sepp Hochreiter: Using Coulomb energy as an objective function, he shows that support vector machines can be easily derived from Coulomb's law as taught in first semester courses on physics. His general electrostatic framework greatly simplifies the proofs of well-known SVM theorems, and yields solutions formally identical to well-known SVM types. In addition, it suggests novel kernels and SVMs for kernels that are not positive definite, and even subsumes other methods such as nearest neighbor classifiers, density estimators, clustering algorithms, and vector quantizers. Thus his Coulomb classifiers promise significant advances in several fields. http://www.cs.tu-berlin.de/~hochreit http://www.cs.colorado.edu/~hochreit ftp://ftp.cs.colorado.edu/users/hochreit/papers/cltr.ps.gz @techreport{Hochreiter:2001coulomb, author = {S. Hochreiter and M. C. Mozer}, title = {Coulomb Classifiers: {R}einterpreting {SVM}s as Electrostatic Systems}, institution = {University of Colorado, Boulder, Department of Computer Science}, number = {CU-CS-921-01}, year = {2001}} (BTW, this is the same person who analyzed in rigorous detail the vanishing error problem of standard recurrent nets (1991), and who recently built the first working gradient-based metalearner (ICANN 2001) using LSTM recurrent nets (Neural Comp 97) which by design do not suffer from this problem, and who also invented Flat Minimum Search (Neural Comp 97, 99), a highly competitive method for finding nets with low information-theoretic complexity and high generalization capability.) ------------------------------------------------- Juergen Schmidhuber director IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland juergen at idsia.ch www.idsia.ch/~juergen From cjlin at csie.ntu.edu.tw Thu Oct 25 06:19:32 2001 From: cjlin at csie.ntu.edu.tw (Chih-Jen Lin) Date: Thu, 25 Oct 2001 18:19:32 +0800 Subject: CFP: special issue on Support Vector Machine Message-ID: CALL FOR PAPERS: special issue on SVM NEUROCOMPUTING An International Journal published by Elsevier Science B.V., vol. 42-47, 24 issues, in 2002 ISNN 0925-2312, URL: http://www.elsevier.nl/locate/neucom Special Issue on Support Vector Machines Paper Submission Deadline: February 28, 2002 Further information: http://www.csie.ntu.edu.tw/~cjlin/svmcfp.html Support Vector Machines is an intensive area of research closely related to statistical learning theory with numerous applications. The basic form of the corresponding learning algorithm can be shown to correspond to a linear model used in a high-dimensional space non-linearly related to input space. A key advantage is the low-complexity of the operations performed by using kernels in the input space. The construction of the high-dimensional space is based a subset of the original input data, the so called support vectors. The corresponding quadratic programming problem in simulative environments is solved using different optimization techniques. Their strong performance in applications has increased the interest in this approach with areas of application research including pattern recognition, computer vision, and biomedical analysis. The Neurocomputing journal invites original contributions for the forthcoming special issue on Support Vector Machines from a broad scope of areas. Some topics relevant to this special issue include, but are not restricted to: -- Theoretical foundations, algorithms, and implementations -- Model selection and hyperparameter tuning -- Choosing kernels for special situations -- Probabilistic treatment of SVMs -- SVM methods for large scale problems -- Benchmarking SVMs against other methods -- Feature selection for SVMs -- Key applications including, but not restricted to data mining, bioinformatics, text categorization, machine vision, etc. Please send two hardcopies of the manuscript before February 28, 2002 to: V. David Sanchez A., Neurocomputing - Editor in Chief - Advanced Computational Intelligent Systems P.O. Box 60130, Pasadena, CA 91116-6130, U.S.A. Street address: 1149 Wotkyns Drive Pasadena, CA 91103, U.S.A. Fax: +1-626-793-5120 Email: vdavidsanchez at earthlink.net including abstract, keywords, a cover page containing the title and author names, corresponding author name's complete address including telephone, fax, and email address, and clear indication to be a submission to the Special Issue on Support Vector Machines. Guest Editors Colin Campbell Department of Engineering Mathematics Bristol University, Bristol BS8 1TR United Kingdom Phone: (+44) (0) 117 928 9858 Fax: (+44) (0)117-925-1154 Email: C.Campbell at bristol.ac.uk Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei, Taiwan, 106 Phone: (+886) 2-2362-5336 x 413 Fax: (+886) 2-2362-8167 Email: cjlin at csie.ntu.edu.tw S. Sathiya Keerthi Department of Mechanical Engineering National University of Singapore 10 KentRidge Crescent Singapore 119260 Republic of Singapore Phone: (+65) 874-4684 Fax: (+65) 779-1459 Email: mpessk at guppy.mpe.nus.edu.sg V. David Sanchez A., Neurocomputing - Editor in Chief - Advanced Computational Intelligent Systems P.O. Box 60130 Pasadena, CA 91116-6130 U.S.A. Fax: +1-626-793-5120 Email: vdavidsanchez at earthlink.net From erik at bbf.uia.ac.be Fri Oct 26 14:13:15 2001 From: erik at bbf.uia.ac.be (Erik De Schutter) Date: Fri, 26 Oct 2001 20:13:15 +0200 Subject: European short-term fellowships in neuroinformatics and computational neuroscience Message-ID: The EU Thematic Network Computational Neuroscience and Neuroinformatics offers short-term fellowships to nationals of the EU and associated countries for visits to laboratories in the EU or associated countries. Fellowships cover travel and accommodation costs for a visit of 2 to 12 weeks. Applications are evaluated once every month on a competitive basis. Fellowships can be awarded within three months of the application. Award per fellowship varies between 600 to 2000 ?. More information and electronic application is available at http://www.neuroinf.org Prof. E. De Schutter University of Antwerp, Belgium coordinator of the Thematic Network Computational Neuroscience and Neuroinformatics http://www.bbf.uia.ac.be From Roberto.Prevete at na.infn.it Mon Oct 29 11:56:51 2001 From: Roberto.Prevete at na.infn.it (Roberto Prevete) Date: Mon, 29 Oct 2001 11:56:51 -0500 Subject: book announcement Message-ID: <3BDD8A53.FFE58765@na.infn.it> Book announcement: AN ALGEBRAIC APPROACH TO THE AUTONOMOUSLY SELF-ADAPTABLE BOOLEAN NEURAL NETS 299p F.E.LAURIA, R.PREVETE Liguori Editore, 2001. ISBN 99-207-3266-1 free for asking, lauria at na.infn.it For more details visit our homepage: http://www.na.infn.it/Gener/cyber/report.html into folder NeuralNetworks/News Here is a brief description: We present an algebraic approach to the autonomously self-adaptable boolean neural nets. Starting from the Caianiello's nets we introduce a Boolean Neural Net (BNN) as a control structure and we set the data structure embedded in a BNN. Introducing the Hebbian rule we set some sufficient conditions in order to obtain an Adaptable Boolean Neural Network (ABNN) as a control structure. Starting from these architectures we present a multitasking architecture (GABNN) proving its training universality. From ascoli at gmu.edu Tue Oct 30 10:54:59 2001 From: ascoli at gmu.edu (Giorgio Ascoli) Date: Tue, 30 Oct 2001 10:54:59 -0500 Subject: postdoc opening - computational neuroscience Message-ID: <3BDECD53.41CE10BE@gmu.edu> Please post and distribute as you see fit (my apologies for cross-listing). Giorgio Ascoli COMPUTATIONAL NEUROSCIENCE POST-DOCTORAL POSITION AVAILABLE A post-doctoral position is available immediately for computational modeling of dendritic morphology, neuronal connectivity, and electrophysiology. All highly motivated candidates with a recent PhD in biology, computer science, physics, or other areas related to Neuroscience (including MD or engineering degree) are encouraged to apply. C programming skills and/or experience with NEURON, GENESIS or other modeling packages are desirable but not necessary. Post-doc will join a young and dynamic research group at the Krasnow Institute for Advanced Study, located in Fairfax, VA (20 miles west of Washington DC). The initial research project is focused on anatomically and biophysically detailed electrophysiological simulations of hippocampal neurons to study synaptic integration and the structure/activity/function relationship at the cellular and network level. The post-doc will be hired as a Research Assistant Professor (with VA state employee benefits) with a salary based on the NIH postdoctoral scale, and will have generous office space, a new computer, and full-time access to Silicon Graphics and Linux servers and consoles. Send CV, (p)reprints, a brief description of your motivation, and names, email addresses and phone/fax numbers of references to: ascoli at gmu.edu (or by fax at the number below) ASAP. There is no deadline but the position will be filled as soon as a suitable candidate is found. Non-resident aliens are welcome to apply. The Krasnow Institute is an equal opportunity employer. Computational Neuroanatomy Group: http://www.krasnow.gmu.edu/L-Neuron/ Krasnow Institute for Advanced Study: http://www.krasnow.org George Mason University: http://www.gmu.edu ------------------- Giorgio Ascoli, PhD Head, Computational Neuroanatomy Group Krasnow Institute for Advanced Study and Department of Psychology - MS2A1 George Mason University, Fairfax, VA 22030-4444 Web: www.krasnow.gmu.edu/ascoli Ph. (703)993-4383 Fax (703)993-4325 From N.Chater at warwick.ac.uk Tue Oct 30 13:43:39 2001 From: N.Chater at warwick.ac.uk (Nick Chater) Date: Tue, 30 Oct 2001 18:43:39 +0000 Subject: 8 Cog Science PhD and research positions at Warwick Message-ID: The Institute for Applied Cognitive Science at the University of Warwick is pleased to announce 5 research positions and 3 funded PhD studentships. The Institute has recently been set up with a $1.5M grant from the Wellcome Trust and Economic and Social Research Council, and has received a similar amount of funding from government, commercial and charitable sources. It has first class computational and experimental facilities, including 128 channel ERP, 3 eye-trackers, and virtual reality and movement monitoring. The Institute for Applied Cognitive Science has links with internationally known faculty in the Department of Psychology, Department of Computer Science, Warwick Business School, Mathematics Institute, and Institute of Education. The positions are: (a) 2 PhD studentships on corpus analysis and experimental language research (b) 1 PhD studentship on basic processes of learning and memory (c) 2 four-year research positions on early reading (d) 3 two-year research positions on financial decision making All projects are directed by Prof Nick Chater and colleagues at the Institute. Please contact Nick Chater (nick.chater at warwick.ac.uk) if you are interested in any of these positions. Further details appear below: (a) 2 PhD studentships on corpus analyis and experimental language research (b) 1 PhD studentship on basic processes of learning and memory (c) 2 four-year research positions on early reading (d) two-year research positions on financial decision making (a) 2 PhD studentships on corpus analysis and experimental language research This project is funded by the Human Frontiers Science Program, and links Warwick with laboratories in the US, France and Japan, to study the interaction of multiple cues to syntactic category identity across languages. The Warwick research theme will focus on corpus analysis and some experimental research with adults. An ideal candidate would have a strong background (good undergraduate degree and preferably Masters level research experience) in corpus analysis, computation, linguistics, cognitive science or psycholinguistics. The successful applicants will work alongside a further 3 PhD students currently work on related topics. The project will be directed in Warwick by Nick Chater, and the whole research network is coordinated by Morten Christiansen, at Cornell. The studentships are open to people of any nationality, and are available with immediate effect. (b) 1 PhD studentship on basic processes of learning and memory This project is funded by a European Training Network on "Basic processes of learning and memory", and links Warwick with laboratories in London, France and Belgium. An ideal candidate would have a strong background (good undergraduate degree and preferably Masters level research experience) in cognitive psychology, cognitive science, computation, linguistics, or related discipline. The project will be directed in Warwick by Nick Chater, and the whole research network is coordinated by Robert French, at the University of Liege. There are various eligibility requirements attached to this funding, the most important of which is that applicants must be from an EU state (or affiliated state), but not from the UK. The studentship is available with immediate effect. (c) 2 four-year researchpositions on early reading This project represents a new stage in a long-term on-going research project, funded by the Leverhulme Trust (a charitable foundation) and Essex Local Educational Authority. The goal of the research is to develop instructional principles based on cognitive science that substantially enhance children's ability to learn to read. The project has already shown some dramatic gains in large scale classroom studies. One post with be primarily concerned with laboratory experimental research on basic principles of learning, with adults and children, and would be based at Warwick University. The second post will be based in Essex, and will involve designing and implementing classroom based studies. The researchers will work as part of a large interdisciplinary team of researchers and educators. Ideal applicants would have graduate level or beyond experience in research in cognitive psychology, cognitive science or education, and preferable, though not necessarily, prior knowledge of reading research. The project will be directed Nick Chater, Jonathan Solity and Gordon Brown. The positions are open to people of any nationality, and the project is likely to start around Jan 1, 2002 (although precise timing and funding details are still to be confirmed). (d) 3 two-year research positions on financial decision making Three two year positions for two postdoctoral researchers and a research associate are opening up at the newly founded research group, based at the Institute for Applied Cognitive Science, University of Warwick. The group will pursue fundamental and applied research on human decision making with relevance to the finance industry. Ideal candidates would have strong backgrounds in cognitive psychology, experimental economics, cognitive science, and IT skills would also be advantageous. Successful applicants will work in a research with several other post-doctoral and post-graduate researchers. The positions are open to people of any nationality, and the project is likely to tart around Jan 1, 2002 (although precise timing and funding details are still to be confirmed). From CogSci at psyvax.psy.utexas.edu Tue Oct 30 15:09:28 2001 From: CogSci at psyvax.psy.utexas.edu (Cognitive Science Society) Date: Tue, 30 Oct 2001 14:09:28 -0600 Subject: Rumelhart Prize Message-ID: <5.0.0.25.2.20011030140638.02b567c0@psy.utexas.edu> ANNOUNCEMENT AND CALL FOR NOMINATIONS: THE THIRD ANNUAL DAVID E. RUMELHART PRIZE FOR CONTRIBUTIONS TO THE FORMAL ANALYSIS OF HUMAN COGNITION The recipient of the Third Annual David E. Rumelhart Prize will be chosen during the first part of 2002. The winner will be announced at the 2002 Meeting of the Cognitive Science Society, and will receive the prize and give the Prize Lecture at the 2003 Meeting. The prize is awarded annually to an individual or collaborative team making a significant contemporary contribution to the formal analysis of human cognition. Mathematical modeling of human cognitive processes, formal analysis of language and other products of human cognitive activity, and computational analyses of human cognition using symbolic or non-symbolic frameworks all fall within the scope of the award. The Prize itself will consist of a certificate, a citation of the awardee's contribution, and a monetary award of $100,000. Nomination, Selection and Award Presentation For the Third Annual Prize, the selection committee will continue to consider nominations previously submitted. The committee invites updates to existing nominations as well as new nominations. Materials should be sent to the Prize Administration address at the end of this announcement. To be considered in the committee's deliberations for the Third David E. Rumelhart Prize, materials must be received by Friday, January 11, 2002. Nominations should include six sets of the following materials: (1) A three-page statement of nomination, (2) a complete curriculum vitae and (3) copies of up to five of the nominee's relevant publications. Note that the nominee may be an individual or a team, and in the case of a team, vitae for all members should be provided. The prize selection committee considers both the scientific contributions and the scientific leadership and collegiality of the nominees, so these issues should be addressed in the statement of nomination. Previous Recipients and Prize-Related Activities Previous winners of the David E. Rumelhart Prize are Geoffrey E. Hinton and Richard M. Shiffrin. Hinton received the First David E. Rumelhart Prize and delivered the Prize Lecture at the 2001 Meeting of the Cognitive Science Society. Shiffrin, the winner of the Second David E. Rumelhart Prize, was announced at the 2001 Meeting of the Cognitive Science Society. He will recieve the prize and deliver the Prize Lecture at the 2002 meeting. Funding of the Prize The David E, Rumelhart Prize is funded by the Robert J. Glushko and Pamela Samuelson Foundation, based in San Francisco. Robert J. Glushko is an entrepreneur in Silicon Valley who received a Ph. D. in Cognitive Psychology in 1979 under Rumelhart's supervision. Prize Administration The Rumelhart Prize is administered by the Chair of the Prize Selection Committee in consultation with the Glushko-Samuelson Foundation and the Distinguished Advisory Board. Screening of nominees and selection of the prize winner will be performed by the Prize Selection Committee. Scientific members (including the Chair) of the Prize Selection Committee will serve for up to two four-year terms, and members of this committee will be selected by the Glushko-Samuelson Foundation in consultation with the Distinguished Advisory Board. A representative of the Foundation will also serve on the Prize Selection Committee. Members of the Prize Selection Committee are listed at the end of this announcement. David E. Rumelhart: A Scientific Biography David E. Rumelhart has made many contributions to the formal analysis of human cognition, working primarily within the frameworks of mathematical psychology, symbolic artificial intelligence, and parallel distributed processing. He also admired formal linguistic approaches to cognition and explored the possibility of formulating a formal grammar to capture the structure of stories. Rumelhart obtained his undergraduate education at the University of South Dakota, receiving a B.A. in psychology and mathematics in 1963. He studied mathematical psychology at Stanford University, receiving his Ph. D. in 1967. From 1967 to 1987 he served on the faculty of the Department of Psychology at the University of California, San Diego. In 1987 he moved to Stanford University, serving as Professor there until 1998. He has become disabled by Pick's disease, a progressive neurodegenerative illness, and now lives with his brother in Ann Arbor, Michigan. Rumelhart developed models of a wide range of aspects of human cognition, ranging from motor control to story understanding to visual letter recognition to metaphor and analogy. He collaborated with Don Norman and the LNR Research Group to produce "Explorations in Cognition" in 1975 and with Jay McClelland and the PDP Research Group to produce "Parallel Distributed Processing: Explorations in the Microstructure of Cognition" in 1986. He mastered many formal approaches to human cognition, developing his own list processing language and formulating the powerful back-propagation learning algorithm for training networks of neuron-like processing units. Rumelhart was elected to the National Academy of Sciences in 1991 and received many prizes, including a MacArthur Fellowship, the Warren Medal of the Society of Experimental Psychologists, and the APA Distinguished Scientific Contribution Award. Rumelhart articulated a clear view of what cognitive science, the discipline, is or ought to be. He felt that for cognitive science to be a science, it would have to have formal theories, and he often pointed to linguistic theories, as well as to mathematical and computational models, as examples of what he had in mind. Prize Selection Committee Alan Collins Department of Learning Sciences School of Education and Social Policy Northwestern University Mark Liberman Departments of Computer and Information Sciences and Linguistics University of Pennsylvania Anthony J. Marley Department of Psychology McGill University James L. McClelland (Chair) Carnegie Mellon University and Center for the Neural Basis of Cognition Pittsburgh, Pennsylvania Inquiries and Nominations should be sent to David E. Rumelhart Prize Administration Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 412-268-4000 derprize at cnbc.cmu.edu Visit the prize web site at www.cnbc.cmu.edu/derprize ---------- Cognitive Science Society c/o Tanikqua Young Department of Psychology University of Texas Austin, TX 78712 Phone: (512) 471-2030 Fax: (512) 471-3053 ---------- From laubach at jbpierce.org Tue Oct 30 10:17:22 2001 From: laubach at jbpierce.org (Mark Laubach) Date: Tue, 30 Oct 2001 10:17:22 -0500 Subject: postdoc position Message-ID: <3BDEC482.80402@jbpierce.org> POSTDOCTORAL FELLOWSHIP IN SYSTEMS NEUROPHYSIOLOGY In the laboratory of Mark Laubach, Ph.D. Assistant Fellow, J.B. Pierce Laboratory Assistant Professor, Dept. of Neurobiology Yale School of Medicine A postdoctoral position is available immediately in the lab of Mark Laubach at the John B. Pierce Laboratory. The focus of the laboratory is to understand how ensembles of neurons work together to represent behaviorally relevant information and how representations of animal behavior are altered in relation to experience. There are three experimental themes of the lab. First, we are examining how tactile and olfactory stimuli are encoded at various levels of the nervous system. Second, we are investigating how representations of stimuli that control behavior are altered by sensorimotor and discrimination learning. Finally, we are beginning to study how sensorimotor capabilities are altered by aging. A major component of this research involves the use of multielectrode recording methods to record simultaneously from groups of neurons in multiple brain areas in awake, behaving animals. The lab is a state-of-the-art facility for carrying out neuronal ensemble recording experiments and for performing quantitative analysis of neuronal ensemble data. We are active in improving methods for neuronal ensemble recording through several collaborative projects, both locally at Yale and elsewhere. A major goal is to carry out spike sorting and spike train analyses on-line and in real experimental time using modern methods for signal processing, statistical pattern recognition, and parallel, real-time computing. Finally, we are engaged in some computational studies to better understand potential network properties that may give rise to the patterns of neuronal ensemble activity that can be used to predict an animal's behavior at a given instant in time. Interested candidates should send a vita, a summary of research experience, and the names of three individuals who can write letters of recommendation to: Mark Laubach, Ph.D. The John B. Pierce Laboratory Yale School of Medicine 290 Congress Ave New Haven CT 06519 laubach at jbpierce.org http://www.jbpierce.org/staff/laubach.html From R.M.Everson at exeter.ac.uk Wed Oct 31 16:21:21 2001 From: R.M.Everson at exeter.ac.uk (R.M.Everson) Date: 31 Oct 2001 21:21:21 +0000 Subject: Postdoctoral fellowship in Critical Systems and Data-Driven Technology Message-ID: RESEARCH FELLOW in Critical Systems and Data-Driven Technology Department of Computer Science and School of Mathematical Sciences Exeter University Highly motivated candidates are sought for a post-doctoral position to join an EPSRC funded project applying inductive technologies (neural networks, Bayes nets etc) to safety critical systems. This project is a collaboration with the National Air Traffic Service and the Royal London Hospital, who will be closely involved. We shall address the theoretical and practical issues of managing critical systems with inductive technologies. We are looking for post-doctoral workers with a strong background in machine learning, data analysis or statistical classification, together with an interest in applications. The Pattern Analysis and Statistics groups at the University of Exeter have a strong tradition in data analysis, statistical modelling, pattern recognition and critical systems. Successful applicants will join a team working with Professors Partridge (Computer Science) and Krzanowski (Statistics), and Dr Everson (Computer Science). Further information from Professor Derek Partridge or Dr. Richard Everson, School of Engineering and Computer Science, University of Exeter, Exeter, EX44PT UK. Tel: +44 1392 264061 Email: {D.Partridge,R.Everson}@exeter.ac.uk Closing date 30th November 2001. From bogus@does.not.exist.com Mon Oct 1 06:32:01 2001 From: bogus@does.not.exist.com () Date: Mon, 1 Oct 2001 12:32:01 +0200 Subject: special sessions at ESANN'2002 Message-ID: ---------------------------------------------------- | | | ESANN'2002 | | | | 10th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 24-25-26, 2002 | | | | Special sessions | ---------------------------------------------------- The following message contains a summary of all special sessions that will be organized during the ESANN'2002 conference. Authors are invited to submit their contributions to one of these sessions or to a regular session, according to the guidelines found on the web server of the conference (http://www.dice.ucl.ac.be/esann/). List of special sessions that will be organized during the ESANN'2002 conference ============================================================================ 1. Perspectives on Learning with Recurrent Networks (B. Hammer, J.J. Steil) 2. Representation of high-dimensional data (A. Gurin-Dugu, J. Hrault) 3. Neural Network Techniques in Fault Detection and Isolation (S. Simani) 4. Hardware and Parallel Computer Implementations of Neural Networks (U. Seiffert) 5. Exploratory Data Analysis in Medicine and Bioinformatics (A. Wismller, T. Villmann) 6. Neural Networks and Cognitive Science (H. Paugam-Moisy, D. Puzenat) Short description ================= 1. Perspectives on Learning with Recurrent Networks --------------------------------------------------- Organised by : Barbara Hammer, University of Osnabrck Jochen J.Steil, University of Bielefeld Description of the session: Recurrent neural networks constitute a natural and widely applicable tool for processing spatio-temporal data such as time-series, language, control signals, financial data, etc. According to the various areas of application, many different models have been proposed: Recurrent networks may evolve in continuous or discrete time, they may be fully or partially connected, they may be trained with supervised or unsupervised methods, to name just a few aspects. With respect to this variety, it is difficult to compare and analyse recurrent networks in a common framework. One approach is to focus on their learning ability, where it is well known that classical gradient descent techniques suffer from numerical problems and may require a huge amount of data for adequate generalization. Then a main question is: Can we find efficient learning algorithms for the specific tasks and can we guarantee their success? To this aim, different means like normalization of the gradients, stochastic approaches, or genetic algorithms have been proposed. Hereby a key ingredient for efficient learning and valid generalization is some kind of regularisation. At a technical level this is mainly achieved by a proper choice of the fitness functions or the stochastic model, and/or constraints on the weights, often inspired by similar techniques for feedforward networks. Recently there have been proposed also more none-standard methods: for continuous networks the learning can be restricted to well behaved, e.g. stable, regions of the network known from a dynamical analysis. In case of symbolic, i.e. discrete inputs relations to symbolic systems such as finite automata can be exploited. Integration of automata rules or a restriction of the network to automata behavior often yields a good starting point for further training. We can borrow ideas from biological systems for adequate training according to the specific area of application, e.g. in speech acquisition or restrict the training to combine prototypic weight matrix templates. The session will focus on methods and examples for efficient training of recurrent networks such that valid generalization can be achieved. This involves algorithms, successful applications, and theoretical investigations to forward new insights and ideas for algorithm design. Authors are invited to submit contributions related to the following list of topics: training methods for recurrent networks, genetic, evolutionary, and hybrid approaches, regularization e.g. via stability constraints, connection to automata or symbolic systems, investigation of the dynamical behavior, general learning theory of recurrent networks, applications e.g. in speech recognition or forecasting, other learning related topics. 2. Representation of high-dimensional data ------------------------------------------ Organised by : Anne Gurin-Dugu, CLIPS-IMAG, Grenoble (France) Jeanny Hrault, LIS Grenoble (France) Description of the session: A common problem in Artificial Neural Networks is the analysis of non-linearly related and high-dimensional data. For human beings living in a 3-D world, there is a strong need for the representation and visualisation of these data. The problem is not simple; because of the so-called "curse of dimensionality", our understanding about high dimensions is often similar to the primitive numeration: one, two, three... many. The problem implies the need of dimension and size reduction of data sets, but for this purpose, some questions remain widely open and are of major importance: evaluation of similarities, distance metrics, manifold unfolding, local projections, non-linear relations... Topics to be addressed (this list is not limitative): Exploratory data analysis Similarity, distance metrics Vector quantization Multidimensional scaling, Self-Organized Maps Visualization High Order Statistics Human-driven exploration Applications (Vision, Genetics, Astronomy, Psychometrics...) 3. Neural Network Techniques in Fault Detection and Isolation ------------------------------------------------------------- Organised by : Silvio Simani, Department of Engineering, University of Ferrara Background: In recent years neural networks have been exploited successfully in pattern recognition as well as function approximation theory and they have been therefore proposed as a possible technique for fault diagnosis, too. Neural networks can handle non-linear behaviour and partially known process because they learn the diagnostic requirements by means of the information of the training data. They are noise tolerant and their ability to generalise the knowledge as well as to adapt during use are extremely interesting properties. Contribution topics: Under previous assumptions, session contributions should concern neural networks methods for fault diagnosis and identification which can be applied to a broad spectrum of processes. In particular, the papers can be address the following items: 1. Diagnosis problem (fault detection and isolation, FDI). Neural networks are exploited to estimate unknown relationships between symptoms and faults. In such a way, residuals which can be generated by means of model-based techniques dependent only on system faults. Therefore, neural networks are able to evaluate patterns of residuals, which are uniquely related to particular fault conditions independently from the plant dynamics. 2. Identification problem for FDI. Neural networks can be exploited also for the identification of complex dynamic processes. Such structures can be therefore successfully used to describe the input-output behaviour of the monitored systems. Moreover, on the basis of the analytical redundancy principle, the identified non-linear models can be hence applied for the development of model-based FDI algorithms. 4. Hardware and Parallel Computer Implementations of Neural Networks -------------------------------------------------------------------- Organised by : Udo Seiffert, University of Magdeburg (Germany) Description of the session: There are three major reasons to implement neural networks on specialised hardware. The first one comes from the internal topology and data flow of the network itself, which provides, depending on the considered network type, a more or less massive parallelism. While parallel implementations for that reason are often intended to adapt a technical model as close as possible to its biological original, the second objective becomes evident when dealing with large scale networks and increasing training times. In this case parallel computer hardware can significantly accelerate the training of existing networks or make their realisation viable at all. Sometimes a particular hardware, which is not necessarily parallel, is essential to meet some requirements of a practical application. This session covers all three topics and tries to reflect the great diversity of dedicated hardware implementations from the neural networks point of view. 5. Exploratory Data Analysis in Medicine and Bioinformatics ----------------------------------------------------------- Organised by : Axel Wismller, Institut fr Klinische Radiologie, Ludwig-Maximilians-Univ. Mnchen (Germany) Thomas Villmann, Klinik fr Psychotherapie, Universitt Leipzig (Germany) Description of the session: Biomedical research is a challenge to neural network computation. As medical doctors and bioscientists are facing vast, rapidly growing amounts of data, the need for advanced exploratory data analysis techniques increasingly moves into the focus of attention. In this context, artificial neural networks, as a special kind of learning and self-adapting data processing systems, have to offer considerable contributions. Their abilities to handle noisy and high-dimensional data, nonlinear problems, large data sets etc. have lead to a wide scope of successful applications in biomedicine. Beyond the classical subjects of neural computation in biomedicine, such as computer-aided diagnosis or biomedical image processing, new application domains discover the conceptual power of artificial neural networks for exploratory data analysis and visualization. As an important example, the subject of `bioinformatics' has emerged in recent years as a promising application domain with growing importance for both biomedical basic research and clinical application. Neural network computation in biomedicine can aim at different motivations. We have to distinguish at least two main directions: the first one is the description of neural processes in brains by neural network models. The other one is to exploit neural computation techniques for biomedical data analysis. The announced special session should focus on the second item. Although, at a first glance, the growing number of applications in the field may seem encouraging, there are still considerable unsolved problems. In particular, there is a need for continuous research emphasizing quality assessment including critical comparative evaluation of competing biosignal processing algorithms with respect to specific constraints of given application domains. In this context, it increasingly becomes clear that knowledge about neural network theory alone is not sufficient for designing successful applications aiming at the solution of relevant real-world problems in biomedicine. What is required as well is a sound knowledge of the data, i.e.~the underlying application domain. Although there may be methodological similarities, each application requires specific careful consideration with regard to data preprocessing, postprocessing, interpretation, and quality assessment. This challenge can only be managed by close interdisciplinary cooperation of medical doctors, biologists, engineers, and computer scientists. Hence, this subject can serve as an example for lively cross-fertilization between neural network computing and related research. In the proposed special session we want to focus on exploratory data analysis in medicine and bioinformatics based on neural networks as well as other advanced methods of computational intelligence. A special emphasis is put on real-world applications combining original ideas and new developments with a strong theoretical background. Authors are invited to submit contributions which can be in any area of medical research or bioinformatics applications. The following non-restrictive list can serve as an orientation, however, additional topics may be chosen as well: time-series analysis (EEG, EKG analysis, sleep monitoring, ...) pattern classification, clustering functional and structural genomics blind source separation and decorrelation dimension and noise reduction evaluation of non-metric data (e.g.~categorial, ordinal data) hybrid systems decision support systems data mining quality assessment knowledge-data fusion 6. Neural Networks and Cognitive Science ---------------------------------------- Organised by : Helene Paugam-Moisy, Universite Lyon 2 (France) Didier Puzenat, Universite Antilles-Guyanne (France) Description of the session: First, neural networks have been inspired by cognitive processes [PDP,1986]. Second, they were proved to be very efficient computing tools for engineering, financial and medical applications, etc... The aim of this session is to point out that there is still a great interest, for both engineering and cognitive science, to explore more deeply the links between natural and artificial neural systems, from a theoretical point of view. On the one hand: how to define more complex learning rules adapted to heterogeneous neural networks and how to build modular multi-network systems for modeling cognitive processes. On the other hand: how to derive new interesting learning paradigms back, for artificial neural networks, and how to design more performant systems than classical basic connectionist models. Especially, the strong power of parallel distributed processing is far from being fully understood and new ideas can be found in cognitive science both for boosting the efficiency of parallel computing and for designing more efficient learning rules. [PDP,1986] Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D. E. Rumelhart, J. L. McClelland and the PDP Research Group, MIT Press, 1986 ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From bbs at bbsonline.org Tue Oct 2 16:13:39 2001 From: bbs at bbsonline.org (Behavioral & Brain Sciences) Date: Tue, 02 Oct 2001 16:13:39 -0400 Subject: BBS: ANNOUNCEMENT Message-ID: Dear Dr. Connectionists List User, BBS announcement Cambridge University Press regrets to announce that Dr Stevan Harnad (University of Southampton, UK) has resigned as Editor of Behavioral and Brain Sciences. We are deeply grateful for the energy and commitment that he has devoted to the journal since its foundation and launch in 1978. His ideas and insights have always been stimulating and provocative, and he is especially recognised for providing much of the impetus in the continuing development and evolution of the exciting new endeavours of electronic publication and dissemination of knowledge. Cambridge is now consulting with Stevan and the Associate Editors of BBS and will shortly be establishing a Search Committee to appoint a new Editor. In the meantime, Dr Gavin Swanson, Editorial Manager, Cambridge Journals, has assumed editorial responsibility for BBS in the interregnum. All submissions and production or progress enquiries for BBS should continue to be addressed to bbs at bbsonline.org Dr Conrad Guettler Director, Journals Cambridge University Press -------------------------------------- -------------------------------------------------------------------- Dr Gavin Swanson Tel: +44 (0)1223 326223 (direct) Editorial Manager, Journals Cambridge University Press Fax: +44 (0)1223 315052 Shaftesbury Road ttab E-mail: gswanson at cambridge.org Cambridge CB2 2RU UK ttab Web: http://uk.cambridge.org (outside North America) http://us.cambridge.org (North America) Cambridge Journals Online: http://journals.cambridge.org/ From istvan at louisiana.edu Tue Oct 2 17:18:34 2001 From: istvan at louisiana.edu (Istvan Berkeley) Date: Tue, 02 Oct 2001 16:18:34 -0500 Subject: 2 Position Announcements Message-ID: <3BBA2F2A.594BCC98@louisiana.edu> Hi there, The following two position announcements may be of interest to list members. All the best, Istvan FACULTY POSITION IN THE INSTITUTE OF COGNITIVE SCIENCE. The Institute of Cognitive Science at the University of Louisiana at Lafayette invites applications for a tenure-track faculty appointment for the Fall of 2002. The appointment will be made at the associate professor or senior assistant professor level. The Institute of Cognitive Science is a graduate unit offering a Ph.D. program in cognitive science. Focus areas of the program are in cognitive processes, comparative cognition, cognitive development, computational models of mind, cognitive neuroscience, and linguistic/psycholinguistic processes. Applicants should hold a Ph.D. in cognitive science, psychology, or a related discipline, and must exhibit evidence of a productive research program. Please send a curriculum vitae, selected reprints, and at least three letters of reference to Subrata Dasgupta, Institute of Cognitive Science, University of Louisiana at Lafayette, P.O. Drawer 43772, Lafayette, LA 70504-3772. Formal review of applications will commence December 1, 2001, but applications will be accepted until the position is filled. The University of Louisiana at Lafayette is an equal opportunity/affirmative action employee. Assistant Professor, tenure-track, beginning Fall 2002. AOS: Philosophy of Mind/Cognitive Science. AOC: Open, but Ancient, Aesthetics, Metaphysics, and/or Philosophy of Language a plus. Ph.D. at time of hire and teaching experience required. The candidate should exhibit research promise, proven excellence in teaching and a commitment to the development and life of the philosophy program and the new Institute of Cognitive Science Ph.D program at UL Lafayette. Thus, empirical experience in a cognitive science related discipline would be a considerable advantage. Send cover letter, C.V., at least 3 confidential letters of reference, recent teaching evaluations, sample(s) of scholarly written work, statement describing research program and statement of teaching philosophy to: Dr. John Moore Philosophy Program P.O. Box 43770 The University of Louisiana at Lafayette Lafayette, LA 70504, USA Screening of applications will begin Nov. 25 . For more information about the University of Louisiana at Lafayette visit our web page at http://www.louisiana.edu. UL Lafayette is an AA/EEO employer. Women and minorities are encouraged to apply. -- Istvan S. N. Berkeley, Ph.D. Philosophy & Cognitive Science E-mail: istvan at louisiana.edu The University of Louisiana at Lafayette P.O. Box 43770 Tel: +1 337 482-6807 Lafayette, LA 70504-3770 Fax: +1 337 482-5002 USA http://www.ucs.louisiana.edu/~isb9112 From michaelPichat at univ-paris8.fr Wed Oct 3 09:27:18 2001 From: michaelPichat at univ-paris8.fr (Michael PICHAT) Date: Wed, 03 Oct 2001 15:27:18 +0200 Subject: Workshop on Multidisciplinary Aspects of Learning Message-ID: <3BBB1236.4EC0999A@univ-paris8.fr> EUROPEAN SOCIETY FOR THE STUDY OF COGNITIVE SYSTEMS Special Workshop on Multidisciplinary Aspects of Learning Clichy (Paris, France), 17-19 January 2002 TOPIC The ESSCS attempts to promote the multidisciplinary study of all aspects of cognition. Learning is one of the nodal points of cognition and raises many integrative issues with regard to the study of cognitive systems. This is the first special workshop on this topic organised by the ESSCS. The spirit of the workshop is deliberately chosen to encourage researchers from various fields to discuss with each other about the challenges and opportunities offered by a cross disciplinary approach to learning. SCOPE Contributions are invited on all aspects of learning, in human, animal, and artificial systems. More specifically the following subdisciplines of cognitive sciences are involved: - Psychology (cognitive, clinical, developmental, ergonomics) - Artificial intelligence (general aspects) - Neurosciences (associative memory, neural networks, etc.) - Linguistics (also computational), language disorders - Educational and Instructional sciences - Philosophy, History of concepts. INVITED SPEAKERS G.J. Dalenoort (University of Groningen): Theoretical considerations on learning G. Vergnaud (CNRS, Universite Paris 8): Learning and conceptual development J. Rogalski (CNRS, Universite Paris 8): Epistemology and cognitive analysis of the task: towards a common frame for analysing competence acquisition from students to professionals ORGANISATION The scientific program includes both oral communications and poster presentations. Each oral communication will be allotted 20 minutes for presentation plus 10 minutes for discussion. The workshop schedule will include a poster session; presenters will stand by their posters for informal discussion with workshop participants. The working language of the workshop is English. There will be a maximum of 20 oral presentations. The total number of presentations will be restricted to about 40. Participation is also possible without a communication. The workshop will take place at the "Lycee Rene Auffray", 23 rue Fernand Pelloutier, 92110 Clichy, Tel: 01-49-68-90-00 (from abroad: +33-1-49-68-90-00), Web site: www.lycee-rene-auffray.com, Metro (underground): line 13 (direction: "Gabriel Peri - Asnieres - Gennevilliers"; Stop: "Mairie de Clichy") For more information see the ESSCS webpage: http://www.esscs.org For scientific or local information please contact: michael.pichat at univ-paris8.fr SUBMISSION OF CONTRIBUTIONS Contributions will be peer-reviewed and considered on the basis of their relevance over a variety of subdisciplines of cognitive sciences. This will imply that papers that exclusively report on experimental results, without a theoretical basis or interpretation, will not be accepted. In case of doubt, please take up contact with the organisers. Papers that have been accepted can also be submitted to a special issue of Cognitive Systems, the international peer-reviewed journal of the ESSCS. Submissions (in English) should be sent in the form a 300 words abstract. Desired presentation form is to be indicated (oral, poster, oral or poster). Submissions should be sent as an email attachment (please both in RTF and DOS formats) to both G. Dalenoort (G.J.DALENOORT at ppsw.rug.nl) and M. Pichat (michael.pichat at univ-paris8.fr) by the 30th of October 2001. Acceptance will be notified within a week, upon which registration payment is required. COMMITTEES Scientific committee G.J. Dalenoort, University of Groningen G. Ricco, Universite Paris 8 G. Vergnaud, CNRS, Universite Paris 8 K.B. Koster, University of Groningen P.L.C. Van Geert, University of Groningen Organising committee M. Pichat, Universite Paris 8 L. Numa-Bocage, IUFM de Picardie M. Merri, ENFA de Toulouse D. Morange, Universite Lyon 2 M.-C. Jollivet, IUFM de Poitou-Charentes REGISTRATION Owing to administrative reasons, it is not possible to separate workshop and catering charges. Therefore, the following amounts include both workshop fees and catering fees (3 breakfasts, 2 lunches, 5 breaks). Please note that = 1 Euro. Members of the ESSCS: 85 Non members of the ESSCS: 95 Students: 75 ESSCS membership: 12 Full registration fee deposit deadline: 15 November 2001 Payment modalities shall be notified with declaring of acceptance. Please note that communication abstract will not be published in the workshop proceedings unless payment is received. ACCOMMODATION Some double rooms are available in the Lycee Rene Auffray itself (place of the meeting). Double rooms (with shower/bath and toilets) are about 40 altogether per night. These rooms have to be booked in advance. Full accommodation fee deposit deadline: 15 November 2001 Payment modalities shall be notified with notification of acceptance Alternative accommodation (hotels) Each room is provided with internal bathroom (toilets and shower or bath), television and telephone. Prices are per night and in Euro. Reservations with the hotels can be made directly (if deposits are required: best via a letter with authorisation to charge a credit card, to avoid high costs of international money transfers). Hotel des Chasses (**), Single room: 61, Breakfast: 6, Distance to workshop place: 5 minutes walk. Tel: From abroad: 00-33-1-47-37-01-73, in France: 01-47-37-01-73 Address: 49 rue Pierre Beregovy, 92110 Clichy Website: www.hoteldeschasses.fr Hotel Sovereign (***), Single room: 60, Breakfast: 6 Distance to workshop place: 10 minutes walk. Tel: from abroad: 00-33-1-47-37-54-24, in France: 01-47-37-54-24 Address: 14 rue Dagobert, 92110 Clichy Hotel Savoy (**), Single room: 68, Breakfast: 7 Distance from workshop place: 8 minutes walk. Tel: from abroad: 00-33-1-47-37-17-01, in France: 01-47-37-17-01 Address: 20 rue Villeneuve, 92110 Clichy Website: www.123france.com/europe/france/paris/hotels/hoclichy.htm IMPORTANT DATES Workshop: 17-19 January 2002 Submission deadline: 30th October 2001 Notification of acceptance: 7th November 2001 Registration fee deposit deadline: 15th November 2001 Accommodation fee deposit deadline: 15th November 2001 From jose.dorronsoro at iic.uam.es Wed Oct 3 12:49:02 2001 From: jose.dorronsoro at iic.uam.es (Jose Dorronsoro) Date: Wed, 03 Oct 2001 18:49:02 +0200 Subject: ICANN2002 Message-ID: <1.5.4.32.20011003164902.0117e670@iic.uam.es> Note: efforts have been made to avoid duplicate postings of this message. Apologies if, nevertheless, you are getting them. ICANN 2002 First Call for Papers The 12th International Conference on Artificial Neural Networks, ICANN 2002, to be held from August 27 to August 31 2002 at the ETS de Inform?tica of the Universidad Aut?noma de Madrid, welcomes contributions on Theory, Algorithms, Applications and Implementations on the following broad Areas: Computational Neuroscience Data Analysis and Pattern Recognition Vision and Image Processing Robotics and Control Signal and Time Series Processing Connectionist Cognitive Science Selforganization Dynamical Systems Suggestions for Tutorials, Workshops and Special Sessions are also welcome. Submissions will be possible by surface mail, e-mail attach or through an upload page to be available at a later time. Concrete submission procedures and other related details will soon appear at the conference's web site, www.ii.uam.es/icann2002. The Proceedings will be published in the "Lecture Notes in Computer Science" series of Springer-Verlag. Paper length is restricted to a maximum of 6 pages, including figures. The final paper layout must adhere strictly to the Author Instructions set out in the page http://www.springer.de/comp/lncs/authors.html of the LNCS web site. In particular, a LaTeX style file is available to help authors format their contributions according to the LNCS standard. Authors are asked to use that file and, in general, to follow very carefully that page's instructions. Important deadlines are End of submission receptions: February 15, 2002. Notification of acceptance/rejection: April 15, 2002. Final papers due (in hardcopy and electronically): May 15, 2002. Three independent referees will review each submitted paper. Acting on those reviews, the Program Committee will accept papers and assign them to either oral or poster presentation. All accepted papers (either for oral or poster presentation) will be published in the Proceedings under the same length restrictions. Proceedings will be distributed to all registered participants at the beginning of the Conference. For further information and/or contacts, send inquiries to the following address ICANN 2002 Conference Secretariat Mrs. Juana Calle Escuela T?cnica Superior de Inform?tica Universidad Aut?noma de Madrid 28049 Madrid, Spain e-mail: icann2002 at ii.uam.es From meesad at okstate.edu Tue Oct 2 23:45:38 2001 From: meesad at okstate.edu (Phayung Meesad) Date: Tue, 02 Oct 2001 22:45:38 -0500 Subject: Call for paper IJCNN2002 Message-ID: <000b01c14bbd$de1e14a0$fa384e8b@okstate.edu> CALL FOR PAPERS 2002 International Joint Conference on Neural Networks (IJCNN2002) May 12-17, 2002 Hilton Hawaiian Village, Honolulu, HI Held as part of the World Congress on Computational Intelligence (http://www.wcci2002.org) The annual IEEE/INNS International Joint Conference on Neural Networks (IJCNN), is one of the premier international conferences in the field. It covers all topics in neural networks, including but not limited to: - supervised, unsupervised and reinforcement learning, - hardware implementation, - time series analysis, - neurooptimization, - neurocontrol, - hybrid architectures, - bioinformatics, - neurobiology and neural modeling, - embedded neural systems, - intelligent agents, - image processing, - rule extraction, - statistics, - chaos, - learning theory, - and a huge variety of applications. The emphasis of the Congress will be on original theories and novel applications of neural networks. The Congress welcomes paper submissions from researchers, practitioners, and students worldwide. IJCNN 2002 will be held in conjunction with the Congress on Evolutionary Computation (CEC) and the IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) as part of the World Congress on Computational Intelligence (WCCI). Crossfertilization of the three fields will be strongly encouraged. The Congress will feature keynote speeches and tutorials by world-leading researchers. It also will include a number of special sessions and workshops on the latest hot topics. Your registration admits you to all events and includes the World Congress proceedings and banquet. The deadline for submissions is December 1, 2001. Look for more details on paper submission and conference registration coming soon. Bookmark your webpage at http://www.wcci2002.org. For more information, contact David Fogel, d.fogel at ieee.org General Chairman, WCCI2002: David B. Fogel , Natural Selection, Inc., USA Vice-General Chairman, WCCI2002: Mohamed A. El-Sharkawi, University of Washington, Inc., USA Program Chairman, IJCNN2002: C. Lee Giles , NEC Research, USA Technical co-Chairmen, IJCNN2002: Don Wunsch ,University of Missouri, Rolla, USA Marco Gori , Universita degli Studi de Sienna, Italy Nik Kasabov , University of Otago, New Zealand Michael Hasselmo , Boston University, USA Special Sessions co-Chairmen , IJCNN2002: C. Lee Giles , NEC Research, USA Don Wunsch, University of Missouri Rolla, USA Local Arrangements Chairman, WCCI2002: Anthony Kuh , University of Hawaii at Manoa, USA +++++++++++++++++++++++++++++++++++++++++++++++++++++ Publicity Chair, IJCNN2002 Gary Yen, Oklahoma State University, USA gyen at okstate.edu Publicity Co-Chair, IJCNN2002 Phayung Meesad, Oklahoma State University, USA meesad at okstate.edu +++++++++++++++++++++++++++++++++++++++++++++++++++++ From ttroyer at glue.umd.edu Wed Oct 3 15:45:10 2001 From: ttroyer at glue.umd.edu (Todd Troyer) Date: Wed, 03 Oct 2001 15:45:10 -0400 Subject: tenure-track position at U. Maryland Message-ID: <3BBB6AC6.BE13C6FD@glue.umd.edu> THE DEPARTMENT OF PSYCHOLOGY AT THE UNIVERSITY OF MARYLAND AT COLLEGE PARK has a tenure-track position at the assistant professor level for a cognitive scientist with expertise in computational, mathematical or neural modeling in areas such as, but not limited to, decision processes, memory, judgment, categorization, motor control and/or perception. The successful candidate must provide evidence of research productivity, and have a clear program of research capable of attracting external support. The person hired will be expected to teach at both the graduate and undergraduate levels. Please send a CV, a statement of research and teaching interests, and arrange to have three letters of recommendation sent to Professor Thomas Wallsten, Computational/Mathematical Psychology Search Committee, Department of Psychology, University of Maryland, College Park, MD 20742. The University of Maryland is an Affirmative Action/Equal Employment Opportunity Employer. For best consideration, materials should be received by October 15, 2001. -- ------------------------------------------------------------------------- Todd Troyer Dept of Psychology Ph: 301-405-9971 Program in Neuroscience FAX: 301-314-9566 and Cognitive Science e-mail: ttroyer at glue.umd.edu University of Maryland http://www.wam.umd.edu/~ttroyer College Park, MD 20742 ------------------------------------------------------------------------- From wahba at stat.wisc.edu Thu Oct 4 17:19:42 2001 From: wahba at stat.wisc.edu (Grace Wahba) Date: Thu, 4 Oct 2001 16:19:42 -0500 (CDT) Subject: Variable Selection, Multicategory SVM's Message-ID: <200110042119.QAA19382@hera.stat.wisc.edu> The following papers are available via http://www.stat.wisc.edu/~wahba -> TRLIST TR 1042 Variable Selection via Basis Pursuit for Non-Gaussian Data Hao Zhang, Grace Wahba,Yi Lin, Meta Voelker, Michael Ferris Ronald Klein and Barbara Klein Abstract A simultaneous flexible variable selection procedure is proposed by applying a basis pursuit method to the likelihood function. The basis functions are chosen to be compatible with variable selection in the context of smoothing spline ANOVA models. Since it is a generalized LASSO-type method, it enjoys the favorable property of shrinking coefficients and gives interpretable results. We derive a Generalized Approximate Cross Validation function (GACV), which is an approximate leave-out-one cross validation function used to choose smoothing parameters. In order to apply the GACV function for a large data set situation, we propose a corresponding randomized GACV. A technique called `slice modeling' is used to develop an efficient code. Our simulation study shows the effectiveness of the proposed approach in the Bernoulli case. TR 1043 Multicategory Support Vector Machines Yoonkyung Lee, Yi Lin and Grace Wahba Abstract The Support Vector Machine (SVM) has shown great performance in practice as a classification methodology. Oftentimes multicategory problems have been treated as a series of binary problems in the SVM paradigm. Even though the SVM implements the optimal classification rule asymptotically in the binary case, solutions to a series of binary problems may not be optimal for the original multicategory problem. We propose multicategory SVMs, which extend the binary SVM to the multicategory case, and encompass the binary SVM as a special case. The multicategory SVM implements the optimal classification rule as the sample size gets large, overcoming the suboptimality of the conventional one-versus-rest approach. The proposed method deals with the equal misclassification cost and the unequal cost case in unified way. (Long version of TR 1040) From meyoung at siu.edu Thu Oct 4 09:41:57 2001 From: meyoung at siu.edu (Michael Young) Date: Thu, 4 Oct 2001 08:41:57 -0500 Subject: faculty position: Southern Illinois University Message-ID: We're looking to hire in the cognitive area, broadly defined. We welcome applicants with a modeling background, although an active empirical research program is a must. Cheers, Mike Young Chair of the Search Committee ====== Job Opening in Brain and Cognitive Sciences (BCS) We are seeking candidates with a Ph.D. in Psychology or a related field, whose research interests include memory and cognitive psychology, cognitive neuroscience, cognitive development, or social cognition. Candidates for this position will be expected to develop a research program with potential for external funding and to share responsibility for teaching basic graduate and undergraduate courses in cognitive psychology, possibly undergraduate research methods or introductory statistics, as well as courses in their own specialty area. Interest and experience with an integrated multidisciplinary approach is highly desirable. Candidates will join a group whose research interests include cognitive psychology, neuroscience, cognitive development, behavior genetics, behavioral economics, decision making, and cognitive aging (see http:/www.siu.edu/~psycho/bcs for a detailed description of BCS faculty research interests). Applicants for the position must have either demonstrated potential for (Assistant Professor level) or an established record of (Associate Professor level) excellence in teaching, publication, and externally funded research. Applicants are expected to have completed all requirements for the Ph.D. by the date of employment. If all requirements have not been completed, a one-year term appointment at the rank of Instructor will be offered. Applicants should send a cover letter with an explicit statement of research and teaching interests, a current curriculum vita, reprints, teaching evaluations (if available), and have three letters of recommendation sent to Chair, BCS Search Committee Department of Psychology, Southern Illinois University, Carbondale, IL 62901-6502. Review of applications will begin November 15, but applications will be accepted until the position is filled. Southern Illinois University is an Equal Opportunity/Affirmative Action Employer. -- Dr. Michael E. Young http://www.siu.edu/~psycho/bcs/young.html Dept. of Psychology 618/453-3567 271F Life Sciences II Southern Illinois University Carbondale, IL 62901-6502 From hastie at stat.stanford.edu Thu Oct 4 13:39:21 2001 From: hastie at stat.stanford.edu (Trevor Hastie) Date: Thu, 4 Oct 2001 10:39:21 -0700 Subject: Book announcement: Elements of Statistical Learning Message-ID: <004c01c14cfb$81b74b20$d6bb42ab@GIBBERS> Book announcement: The Elements of Statistical Learning -data mining, inference and prediction 536p (in full color) Trevor Hastie, Robert Tibshirani, and Jerome Friedman Springer-Verlag, 2001 For more details visit our book homepage: http://www-stat.stanford.edu/ElemStatLearn To buy this book: Springer: http://www.springer-ny.com/detail.tpl?isbn=3D0387952845&cart=3D10022167731259632 Amazon:http://www.amazon.com/exec/obidos/ASIN/0387952845/o/qid%3D994019007/sr%3D2-2/ref%3Daps%5Fsr%5Fb%5F1%5F2/107-4101918-6486124 Barnes&Noble: http://shop.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=3D6B0UGX3JWY&mscssid=3DKSW8Q7J9FHV78HC3E4UM2UF3KK9H4E33&isbn=3D0387952845 Here is a brief description: During the past decade there has been an explosion in computation and information technology. With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of Statistics, and spawned new areas such as data mining, machine learning and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data-mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting --- the first comprehensive treatment of this topic in any book. Jerome Friedman, Trevor Hastie, and Robert Tibshirani are Professors of Statistics at Stanford University. They are prominent researchers in this area: Friedman is the (co-)inventor of many data-mining tools including CART, MARS, and projection pursuit. Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modelling software in S-PLUS, and invented principal curves and surfaces. Tibshirani proposed the Lasso and co-wrote the best selling book ``An Introduction to the Bootstrap''. -------------------------------------------------------------------- Trevor Hastie hastie at stat.stanford.edu Professor, Department of Statistics, Stanford University Phone: (650) 725-2231 (Statistics) Fax: (650) 725-8977 (650) 498-5233 (Biostatistics) Fax: (650) 725-6951 URL: http://www-stat.stanford.edu/~hastie address: room 104, Department of Statistics, Sequoia Hall 390 Serra Mall, Stanford University, CA 94305-4065 -------------------------------------------------------------------- From karen at research.nj.nec.com Thu Oct 4 09:18:34 2001 From: karen at research.nj.nec.com (Karen Hahn) Date: Thu, 4 Oct 2001 09:18:34 -0400 Subject: Research Scientist positions at NEC Message-ID: <453644D81568434F85CF1EC0E991E01C4A3848@exchange.nj.nec.com> The NEC Research Institute (NECI), founded twelve years ago, has as its mission basic research in Computer Science and Physical Sciences underlying future technologies relevant to NEC. The Institute has research programs in theory, machine learning, computer vision, computational linguistics, web characterization and applications, bioinformatics, as well as research activities in Physical Sciences. NECI is soliciting applications for full time Research Scientists in Computer Science. In addition to enhancing its current research efforts, NECI also plans to establish a group in distributed systems. Although these areas are preferred, NECI will consider exceptional applicants from other areas of computer science. NECI offers unique and unusual opportunities to its scientists including great freedom in deciding basic research directions and projects; budgets for research, travel, equipment, and support staff that are directly controlled by principal researchers; and publication of all research results in the open literature. The Institute's laboratories are state-of-the-art and include several high-end parallel compute servers. NECI has close ties with outstanding research universities in and outside the Princeton area and with NEC's Central Research Laboratory (CRL) in Japan. Collaborations with university and CRL research groups are encouraged. Full applications should include resumes, copies of selected publications, names of at least three references, and a two-page statement of proposed research directions. Applications will be reviewed beginning January 1, 2002. NECI is an equal opportunity employer. For more details about NECI, please see http://www.neci.nj.nec.com. Please send applications or inquiries to: CS Search Committee Chair NEC Research Institute 4 Independence Way Princeton, NJ 08540 Email: compsci-candidates at research.nj.nec.com From myosioka at brain.riken.go.jp Fri Oct 5 01:54:15 2001 From: myosioka at brain.riken.go.jp (Masahiko Yoshioka) Date: Fri, 05 Oct 2001 14:54:15 +0900 Subject: Paper: Spike-timing-dependent learning rule Message-ID: <20011005145415H.myosioka@brain.riken.go.jp> Dear Connectionists, I am pleased to announce the availability of my recent paper and of one potentially related paper. Recent paper: ------------- "The spike-timing-dependent learning rule to encode spatiotemporal patterns in a network of spiking neurons" M. Yoshioka, Phys. Rev. E (in press) Available at http://arXiv.org/abs/cond-mat/0110070 (preprint) Abstract: We study associative memory neural networks based on the Hodgkin-Huxley type of spiking neurons. We introduce the spike-timing-dependent learning rule, in which the time window with the negative part as well as the positive part is used to describe the biologically plausible synaptic plasticity. The learning rule is applied to encode a number of periodical spatiotemporal patterns, which are successfully reproduced in the periodical firing pattern of spiking neurons in the process of memory retrieval. The global inhibition is incorporated into the model so as to induce the gamma oscillation. The occurrence of gamma oscillation turns out to give appropriate spike timings for memory retrieval of discrete type of spatiotemporal pattern. The theoretical analysis to elucidate the stationary properties of perfect retrieval state is conducted in the limit of an infinite number of neurons and shows the good agreement with the result of numerical simulations. The result of this analysis indicates that the presence of the negative and positive parts in the form of the time window contributes to reduce the size of crosstalk term, implying that the time window with the negative and positive parts is suitable to encode a number of spatiotemporal patterns. We draw some phase diagrams, in which we find various types of phase transitions with change of the intensity of global inhibition. Related paper: -------------- "Associative memory storing an extensive number of patterns based on a network of oscillators with distributed natural frequencies in the presence of external white noise" M. Yoshioka and M. Shiino, Phys. Rev. E 61, 4732 (2000) Available at http://pre.aps.org/ (subscription is required) http://xxx.lanl.gov/abs/cond-mat/9903316 (preprint) Abstract: We study associative memory based on temporal coding in which successful retrieval is realized as an entrainment in a network of simple phase oscillators with distributed natural frequencies under the influence of white noise. The memory patterns are assumed to be given by uniformly distributed random numbers on $[0,2\pi)$ so that the patterns encode the phase differences of the oscillators. To derive the macroscopic order parameter equations for the network with an extensive number of stored patterns, we introduce the effective transfer function by assuming the fixed-point equation of the form of the TAP equation, which describes the time-averaged output as a function of the effective time-averaged local field. Properties of the networks associated with synchronization phenomena for a discrete symmetric natural frequency distribution with three frequency components are studied based on the order parameter equations, and are shown to be in good agreement with the results of numerical simulations. Two types of retrieval states are found to occur with respect to the degree of synchronization, when the size of the width of the natural frequency distribution is changed. Regards, Masahiko Yoshioka Brain Science Institute, RIKEN Hirosawa 2-1, Wako-shi, Saitama, 351-0198, Japan From ken at phy.ucsf.edu Fri Oct 5 20:42:02 2001 From: ken at phy.ucsf.edu (Ken Miller) Date: Fri, 5 Oct 2001 17:42:02 -0700 Subject: Paper available: Model of visual cortical responses Message-ID: <15294.21338.975363.928535@coltrane.ucsf.edu> A preprint of the following paper is now available from ftp://ftp.keck.ucsf.edu/pub/ken/lauritzen_etal.pdf or from http://www.keck.ucsf.edu/~ken (click on "Publications", then on "Models of Neuronal Integration and Circuitry") Lauritzen, T.Z., A.E. Krukowski and K.D. Miller (2001). "Local correlation-based circuitry can account for responses to multi-grating stimuli in a model of cat V1." In press, Journal of Neurophysiology. Abstract: In cortical simple cells of cat striate cortex, the response to a visual stimulus of the preferred orientation is partially suppressed by simultaneous presentation of a stimulus at the orthogonal orientation, an effect known as ``cross-orientation inhibition". It has been argued that this is due to the presence of inhibitory connections between cells tuned for different orientations, but intracellular studies suggest that simple cells receive inhibitory input primarily from cells with similar orientation tuning. Furthermore, response suppression can be elicited by a variety of non-preferred stimuli at all orientations. Here we study a model circuit that was presented previously to address many aspects of simple cell orientation tuning, which is based on local intracortical connectivity between cells of similar orientation tuning. We show that this model circuit can account for many aspects of cross-orientation inhibition and, more generally, of response suppression by non-preferred stimuli and of other non-linear properties of responses to stimulation with multiple gratings. Ken Kenneth D. Miller telephone: (415) 476-8217 Associate Professor fax: (415) 476-4929 Dept. of Physiology, UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From dayan at gatsby.ucl.ac.uk Mon Oct 8 08:39:07 2001 From: dayan at gatsby.ucl.ac.uk (Peter Dayan) Date: Mon, 8 Oct 2001 13:39:07 +0100 (BST) Subject: Gatsby Unit research positions Message-ID: <200110081239.NAA06852@flies.gatsby.ucl.ac.uk> Gatsby Computational Neuroscience Unit http://www.gatsby.ucl.ac.uk/ Post-doctoral and PhD Research Positions Computational Neuroscience The Gatsby Computational Neuroscience Unit invites applications for PhD studentships and post-doctoral research positions. Members of the unit are interested in models of all aspects of brain function, especially unsupervised learning, reinforcement learning, neural dynamics, population coding and computational motor control. There is the opportunity to conduct psychophysical experiments in motor control. The Unit also has active interests in more general aspects of machine learning. For further details please see: http://www.gatsby.ucl.ac.uk/research.html The Gatsby Unit provides a unique opportunity for a critical mass of theoreticians to interact closely with each other and with University College's other world class research groups including Anatomy, Computer Science, Functional Imaging Laboratory, Physics, Physiology, Psychology, Neurology, Ophthalmology, and Statistics. The unit has excellent computational facilities, and laboratory facilities for theoretically motivated experimental studies. The unit's visitor and seminar programmes enable its staff and students to interact with leading researchers from across the world. Candidates should have a strong analytical background and a keen interest in neuroscience. Competitive salaries and studentships are available. Applicants should send in plain text format a CV (PhD applicants should include details of course work and grades), a statement of research interests, and names and addresses of three referees to admin at gatsby.ucl.ac.uk (email preferred) or to The Gatsby Computational Neuroscience Unit University College London Alexandra House 17 Queen Square LONDON WC1N 3AR UK ** Closing date for applications: 9th November 2001 ** From jordan at CS.Berkeley.EDU Mon Oct 8 17:33:25 2001 From: jordan at CS.Berkeley.EDU (Michael Jordan) Date: Mon, 8 Oct 2001 14:33:25 -0700 (PDT) Subject: letter of resignation from Machine Learning journal Message-ID: Dear colleagues in machine learning, The forty people whose names appear below have resigned from the Editorial Board of the Machine Learning Journal (MLJ). We would like to make our resignations public, to explain the rationale for our action, and to indicate some of the implications that we see for members of the machine learning community worldwide. The machine learning community has come of age during a period of enormous change in the way that research publications are circulated. Fifteen years ago research papers did not circulate easily, and as with other research communities we were fortunate that a viable commercial publishing model was in place so that the fledgling MLJ could begin to circulate. The needs of the community, principally those of seeing our published papers circulate as widely and rapidly as possible, and the business model of commercial publishers were in harmony. Times have changed. Articles now circulate easily via the Internet, but unfortunately MLJ publications are under restricted access. Universities and research centers can pay a yearly fee of $1050 US to obtain unrestricted access to MLJ articles (and individuals can pay $120 US). While these fees provide access for institutions and individuals who can afford them, we feel that they also have the effect of limiting contact between the current machine learning community and the potentially much larger community of researchers worldwide whose participation in our field should be the fruit of the modern Internet. None of the revenue stream from the journal makes its way back to authors, and in this context authors should expect a particularly favorable return on their intellectual contribution---they should expect a service that maximizes the distribution of their work. We see little benefit accruing to our community from a mechanism that ensures revenue for a third party by restricting the communication channel between authors and readers. In the spring of 2000, a new journal, the Journal of Machine Learning Research (JMLR), was created, based on a new vision of the journal publication process in which the editorial board and authors retain significant control over the journal's content and distribution. Articles published in JMLR are available freely, without limits and without conditions, at the journal's website, http://www.jmlr.org. The content and format of the website are entirely controlled by the editorial board, which also serves its traditional function of ensuring rigorous peer review of journal articles. Finally, the journal is also published in a hardcopy version by MIT Press. Authors retain the copyright for the articles that they publish in JMLR. The following paragraph is taken from the agreement that every author signs with JMLR (see www.jmlr.org/forms/agreement.pdf): You [the author] retain copyright to your article, subject only to the specific rights given to MIT Press and to the Sponsor [the editorial board] in the following paragraphs. By retaining your copyright, you are reserving for yourself among other things unlimited rights of electronic distribution, and the right to license your work to other publishers, once the article has been published in JMLR by MIT Press and the Sponsor [the editorial board]. After first publication, your only obligation is to ensure that appropriate first publication credit is given to JMLR and MIT Press. We think that many will agree that this is an agreement that is reflective of the modern Internet, and is appealing in its recognition of the rights of authors to distribute their work as widely as possible. In particular, authors can leave copies of their JMLR articles on their own homepage. Over the years the editorial board of MLJ has expanded to encompass all of the various perspectives on the machine learning field, and the editorial board's efforts in this regard have contributed greatly to the sense of intellectual unity and community that many of us feel. We believe, however, that there is much more to achieve, and that our further growth and further impact will be enormously enhanced if via our flagship journal we are able to communicate more freely, easily, and universally. Our action is not unprecedented. As documented at the Scholarly Publishing and Academic Resources Coalition (SPARC) website, http://www.arl.org/sparc, there are many areas in science where researchers are moving to low-cost publication alternatives. One salient example is the case of the journal "Logic Programming". In 1999, the editors and editorial advisors of this journal resigned to join "Theory and Practice of Logic Programming", a Cambridge University Press journal that encourages electronic dissemination of papers. In summary, our resignation from the editorial board of MLJ reflects our belief that journals should principally serve the needs of the intellectual community, in particular by providing the immediate and universal access to journal articles that modern technology supports, and doing so at a cost that excludes no one. We are excited about JMLR, which provides this access and does so unconditionally. We feel that JMLR provides an ideal vehicle to support the near-term and long-term evolution of the field of machine learning and to serve as the flagship journal for the field. We invite all of the members of the community to submit their articles to the journal and to contribute actively to its growth. Sincerely yours, Chris Atkeson Peter Bartlett Andrew Barto Jonathan Baxter Yoshua Bengio Kristin Bennett Chris Bishop Justin Boyan Carla Brodley Claire Cardie William Cohen Peter Dayan Tom Dietterich Jerome Friedman Nir Friedman Zoubin Ghahramani David Heckerman Geoffrey Hinton Haym Hirsh Tommi Jaakkola Michael Jordan Leslie Kaelbling Daphne Koller John Lafferty Sridhar Mahadevan Marina Meila Andrew McCallum Tom Mitchell Stuart Russell Lawrence Saul Bernhard Schoelkopf John Shawe-Taylor Yoram Singer Satinder Singh Padhraic Smyth Richard Sutton Sebastian Thrun Manfred Warmuth Chris Williams Robert Williamson From becker at meitner.psychology.mcmaster.ca Mon Oct 8 21:15:54 2001 From: becker at meitner.psychology.mcmaster.ca (Sue Becker) Date: Mon, 8 Oct 2001 21:15:54 -0400 (EDT) Subject: NIPS*2001 student travel awards Message-ID: Dear connectionists, This is to let you know that the deadline for applications for student travel awards to attend the NIPS*2001 meeting in Vancouver is Friday October 12, and the application form is now available on the NIPS web page, at www.cs.cmu.edu/Web/Groups/NIPS/ These awards cover travel only. Graduate students interested in doing volunteer work at the meeting in exchange for a registration fee waiver should send email to Sid Fels ssfels at ece.ubc.ca. Only a limited number of openings are available. cheers, Sue Becker NIPS*2001 Program Chair From giro-ci0 at wpmail.paisley.ac.uk Tue Oct 9 12:35:23 2001 From: giro-ci0 at wpmail.paisley.ac.uk (Mark Girolami) Date: Tue, 09 Oct 2001 17:35:23 +0100 Subject: Two Post-Doctoral Research Assistantships Available Message-ID: Two Post-Doctoral Research Assistantships Available Division of Computing and Information Systems University of Paisley A research project which is to be funded by the Engineering and Physical Sciences Research Council (EPSRC), the Department of Trade and Industry (DTI), and industrial partners, to a total value of over 530K, will be conducted at the University of Paisley in Scotland for a period of three years. The project aims to investigate the technologies required in software systems which will be able to provide effective detection and subsequent analysis of fraudulent activity within the general framework required of emerging fixed and mobile telecommunications applications such as electronic and mobile commerce. Two postdoctoral positions are now available to investigate the application of machine learning and advanced data mining methods in the detection and analysis of anomalous and possibly fraudulent usage of fixed and mobile telecommunications applications such as electronic and mobile commerce. The project will involve the design and implementation of novel algorithms and systems to both discover and analyse emerging patterns of anomalous telecommunication system user activity. Highly motivated candidates who have a publication record in, ideally, machine learning, data mining or artificial intelligence applications are encouraged to apply. Applicants should have, or shortly expect to obtain, a PhD in Computer Science. State-of-the-art computer hardware and software will be made available to the selected candidates, as will ample funding for travel to international conferences and meetings. Salaries will be on the R1A scale, starting at 20,066pa to 27,550pa. The Applied Computational Intelligence Research Unit (ACIRU) is a young, ambitious and growing interdisciplinary research group within the University of Paisley. Within Scotland ACIRU have active and funded research collaborations with the University of Edinburgh, University of Stirling (http://www.cn.stir.ac.uk/incite/), the University of Glasgow and the University of Strathclyde and it forms part of a rich network of research establishments within which to work. For further information and informal enquiries please contact Mark Girolami (mark.girolami at paisley.ac.uk, http://cis.paisley.ac.uk/giro-ci0) in the first instance. EPSRC & DTI Project Data mining Tools for Fraud Detection in M-Commerce * DETECTOR http://cis.paisley.ac.uk/giro-ci0/projects.html Abstract: The effective detection and subsequent analysis of the types of fraudulent activity which occur within telecommunications systems is varied and changes with the emergence of new technologies and new forms of commercial activity (e&m-commerce). The dynamic nature of fraudulent activity as well as the dynamic and changing nature of normal usage of a service has rendered the detection of fraudulent intent from observed behavioural patterns a research problem of some considerable difficulty. It is proposed that a common theoretical probabilistic framework be employed in the development of dynamic behavioural models which combine a number of prototypical behavioural aspects to define a model of acceptable behaviour (e.g. usage of a mobile phone, web-browsing patterns) from which inferences of the probability of abnormal behaviour can be made. In addition to these inferential models a means of visualising the observed behaviour and the intentions behind it (based on call records, web activity, or purchasing patterns) will significantly aid the pattern recognition abilities of human fraud analysts. Employing the common probabilistic modelling framework which defines the 'fraud detection' models visualisation tools will be developed to provide meaningful visual representations of dynamic activity which has been observed and visualisations of the evolution of the underlying states (or user intentions) generating the observed activity. The development of detection & analysis tools from the common theoretical framework will provide enhanced detection and analysis capability in the identification of fraud. M.A.Girolami School of Communication and Information Technologies University of Paisley High Street Paisley, PA1 2BE Scotland, UK Tel: +44 - 141 - 848 3317 Fax: +44 - 141 - 848 3542 Email: mark.girolami at paisley.ac.uk Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From dld at mail.csse.monash.edu.au Wed Oct 10 01:49:28 2001 From: dld at mail.csse.monash.edu.au (David L Dowe) Date: Wed, 10 Oct 2001 15:49:28 +1000 (EST) Subject: Research Fellows in Machine Learning at Monash University Message-ID: <200110100549.f9A5nSG27946@bruce.csse.monash.edu.au> Dear all, Apologies for any cross-posting. Below is an advertisement for Research Fellows in Information Technology at Monash University. If you are interested in machine learning and machine learning by Minimum Message Length (MML), then I warmly invite you to read the advertisement below for Research Fellows, for which the application deadline is Friday 19th October. David Dowe. http://www.csse.monash.edu.au/~dld/MML.html =========================================================================== Research Fellows in Information Technology ------------------------------------------ As part of its goal to cement its position as Australia's premier academic institution for information technology research, the Faculty of Information Technology of Monash University has established the IT Research Fellowship Scheme to attract outstanding early career researchers in any field of information technology to Monash University. The positions are for up to 3 years and the salary is in the Research Fellow Level B range $50,847 - $60,382 per annum depending on experience. In addition Fellowship holders are eligible for a Research Support Grants which can be up to $10,000 per annum depending on the needs of the research program. Return airfares may be available for successful interstate/overseas candidates and their dependants. The Faculty of Information Technology was created in 1990. It is Australia's largest faculty exclusively devoted to information technology with 190 academics. It has an enviable research reputation in virtually all aspects of information technology and produces more research postgraduates than any other Australian university. Research in the Faculty is centred around 8 research groups which cover: distributed systems, mobile computing and software engineering; artificial intelligence and operations research techniques for intelligent decision support; information systems and information management; digital communications and multimedia; and computer education. Location: Appointees may be based at Clayton, Caulfield, Peninsula, Gippsland or Berwick campuses. Contact: Further information and particulars of the application procedure may be obtained from A/Prof. Kim Marriott, Associate Dean of Research, Faculty of Information Technology telephone +61 3 9905 5525, facsimile +61 3 9905 5146, e-mail: adr at infotech.monash.edu.au Applications: Ms M Jones-Roberts, Manager, Research Services, Faculty of Information Technology, P.O. Box 197, Caulfield East, Vic 3145 by 19/10/01. Quote Ref No. A013055 and include a completed application form. The position description, selection criteria, background information and application form are available at http://www.infotech.monash.edu.au/sta_pv_research.html From bap at cs.unm.edu Wed Oct 10 18:07:34 2001 From: bap at cs.unm.edu (Barak Pearlmutter) Date: Wed, 10 Oct 2001 16:07:34 -0600 (MDT) Subject: NIPS*2001 Workshops Announcement Message-ID: * * * Post-NIPS*2001 Workshops * * * * * * Whistler, BC, CANADA * * * * * * December 7-8, 2001 * * * The NIPS*2001 Workshops will be on Friday and Saturday, December 7/8, in Whistler, BC, Canada, following the main NIPS conference in Vancouver Monday-Thursday, December 3-6. This year there are 19 workshops: Activity-Dependent Synaptic Plasticity Artificial Neural Networks in Safety-Related Areas Brain-Computer Interfaces Causal Learning and Inference in Humans & Machines Competition: Unlabeled Data for Supervised Learning Computational Neuropsychology Geometric Methods in Learning Information & Statistical Structure in Spike Trains Kernel-Based Learning Knowledge Representation in Meta-Learning Machine Learning in Bioinformatics Machine Learning Methods for Images and Text Minimum Description Length Multi-sensory Perception & Learning Neuroimaging: Tools, Methods & Modeling Occam's Razor & Parsimony in Learning Preference Elicitation Quantum Neural Computing Variable & Feature Selection Some workshops span both days, while others will be only one day long. One-day workshops will be assigned to friday or saturday by October 14. Please check the web page after this time for individual dates. All workshops are open to all registered attendees. Many workshops also invite submissions. Submissions, and questions about individual workshops, should be directed to the individual workshop organizers. Included below is a short description of most of the workshops. Additional information (including web pages for the individual workshops) is available at the NIPS*2001 Web page: http://www.cs.cmu.edu/Groups/NIPS/ Information about registration, travel, and accommodations for the main conference and the workshops is also available there. Whistler is a ski resort a few hours drive from Vancouver. The daily workshop schedule is designed to allow participants to ski half days, or enjoy other extra-curricular activities. Some may wish to extend their visit to take advantage of the relatively low pre-season rates. We look forward to seeing you in Whistler. Virginia de Sa and Barak Pearlmutter NIPS Workshops Co-chairs ------------------------------------------------------------------------- Activity-dependent Synaptic Plasticity Paul Munro, Larry Abbott http://www.pitt.edu/~pwm/plasticity While the mathematical and cognitive aspects of rate-based Hebb-like rules have been broadly explored, relatively little is known about the possible role of STDP at the computational level. Hebbian learning in neural networks requires both correlation-based synaptic plasticity and a mechanism that induces competition between different synapses. Spike-timing-dependent synaptic plasticity is especially interesting because it combines both of these elements in a single synaptic modification rule. Some recent work has examined the possibility that STDP may underlie older models, such as Hopfield networks or the BCM rule. Temporally dependent synaptic plasticity is attracting a rapidly growing amount of attention in the computational neuroscience community. The change in synaptic efficacy arising from this form of plasticity is highly sensitive to temporal correlations between different presynaptic spike trains. Furthermore, it can generate asymmetric and directionally selective receptive fields, a result supported by experiments on experience-dependent modifications of hippocampal place fields. Finally, spike-timing-dependent plasticity automatically balances excitation and inhibition producing a state in which neuronal responses are rapid but highly variable. The major goals of the workshop are: 1. To review current experimental results on spike-timing-dependent synaptic plasticity and related effects. 2. To discuss models and mechanisms for this form of synaptic plasticity. 3. To explore the relationship of STDP with other approaches. 4. To reconcile the rate-based and spike-based plasticity data with a unified theoretical framework (very optimistic!).. ------------------------------------------------------------------------- Artificial Neural Networks in Safety-Related Areas: Applications and Methods for Validation and Certification J. Schumann, P. Lisboa, R. Knaus http://ase.arc.nasa.gov/people/schumann/workshops/NIPS2001 Over the recent years, Artificial Neural Networks have found their way into various safety-related and safety-critical areas, for example, power generation and transmission, transportation, avionics, environmental monitoring and control, medical applications, and consumer products. Applications range from classification to monitoring and control. Quite often, these applications proved to be highly successful, leading from pure research prototypes into serious experimental systems (e.g., a neural-network-based flight-control system test-flown on a NASA F-15ACTIVE) or commercial products (e.g., Sharp's Logi-cook). However, the general question of how to make sure that the ANN-based system performs as expected in all cases has not yet been addressed satisfactorily. All safety-related software applications require careful verification and validation (V&V) of the software components, ranging from extended testing to full-fledged certification procedures. However, for neural-network based systems, a number of specific issues have to be addressed. For example, a lack of a concise plant model, often a major reason to use a ANN in the first place, makes traditional approaches to V&V impossible. In this workshop, we will address such issues. In particular, we will discuss the following (non-exhaustive list of) topics: * theoretical methodologies to characterise the properties of ANN solutions, e.g., multiple realisations of a particular network and ways of managing this * fundamental software approaches to V&V and implications for ANNs, e.g., the application of FMEA * statistical (Bayesian) methods and symbolic techniques like rule extraction with subsequent V&V to assess and guarantee the performance of a ANN * dynamic monitoring of the ANN's behavior * stability proofs for control of dynamical systems with ANNs * principled approaches to design assurance, risk assessment, and performance evaluation of systems with ANNs * experience of application and certification of ANNs for safety-related applications * V&V techniques suitable for on-line trained and adaptive systems This workshop aims to bring together researchers who have applied ANNs in safety-related areas and actually addressed questions of demonstrating flawless operation of the ANN, researchers working on theoretical topics of convergence and performance assessment, researchers in the area of nonlinear adaptive control, and researchers from the area of formal methods in software design for safety-critical systems. Many prototypical/experimental application of neural networks in safety-related areas have demonstrated their usefulness successfully. But ANN applicability in safety-critical areas is substantially limited because of a lack of methods and techniques for verification and validation. Currently, there is no silver bullet for V&V in traditional software, and with the more complicated situation for ANNs, none is expected here in the short run. However, any result can have substantial impact in this field. ------------------------------------------------------------------------- Brain-Computer Interfaces Lucas Parra, Paul Sajda, Klaus-Robert Mueller http://newton.bme.columbia.edu/bci ------------------------------------------------------------------------- Causal learning and inference in humans and machines T. Griffiths, J. Tenenbaum, T. Kushnir, K. Murphy, A. Gopnik http://www-psych.stanford.edu/~jbt/causal-workshop.html The topic of causality has recently leapt to the forefront of theorizing in the fields of cognitive science, statistics, and artificial intelligence. The main objective of this workshop is to explore the potential connections between research on causality in the these three fields. There has already been much productive cross-fertilization: the development of causal Bayes nets in the AI community has often had a strong psychological motivation, and recent work by several groups in cognitive science has shown that some elementary but important aspects of how people learn and reason about causes may be best explained by theories based on causal Bayes nets.? Yet the most important questions lay wide open. Some examples of the questions we hope to address in this workshop include: * Can we scale up Bayes-net models of human causal learning and inference from microdomains with one or two causes and effects to more realistic large-scale domains? * What would constitute strong empirical tests of large-scale Bayes net models of human causal reasoning? * Do approximation methods for inference and learning on large Bayes nets have anything to do with human cognitive processes? * What are the relative roles of passive observation and active manipulation in causal learning? * What is the relation between psychological and computational notions of causal independence? The workshop will last one day. Most of the talks will be invited, but we welcome contributions for short talks by researchers in AI, statistics or cognitive science would like to make connections between these fields. Please contact one of the organizers if you are interested in participating. For more information contact Josh Tenenbaum (jbt at psych.stanford.edu) or Alison Gopnik (gopnik at socrates.berkeley.edu). ------------------------------------------------------------------------- Competition: Unlabeled Data for Supervised Learning Stefan C. Kremer, Deborah A. Stacey http://q.cis.uoguelph.ca/~skremer/NIPS2001/ Recently, there has been much interest in applying techniques that incorporate knowledge from unlabeled data into systems performing supervised learning. The potential advantages of such techniques are obvious in domains where labeled data is expensive and unlabeled data is cheap. Many such techniques have been proposed, but only recently has any effort been made to compare the effectiveness of different approaches on real world problems. This web-site presents a challenge to the proponents of methods to incorporate unlabeled data into supervised learning. Can you really use unlabeled data to help train a supervised classification (or regression) system? Do recent (and not so recent) theories stand up to the data test? On this web-site you can find challenge problems where you can try out your methods head-to-head against anyone brave enough to face you. Then, at the end of the contest we will release the results and find out who really knows something about using unlabeled data, and if unlabeled data are really useful or we are all just wasting our time. So ask yourself, are you (and your theory) up to the challenge?? Feeling lucky??? ------------------------------------------------------------------------- Computational Neuropsychology Sara Solla, Michael Mozer, Martha Farah http://www.cs.colorado.edu/~mozer/nips2001workshop.html The 1980's saw two important developments in the sciences of the mind: The development of neural network models in cognitive psychology, and the rise of cognitive neuroscience. In the 1990's, these two separate approaches converged, and one of the results was a new field that we call "Computational Neuropsychology." In contrast to traditional cognitive neuropsychology, computational neuropsychology uses the concepts and methods of computational modeling to infer the normal cognitive architecture from the behavior of brain-damaged patients. In contrast to traditional neural network modeling in psychology, computational neuropsychology derives constraints on network architectures and dynamics from functional neuroanatomy and neurophysiology. Unfortunately, work in computational neuropsychology has had relatively little contact with the Neural Information Processing Systems (NIPS) community. Our workshop aims to expose the NIPS community to the unusual patient cases in neuropsychology and the sorts of inferences that can be drawn from these patients based on computational models, and to expose researchers in computational neuropsychology to some of the more sophisticated modeling techniques and concepts that have emerged from the NIPS community in recent years. We are interested in speakers from all aspects of neuropsychology, including: * attention (neglect) * visual and auditory perception (agnosia) * reading (acquired dyslexia) * face recognition (prosopagnosia) * memory (Alzheimer's, amnesia, category-specific deficits) * language (aphasia) * executive function (schizophrenia, frontal deficits). Contact Sara Solla (solla at nwu.edu) or Mike Mozer (mozer at colorado.edu) if you are interested in speaking at the workshop. ------------------------------------------------------------------------- Geometric Methods in Learning workshop Amir Assadi http://www.lmcg.wisc.edu/bioCVG/events/NIPS2001/NIPS2001Wkshp.htm http://www.lmcg.wisc.edu/bioCVG The purpose of this workshop is to attract the attention of the learning community to geometric methods and to take on an endeavor: 1. To lay out a geometric paradigm for formulating profound ideas in learning; 2. To facilitate the development of geometric methods suitable of investigation of new ideas in learning theory. Today's continuing advances in computation make it possible to infuse geometric ideas into learning that otherwise would have been computationally prohibitive. Nonlinear dynamics in brain-like complex systems has created great excitement, offering a broad spectrum of new ideas for discovery of parallel-distributed algorithms, a hallmark of learning theory. By having great overlap, geometry and nonlinear dynamics together offer a complementary and more profound picture of the physical world and how it interacts with the brain, the ultimate learning system. Among the discussion topics, we envision the following: information geometry, differential topological methods for turning local estimates into global quantities and invariants, Riemannian geometry and Feynman path integration as a framework to explore nonlinearity, advanced in complex dynamical system theory in the context of learning and dynamic information processing in brain, and information theory of massive data sets. As before, in our discussion sessions we will also examine the potential impact of learning theory on future development of geometry, and report on new examples of new vistas on the impact of learning theoretic parallel-distributed algorithms on research in mathematics. With 3 years of meetings, we are in a position to plan a volume based on the materials for the workshops and other contributions to be proposed to the NIPS Program Committee. ------------------------------------------------------------------------- Information and Statistical Structure in Spike Trains Jonathon D. Victor http://www-users.med.cornell.edu/~jdvicto/nips2001.html Understanding how neurons represent and manipulate information in their spike trains is one of the major fundamental problems in neuroscience. Moreover, advances towards its solution will rely on a combination of appropriate theoretical, computational, and experimental strategies. Meaningful and reliable statistical analyses, including calculation of information and related quantities, are at the basis of understanding neural information processing. The accuracy and precision of statistical analyses and empirical information estimates depend strongly on the amount and quality of the data available, and on the assumptions that are made in order to apply the formalisms to a laboratory data set. These assumptions typically relate to the neural transduction itself (e.g., linearity or stationarity) and to the statistics of the spike trains (e.g., correlation structure). There are numerous approaches to conducting statistical analyses and estimating information-theoretic quantities, and there are also some major differences in findings across preparations. It is unclear to what extent these differences represent fundamental biological differences, differences in what is being measured, or methodological biases. Specific areas of focus will include: Theoretical and experimental approaches to analyze multineuronal spiking activity; Bursting, rhythms, and other endogenous patterns; Is "Poisson-like" a reasonable approximation to spike train stochastic structure?; How do we formulate alternative models to Poisson?; How do we evaluate model goodness-of-fit? A limited number of slots are available for contributed presentations. Individuals interested in presenting a talk (approximately 20 minutes, with 10 to 20 minutes for discussion) should submit a title and abstract, 200-300 words, to the organizers, Jonathan D. Victor (jdvicto at med.cornell.edu) and Emery Brown (brown at neurostat.mgh.harvard.edu) by October 12, 2001. ------------------------------------------------------------------------- Workshop on New Directions in Kernel-Based Learning Methods Chris Williams, Craig Saunders, Matthias Seeger, John Shawe-Taylor http://www.cs.rhul.ac.uk/colt/nipskernel.html The aim of the workshop is to present new perspectives and new directions in kernel methods for machine learning. Recent theoretical advances and experimental results have drawn considerable attention to the use of kernel functions in learning systems. Support Vector Machines, Gaussian Processes, kernel PCA, kernel Gram-Schmidt, Bayes Point Machines, Relevance and Leverage Vector Machines, are just some of the algorithms that make crucial use of kernels for problems of classification, regression, density estimation, novelty detection and clustering. At the same time as these algorithms have been under development, novel techniques specifically designed for kernel-based systems have resulted in methods for assessing generalisation, implementing model selection, and analysing performance. The choice of model may be simply determined by parameters of the kernel, as for example the width of a Gaussian kernel. More recently, however, methods for designing and combining kernels have created a toolkit of options for choosing a kernel in a particular application. These methods have extended the applicability of the techniques beyond the natural Euclidean spaces to more general discrete structures. The workshop will provide a forum for discussing results and problems in any of the above mentioned areas. But more importantly, by the structure of the workshop we hope to examine the future directions and new perpsectives that will keep the field lively and growing. We seek two types of contributions: 1) Contributed 20 minutes talks that offer new directions (serving as a focal point for the general discussions) 2) Posters of new ongoing work, with associated spotlight presentations (summarising current work and serving as a springboard for individual discussion). Important Dates: Submission of extended abstracts: 15th October 2001. Notification of acceptance: Early November. Submission Procedure: Extended abstracts in .ps or .pdf formats (only) should be e-mailed to nips-kernel-workshop at cs.rhul.ac.uk ------------------------------------------------------------------------- Knowledge Representation In Meta-Learning Ricardo Vilalta http:www/research.ibm.com/MetaLearning Learning across multiple related tasks, or improving learning performance over time, requires knowledge be transferred across tasks. In many classification algorithms, successive applications of the algorithm over the same data always produces the same hypothesis; no knowledge is extracted across tasks. Knowledge across tasks can be used to construct meta-learners able to improve the quality of the inductive bias through experience. To attain this goal, different pieces of knowledge are needed. For example, how can we characterize those tasks that are most favorable to a particular classification algorithm? On the other hand, What forms of bias are most favorable for certain tasks? Are there invariant transformations inherent to a domain that can be captured when learning across tasks? The goal of the workshop is to discuss alternative ways of knowledge representation in meta-learning with the idea of achieving new forms of bias adaptation. Important Dates: Paper submission: Nov 1, 2001. Notification of acceptance: Nov 12, 2001. Camera-ready copy: Nov 26, 2001. ------------------------------------------------------------------------- Machine Learning Techniques for Bioinformatics Colin Campbell, Shayan Mukherjee http://lara.enm.bris.ac.uk/cig/nips01/nips01.htm There has been significant recent interest in the development of new methods for functional interpretation of gene expression data derived from cDNA microarrays and related technologies. Analysis frequently involves classification, regression, feature selection, outlier detection and cluster analysis, for example. To provide a focus, this topic be the main theme for this one-day Workshop, though contributions in related areas of bioinformatics are welcome. Contributed papers should ideally be in the area of new algorithmic or theoretical approaches to analysing such datasets as well as biologically interesting applications and validation of existing algorithms. To make sure the Workshop relates to issues of real importance to experimentalists there will be four invited tutorial talks to introduce microarray technology, illustrate particular case studies and discuss issues relevant to eventual clinical application. The invited speakers are Pablo Tamayo or Todd Golub (Whitehead Institute, MIT), Dan Notterman (Princeton University), Roger Bumgarner (University of Washington) and Richard Simon (National Cancer Institute). The invited speakers have been involved in the preparation of well-known datasets and studies of expression analysis for a variety of cancers. Authors wishing to contribute papers should submit a title and extended abstract to both organisers (C.Campbell at bris.ac.uk and sayan at mit.edu) before 14th October 2001. Further details about this workshop and the final schedule are available from the workshop webpage. ------------------------------------------------------------------------- Machine Learning Methods for Images and Text Thomas Hofmann, Jaz Kandola, Tomaso Poggio, John Shawe-Taylor http://www.cs.rhul.ac.uk/colt/nipstext.html The aim of the workshop is to present new perspectives and new directions in information extraction from structured and semi-structured data for machine learning. The goal of this workshop is to investigate extensions of modern statistical learning techniques for applications in the domains of categorization and retrieval of information for example text, video and sound, as well as to their combination -- multimedia. The focus will be on exploring innovative and potentially groundbreaking machine learning technologies as well as on identifying key challenges in information access, such as multi-class classification, partially labeled examples and the combination of evidence from separate multimedia domains. The workshop aims to bring together an interdisciplinary group of international researchers from machine learning, information retrieval, computational linguistics, human-computer interaction, and digital libraries for discussing results and dissemination of ideas, with the objective of highlighting new research directions. The workshop will provide a forum for discussing results and problems in any of the above mentioned areas. But more importantly, by the structure of the workshop we hope to examine the future directions and new perpsectives that will keep the field lively and growing. We seek two types of contributions: 1) Contributed 20 minutes talks that offer new directions (serving as a focal point for the general discussions) 2) Posters of new ongoing work, with associated spotlight presentations (summarising current work and serving as a springboard for individual discussion). Important Dates: Submission of extended abstracts: 15th October 2001. Notification of acceptance: 2nd November 2001. Submission Procedure: Extended abstracts in .ps or .pdf formats (only) should be e-mailed to nips-text-workshop at cs.rhul.ac.uk by 15th October 2001. Extended abstracts should be 2-4 sides of A4. The higlighting of a confernce-style group for the paper is not necessary, however the indication of a group and/or keywords would be helpful. ------------------------------------------------------------------------- Minimum Description Length: Developments in Theory and New Applications Peter Grunwald, In-Jae Myung, Mark Pitt http://quantrm2.psy.ohio-state.edu/injae/workshop.htm Inductive inference, the process of inferring a general law from observed instances, is at the core of science. The Minimum Description Length (MDL) Principle, which was originally proposed by Jorma Rissanen in 1978 as a computable approximation of Kolmogorov complexity, is a powerful method for inductive inference. The MDL principle states that the best explanation (i.e., model) given a limited set of observed data is the one that permits the greatest compression of the data. That is, the more we are able to compress the data, the more we learn about the underlying regularities that generated the data. This conceptualization originated in algorithmic information theory from the notion that the existence of regularities underlying data necessarily implies redundancy in the information from successive observations. Since 1978, significant strides have been made in both the mathematics and application of MDL. For example, MDL is now being applied in machine learning, statistical inference, model selection, and psychological modeling. The purpose of this workshop is to bring together researchers, both theorists and practitioners, to discuss the latest developments and share new ideas. In doing so, our intent is to introduce to the broader NIPS community the current state of the art in the field. ------------------------------------------------------------------------- Multi-sensory Perception & Learning J. Fisher, L. Shams, V. de Sa, M. Slaney, T. Darrell http://www.ai.mit.edu/people/fisher/nips01/perceptwshop/description/ All perception is multi-sensory perception. Situations where animals are exposed to information from a single modality exist only in experimental settings in the laboratory. For a variety of reasons, research on perception has focused on processing within one sensory modality. Consequently, the state of knowledge about multi-sensory fusion in mammals is largely at the level of phenomenology, and the underlying mechanisms and principles are poorly understood. Recently, however, there has been a surge of interest in this topic, and this field is emerging as one of fast growing areas of research in perception. Simultaneously and with the advent of low-cost, low-power multi-media sensors there has been renewed interest in automated multi-modal data processing. Whether it be in an intelligent room environment, heterogenous sensor array or the autonomous robot, robust integrated processing of multiple modalities has the potential to solve perception problems more efficiently by leveraging complementary sensor information. The goals of this workshop are to further the understanding of the both the cognitive mechanisms by which humans (and other animals) integrate multi-modal data as well as the means by which automated systems may similarly function. It is not our contention that one should follow the other. It is our contention, that researchers in these different communities stand to gain much through interaction with each other. This workshop aims to bring these researchers together to compare methods and performance and to develop a common understanding of the underlying principles which might be used to analyze both human and machine perception of multi-modal data. Discussions and presentations will span theory, application, as well as relevant aspects of animal/machine perception. The workshop will emphasize a moderated discussion format with short presentations prefacing each of the discussions. Please see the web page for some of the specific questions to be addressed. ------------------------------------------------------------------------- Neuroimaging: Tools, Methods & Modeling B. M. Bly, L. K. Hansen, S. J. Hanson, S. Makeig, S. Strother http://psychology.rutgers.edu/Users/ben/nips2001/nips2001workshop.html Advances in the mathematical description of neuroimaging data are currently a topic of great interest. Last June, at the 7th Annual Meeting of the Organization for Human Brain Mapping in Brighton UK, the number of statistical modeling abstracts virtually exploded (30 abstracts were submitted on ICA alone.) Because of its high relevance for researchers in statistical modeling it has been the topic of several NIPS workshops. Neuroinformatics is an emerging research field, which besides a rich modeling activity also is concerned with database and datamining issues as well as ongoing discussions of data and model sharing. Several groups now distribute statistical modeling tools and advanced exploratory approaches are finding increasing use in neuroimaging labs. NIPS is a rich arena for multivariate and neural modeling, the intersection of Neuroimaging and neural models is important for both fields. This workshop will discuss the underlying methods and software tools related to a variety of strategies for modeling and inference in neuroimaging data analysis (Morning, Day 1.) Discussants will also present methods for comparison, evaluation, and meta-analysis in neuroimaging (Afternoon, Day 1.) On the second day of the workshop, we will continue the discussion with a focus on multivariate strategies (Morning, Day 2.) The workshop will include a discussion of hemodynamic and neural models and their role in mathematical modeling of neuroimaging data (Afternoon, Day 2). Each session of the two-day workshop will include discussion. Talks are intended to last roughly 20 minutes each, followed by 10 minutes of discussion. At the end of each day, there will be a discussion of themes by all participants, with the presenters acting as a panel. ------------------------------------------------------------------------- Foundations of Occam's razor and parsimony in learning David G. Stork http://www.rii.ricoh.com/~stork/OccamWorkshop.html "Entia non sunt multiplicanda praeter necessitatem" -- William of Occam (1285?-1349?) Occam's razor is generally interpreted as counselling the use of "simpler" models rather than complex ones, fewer parameters rather than more, and "smoother" generalizers rather than those that are less smooth. The mathematical descendents of this philosophical principle of parsimony appear in minimum-description-length, Akaike, Kolmogorov complexity and related principles, having numerous manifestations in learning, for instance regularization, pruning, and overfitting avoidance. For a given quality of fit to the training data, in the absence of other information should we favor "simpler" models, and if so, why? How do we measure simplicity, and which representation should we use when doing so? What assumptions are made -- explicitly or implicitly -- by these methods and when are such assumptions valid? What are the minimum assumptions or conditions -- for instance that by increasing the amount of training data we will improve a classifier's performance -- that yield Occam's razor? Support Vector Machines and some neural networks contain a very large number of free parameters, more than might be permitted by the size of the training data and in seeming contradiction to Occam's razor; nevertheless, such classifiers can work exceedingly well. Why? Bayesian techniques such as ML-II reduce a classifier's complexity in a data-dependent way. Does this comport with Occam's razor? Can we characterize problems for which Occam's razor should or should not apply? Even if we abandon the search for the "true" model that generated the training data, can Occam's razor improve our chances of finding a "useful" model? It has been said that Occam's razor is either profound and true, or vacuous and false -- it just isn't clear which. Rather than address specific implementation techniques or applications, the goal of this workshop is to shed light on, and if possible resolve, the theoretical questions associated with Occam's razor, some of the deepest in the intellectual foundations of machine learning and pattern recognition. ------------------------------------------------------------------------- Quantum Neural Computing Elizabeth Behrman ------------------------------------------------------------------------- Variable and Feature Selection Isabelle Guyon, David Lewis http://www.clopinet.com/isabelle/Projects/NIPS2001/ Variable selection has recently received a lot of attention from the machine learning and neural network community because of its applications in genomics and text processing. Variable selection refers to the problem of selecting input variables that are most predictive of a given outcome. Variable selection problems are found in all machine learning tasks, supervised or unsupervised (clustering), classification, regression, time series prediction, two-class or multi-class, posing various levels of challenges. The objective of variable selection is two-fold: improving the prediction performance of the predictors and providing a better understanding of the underlying process that generated the data. This last problem is particularly important in biology when the process may be a living organism and the variables gene expression coefficient. One of the goals of the workshop is to explore alternate statements of the problem, including: (i) discovering all the variables relevant to the concept (e.g. to identify all candidate drug targets) (ii) finding a minimum subset of variables that are useful to the predictor (e.g. to identify the best biomarkers for diagnosis or prognosis). The workshop will also be a forum to compare the best existing algorithms and to discuss the organization of a potential competition on variable selection for a future workshop. Prospective participants are invited to submit one or two pages of summary. Theory, algorithm, and application contributions are welcome. After the workshop, the participants will be offered the possibility of submitting a full paper to a special issue of the Journal of Machine Learning Research on variable selection. Deadline for submission: October 15, 2001. Email submissions to: Isabelle Guyon at isabelle at clopinet.com. ------------------------------------------------------------------------- New Methods for Preference Elicitation Craig Boutilier, Holger Hoos, David Poole (chair), Qiang Yang http://www.cs.ubc.ca/spider/poole/NIPS/Preferences2001.html As intelligent agents become more and more adept at making (or recommending) decisions for users in various domains, the need for effective methods for the representation, elicitation, and discovery of preference and utility functions becomes more pressing. Deciding on the best course of action for a user depends critically on that user's preferences. While there has been much work on representing and learning models of the world (e.g., system dynamics), there has been comparatively little similar research with respect to preferences. The need to reason about preferences arises in electronic commerce, collaborative filtering, user interface design, task-oriented mobile robotics, reinforcement learning, and many others. Many areas of research bring interesting tools to the table that can be used to tackle these issues: machine learning (classification, reinforcement learning), decision theory and control theory (Markov decision processes, filtering techniques), Bayesian networks and probabilistic inferences, economics and game theory, among others. The aim of this workshop is to bring together a diverse group of researchers to discuss the both the practical and theoretical problems associated with effective preference elicitation and to highlight avenues for future research. The deadline for extended abstracts and statements of interest is October 19. From wolfskil at MIT.EDU Wed Oct 10 13:26:17 2001 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Wed, 10 Oct 2001 13:26:17 -0400 Subject: book announcement--Marcus Message-ID: <5.0.2.1.2.20011010132201.0546ed08@po14.mit.edu> I thought readers of the Connectionist List might be interested in this book. For more information please visit http://mitpress.mit.edu/0262133792 Best, Jud The Algebraic Mind Integrating Connectionism and Cognitive Science Gary F. Marcus In The Algebraic Mind, Gary F. Marcus integrates two competing theories about how the mind works, one which says that the mind is a computer-like manipulator of symbols, and another which says that the mind is a large network of neurons working together in parallel. Refuting the conventional wisdom that says that if the mind is a large neural network it cannot simultaneously be a manipulator of symbols, Marcus shows how neural systems could be organized so as to manipulate symbols, why such systems explain language and cognition better than systems that eschew symbols, how such system could evolve, and how they might unfold developmentally within the womb. The Algebraic Mind revamps our understanding of models in cognitive neuroscience and helps to set a new agenda for the field. Gary F. Marcus is Associate Professor of Psychology at New York University. 6 x 9, 225 pp., 48 illus., cloth ISBN 0-262-13379-2 Learning, Development, and Conceptual Change series A Bradford Book Jud Wolfskill Associate Publicist MIT Press 5 Cambridge Center, 4th Floor Cambridge, MA 02142 617.253.2079 617.253.1709 fax wolfskil at mit.edu From zemel at cs.toronto.edu Wed Oct 10 19:59:58 2001 From: zemel at cs.toronto.edu (Richard Zemel) Date: Wed, 10 Oct 2001 19:59:58 -0400 Subject: NIPS*2001 registration Message-ID: <01Oct10.200006edt.453166-29679@jane.cs.toronto.edu> You are invited to attend the 14th annual conference of NIPS*2001, Neural Information Processing Systems, at the Hyatt Regency in Vancouver, British Columbia, Canada and workshops at the Whistler ski resort near Vancouver. http://www-2.cs.cmu.edu/Groups/NIPS/ Tutorials: December 3, 2001 Conference: December 4-6, 2001 Workshops: December 6-8, 2001 The DEADLINE for reduced early registration fees is November 2, 2001. Registration can now be made online through a secure credit card link or through bank wire transfer, fax, and check: https://www.nips.salk.edu/regist.html Because the number of submissions this year increased to 650, we were able to accept 173 and maintain the same high standards: http://www-2.cs.cmu.edu/Groups/NIPS/NIPS2001/nips-program.html All registrants this year will receive a CD-ROM of the conference proceedings, which will also be available free online. The 2 volume soft-cover format, published by the MIT Press, can be purchased at a special conference rate. The last month has been a difficult time for everyone. The organizing committee for NIPS*2001 has been working hard to ensure that the program and facilities for the annual meeting are better than ever. Vancouver is a beautiful city with many excellent restaurants within a short walk of the conference. The base at Whistler is at 2,200 feet, substantially lower than ski resorts in Colorado. We hope you will join us in Vancouver for an exciting new NIPS*2001 Terry Sejnowski ----------------------------------------------- NIPS*2001 TUTORIALS - December 3, 2001 Luc Devroye, McGill University - Nonparametric Density Estimation: VC to the Rescue Daphne Koller, Stanford, and Nir Friedman, Hebrew University - Learning Bayesian Networks from Data Shawn Lockery, University of Oregon - Why the Worm Turns: How to Analyze the Behavior of an Animal and Model Its Neuronal Basis Christopher Manning, Stanford University - Probabilistic Linguistics and Probabilistic Models of Natural Language Processing Bernhard Scholkopf, Biowulf Technologies and Max-Planck Institute for Biological Cybernetics - SVM and Kernel Methods Sebastian Thrun, Carnegie Mellon University - Probabilistic Robotics INVITED SPEAKERS - December 4-6, 2001 Barbara Finlay, Cornell University - How Brains Evolve, and the Consequences for Computation Alison Gopnik, UC Berkeley - Babies and Bayes-nets: Causal Inference and Theory-formation in Children, Chimps, Scientists and Computers Jon M. Kleinberg, Cornell University - Decentralized Network Algorithms: Small-world Phenomena and the Dynamics of Information Tom Knight, MIT - Computing with Life Judea Pearl, UCLA - Causal Inference As an Exercise in Computational Learning Shihab Shamma, U. Maryland - Common Principles in Auditory and Visual Processing WORKSHOPS - December 6-8, 2001 Activity-Dependent Synaptic Plasticity - Paul Munro Artificial Neural Networks in Safety-Related Areas - Johann Schumann Brain-Computer Interfaces - Lucas Parra Causal Learning and Inference in Humans & Machines - Joshua B. Tenenbaum Competition: Unlabeled Data for Supervised Learning - Stefan C. Kremer Computational Neuropsychology - Mike Mozer Geometric Methods in Learning - Amir H. Assadi Information & Statistical Structure in Spike Trains - Jonathon D. Victor Kernel-Based Learning - John Shawe-Taylor and Craig Saunders Knowledge Representation in Meta-Learning - Ricardo Vilalta Machine Learning in Bioinformatics - Colin Campbell, Sayan Mukherjee Machine Learning Methods for Text and Images - Jaz Kandola Minimum Description Length - Peter Grunwald Multi-sensory Perception & Learning - Ladan Shams, John Fisher Neuroimaging: Tools, Methods & Modeling - Steve Hanson Occam's Razor & Parsimony in Learning - David Stork Preference Elicitation - David Poole Quantum Neural Computing - Elizabeth Behrman Variable & Feature Selection - Isabelle Guyon ----------------------------------------------- From dmedler at mcw.edu Wed Oct 10 13:24:35 2001 From: dmedler at mcw.edu (David A. Medler) Date: Wed, 10 Oct 2001 12:24:35 -0500 Subject: Postdoctoral Fellowship in Speech and Language Processing Message-ID: <3BC48453.62F314E0@mcw.edu> POSTDOCTORAL FELLOWSHIP COGNITIVE NEUROSCIENCE OF SPEECH AND LANGUAGE PROCESSING The Medical College of Wisconsin The Language Imaging Laboratory, Department of Neurology, Medical College of Wisconsin, announces an NIH-funded postdoctoral position in cognitive neuroscience of language processes. Applicants will join a research team studying word recognition, speech perception, semantics, and language development using fMRI, neural network modeling, and event-related potentials. Computer proficiency and exposure to computational models of language processing are desirable. Facilities include state-of-the-art 3T and 1.5T fMRI systems dedicated to research and supported by a large physics and engineering core. Ample scanner time and training in fMRI techniques will be provided. Applicants should have a PhD in experimental psychology, linguistics, cognitive neuroscience, computing science, or related field. Send curriculum vitae, statement of research interests, and two letters of recommendation to: Jeffrey Binder, Department of Neurology, Medical College of Wisconsin, 9200 W. Wisconsin Ave., Milwaukee, WI 53226. Email: jbinder at mcw.edu. Fax: 414-259-0469. Equal Opportunity Employer. -- David A. Medler, Ph.D. dmedler at mcw.edu Department of Neurology, The Medical College of Wisconsin 8701 Watertown Plank Rd, MEB 4550 Milwaukee, WI 53226 From jek at first.fraunhofer.de Thu Oct 11 12:44:59 2001 From: jek at first.fraunhofer.de (Jens Kohlmorgen) Date: Thu, 11 Oct 2001 18:44:59 +0200 (MET DST) Subject: Paper available: An On-line Method for Segmentation and Identification of Non-stationary Time Series Message-ID: Readers of the connectionists list might be interested in the following paper: Kohlmorgen, J., Lemm, S. (2001), "An On-line Method for Segmentation and Identification of Non-stationary Time Series" in: Neural Networks for Signal Processing XI, IEEE, NJ, pp. 113-122. It is available from http://www.first.gmd.de/~jek/Kohlmorgen.Jens/publications.html Abstract: We present a method for the analysis of non-stationary time series from dynamical systems that switch between multiple operating modes. In contrast to other approaches, our method processes the data incrementally and without any training of internal parameters. It straightaway performs an unsupervised segmentation and classification of the data on-the-fly. In many cases it even allows to process incoming data in real-time. The main idea of the approach is to track and segment changes of the probability density of the data in a sliding window on the incoming data stream. An application to a switching dynamical system demonstrates the potential usefulness of the algorithm in a broad range of applications. ========================================================================== Dr. Jens Kohlmorgen Tel.(office) : +49 30 6392-1875 Tel.(secret.): +49 30 6392-1800 Intelligent Data Analysis Group Fax : +49 30 6392-1805 Fraunhofer-FIRST (former GMD FIRST) Kekulestr. 7 e-mail: jek at first.fraunhofer.de 12489 Berlin, Germany http://www.first.fraunhofer.de/~jek ========================================================================== From rsun at cecs.missouri.edu Thu Oct 11 14:21:50 2001 From: rsun at cecs.missouri.edu (rsun@cecs.missouri.edu) Date: Thu, 11 Oct 2001 13:21:50 -0500 Subject: papers on hybrid reinforcement learning Message-ID: <200110111821.f9BILou23150@ari1.cecs.missouri.edu> Two papers on hybrid reinforcement learning: combining symbolic and neural methods for reinforcement learning accessible from http://www.cecs.missouri.edu/~rsun/hybrid-rl.html ---------------------------------------------------------------------------- Supplementing Neural Reinforcement Learning with Symbolic Methods Ron Sun Several different ways of using symbolic methods to enhance reinforcement learning are identified and discussed in some detail. Each demonstrates to some extent the potential advantages of combining RL and symbolic methods. Different from existing work, in combining RL and symbolic methods, we focus on autonomous learning from scratch without a priori domain-specific knowledge. Thus the role of symbolic methods lies truly in enhancing learning, not in providing a priori domain-specific knowledge. These discussed methods point to the possibilities and the challenges in this line of research. ---------------------------------------------------------------------------- Beyond Simple Rule Extraction: Acquiring Planning Knowledge from Neural Networks Ron Sun Todd Peterson Chad Sessions This paper discusses learning in hybrid models that goes beyond simple classification rule extraction from backpropagation networks. Although simple rule extraction has received a lot of research attention, we need to further develop hybrid learning models that learn autonomously and acquire both symbolic and subsymbolic knowledge. It is also necessary to study autonomous learning of both subsymbolic and symbolic knowledge in integrated architectures. This paper will describe planning knowledge extraction from neural reinforcement learning that goes beyond extracting simple rules. It includes two approaches towards extracting planning knowledge: the extraction of symbolic rules from neural reinforcement learning, and the extraction of complete plans. This work points to a general framework for achieving the subsymbolic to symbolic transition in an integrated autonomous learning framework. ------------------------------------------------------------------------- Both papers are accessible from http://www.cecs.missouri.edu/~rsun/hybrid-rl.html =========================================================================== Prof. Ron Sun http://www.cecs.missouri.edu/~rsun CECS Department phone: (573) 884-7662 University of Missouri-Columbia fax: (573) 882 8318 201 Engineering Building West Columbia, MO 65211-2060 email: rsun at cecs.missouri.edu http://www.cecs.missouri.edu/~rsun http://www.cecs.missouri.edu/~rsun/journal.html http://www.elsevier.com/locate/cogsys =========================================================================== From roli at diee.unica.it Thu Oct 11 10:59:03 2001 From: roli at diee.unica.it (Fabio Roli) Date: Thu, 11 Oct 2001 15:59:03 +0100 Subject: IF Special Issue on Fusion of Multiple Classifiers Message-ID: Call for papers for a special issue on Fusion of Multiple Classifiers Information Fusion An International Journal on Multi-Sensor, Multi-Source Information =46usion An Elsevier Science Publication Editor-in-Chief: Dr. Belur V. Dasarathy belur.d at dynetics.com; ifjournal at yahoo.com The Information Fusion Journal has planned for publication, in the latter half of 2002, a special issue devoted to the fusion of multiple classifiers. Classifier fusion has recently become an important tool for enhancing the performance of pattern recognition systems. A myriad of techniques have been developed for combining classifiers at the decision or soft decision output level. These techniques have been conceived by researchers in many diverse communities including Machine Learning, Pattern Recognition, Neural Networks, Statistics, and Artificial Intelligence. The aim of this special issue is to provide a focal point for recent advances in this methodological area of pat-tern recognition across different paradigms and disciplines. Submitted papers should report new theories underpinning classifier combination, novel methodologies, applications where classifier fusion significantly enhanced the recognition system performance, or extensive comparative studies of different combination rules. Topics appropriate for this special issue include, but are not limited to: =85 Decision level fusion =85 Strategies for multiple classifier fusion =85 Bagging and boosting =85 Neural network ensembles =85 Multiple classifier design =85 Fusion of one-class classifiers =85 Fusion of measurement and contextual information =85 Innovative applications Prospective authors should follow the regular paper preparation guide-lines of the Journal. Submission can be done electronically in =2Epdf or .ps format, along with four hard copies sent to one of the Guest Editors listed below: Guest Editors Prof. Josef Kittler Center for Vision, Speech and Signal Proc. Univ. of Surrey, Guildford, Surrey GU2 5XH, UK e-mail j.kittler at eim.surrey.ac.uk Prof. Fabio Roli Dept. of Electrical and Electronic Eng. Univ. of Cagliari, 09123, Cagliari, Italy email roli at diee.unica.it Deadline for Submission: November 15, 2001 -- =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D =46abio Roli, Ph.D. Associate Professor of Computer Science Electrical and Electronic Engineering Dept. - University of Cagliari Piazza d'Armi 09123 Cagliari Italy Phone +39 070 675 5874 Fax +39 070 6755900 e-mail roli at diee.unica.it Web Page at http://www.diee.unica.it/~roli/info.html From bogus@does.not.exist.com Thu Oct 11 15:55:02 2001 From: bogus@does.not.exist.com () Date: Thu, 11 Oct 2001 12:55:02 -0700 Subject: Research position at Microsoft Research Cambridge, with associated College Fellowship Message-ID: <1D4512C91A0E044FBF24DDB221C94C5303E41645@red-msg-29.redmond.corp.microsoft.com> From roli at diee.unica.it Thu Oct 11 10:53:55 2001 From: roli at diee.unica.it (Fabio Roli) Date: Thu, 11 Oct 2001 15:53:55 +0100 Subject: MCS 2002 - Third International Workshop on Multiple Classifier Systems Message-ID: **Apologies for multiple copies** ****************************************** *****MCS 2002 Call for Papers***** ****************************************** *****Paper Submission: 1 FEBRUARY 2002***** *********************************************************************** THIRD INTERNATIONAL WORKSHOP ON MULTIPLE CLASSIFIER SYSTEMS Grand Hotel Chia Laguna, Cagliari, Italy, June 24-26 2002 Updated information: http://www.diee.unica.it/mcs E-mail: mcs at diee.unica.it *********************************************************************** WORKSHOP OBJECTIVES MCS 2002 is the third workshop of a series aimed to create a common international forum for researchers of the diverse communities working in the field of multiple classifier systems. Information on the previous editions of MCS workshop can be found on www.diee.unica.it/mcs. Contributions from all the research communities working in the field are welcome in order to compare the different approaches and to define the common research priorities. Special attention is also devoted to assess the applications of multiple classifier systems. The papers will be published in the workshop proceedings, and extended versions of selected papers will be considered for publication in a special issue of the International Journal of Pattern Recognition and Artificial Intelligence. WORKSHOP CHAIRS Josef Kittler (Univ. of Surrey, United Kingdom) Fabio Roli (Univ. of Cagliari, Italy) ORGANIZED BY Dept. of Electrical and Electronic Eng. of the University of Cagliari Center for Vision, Speech and Signal Proc. of the University of Surrey Sponsored by IAPR TC1 Statistical Pattern Recognition Techniques PAPER SUBMISSION Three hard copies of the full paper should be mailed to: MCS 2002 Prof. Fabio Roli Dept. of Electrical and Electronic Eng. University of Cagliari Piazza d'armi 09123 Cagliari Italy In addition, participants should submit an electronic version of the manuscript (PostScript or PDF format) to mcs at diee.unica.it. The papers should not exceed 10 pages (LNCS format, see http://www.springer.de/comp/lncs/authors.html). A cover sheet with the authors names and affiliations is also requested, with the complete address of the corresponding author, and an abstract (200 words). Two members of the Scientific Committee will referee the papers. IMPORTANT NOTICE: Submission implies the willingness of at least one author to register, attend the workshop, and present the paper. Accepted papers will be published in the proceedings only if the registration form and payment for one of the authors will be received. WORKSHOP TOPICS Papers describing original work in the following and related research topics are welcome: Foundations of multiple classifier systems Methods for classifier fusion Design of multiple classifier systems Neural network ensembles Bagging and boosting Mixtures of experts New and related approaches Applications INVITED SPEAKERS Joydeep Ghosh (University of Texas, USA) Trevor Hastie (Stanford University, USA) Sarunas Raudys (Vilnius University, Lithuania) SCIENTIFIC COMMITTEE J. A. Benediktsson (Iceland) H. Bunke (Switzerland) L. P. Cordella (Italy) B. V. Dasarathy (USA) R. P.W. Duin (The Netherlands) C. Furlanello (Italy) J. Ghosh (USA) T. K. Ho (USA) S. Impedovo (Italy) N. Intrator (Israel) A.K. Jain (USA) M. Kamel (Canada) L.I. Kuncheva (UK) L. Lam (Hong Kong) D. Landgrebe (USA) D-S. Lee (USA) D. Partridge (UK) A.J.C. Sharkey (UK) K. Tumer (USA) G. Vernazza (Italy) T. Windeatt (UK) IMPORTANT DATES February 1, 2002 : Paper Submission March 15, 2002: Notification of Acceptance April 10, 2002: Camera-ready Manuscript April 10, 2002: Registration WORKSHOP VENUE The workshop will be held at Grand Hotel Chia Laguna, Cagliari, Italy. See http://www.crs4.it/~zip/EGVISC95/chia_laguna.html (in English) or http://web.tiscali.it/chialaguna (in Italian). WORKSHOP PROCEEDINGS Accepted papers will appear in the workshop proceedings that will be published in the series Lecture Notes in Computer Science by Springer-Verlag. Extended versions of selected papers will considered for possible publication in a special issue of the International Journal of Pattern Recognition and Artificial Intelligence.. -- ==================================================================== Fabio Roli, Ph.D. Associate Professor of Computer Science Electrical and Electronic Engineering Dept. - University of Cagliari Piazza d'Armi 09123 Cagliari Italy Phone +39 070 675 5874 Fax +39 070 6755900 e-mail roli at diee.unica.it Web Page at http://www.diee.unica.it/~roli/info.html From efiesler at intopsys.com Thu Oct 11 20:05:01 2001 From: efiesler at intopsys.com (Emile Fiesler) Date: Thu, 11 Oct 2001 17:05:01 -0700 Subject: Vacancy in Southern California for a research scientist. Message-ID: <000701c152b1$8a6d6a40$0615010a@efeisler> Intelligent Optical Systems (IOS), a world leader in the development of innovative optical sensors, is seeking a Research Scientist with expertise and hands-on experience in advanced signal and image processing, including neural computation. The ideal candidate will have a doctoral degree in computer science, electrical engineering, or equivalent, plus experience in securing funding through proposal writing. Expertise in spectroscopy and image enhancement are a definite plus. Functions would include image analysis and enhancement, biomedical diagnosis, chemical analysis, and object recognition, using AI and neural computation-based implementations in software and hardware. We offer a competitive salary and benefits packages. Please send your application, including CV, and 3 references, by e-mail to: OHuang at intopsys.com Emile Fiesler From holte at cs.ualberta.ca Sat Oct 13 14:31:44 2001 From: holte at cs.ualberta.ca (Robert Holte) Date: Sat, 13 Oct 2001 12:31:44 -0600 (MDT) Subject: Machine Learning journal Message-ID: Dear colleagues, In response to the widely circulated letter of resignation of some members of the Machine Learning journal (MLJ), I would like to make two points: - MLJ articles *are* universally electronically accessible - MLJ seeks your support and input to continue serving the community The accessibility of MLJ papers has been dramatically improved in the past 12 months. The main changes are these: - the copyright agreement gives the author the right to distribute individual copies of an MLJ paper to students and colleagues, physically and electronically, including making the paper available from the author's personal web site. - all MLJ papers are freely available online at Kluwer's web page http://www.wkap.nl/kaphtml.htm/MACHFCP from the time of acceptance until the paper appears in print. - the individual MLJ subscription price has been dramatically reduced. It is excellent value for money: for $120 Kluwer prints, binds, and mails to your door around 1350 pages. As a consequence of the first two points, MLJ articles are universally accessible -- from Kluwer's home page in the first six months or so, and at any time from the author's home page. The primary purpose of paid subscriptions, in this new distribution model, is to enable an individual or institution to obtain a bound archival copy of the journal printed on high-quality paper -- exactly the same role served by the printed version of JMLR sold by MIT Press. Turning to the second point, all members of both editorial boards have the interests of the machine learning community at heart. Our job is to serve you. The current members of the MLJ board, and the new members we are in the process of adding, believe it is in the best interests of the research community to keep MLJ alive and strong at this time. This is not to say we hope JMLR will fail. There is ample excellent research to support two high-quality journals, so it is not necessary for one of the journals to be destroyed in order for the other to succeed. If you agree that MLJ is useful to the community and has a role to play in the future, I would like to hear from you - feedback from the community is the very best way for me to know how to steer MLJ's course so it best serves the community. -- Robert Holte holte at cs.ualberta.ca Executive Editor Machine Learning From giro-ci0 at wpmail.paisley.ac.uk Mon Oct 15 03:35:24 2001 From: giro-ci0 at wpmail.paisley.ac.uk (Mark Girolami) Date: Mon, 15 Oct 2001 08:35:24 +0100 Subject: 24th BCS-IRSG European Colloquium on IR Research Message-ID: There is an increasing level of research interest within the connectionist and machine learning communities on a number of aspects of information retrieval * evidenced by the number of papers appearing in recent NIPS and ICML conferences as well as recently organised post-conference workshops at NIPS on document mining and retrieval. Therefore the following cfp will be of interest to the connectionist and ml mailing lists. Rgds Mark Girolami ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The 24th BCS-IRSG European Colloquium on Information Retrieval (IR) Research - which was the precursor of the ACM SIGIR conference - is being held in the city of Glasgow, Scotland and submissions reporting recent research work in this area are welcomed. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 24th BCS-IRSG European Colloquium on IR Research March 25-27, 2002, Glasgow, Scotland, UK http://www.cs.strath.ac.uk/ECIR02/ The colloquium on information retrieval research provides an opportunity for both new and established researchers to present papers describing work in progress or final results. These Colloquia were established by the BCS IRSG (British Computer Society Information Retrieval Specialist Group), and named the Annual Colloquium on Information Retrieval Research. Recently, the location of the colloquium has alternated between the United Kingdom and continental Europe. To reflect the growing European orientation of the event, the Colloquium was renamed "European Annual Colloquium on Information Retrieval Research" from 2001. The previous five colloquia have been held in Darmstadt (2001), Cambridge (2000), Glasgow (1999), Grenoble (1998), and Aberdeen (1997). Details The colloquium on information retrieval research provides an opportunity for both new and established researchers to present papers describing work in progress or final results. Relevant papers should address (at the theoretical, methodological, system or application level) the analysis, design or evaluation of functions like: Indexing Information Extraction Data Mining Browsing Retrieval and Filtering User Interaction for the following types of documents and databases: Monomedia documents (e.g. text, images, audio, voice, video) Composite documents Multimedia documents Hypermedia documents Active documents Distributed documents and databases Digital Libraries the Web Organising Committee Dr Fabio Crestani, Department of Computer & Information Sciences, University of Strathclyde, GLASGOW. Prof Mark Girolami, Deparment of Computer Science, University of Paisley, PAISLEY. Prof Keith van Rijsbergen, Department of Computing Science, University of Glasgow, GLASGOW. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From bischof at icg.tu-graz.ac.at Mon Oct 15 05:27:19 2001 From: bischof at icg.tu-graz.ac.at (Horst Bischof) Date: Mon, 15 Oct 2001 11:27:19 +0200 Subject: Reminder CfP Special Issue PR Message-ID: <3BCAABF7.3050001@icg.tu-graz.ac.at> Pattern Recognition The Journal of the Pattern Recognition Society Special Issue on Kernel and Subspace Methods for Computer Vision http://www.prip.tuwien.ac.at/~bis/cfp-pr.html Guest Editors: Ales Leonardis Horst Bischof Faculty of Computer and Pattern Recognition and Information Science, Image Processing Group University of Ljubljana, Vienna University of Technology Trzaska 25, Favoritenstr. 9/1832, 1001 Ljubljana, Slovenia A-1040 Vienna, Austria alesl at fri.uni-lj.si bis at prip.tuwien.ac.at This Pattern Recognition Special Issue will address new developments in the area of kernel and subspace methods related to computer vision. High-quality original journal paper submissions are invited. The topics of interest include (but are not limited to): Support Vector Machines, Independent Component Analysis, Principal Component Analysis, Mixture Modeling, Canonical Correlation Analysis, etc. applied to computer vision problems such as: Object Recognition, Navigation and Robotics, Medical Imaging, 3D Vision, etc. All submitted papers will be peer reviewed. Only high-quality, original submissions will be accepted for publication in the Special Issue---in accordance with the Pattern Recognition guidelines (http://www.elsevier.nl/inca/publications/store/3/2/8/index.htt). Submission Timetable Submission of full manuscript: November 30, 2001 Notification of Acceptance: March 29, 2002 Submission of revised manuscript: End of June 2002 Final Decision: August 2002 Final papers: September 2002 Submission Procedure All submissions should follow the Pattern Recognition Guidelines and should be submitted electronically via anonymous ftp in either postscript or pdf format (compressed with zip or gzip). Files should be named by the surname of the first author i.e., surname.ps.gz, for multiple submissions surname1, surname2, ... should be used. Papers should be uploaded to the following ftp site by the deadline of 30th November 2001. ftp ftp.prip.tuwien.ac.at [anonymous ftp, i.e.: Name: ftp Password: < your email address > ] cd sipr binary put .ext quit After uploading the paper authors should email the guest editor Ales Leonardis giving full details of the paper title and authors. -------------------------------------------------------------------------- -- !!!!!!!!!!!!!!!!! ATTENTION NEW ADDRESS !!!!!!!!!!!! Horst Bischof Institute for Computer Graphics and Vision TU Graz Inffeldgasse 16 2. OG A-8010 Graz AUSTRIA email: bischof at icg.tu-graz.ac.at Tel.: +43-316-873-5014 Fax.: +43-316-873-5050 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! From wolpert at hera.ucl.ac.uk Tue Oct 16 06:13:17 2001 From: wolpert at hera.ucl.ac.uk (Daniel Wolpert) Date: Tue, 16 Oct 2001 11:13:17 +0100 Subject: Two Postdoctoral Fellowships in Motor Control Message-ID: <015c01c1562b$2d277e70$40463ec1@APHRODITE> Two Postdoctoral Fellowships are available in 1. Computational models of imitation and 2. Human Sensorimotor Control ---------------------------------------------------------------------------- 1. Postdoctoral Fellowship Computational models of imitation Sobell Research Department of Motor Neuroscience & Movement Disorders, Institute of Neurology, University College London, London & ATR Human Information Science Laboratories, Japan Advisors: Dr. Daniel Wolpert (IoN/UCL, London) www.hera.ucl.ac.uk & Mitsuo Kawato (ATR, Japan) www.his.atr.co.jp/~kawato/ We have an opening for a highly motivated Postdoctoral Fellow to work on an international collaborative project funded by the McDonnell foundation entitled "Mechanisms of Forward Thinking and Behaviour". The Fellow will investigate the computational mechanisms by which the motor system can be used to decode the observed actions of others. Computational architectures will be developed for action observation, imitation and communication. The ideal candidate will have a PhD in a related area and strong mathematical and computational skills. Each year the postdoctoral fellow will spend 9 months in London with Dr Daniel Wolpert, to whom informal enquiries are welcome (wolpert at hera.ucl.ac.uk) and 3 months in Kyoto, Japan working with Dr Mitsuo Kawato. The position is available for two years and the Fellow could start immediately. Starting salary is up to #30,453 pa inclusive, depending on experience. Applications (2 copies of CV and names of 3 referees) to Miss E Bertram, Assistant Secretary (Personnel), Institute of Neurology, Queen Square, London WC1N 3BG (fax: +44 (0)20 7278 5069) by 6th November 2001. Working toward Equal Opportunity ---------------------------------------------------------------------------- 2. Postdoctoral Fellowship Human Sensorimotor Control Sobell Research Department of Motor Neuroscience & Movement Disorders Institute of Neurology, University College London Advisor: Dr. Daniel Wolpert The Sobell Department of Motor Neuroscience & Movement Disorders has an opening for a highly motivated Postdoctoral Fellow in the area of computational and experimental human motor control. The Fellow will join a team investigating planning, control and learning of skilled action. The ideal candidate will have a PhD, technical expertise and computational skills relevant to the study of human movement. The sensorimotor control laboratory is housed with state of the art equipment for the collection of kinematic (Optotrak & multiple flock-of-birds), force (multiple six axis force transducers) and physiological data (EMG). In addition equipment is available for the provision of online visual feedback (both 3D and 2D virtual reality systems), and for the perturbation of movements (two robotic Phantom haptic interfaces, muscle stimulation and TMS facilities). The project, funded by a Wellcome Programme Grant, is under the direction of Dr. Daniel Wolpert to whom informal enquiries are welcome (wolpert at hera.ucl.ac.uk). The position is available for three years with a starting date from January 2002. Further details of the post and laboratory are available on www.hera.ucl.ac.uk. Starting salary is up to #30,453 pa inclusive, depending on experience. Applications (2 copies of CV and names of 3 referees) to Miss E Bertram, Assistant Secretary (Personnel), Institute of Neurology, Queen Square, London WC1N 3BG (fax: +44 (0)20 7278 5069) by 6th November 2001. Working toward Equal Opportunity ---------------------------------------------------------------------------- From terry at salk.edu Fri Oct 19 19:15:01 2001 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 19 Oct 2001 16:15:01 -0700 (PDT) Subject: NEURAL COMPUTATION 13:11 Message-ID: <200110192315.f9JNF1j49876@purkinje.salk.edu> Neural Computation - Contents - Volume 13, Number 11 - November 1, 2001 ARTICLE Predictability, Complexity and Learning William Bialek, Ilya Nemenman, and Naftali Tishby NOTE Dendritic Subunits Determined by Dendritic Morphology K. A. Lindsay, J. M. Ogden and J. R. Rosenberg LETTERS Computing the Optimally Fitted Spike Train for a Synapse Thomas Natschlager and Wolfgang Maass Period Focusing Induced by Network Feedback in Populations of Noisy Integrate-and-Fire Neurons Francisco B. Rodriguez, Alberto Suarez, Vicente Lopez A Variational Method for Learning Sparse and Overcomplete Representations Mark Girolami Random Embedding Machines for Pattern Recognition Yoram Baram Manifold Stochastic Dynamics for Bayesian Learning Mark Zlochin and Yoram Baram Resampling Method for Unsupervised Estimation of Cluster Validity Erel Levine and Eytan Domany The Whitney Reduction Network: A Method for Computing Autoassociative Graphs D. S. Broomhead and M. J. Kirby Enhanced 3D Shape Recovery Using the Neural-Based Hybrid Reflectance Model Siu-Yeung Cho and Tommy W. S. Chow ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2001 - VOLUME 13 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $460 $492.20 $508 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From psyc at coglit.ecs.soton.ac.uk Fri Oct 19 04:12:40 2001 From: psyc at coglit.ecs.soton.ac.uk (PSYCOLOQUY (Electronic Journal)) Date: Fri, 19 Oct 2001 09:12:40 +0100 (BST) Subject: Psycoloquy: Call for Submissions. Message-ID: <200110190812.JAA07734@coglit.ecs.soton.ac.uk> PSYCOLOQUY CALL FOR ARTICLES PSYCOLOQUY is a refereed international, interdisciplinary electronic journal sponsored by the American Psychological Association (APA) and indexed by APA's PsycINFO and by Institute for Scientific Information. http://www.apa.org/psycinfo/about/covlist.html PSYCOLOQUY publishes target articles and peer commentary in all areas of psychology as well as cognitive science, neuroscience, behavioral biology, artificial intelligence, robotics/vision, linguistics and philosophy. DIRECT SUBMISSIONS TO: psyc at pucc.princeton.edu Further information is available on the Psycoloquy website: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc Instructions for authors may be found at: http://www.princeton.edu/~harnad/psyc.html#inst http://www.cogsci.soton.ac.uk/psycoloquy/#inst Below is a list of other recently published PSYCOLOQUY treatments that are currently undergoing Open Peer Commentary: Navon, D. (2001), The Puzzle of Mirror Reversal: A View From Clockland. Psycoloquy 12 (017) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.017 Kramer, D. & Moore, M. (2001), Gender Roles, Romantic Fiction and Family Therapy. Psycoloquy 12 (024) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.024 Sherman, J. A. (2001), Evolutionary Origin of Bipolar Disorder (EOBD). Psycoloquy 12 (028) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.028 Overgaard, M. (2001), The Role of Phenomenological Reports in Experiments on Consciousness. Psycoloquy 12 (029) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.029 Crow, T. J. (2000) Did Homo Sapiens Speciate on the Y Chromosome? Psycoloquy 11 (001) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.001 Margolis, H. (2000) Wason's Selection Task with A Reduced Array Psycoloquy 11 (005) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.005 Place, U. T. (2000) The Role of the Hand in the Evolution of Language Psycoloquy 11 (007) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.007 Green, C. D. (2000) Is AI the Right Method for Cognitive Science? Psycoloquy 11 (061) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.061 Reifman, A. (2000) Revisiting the Bell Curve Psycoloquy 11 (099) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.099 SPECIAL SET OF 6 TARGET ARTICLES ON NICOTINE ADDICTION: Balfour, D. (2001), The Role of Mesolimbic Dopamine in Nicotine Dependence. Psycoloquy 12(001) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.001 Le Houezec, J. (2001), Non-Dopaminergic Pathways in Nicotine Dependence. Psycoloquy 12 (002) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.002 Oscarson, M. (2001), Nicotine Metabolism by the Polymorphic Cytochrome P450 2A6 (CYP2A6) Enzyme: Implications for Interindividual Differences in Smoking Behaviour. Psycoloquy 12 (003) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.003 Sivilotti, L. (2001), Nicotinic Receptors: Molecular Issues. Psycoloquy 12 (004) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.004 Smith, G. & Sachse, C. (2001), A Role for CYP2D6 in Nicotine Metabolism? Psycoloquy 12 (005) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.005 Wonnacott, S. (2001), Nicotinic Receptors in Relation to Nicotine Addiction. Psycoloquy 12 (006) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.006 MULTIPLE BOOK REVIEWS: Ben-Ze'ev, A. (2001), The Subtlety of Emotions. Psycoloquy 12 (007) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.007 Miller, G. F. (2001), The Mating Mind: How Sexual Choice Shaped the Evolution of Human Nature. Psycoloquy 12 (008) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.008 Bolton, D. & Hill, J. (2001), Mind, Meaning & Mental Disorder: The Nature of Causal Explanation in Psychology & Psychiatry. Psycoloquy 12 (018) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.018 Zachar, P. (2001), Psychological Concepts and Biological Psychiatry: A Philosophical Analysis. Psycoloquy 12 (023) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.023 Praetorius, N. (2001), Principles of Cognition, Language and Action: Essays on the Foundations of a Science of Psychology. Psycoloquy 12 (027) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.027 Carstairs-McCarthy, A. (2000) The Origins of Complex Language Psycoloquy 11 (082) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.082 Storfer, M. D. (2000) Myopia, Intelligence, and the Expanding Human Neocortex Psycoloquy 11 (083) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.083 Tenopir, C. & King, D. W. (2000) Towards Electronic Journals: Realities for Scientists, Librarians, and Publishers Psycoloquy 11 (084) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.084 Sheets-Johnston, M. (2000) The Primacy of Movement Psycoloquy 11 (098) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?11.098 From Machine.Learning at wkap.com Fri Oct 19 12:50:13 2001 From: Machine.Learning at wkap.com (Journal of Machine Learning) Date: Fri, 19 Oct 2001 12:50:13 -0400 Subject: Kluwer on Machine Learning Message-ID: In response to the public resignation and subsequent e-mail campaign of some MACHINE LEARNING (MLJ) editorial board members, please allow these facts to be clear. Kluwer Academic Publishers will continue its full support of the artificial intelligence community and MACHINE LEARNING. MLJ has been the premier publication venue for machine learning for 15 years. Its prestige is unequalled; even with a rejection rate of over 55% it publishes 12 full issues-over 1300 pages-per year. Its Science Citation Index ranking rises each year, and is consistently in the top 12 in all of Computer Science and Artificial Intelligence. MACHINE LEARNING is featured in over 30 abstracting and indexing services, including all of the relevant ISI indexes and those in fields as far-ranging as physics, psychology, and neuroscience. Print subscriptions continue to grow. Its electronic version reaches millions of users worldwide. Kluwer's commitment to the MACHINE LEARNING community includes: * Posting of accepted articles for free on the journal's web site immediately upon acceptance * Encouraging authors to post their papers on their own web site, prior to and after publication * Serving our editors, authors, and reviewers with an electronic reviewing system * Providing services from promotion to copyediting to distribution that ensure their work will reach the community, and reach it with a professional presentation * Representing the journal at conferences across all of computer science, exposing it to communities outside of machine learning * Maintaining the current individual subscription price of $120 * Including in the 2002 volumes over 15% more content with a less than 5% price increase to libraries Kluwer Academic Publishers has made, and continues to make, a very large investment in MLJ. Additionally, journals such as MACHINE LEARNING have paved the way for countless other new journals across all disciplines-its revenue provides a critical component for funding new projects that might not otherwise have been started. In fact, Kluwer publishes over 25 research journals in Artificial Intelligence. We are proud to sponsor targeted journals such as ARTIFICIAL INTELLIGENCE AND LAW and ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE as a service to the AI community. Publisher revenues also provide the very taxes that universities and other non-profit entities depend on to fund research. Kluwer Academic Publishers will continue our commitment to providing a healthy forum for many journals spanning all areas of academic research. The e-mail contact for MACHINE LEARNING is machine.learning at wkap.com. From charlotte.manly at louisville.edu Fri Oct 19 12:06:33 2001 From: charlotte.manly at louisville.edu (Charlotte Manly) Date: Fri, 19 Oct 2001 12:06:33 -0400 Subject: position in cognitive or computational neuroscience Message-ID: Dear Connectionists, Please circulate to interested parties. --------------- The Department of Psychological and Brain Sciences at the University of Louisville invites applications for a tenure-track position as Assistant Professor in cognitive neuroscience or computational neuroscience. The specific research area can come from a broad domain, including theoretical and applied areas of cognition, development, and vision. Applicants must show promise of building an outstanding record of externally funded research and publication. Postdoctoral experience is desirable. The position will begin August 1, 2002. Salary and start-up package are highly competitive. Applicants should have a curriculum vitae, description of research and teaching interests and experience, reprints and/or preprints, and three letters of recommendation forwarded to: John R. Pani, Ph.D., Chair, Experimental Search Committee, Department of Psychological and Brain Sciences, University of Louisville, Louisville, KY 40292. Review of applications will begin January 10, 2001 and will continue until the position is filled. Women and minorities are encouraged to apply. The University of Louisville is an Affirmative Action, Equal Opportunity Employer. The Department of Psychological and Brain Sciences (http://www.louisville.edu/a-s/psychology/) has been targeted for further enhancement by the University's Challenge for Excellence. Louisville is a dynamic city that is ranked highly for its quality of life, community and state commitment to education, and support for the arts. -- ====================================================== Charlotte F. Manly, Ph.D. | Psychological & Brain Sciences Assistant Professor | 317 Life Sciences Bldg ph: (502) 852-8162 | University of Louisville fax: (502) 852-8904 | Louisville, KY 40292 charlotte.manly at louisville.edu http://www.louisville.edu/a-s/psychology/ http://www.louisville.edu/~cfmanl01 From rothschild at cs.haifa.ac.il Sun Oct 21 04:10:32 2001 From: rothschild at cs.haifa.ac.il (Rothschild Institute) Date: Sun, 21 Oct 2001 10:10:32 +0200 (IST) Subject: Call-for-Papers: Haifa Winter Workshop on Computer Science and Statistics (Dec. 17-20, 2001) Message-ID: (Our apologies in advance for multiple copies due to posting on multiple mailing lists.) CALL FOR PAPERS Haifa Winter Workshop on Computer Science and Statistics 17-20 December 2001 The Caesarea Edmond Benjamin de Rothschild Foundation Institute for Interdisciplinary Applications of Computer Science at the University of Haifa, together with the Department of Statistics and the Department of Computer Science, are organizing an international workshop Dec. 17-20, 2001 on Computer Science and Statistics -- with an emphasis on Knowledge Discovery and other AI related topics. Additional funding is provided by co-sponsors the Ministry of Science and the US National Science Foundation. Purpose: The purpose of the workshop is to bring together experts from the fields of computer science and statistics and to explore potential areas of research in order to stimulate collaborative work. Particular areas of interests are (preliminary list): * Bayesian learning * Data mining * Simulation-based computation * Expert systems * Automated learning * Robotics Call-for-Papers: Contributed papers and posters are solicited for presentation. Submissions (extended abstract or full paper) should be sent electronically to libi at cs.haifa.ac.il no later than Nov. 15, 2001. Accepted abstracts will be posted on the workshop website http:// www.rothschild.haifa.ac.il /csstat (Late submissions will be considered on a space available basis.) Dates: 17-20 December 2001 Venue: University of Haifa Organizers: Martin C. Golumbic, Udi E. Makov, Yoel Haitovski, Ya'acov Ritov Tentative list of Invited speakers: Matt Beal (Gatsby) Steve Fienberg (Carnegie Mellon) Nir Friedman (Hebrew Univ.) Dan Geiger (Technion) Michael Kearns (Syntekcapital) Yishay Mansour (Tel Aviv) David Madigan (Rutgers) Thomas Richardson (Washington) Dan Roth (Urbana) Steve Skiena (Stony Brook) Yehuda Vardi (Rutgers) Volodya Vovk (London) Registration: There will be no registration fee, but participants are asked to register in advance using the form on the website http://www.rothschild.haifa.ac.il /csstat Hotel subsidies for advanced graduate students will be available upon the recommendation of their thesis advisor. For further information please contact libi at cs.haifa.ac.il From mieko at atr.co.jp Mon Oct 22 00:56:17 2001 From: mieko at atr.co.jp (Mieko Namba) Date: Mon, 22 Oct 2001 13:56:17 +0900 Subject: Neural Networks 14(9) Message-ID: NEURAL NETWORKS 14(9) Contents - Volume 14, Number 9 - 2001 ------------------------------------------------------------------ NEURAL NETWORKS LETTER: Neuronal integration mechanisms have little effect on spike auto-correlations of cortical neurons Yutaka Sakai Best estimated inverse versus inverse of the best estimator Amir Karniel, Ron Meir and Gideon F. Inbar INVITED ARTICLE How to be a gray box: dynamic semi-physical modeling Yacine Oussar and G?rard Dreyfus CONTRIBUTED ARTICLES: ***** Mathematical and Computational Analysis ***** An Infomax-based learning rule that generates cells similar to visual cortical neurons K. Okajima On the stability analysis of delayed neural networks systems Chunhua Feng and Rejean Plamondon A two-level Hamming network for high performance associative memory Nobuhiko Ikeda, Paul Watta, Metin Artiklar and Mohamad H. Hassoun A closed-form neural network for discriminatory feature extraction from high-dimensional data Ashit Talukder and David Casasent The enhanced LBG algorithm Giuseppe Patane and Marco Russo ***** Engineering & Design ***** Reconstruction of chaotic dynamics by on-line EM algorithm S. Ishii and M.-A. Sato Novelty detection using products of simple expertsa potential architecture for embedded systems Alan F. Murray A new algorithm to design compact two-hidden-layer artificial neural networks Md. Monirul Islam and K. Murase Cross-validation in Fuzzy ARTMAP for large databases Anna Koufakou, Michael Georgiopoulos, George Anagnostopoulos and Takis Kasparis ***** Technology and Applications ***** Fingerprints classification using artificial neural networks: a combined structural and statistical approach Khaled Ahmed Nagaty Bi-directional computing architecture for time series prediction Hiroshi Wakuya and Jacek M. Zurada ***** Book Review ***** Book review: Advances in Synaptic Plasticity: A Compact Account of the New, the Important, and the Interesting Murat Okatan ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 (regular) Y 13,000 (regular) Neural Networks (plus 2,000 enrollment fee) $20 (student) SEK 460 (student) Y 11,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- membership without $30 SEK 200 not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Takashi Nagano Faculty of Engineering Hosei University 3-7-2, Kajinocho, Koganei-shi Tokyo 184-8584 Japan 81 42 387 6350 (phone and fax) jnns at k.hosei.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- -- ========================================================= Mieko Namba Secretary to Dr. Mitsuo Kawato Editorial Administrator of NEURAL NETWORKS ATR-I, Human Information Science Laboratories Department 3 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-2647 E-MAIL mieko at atr.co.jp ========================================================= MY EMAIL ADDRESS HAS BEEN CHANGED FROM OCT.1, 2001 ========================================================= From bbs at bbsonline.org Mon Oct 22 14:41:14 2001 From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor)) Date: Mon, 22 Oct 2001 14:41:14 -0400 Subject: Rachlin: ALTRUISM AND SELFISHNESS: BBS Call for Commentators Message-ID: Dear Dr. Connectionists List User, Below is the abstract of a forthcoming BBS target article ALTRUISM AND SELFISHNESS by Howard Rachlin http://www.bbsonline.org/Preprints/Rachlin/Referees/ This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _____________________________________________________________ ALTRUISM AND SELFISHNESS Howard Rachlin Psychology Department State University of New York Stony Brook, New York, 11794-2500 KEYWORDS: addiction, altruism, commitment, cooperation, defection, egoism, impulsiveness, patterning, prisoners dilemma, reciprocation, reinforcement, selfishness, self-control ABSTRACT: Many situations in human life present choices between (a) narrowly preferred particular alternatives and (b) narrowly less preferred (or aversive) particular alternatives that nevertheless form part of highly preferred abstract behavioral patterns. Such alternatives characterize problems of self-control. For example, at any given moment, a person may accept alcoholic drinks yet also prefer being sober to being drunk over the next few days. Other situations present choices between (a) alternatives beneficial to an individual and (b) alternatives that are less beneficial (or harmful) to the individual that would nevertheless be beneficial if chosen by many individuals. Such alternatives characterize problems of social cooperation; choices of the latter alternative are generally considered to be altruistic. Altruism, like self-control, is a valuable temporally-extended pattern of behavior. Like self-control, altruism may be learned and maintained over an individuals lifetime. It needs no special inherited mechanism. Individual acts of altruism, each of which may be of no benefit (or of possible harm) to the actor, may nevertheless be beneficial when repeated over time. However, because each selfish decision is individually preferred to each altruistic decision, people can benefit from altruistic behavior only when they are committed to an altruistic pattern of acts and refuse to make decisions on a case-by-case basis http://www.bbsonline.org/Preprints/Rachlin/Referees/ ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENTS *** (1) The authors of scientific articles are not paid money for their refereed research papers; they give them away. What they want is to reach all interested researchers worldwide, so as to maximize the potential research impact of their findings. Subscription/Site-License/Pay-Per-View costs are accordingly access-barriers, and hence impact-barriers for this give-away research literature. There is now a way to free the entire refereed journal literature, for everyone, everywhere, immediately, by mounting interoperable university eprint archives, and self-archiving all refereed research papers in them. Please see: http://www.eprints.org http://www.openarchives.org/ http://www.dlib.org/dlib/december99/12harnad.html --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to self-archive all their papers in their own institution's Eprint Archives or in CogPrints, the Eprint Archive for the biobehavioral and cognitive sciences: http://cogprints.soton.ac.uk/ It is extremely simple to self-archive and will make all of our papers available to all of us everywhere, at no cost to anyone, forever. Authors of BBS papers wishing to archive their already published BBS Target Articles should submit it to BBSPrints Archive. Information about the archiving of BBS' entire backcatalogue will be sent to you in the near future. Meantime please see: http://www.bbsonline.org/help/ and http://www.bbsonline.org/Instructions/ --------------------------------------------------------------------- (3) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, useing your username and password above: http://www.bbsonline.org/ For information about the mailshot, please see the help file at: http://www.bbsonline.org/help/node5.html#mailshot *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* From malchiod at laren.usr.dsi.unimi.it Mon Oct 22 09:48:08 2001 From: malchiod at laren.usr.dsi.unimi.it (Dario Malchiodi) Date: Mon, 22 Oct 2001 15:48:08 +0200 (CEST) Subject: Course at the International School on Neural Nets "E. R. Caianiello" Message-ID: Here enclosed you find the programme of the course FROM SYNAPSES TO RULES: DISCOVERING SYMBOLIC RULES FROM NEURAL PROCESSED DATA. Dario Malchiodi malchiodi at dsi.unimi.it ----------------------------------------------------- GALILEO GALILEI FOUNDATION WORLD FEDERATION OF SCIENTISTS ETTORE MAJORANA CENTRE FOR SCIENTIFIC CULTURE GALILEO GALILEI CELEBRATIONS Four Centuries Since the Birth of MODREN SCIENCE INTERNATIONAL SCHOOL ON NEURAL NETS "E. R. CAIANIELLO" 5th Course: FROM SYNAPSES TO RULES: DISCOVERING SYMBOLIC RULES FROM NEURAL PROCESSED DATA ERICE-SICILY: 25 FEBRUARY - 7 MARCH 2002 Sponsored by the: International Institute for Advanced Scientific Studies (IIASS) Italian Ministry of Education, University, Scientific Research and Technology Sicilian Regional Government Italian Society for Nerual Networks (SIREN) University of Salerno University of Milan PROGRAMME AND LECTURERS Inferential bases for learning Theoretical foundations for soft computing Integration of symbolic-subsymbolic reasoning methods Physics and metaphysics of learning Toward applications * B. Apolloni, University of Milan, I * D. Malchiodi, University of Milan, I * D. Mundici, University of Milan, I * M. Gori, University of Siena, I * F. Kurfess, California Polytechnic State Univ., San Luis Obispo, CA , USA * A. Roy, Arizona State University, Tempe, AZ, USA * R. Sun, University of Missouri-Columbia, MO, USA * L. Agnati, Karolinska Institutet, Stockholm, S * G. Basti, Pontificia Università Lateranense, Rome, I * G. Biella, C.N.R. LITA, Milan, I * J. G. Taylor, King's College, London, UK * A. Esposito, Istituto Italiano Alti Studi Scientifici, Vietri, I * A. Moise, Boise State University, ID, USA PURPOSE OF THE COURSE The school aims at fixing a theoretical and applicatry framework for extracting formal rules from data. To this end the modern approaches will be expounded that collapse the two typical goals of the conventional AI and connectionism - respectively, deducing within an axiomatic shell formal rules about a phenomenon and inferring the actual behavior of it from examples - into a challenging inferential framework where we learn from data and understand what we have learnt. The target reads as a translation of the subsymbolic structure of the data - stored in the synapses of a neural network - into formal properties described by rules. To capture this trip from synapses to rules and then render it manageable for affording real world learning tasks, the Course will deal in depth with the following aspects: i. theoretical foundations of learning algorithms and soft computing, ii. intimate relationships between symbolic and subsymbolic reasoning methods, iii. integration of the related hosting architectures in both physiological and artificial brain. APPLICATIONS Interested candidates should send a letter to: * Professor Bruno Apolloni - Dipartimento di Scienze dell'Informazione Università degli Studi di Milano Via Comelico 39/41 20135 Milano, Italy Tel: ++39.02.5835.6284, Fax: ++39.02.5835.6228 e-mail: apolloni at dsi.unimi.it specifying: i) date and place of birth and present activity; ii) nationality. Thanks to the generosity of the sponsoring Institutions, partial support can be granted to some deserving students who need financial aid. Requests to this effect must be specified and justified in the letter of application. Notification of acceptance will be sent within the end of January 2002. * PLEASE NOTE Participants must arrive in Erice on 25 February, not later than 5 p.m. Closing date for applications: December 15, 2001 No special application form is required POETIC TOUCH According to legend, Erice, son of Venus and Neptune, founded a small town on top of a mountain (750 meters above sea level) more than three thousand years ago. The founder of modern history - i.e. the recording of eventos in a methodic and chronological sequence as they really happened without reference to mythical causes - the great Thucydides (~500 B.C.), writing about events connected with the conquest of Troy (1183 B.C.), says: "After the fall of Troy some Trojans on their escape from the Achaei arrived in Sicily on boats and as they settled near the border with the Sicanians all together they were named Elymi: their towns were Segesta and Erice". This inspired Virgil to describe the arrival of the Trojan royal family in Erice and the burial of Anchise, by his son Enea, on the coast below Erice. Homer (~1000 B.C.) , Theocritus (~300 B.C.), Polybius (~200 B.C.), Virgil (~50 B.C.), Horace (~20 B.C.), and other have celebrated this magnificent spot in Sicily in their poems. During seven centuries (XIII-XIX) the town of Erice was under the leadership of a local oligarchy, whose wisdom assured a long period of cultural development and economic prosperity which in turn gave rise to the many chunches, monasteries and private palaces which you see today. In Erice you can admire the Castle of Venus, the Cyclopean Walls (~800 B.C.) and the Gothic Cathedral (~1300 A.D.). Erice is at present a mixture of ancient and medieval architecture. Other masterpieces of ancient civilization are to be found in the neighbourhood: at Motya (Phoenician), Segesta (Elymian), and Selinunte (Greek). On the Aegadian Islands - theatre of the decisive naval battle of the first Punic War (264-241 B.C.) - suggestive neolithic and paleolithic vestiges are still visible: the grottoes of Favignana, the carvings and murals of Levanzo. Splendid beaches are to be found at San Vito Lo Capo, Scopello, and Cornino, and a wild and rocky coast around Monte Cofano: all at less than one hour's drive from Erice. More information about this Course and the other activities of the Ettore Majorana Centre can be found on the WWW at the following address: http://www.ccsem.infn.it B. APOLLONI, A. MOISE DIRECTORS OF THE COURSE M. J. JORDAN, M. MARINARO DIRECTORS OF THE SCHOOL A. ZICHICHI DIRECTOR OF THE CENTRE From moeller at mpipf-muenchen.mpg.de Tue Oct 23 07:02:57 2001 From: moeller at mpipf-muenchen.mpg.de (Ralf Moeller) Date: Tue, 23 Oct 2001 13:02:57 +0200 Subject: postdoctoral position "AMOUSE" project Message-ID: <3BD54E61.6051CC3E@mpipf-muenchen.mpg.de> The Max Planck Institute for Psychological Research in Munich, Germany, invites applications for a Postdoctoral position (4-year appointment, salary BAT IIa/Ib, approx. DEM 71k p.a.) in the research group "Cognitive Robotics", starting at the earliest convenience. The position is funded by the European Community in the project "Artificial Mouse" as part of the initiative "Neuroinformatics for Living Artefacts". The project aims at an improved understanding of the somatosensory (whisker) system in rodents, from transduction to visuo-tactile and sensorimotor integration. As part of the modeling process, an artificial whisker system will be developed and tested on a mobile robot. Candidates should have a background in signal processing, electronics, mechanics, and computer science, as well as an interest in interdisciplinary research in the fields of neuroscience and cognitive science. Experience with image processing, neural networks, and robotics is beneficial. The Max Planck Society seeks to increase the number of female scientists and encourages them to apply. Handicapped persons with comparable qualifications receive preferential status. Please submit a CV, complete academic records, and the name and email address of two academic references to: Administration Max Planck Institute for Psychological Research Amalienstr. 33 D-80799 Munich Germany web site: http://www.mpipf-muenchen.mpg.de For further information, please contact: Dr. Ralf Moeller email: moeller at mpipf-muenchen.mpg.de From plaut at cmu.edu Tue Oct 23 14:39:50 2001 From: plaut at cmu.edu (David Plaut) Date: Tue, 23 Oct 2001 14:39:50 -0400 Subject: Post-Doctoral Positions in Connectionist Modeling of Reading and Language Message-ID: <23079.1003862390@pewee.cnbc.cmu.edu> Post-Doctoral Positions in Connectionist Modeling of Reading and Language Center for the Neural Basis of Cognition and the Department of Psychology, Carnegie Mellon University Two postdoctoral research positions are available in the area of connectionist/neural-network modeling of normal and impaired cognitive processes in reading and language. Topics of particular interest include phonological and lexical development, reading acquisition and developmental and acquired dyslexia, cross-linguistic differences in morphological processing, and neuropsychological impairments of lexical semantic knowledge. Applicants should have expertise in connectionist modeling or in empirical investigation of language-related processes combined with some experience in modeling. The positions are for 2-3 years, with salary commensurate with experience, and are affiliated with the Center for the Neural Basis of Cognition (http://www.cnbc.cmu.edu) and the Department of Psychology (http://www.psy.cmu.edu) at Carnegie Mellon. Please send CV, a description of research experience and interests, copies of representative publications, and three letters of reference by January 1, 2002 to Dr. David Plaut, Center for the Neural Basis of Cognition, Mellon Institute 115, 4400 Fifth Avenue, Pittsburgh PA 15213-2683, USA. Carnegie Mellon is an AA/EEO employer. From cindy at cns.bu.edu Wed Oct 24 13:59:46 2001 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 24 Oct 2001 13:59:46 -0400 Subject: 6th ICCNS: Call for Abstracts Message-ID: <200110241759.NAA10020@retina.bu.edu> Apologies if you receive this more than once. ***** CALL FOR ABSTRACTS ***** SIXTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS Tutorials: May 29, 2002 Meeting: May 30 - June 1, 2002 Boston University 677 Beacon Street Boston, Massachusetts 02215 http://www.cns.bu.edu/meetings/ Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from the Office of Naval Research This interdisciplinary conference has drawn about 300 people from around the world each time that it has been offered. Last year's conference was attended by scientists from 31 countries. The conference is structured to facilitate intense communication between its participants, both in the formal sessions and during its other activities. As during previous years, the conference will focus on solutions to the fundamental questions: How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? The conference will include invited tutorials and lectures, and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Abstract submissions encourage submissions of the latest results. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. Confirmed invited speakers include: Dana Ballard Jeff Bowers Daniel Bullock Edward M. Callaway Gail Carpenter Bart Ermentrout David Field Mark Gluck Stephen Grossberg Frank Guenther Daniel Johnston Philip J. Kellman Stephen G. Lisberger James McClelland Ferdinando Mussa-Ivaldi Lynn Nadel Erkki Oja Randall O'Reilly Michael Page John Rinzel Edmund Rolls Daniel Schacter Wolfram Schultz Rudiger von der Heydt CALL FOR ABSTRACTS Session Topics: * vision * spatial mapping and navigation * object recognition * neural circuit models * image understanding * neural system models * audition * mathematics of neural systems * speech and language * robotics * unsupervised learning * hybrid systems (fuzzy, evolutionary, digital) * supervised learning * neuromorphic VLSI * reinforcement and emotion * industrial applications * sensory-motor control * cognition, planning, and attention * other Contributed abstracts must be received, in English, by January 31, 2002. Notification of acceptance will be provided by email by February 28, 2002. A meeting registration fee must accompany each Abstract. See Registration Information below for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted Abstracts will be returned on request only until April 19, 2002. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; requested preference for oral or poster presentation; and a first and second choice from the topics above, including whether it is biological (B) or technological (T) work. Example: first choice: vision (T); second choice: neural system models (B). (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, VCR, and LCD projector facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. Accepted Abstracts will be printed in the conference proceedings volume. No longer paper will be required. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows are available to help cover meeting travel and living costs. The deadline to apply for fellowship support is January 31, 2002. Applicants will be notified by email by February 28, 2002. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Fellowship applicants who also submit an Abstract need to include the registration fee with their Abstract submission. Those who are awarded fellowships are required to register for and attend both the conference and the day of tutorials. Fellowship checks will be distributed after the meeting. REGISTRATION FORM Sixth International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 29, 2002 Meeting: May 30 - June 1, 2002 FAX: (617) 353-7755 http://www.cns.bu.edu/meetings/ (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $85 Conference plus Tutorial (Regular) ( ) $55 Conference plus Tutorial (Student) ( ) $60 Conference Only (Regular) ( ) $40 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ From robbie at bcs.rochester.edu Wed Oct 24 11:24:42 2001 From: robbie at bcs.rochester.edu (Robbie Jacobs) Date: Wed, 24 Oct 2001 11:24:42 -0400 (EDT) Subject: postdoc position available Message-ID: Below is an ad for a postdoctoral position in my laboratory. Although the readers of this list are primarily computational, you (or someone you know) may be interested in gaining expertise in visual psychophysics and virtual reality. Robert Jacobs Brain and Cognitive Sciences Center for Visual Science University of Rochester ============================================== POSTDOCTORAL FELLOW POSITION IN VISUAL PSYCHOPHYSICS PI: Robert Jacobs Center for Visual Science Department of Brain and Cognitive Sciences University of Rochester A postdoctoral position is available immediately in the lab of Robert Jacobs, Center for Visual Science and the Department of Brain and Cognitive Sciences, University of Rochester. The lab focuses on experimental and computational studies of visual learning with respect to mid-level and high-level visual functions, particularly on experience-dependent perception of visual depth. Some projects in our lab study observers' abilities to recalibrate their interpretations of individual visual cues, other projects study how observers adapt their visual cue combination strategies, and still other projects examine how information from other perceptual modalities (such as haptic or auditory percepts) influence how observers interpret and combine information from visual cues. We have a well-equipped lab that includes access to a wide variety of virtual reality equipment for creating visual, auditory, and haptic environments. You can learn more about our lab (and obtain several of our papers) from: http://www.bcs.rochester.edu/bcs/people/faculty/robbie/robbie.html You can learn more about the Department of Brain and Cognitive Sciences from: http://www.bcs.rochester.edu You can learn more about the Center for Visual Science from: http://www.cvs.rochester.edu Interested candidates should send a vita, a research statement, recent publications, and the names of three individuals who can write letters of recommendation to: Robert Jacobs Brain and Cognitive Sciences Meliora Hall, River Campus University of Rochester Rochester, NY 14627-0268 robbie at bcs.rochester.edu From kivinen at axiom.anu.edu.au Thu Oct 25 02:07:51 2001 From: kivinen at axiom.anu.edu.au (Jyrki Kivinen) Date: Thu, 25 Oct 2001 16:07:51 +1000 (EST) Subject: Call for papers: COLT 2002 Message-ID: Call for Papers: Fifteenth Annual Conference on Computational Learning Theory The Fifteenth Annual Conference on Computational Learning Theory (COLT 2002) will be held during the week July 8-12, 2002 in Sydney, Australia. The conference will be co-located with ICML-2002. We invite submission of papers about the theory of machine learning. Possible topics include: * analysis of learning algorithms for specific classes of hypotheses, including established classes (e.g. neural networks, graphical models, decision trees, logical formulae, automata, pattern languages, grammars) and new classes; * bounds on the generalization ability of learning algorithms; * learning algorithms based on large margin hypotheses (SVM, boosting); * worst-case relative loss bounds for sequential prediction algorithms; * analysis of adaptive algorithms for decision, planning and control; * bounds on the computational complexity of learning; * learning with queries and learning in the limit; * new learning models that either capture important details of specific applications or that address general issues in a new way. We also welcome theoretical papers about learning that do not fit into the above categories; we are particularly interested in papers that include ideas and viewpoints that are new to the COLT community. While the primary focus of the conference is theoretical, papers can be strengthened by the inclusion of relevant experimental results. Papers that have appeared in journals or other conferences, or that are being submitted to other conferences, are not appropriate for submission to COLT. Paper submissions: We will be setting up a server to receive electronic submissions. Although electronic submissions are preferred, hard-copy submissions will also be possible. Details of the submission procedure will be made available on the conference web page http://www.learningtheory.org/colt2002. Please check this page for updates on submission and conference details. If you have questions, send e-mail to the program co-chairs (Jyrki.Kivinen at faceng.anu.edu.au, rsloan at nsf.gov). Important dates: Submissions, electronic or hard-copy, must be received by 23:59 GMT on Monday, January 28, 2002. Authors will be notified of acceptance or rejection on or before Friday April 5, 2002. Final camera-ready versions must be received by Friday April 19. Submission format: Unlike previous COLT conferences, we are asking the authors to submit a full paper, which should be in the Springer LNAI format (see http://www.springer.de/comp/lncs/authors.html) and no longer than 15 pages. Authors not using LaTeX2e are asked to contact the program chairs well in advance of the submission deadline. The paper should include a clear definition of the theoretical model used and a clear description of the results, as well as a discussion of their significance, including comparison to other work. Proofs or proof sketches should be included. Conference chair: Arun Sharma (Univ. of New South Wales) Program co-chairs: Jyrki Kivinen (Australian National Univ.) and Bob Sloan (NSF and Univ. of Illinois, Chicago). Program committee: Dana Angluin (Yale), Javed Aslam (Dartmouth), Peter Bartlett (BIOwulf Technologies), Shai Ben-David (Technion), John Case (Univ. of Delaware), Peter Grunwald (CWI), Ralf Herbrich (Microsoft Research), Mark Herbster (University College London), Gabor Lugosi (Pompeu Fabra University), Ron Meir (Technion), Shahar Mendelson (Australian National Univ.), Michael Schmitt (Ruhr-Universitaet Bochum), Rocco Servedio (Harvard), and Santosh Vempala (MIT) Student travel: We anticipate that some funds will be available to partially support travel by student authors. Eligible authors who wish to apply for travel support should indicate this on their submission's title page. Mark Fulk Award: This award is for the best paper authored or coauthored by a student. Eligible authors who wish to be considered for this prize should indicate this on their submission's title page. From juergen at idsia.ch Thu Oct 25 06:13:21 2001 From: juergen at idsia.ch (juergen@idsia.ch) Date: Thu, 25 Oct 2001 12:13:21 +0200 Subject: Coulomb's law yields support vector machines and more Message-ID: <200110251013.MAA22785@ruebe.idsia.ch> Important recent results by Sepp Hochreiter: Using Coulomb energy as an objective function, he shows that support vector machines can be easily derived from Coulomb's law as taught in first semester courses on physics. His general electrostatic framework greatly simplifies the proofs of well-known SVM theorems, and yields solutions formally identical to well-known SVM types. In addition, it suggests novel kernels and SVMs for kernels that are not positive definite, and even subsumes other methods such as nearest neighbor classifiers, density estimators, clustering algorithms, and vector quantizers. Thus his Coulomb classifiers promise significant advances in several fields. http://www.cs.tu-berlin.de/~hochreit http://www.cs.colorado.edu/~hochreit ftp://ftp.cs.colorado.edu/users/hochreit/papers/cltr.ps.gz @techreport{Hochreiter:2001coulomb, author = {S. Hochreiter and M. C. Mozer}, title = {Coulomb Classifiers: {R}einterpreting {SVM}s as Electrostatic Systems}, institution = {University of Colorado, Boulder, Department of Computer Science}, number = {CU-CS-921-01}, year = {2001}} (BTW, this is the same person who analyzed in rigorous detail the vanishing error problem of standard recurrent nets (1991), and who recently built the first working gradient-based metalearner (ICANN 2001) using LSTM recurrent nets (Neural Comp 97) which by design do not suffer from this problem, and who also invented Flat Minimum Search (Neural Comp 97, 99), a highly competitive method for finding nets with low information-theoretic complexity and high generalization capability.) ------------------------------------------------- Juergen Schmidhuber director IDSIA, Galleria 2, 6928 Manno-Lugano, Switzerland juergen at idsia.ch www.idsia.ch/~juergen From cjlin at csie.ntu.edu.tw Thu Oct 25 06:19:32 2001 From: cjlin at csie.ntu.edu.tw (Chih-Jen Lin) Date: Thu, 25 Oct 2001 18:19:32 +0800 Subject: CFP: special issue on Support Vector Machine Message-ID: CALL FOR PAPERS: special issue on SVM NEUROCOMPUTING An International Journal published by Elsevier Science B.V., vol. 42-47, 24 issues, in 2002 ISNN 0925-2312, URL: http://www.elsevier.nl/locate/neucom Special Issue on Support Vector Machines Paper Submission Deadline: February 28, 2002 Further information: http://www.csie.ntu.edu.tw/~cjlin/svmcfp.html Support Vector Machines is an intensive area of research closely related to statistical learning theory with numerous applications. The basic form of the corresponding learning algorithm can be shown to correspond to a linear model used in a high-dimensional space non-linearly related to input space. A key advantage is the low-complexity of the operations performed by using kernels in the input space. The construction of the high-dimensional space is based a subset of the original input data, the so called support vectors. The corresponding quadratic programming problem in simulative environments is solved using different optimization techniques. Their strong performance in applications has increased the interest in this approach with areas of application research including pattern recognition, computer vision, and biomedical analysis. The Neurocomputing journal invites original contributions for the forthcoming special issue on Support Vector Machines from a broad scope of areas. Some topics relevant to this special issue include, but are not restricted to: -- Theoretical foundations, algorithms, and implementations -- Model selection and hyperparameter tuning -- Choosing kernels for special situations -- Probabilistic treatment of SVMs -- SVM methods for large scale problems -- Benchmarking SVMs against other methods -- Feature selection for SVMs -- Key applications including, but not restricted to data mining, bioinformatics, text categorization, machine vision, etc. Please send two hardcopies of the manuscript before February 28, 2002 to: V. David Sanchez A., Neurocomputing - Editor in Chief - Advanced Computational Intelligent Systems P.O. Box 60130, Pasadena, CA 91116-6130, U.S.A. Street address: 1149 Wotkyns Drive Pasadena, CA 91103, U.S.A. Fax: +1-626-793-5120 Email: vdavidsanchez at earthlink.net including abstract, keywords, a cover page containing the title and author names, corresponding author name's complete address including telephone, fax, and email address, and clear indication to be a submission to the Special Issue on Support Vector Machines. Guest Editors Colin Campbell Department of Engineering Mathematics Bristol University, Bristol BS8 1TR United Kingdom Phone: (+44) (0) 117 928 9858 Fax: (+44) (0)117-925-1154 Email: C.Campbell at bristol.ac.uk Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei, Taiwan, 106 Phone: (+886) 2-2362-5336 x 413 Fax: (+886) 2-2362-8167 Email: cjlin at csie.ntu.edu.tw S. Sathiya Keerthi Department of Mechanical Engineering National University of Singapore 10 KentRidge Crescent Singapore 119260 Republic of Singapore Phone: (+65) 874-4684 Fax: (+65) 779-1459 Email: mpessk at guppy.mpe.nus.edu.sg V. David Sanchez A., Neurocomputing - Editor in Chief - Advanced Computational Intelligent Systems P.O. Box 60130 Pasadena, CA 91116-6130 U.S.A. Fax: +1-626-793-5120 Email: vdavidsanchez at earthlink.net From erik at bbf.uia.ac.be Fri Oct 26 14:13:15 2001 From: erik at bbf.uia.ac.be (Erik De Schutter) Date: Fri, 26 Oct 2001 20:13:15 +0200 Subject: European short-term fellowships in neuroinformatics and computational neuroscience Message-ID: The EU Thematic Network Computational Neuroscience and Neuroinformatics offers short-term fellowships to nationals of the EU and associated countries for visits to laboratories in the EU or associated countries. Fellowships cover travel and accommodation costs for a visit of 2 to 12 weeks. Applications are evaluated once every month on a competitive basis. Fellowships can be awarded within three months of the application. Award per fellowship varies between 600 to 2000 ?. More information and electronic application is available at http://www.neuroinf.org Prof. E. De Schutter University of Antwerp, Belgium coordinator of the Thematic Network Computational Neuroscience and Neuroinformatics http://www.bbf.uia.ac.be From Roberto.Prevete at na.infn.it Mon Oct 29 11:56:51 2001 From: Roberto.Prevete at na.infn.it (Roberto Prevete) Date: Mon, 29 Oct 2001 11:56:51 -0500 Subject: book announcement Message-ID: <3BDD8A53.FFE58765@na.infn.it> Book announcement: AN ALGEBRAIC APPROACH TO THE AUTONOMOUSLY SELF-ADAPTABLE BOOLEAN NEURAL NETS 299p F.E.LAURIA, R.PREVETE Liguori Editore, 2001. ISBN 99-207-3266-1 free for asking, lauria at na.infn.it For more details visit our homepage: http://www.na.infn.it/Gener/cyber/report.html into folder NeuralNetworks/News Here is a brief description: We present an algebraic approach to the autonomously self-adaptable boolean neural nets. Starting from the Caianiello's nets we introduce a Boolean Neural Net (BNN) as a control structure and we set the data structure embedded in a BNN. Introducing the Hebbian rule we set some sufficient conditions in order to obtain an Adaptable Boolean Neural Network (ABNN) as a control structure. Starting from these architectures we present a multitasking architecture (GABNN) proving its training universality. From ascoli at gmu.edu Tue Oct 30 10:54:59 2001 From: ascoli at gmu.edu (Giorgio Ascoli) Date: Tue, 30 Oct 2001 10:54:59 -0500 Subject: postdoc opening - computational neuroscience Message-ID: <3BDECD53.41CE10BE@gmu.edu> Please post and distribute as you see fit (my apologies for cross-listing). Giorgio Ascoli COMPUTATIONAL NEUROSCIENCE POST-DOCTORAL POSITION AVAILABLE A post-doctoral position is available immediately for computational modeling of dendritic morphology, neuronal connectivity, and electrophysiology. All highly motivated candidates with a recent PhD in biology, computer science, physics, or other areas related to Neuroscience (including MD or engineering degree) are encouraged to apply. C programming skills and/or experience with NEURON, GENESIS or other modeling packages are desirable but not necessary. Post-doc will join a young and dynamic research group at the Krasnow Institute for Advanced Study, located in Fairfax, VA (20 miles west of Washington DC). The initial research project is focused on anatomically and biophysically detailed electrophysiological simulations of hippocampal neurons to study synaptic integration and the structure/activity/function relationship at the cellular and network level. The post-doc will be hired as a Research Assistant Professor (with VA state employee benefits) with a salary based on the NIH postdoctoral scale, and will have generous office space, a new computer, and full-time access to Silicon Graphics and Linux servers and consoles. Send CV, (p)reprints, a brief description of your motivation, and names, email addresses and phone/fax numbers of references to: ascoli at gmu.edu (or by fax at the number below) ASAP. There is no deadline but the position will be filled as soon as a suitable candidate is found. Non-resident aliens are welcome to apply. The Krasnow Institute is an equal opportunity employer. Computational Neuroanatomy Group: http://www.krasnow.gmu.edu/L-Neuron/ Krasnow Institute for Advanced Study: http://www.krasnow.org George Mason University: http://www.gmu.edu ------------------- Giorgio Ascoli, PhD Head, Computational Neuroanatomy Group Krasnow Institute for Advanced Study and Department of Psychology - MS2A1 George Mason University, Fairfax, VA 22030-4444 Web: www.krasnow.gmu.edu/ascoli Ph. (703)993-4383 Fax (703)993-4325 From N.Chater at warwick.ac.uk Tue Oct 30 13:43:39 2001 From: N.Chater at warwick.ac.uk (Nick Chater) Date: Tue, 30 Oct 2001 18:43:39 +0000 Subject: 8 Cog Science PhD and research positions at Warwick Message-ID: The Institute for Applied Cognitive Science at the University of Warwick is pleased to announce 5 research positions and 3 funded PhD studentships. The Institute has recently been set up with a $1.5M grant from the Wellcome Trust and Economic and Social Research Council, and has received a similar amount of funding from government, commercial and charitable sources. It has first class computational and experimental facilities, including 128 channel ERP, 3 eye-trackers, and virtual reality and movement monitoring. The Institute for Applied Cognitive Science has links with internationally known faculty in the Department of Psychology, Department of Computer Science, Warwick Business School, Mathematics Institute, and Institute of Education. The positions are: (a) 2 PhD studentships on corpus analysis and experimental language research (b) 1 PhD studentship on basic processes of learning and memory (c) 2 four-year research positions on early reading (d) 3 two-year research positions on financial decision making All projects are directed by Prof Nick Chater and colleagues at the Institute. Please contact Nick Chater (nick.chater at warwick.ac.uk) if you are interested in any of these positions. Further details appear below: (a) 2 PhD studentships on corpus analyis and experimental language research (b) 1 PhD studentship on basic processes of learning and memory (c) 2 four-year research positions on early reading (d) two-year research positions on financial decision making (a) 2 PhD studentships on corpus analysis and experimental language research This project is funded by the Human Frontiers Science Program, and links Warwick with laboratories in the US, France and Japan, to study the interaction of multiple cues to syntactic category identity across languages. The Warwick research theme will focus on corpus analysis and some experimental research with adults. An ideal candidate would have a strong background (good undergraduate degree and preferably Masters level research experience) in corpus analysis, computation, linguistics, cognitive science or psycholinguistics. The successful applicants will work alongside a further 3 PhD students currently work on related topics. The project will be directed in Warwick by Nick Chater, and the whole research network is coordinated by Morten Christiansen, at Cornell. The studentships are open to people of any nationality, and are available with immediate effect. (b) 1 PhD studentship on basic processes of learning and memory This project is funded by a European Training Network on "Basic processes of learning and memory", and links Warwick with laboratories in London, France and Belgium. An ideal candidate would have a strong background (good undergraduate degree and preferably Masters level research experience) in cognitive psychology, cognitive science, computation, linguistics, or related discipline. The project will be directed in Warwick by Nick Chater, and the whole research network is coordinated by Robert French, at the University of Liege. There are various eligibility requirements attached to this funding, the most important of which is that applicants must be from an EU state (or affiliated state), but not from the UK. The studentship is available with immediate effect. (c) 2 four-year researchpositions on early reading This project represents a new stage in a long-term on-going research project, funded by the Leverhulme Trust (a charitable foundation) and Essex Local Educational Authority. The goal of the research is to develop instructional principles based on cognitive science that substantially enhance children's ability to learn to read. The project has already shown some dramatic gains in large scale classroom studies. One post with be primarily concerned with laboratory experimental research on basic principles of learning, with adults and children, and would be based at Warwick University. The second post will be based in Essex, and will involve designing and implementing classroom based studies. The researchers will work as part of a large interdisciplinary team of researchers and educators. Ideal applicants would have graduate level or beyond experience in research in cognitive psychology, cognitive science or education, and preferable, though not necessarily, prior knowledge of reading research. The project will be directed Nick Chater, Jonathan Solity and Gordon Brown. The positions are open to people of any nationality, and the project is likely to start around Jan 1, 2002 (although precise timing and funding details are still to be confirmed). (d) 3 two-year research positions on financial decision making Three two year positions for two postdoctoral researchers and a research associate are opening up at the newly founded research group, based at the Institute for Applied Cognitive Science, University of Warwick. The group will pursue fundamental and applied research on human decision making with relevance to the finance industry. Ideal candidates would have strong backgrounds in cognitive psychology, experimental economics, cognitive science, and IT skills would also be advantageous. Successful applicants will work in a research with several other post-doctoral and post-graduate researchers. The positions are open to people of any nationality, and the project is likely to tart around Jan 1, 2002 (although precise timing and funding details are still to be confirmed). From CogSci at psyvax.psy.utexas.edu Tue Oct 30 15:09:28 2001 From: CogSci at psyvax.psy.utexas.edu (Cognitive Science Society) Date: Tue, 30 Oct 2001 14:09:28 -0600 Subject: Rumelhart Prize Message-ID: <5.0.0.25.2.20011030140638.02b567c0@psy.utexas.edu> ANNOUNCEMENT AND CALL FOR NOMINATIONS: THE THIRD ANNUAL DAVID E. RUMELHART PRIZE FOR CONTRIBUTIONS TO THE FORMAL ANALYSIS OF HUMAN COGNITION The recipient of the Third Annual David E. Rumelhart Prize will be chosen during the first part of 2002. The winner will be announced at the 2002 Meeting of the Cognitive Science Society, and will receive the prize and give the Prize Lecture at the 2003 Meeting. The prize is awarded annually to an individual or collaborative team making a significant contemporary contribution to the formal analysis of human cognition. Mathematical modeling of human cognitive processes, formal analysis of language and other products of human cognitive activity, and computational analyses of human cognition using symbolic or non-symbolic frameworks all fall within the scope of the award. The Prize itself will consist of a certificate, a citation of the awardee's contribution, and a monetary award of $100,000. Nomination, Selection and Award Presentation For the Third Annual Prize, the selection committee will continue to consider nominations previously submitted. The committee invites updates to existing nominations as well as new nominations. Materials should be sent to the Prize Administration address at the end of this announcement. To be considered in the committee's deliberations for the Third David E. Rumelhart Prize, materials must be received by Friday, January 11, 2002. Nominations should include six sets of the following materials: (1) A three-page statement of nomination, (2) a complete curriculum vitae and (3) copies of up to five of the nominee's relevant publications. Note that the nominee may be an individual or a team, and in the case of a team, vitae for all members should be provided. The prize selection committee considers both the scientific contributions and the scientific leadership and collegiality of the nominees, so these issues should be addressed in the statement of nomination. Previous Recipients and Prize-Related Activities Previous winners of the David E. Rumelhart Prize are Geoffrey E. Hinton and Richard M. Shiffrin. Hinton received the First David E. Rumelhart Prize and delivered the Prize Lecture at the 2001 Meeting of the Cognitive Science Society. Shiffrin, the winner of the Second David E. Rumelhart Prize, was announced at the 2001 Meeting of the Cognitive Science Society. He will recieve the prize and deliver the Prize Lecture at the 2002 meeting. Funding of the Prize The David E, Rumelhart Prize is funded by the Robert J. Glushko and Pamela Samuelson Foundation, based in San Francisco. Robert J. Glushko is an entrepreneur in Silicon Valley who received a Ph. D. in Cognitive Psychology in 1979 under Rumelhart's supervision. Prize Administration The Rumelhart Prize is administered by the Chair of the Prize Selection Committee in consultation with the Glushko-Samuelson Foundation and the Distinguished Advisory Board. Screening of nominees and selection of the prize winner will be performed by the Prize Selection Committee. Scientific members (including the Chair) of the Prize Selection Committee will serve for up to two four-year terms, and members of this committee will be selected by the Glushko-Samuelson Foundation in consultation with the Distinguished Advisory Board. A representative of the Foundation will also serve on the Prize Selection Committee. Members of the Prize Selection Committee are listed at the end of this announcement. David E. Rumelhart: A Scientific Biography David E. Rumelhart has made many contributions to the formal analysis of human cognition, working primarily within the frameworks of mathematical psychology, symbolic artificial intelligence, and parallel distributed processing. He also admired formal linguistic approaches to cognition and explored the possibility of formulating a formal grammar to capture the structure of stories. Rumelhart obtained his undergraduate education at the University of South Dakota, receiving a B.A. in psychology and mathematics in 1963. He studied mathematical psychology at Stanford University, receiving his Ph. D. in 1967. From 1967 to 1987 he served on the faculty of the Department of Psychology at the University of California, San Diego. In 1987 he moved to Stanford University, serving as Professor there until 1998. He has become disabled by Pick's disease, a progressive neurodegenerative illness, and now lives with his brother in Ann Arbor, Michigan. Rumelhart developed models of a wide range of aspects of human cognition, ranging from motor control to story understanding to visual letter recognition to metaphor and analogy. He collaborated with Don Norman and the LNR Research Group to produce "Explorations in Cognition" in 1975 and with Jay McClelland and the PDP Research Group to produce "Parallel Distributed Processing: Explorations in the Microstructure of Cognition" in 1986. He mastered many formal approaches to human cognition, developing his own list processing language and formulating the powerful back-propagation learning algorithm for training networks of neuron-like processing units. Rumelhart was elected to the National Academy of Sciences in 1991 and received many prizes, including a MacArthur Fellowship, the Warren Medal of the Society of Experimental Psychologists, and the APA Distinguished Scientific Contribution Award. Rumelhart articulated a clear view of what cognitive science, the discipline, is or ought to be. He felt that for cognitive science to be a science, it would have to have formal theories, and he often pointed to linguistic theories, as well as to mathematical and computational models, as examples of what he had in mind. Prize Selection Committee Alan Collins Department of Learning Sciences School of Education and Social Policy Northwestern University Mark Liberman Departments of Computer and Information Sciences and Linguistics University of Pennsylvania Anthony J. Marley Department of Psychology McGill University James L. McClelland (Chair) Carnegie Mellon University and Center for the Neural Basis of Cognition Pittsburgh, Pennsylvania Inquiries and Nominations should be sent to David E. Rumelhart Prize Administration Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 412-268-4000 derprize at cnbc.cmu.edu Visit the prize web site at www.cnbc.cmu.edu/derprize ---------- Cognitive Science Society c/o Tanikqua Young Department of Psychology University of Texas Austin, TX 78712 Phone: (512) 471-2030 Fax: (512) 471-3053 ---------- From laubach at jbpierce.org Tue Oct 30 10:17:22 2001 From: laubach at jbpierce.org (Mark Laubach) Date: Tue, 30 Oct 2001 10:17:22 -0500 Subject: postdoc position Message-ID: <3BDEC482.80402@jbpierce.org> POSTDOCTORAL FELLOWSHIP IN SYSTEMS NEUROPHYSIOLOGY In the laboratory of Mark Laubach, Ph.D. Assistant Fellow, J.B. Pierce Laboratory Assistant Professor, Dept. of Neurobiology Yale School of Medicine A postdoctoral position is available immediately in the lab of Mark Laubach at the John B. Pierce Laboratory. The focus of the laboratory is to understand how ensembles of neurons work together to represent behaviorally relevant information and how representations of animal behavior are altered in relation to experience. There are three experimental themes of the lab. First, we are examining how tactile and olfactory stimuli are encoded at various levels of the nervous system. Second, we are investigating how representations of stimuli that control behavior are altered by sensorimotor and discrimination learning. Finally, we are beginning to study how sensorimotor capabilities are altered by aging. A major component of this research involves the use of multielectrode recording methods to record simultaneously from groups of neurons in multiple brain areas in awake, behaving animals. The lab is a state-of-the-art facility for carrying out neuronal ensemble recording experiments and for performing quantitative analysis of neuronal ensemble data. We are active in improving methods for neuronal ensemble recording through several collaborative projects, both locally at Yale and elsewhere. A major goal is to carry out spike sorting and spike train analyses on-line and in real experimental time using modern methods for signal processing, statistical pattern recognition, and parallel, real-time computing. Finally, we are engaged in some computational studies to better understand potential network properties that may give rise to the patterns of neuronal ensemble activity that can be used to predict an animal's behavior at a given instant in time. Interested candidates should send a vita, a summary of research experience, and the names of three individuals who can write letters of recommendation to: Mark Laubach, Ph.D. The John B. Pierce Laboratory Yale School of Medicine 290 Congress Ave New Haven CT 06519 laubach at jbpierce.org http://www.jbpierce.org/staff/laubach.html From R.M.Everson at exeter.ac.uk Wed Oct 31 16:21:21 2001 From: R.M.Everson at exeter.ac.uk (R.M.Everson) Date: 31 Oct 2001 21:21:21 +0000 Subject: Postdoctoral fellowship in Critical Systems and Data-Driven Technology Message-ID: RESEARCH FELLOW in Critical Systems and Data-Driven Technology Department of Computer Science and School of Mathematical Sciences Exeter University Highly motivated candidates are sought for a post-doctoral position to join an EPSRC funded project applying inductive technologies (neural networks, Bayes nets etc) to safety critical systems. This project is a collaboration with the National Air Traffic Service and the Royal London Hospital, who will be closely involved. We shall address the theoretical and practical issues of managing critical systems with inductive technologies. We are looking for post-doctoral workers with a strong background in machine learning, data analysis or statistical classification, together with an interest in applications. The Pattern Analysis and Statistics groups at the University of Exeter have a strong tradition in data analysis, statistical modelling, pattern recognition and critical systems. Successful applicants will join a team working with Professors Partridge (Computer Science) and Krzanowski (Statistics), and Dr Everson (Computer Science). Further information from Professor Derek Partridge or Dr. Richard Everson, School of Engineering and Computer Science, University of Exeter, Exeter, EX44PT UK. Tel: +44 1392 264061 Email: {D.Partridge,R.Everson}@exeter.ac.uk Closing date 30th November 2001.