From iehava at ie.technion.ac.il Mon Jan 4 11:18:25 1999 From: iehava at ie.technion.ac.il (Hava Siegelmann) Date: Mon, 4 Jan 1999 18:18:25 +0200 (EET) Subject: Announcing a New Book Message-ID: Neural Networks and Analog Computation: Beyond the Turing Limit Author: Hava T. Siegelmann The theoretical foundations of Neural Networks and Analog Computation conceptualize neural networks as a particular type of computer consisting of multiple assemblies of basic processors interconnected in an intricate structure. Examining these networks under various resource constraints reveals a continuum of computational devices, several of which coincide with well-known classical models. What emerges is a Church-Turing-like thesis, applied to the field of analog computation, which features the neural network model in place of the digital Turing machine. This new concept can serve as a point of departure for the development of alternative, supra-Turing computational theories. On a mathematical level, the treatment of neural computations enriches the theory of computation but also explicates the computational complexity associated with biological networks, adaptive engineering tools, and related models from the fields of control theory and nonlinear dynamics. The material in this book will be of interest to researchers in a variety of engineering and applied sciences disciplines. In addition, the work may provide the base of a graduate-level seminar in neural networks for computer science students. Special care has been taken to explain the theory clearly and concisely. The first chapter reviews the fundamental terms of modern computational theory from the point of view of neural networks and serves as a reference for the remainder of the book. Each of the subsequent chapters opens with introductory material and proceeds to explain the chapter's connection to the development of the theory. Thereafter, the concept is defined in mathematical terms. Birkh?user Boston * Basel * Berlin ISBN 0-8176-3949-7 e-mail: orders at birkhauser.ch Web: http://www.birkhauser.ch From harnad at coglit.soton.ac.uk Mon Jan 4 07:14:37 1999 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Mon, 4 Jan 1999 12:14:37 +0000 (GMT) Subject: Origin of Culture: PSYCOLOQUY Call for Commentators Message-ID: Gabora: ORIGIN OF CULTURE The target article whose abstract appears below has just appeared in PSYCOLOQUY, a refereed journal of Open Peer Commentary sponsored by the American Psychological Association. Qualified professional biobehavioral, neural or cognitive scientists are hereby invited to submit Open Peer Commentary on it. Please email for Instructions if you are not familiar with format or acceptance criteria for PSYCOLOQUY commentaries (all submissions are refereed). To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc RATIONALE FOR SOLICITING COMMENTARY: This target article presents a model of cognitive origins to explain the transition from episodic to mimetic/memetic culture (as outlined by Merlin Donald in "Origins of the Modern Mind," 1991) using Stuart Kauffman's ideas about how an information-evolving system can emerge through autocatalysis ("Origins of Order," 1993). I would like to invite commentary from cognitive anthropologists and archeologists on the plausibility of the proposal, from neuroscientists on the neurobiological aspects of this model, and from cognitive psychologists on its compatibility with other dynamic models memory (i.e. models of how one thought evokes another in a train of associations.) I also invite discussion of the memetic perspective of culture as an information-evolving system. Full text of article available at: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.67 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/ psyc.98.9.67.origin-culture.1.gabora ----------------------------------------------------------------------- AUTOCATALYTIC CLOSURE IN A COGNITIVE SYSTEM: A TENTATIVE SCENARIO FOR THE ORIGIN OF CULTURE Liane Gabora Center Leo Apostel Brussels Free University Krijgskundestraat 33 1160 Brussels Belgium lgabora at vub.ac.be http://www.vub.ac.be/CLEA/liane/ ABSTRACT: This target article presents a speculative model of the cognitive mechanisms underlying the transition from episodic to mimetic (or memetic) culture with the arrival of Homo Erectus, which Donald (1991) claims paved the way for the unique features of human culture. The model draws on Kauffman's (1993) theory of how an information-evolving system emerges through the formation of an autocatalytic network. Though originally formulated to explain the origin of life, Kauffman's theory also provides a plausible account of how discrete episodic memories become woven into an internal model of the world, or world-view, that both structures, and is structured by, self-triggered streams of thought. Social interaction plays a role in (and may be critical to) this process. Implications for cognitive development are explored. KEYWORDS: abstraction, animal cognition, autocatalysis, cognitive development, cognitive origins, consciousness, cultural evolution, memory, meme, mimetic culture, representational redescription, world-view. ----------------------------------------------------------------------- INSTRUCTIONS FOR PSYCOLOQUY COMMENTATORS PSYCOLOQUY is a refereed electronic journal (ISSN 1055-0143) sponsored on an experimental basis by the American Psychological Association and currently estimated to reach a readership of 50,000. PSYCOLOQUY publishes brief reports of new ideas and findings on which the author wishes to solicit rapid peer feedback, international and interdisciplinary ("Scholarly Skywriting"), in all areas of psychology and its related fields (biobehavioral science, cognitive science, neuroscience, social science, etc.). All contributions are refereed. Accepted PSYCOLOQUY target articles have been judged by 5-8 referees to be appropriate for Open Peer Commentary, the special service provided by PSYCOLOQUY to investigators in psychology, neuroscience, behavioral biology, cognitive sciences and philosophy who wish to solicit multiple responses from an international group of fellow specialists within and across these disciplines to a particularly significant and controversial piece of work. If you feel that you can contribute substantive criticism, interpretation, elaboration or pertinent complementary or supplementary material on a PSYCOLOQUY target article, you are invited to submit a formal electronic commentary. 1. Before preparing your commentary, please examine recent numbers of PSYCOLOQUY if not familiar with the journal. 2. Commentaries should preferably be up to ~200 lines (~1800 words) 3. Please provide a title for your commentary. As many commentators will address the same general topic, your title should be a distinctive one that reflects the gist of your specific contribution and is suitable for the kind of keyword indexing used in modern bibliographic retrieval systems. Each commentary should also have a brief (~100 word) abstract 4. All paragraphs should be numbered consecutively. Line length should not exceed 72 characters. The commentary should begin with the title, your name and full institutional address (including zip code) and email address. References must be prepared in accordance with the examples given in the Instructions. Please read the sections of the Instruction for Authors concerning style, preparation and editing. Please include URL wherever available. Target article length should preferably be up to 1200 lines [c. 10,000 words]. All target articles, commentaries and responses must have (1) a short abstract (up to 200 words for target articles, shorter for commentaries and responses), (2) an indexable title, (3) the authors' full name(s) and institutional address(es), (4) email addresses, (5) Home-page URLs. In addition, for target articles only: (4) 6-8 indexable keywords, (5) a separate statement of the authors' rationale for soliciting commentary (e.g., why would commentary be useful and of interest to the field? what kind of commentary do you expect to elicit?) and (6) a list of potential commentators (with their email addresses). All paragraphs should be numbered in articles, commentaries and responses (see format of already published articles in the PSYCOLOQUY archive; line length should be < 80 characters, no hyphenation). Figures should be Web-ready gifs, jpegs or equivalent. Captions should be in separate html file that links to the gifs. PSYCOLOQUY also publishes multiple reviews of books in any of the above fields; these should normally be the same length as commentaries, but longer reviews will be considered as well. Book authors should submit a 500-line self-contained Precis of their book, in the format of a target article; if accepted, this will be published in PSYCOLOQUY together with a formal Call for Reviews (of the book, not the Precis). The author's publisher must agree in advance to furnish review copies to the reviewers selected. Authors of accepted manuscripts assign to PSYCOLOQUY the right to publish and distribute their text electronically and to archive and make it permanently retrievable electronically, but they retain the copyright, and after it has appeared in PSYCOLOQUY authors may republish their text in any way they wish -- electronic or print -- as long as they clearly acknowledge PSYCOLOQUY as its original locus of publication. However, except in very special cases, agreed upon in advance, contributions that have already been published or are being considered for publication elsewhere are not eligible to be considered for publication in PSYCOLOQUY, Please submit all material to psyc at pucc.bitnet or psyc at pucc.princeton.edu URLs for retrieving full texts of target articles: http://www.princeton.edu/~harnad/psyc.html http://cogsci.soton.ac.uk/psyc ftp://ftp.princeton.edu/pub/harnad/Psycoloquy ftp://cogsci.soton.ac.uk/pub/harnad/Psycoloquy news:sci.psychology.journals.psycoloquy From jagota at cse.ucsc.edu Tue Jan 5 18:52:41 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Tue, 5 Jan 1999 15:52:41 -0800 (PST) Subject: connectionist symbol processing collective survey Message-ID: <199901052352.PAA07507@arapaho.cse.ucsc.edu> 1. New NCS e-publication: collective survey 2. Forward references to NCS articles solicited ----- 1 ----- A. Jagota, T. Plate, L. Shastri, R. Sun (eds), Connectionist Symbol Processing: Dead or Alive?, Neural Computing Surveys 2, 1--40, 1999, 220 references. http://www.icsi.berkeley.edu/~jagota/NCS PREFACE: In August 1998 Dave Touretzky asked on the connectionists e-mailing list, ``Is connectionist symbol processing dead?'' This query lead to an interesting discussion and exchange of ideas. We thought it might be useful to capture this exchange in an article. We solicited contributions, and this collective article is the result. Contributions were solicited by a public call on the connectionists e-mailing list. All contributions received were subjected to two to three informal reviews. Almost all were accepted with varying degrees of revision. Given the number and variety of contributions, the articles cover a wide, though by no means complete, range of the work in the field. The pieces in this article are of varying nature: position summaries, individual research summaries, historical accounts, discussion of controversial issues, etc. We have not attempted to connect the various pieces together, or to organize them within a coherent framework. Despite this, we think, the reader will find this collection useful. CONTRIBUTORS: D. S. Blank, M. Coltheart, J. Diederich, B.M. Garner, R.W. Gayler, C.L. Giles, L. Goldfarb, M. Hadeishi, B. Hazlehurst, M. J. Healy, J. Henderson, N. G. Jani, D. S. Levine, S. Lucas, T. Plate, G. Reeke, D. Roth, L. Shastri, J. Sougne, R. Sun, W. Tabor, B. B. Thompson, S. Wermter ----- 2 ----- NCS would like to begin to maintain FORWARD references to published articles. If you are aware of any such, e-mail to jagota at cse.ucsc.edu the following information for each such reference CITING ARTICLE: FULL REFERENCE NCS CITED ARTICLE -------------- From baluja at grr.ius.cs.cmu.edu Tue Jan 5 22:26:37 1999 From: baluja at grr.ius.cs.cmu.edu (baluja@grr.ius.cs.cmu.edu) Date: Tue, 5 Jan 99 22:26:37 EST Subject: Probabilistic Modeling for Face Orientation Discrimination Message-ID: The following paper is available from: http://www.cs.cmu.edu/~baluja Probabilistic Modeling for Face Orientation Discrimination: Learning from Labeled and Unlabeled Data Shumeet Baluja Abstract: This paper presents probabilistic modeling methods to solve the problem of discriminating between five facial orientations with very little labeled data. Three models are explored. The first model maintains no inter-pixel dependencies, the second model is capable of modeling a set of arbitrary pair-wise dependencies, and the last model allows dependencies only between neighboring pixels. We show that for all three of these models, the accuracy of the learned models can be greatly improved by augmenting a small number of labeled training images with a large set of unlabeled images using Expectation-Maximization. This is important because it is often difficult to obtain image labels, while many unlabeled images are readily available. Through a large set of empirical tests, we examine the benefits of unlabeled data for each of the models. By using only two randomly selected labeled examples per class, we can discriminate between the five facial orientations with an accuracy of 94%; with six labeled examples, we achieve an accuracy of 98%. This work was completed while the author was at: Justsystem Pittsburgh Research Center & School of Computer Science, Carnegie Mellon University Comments and Questions welcome. Please send all feedback to sbaluja at lycos.com. From shimone at cogs.susx.ac.uk Wed Jan 6 09:04:01 1999 From: shimone at cogs.susx.ac.uk (Shimon Edelman) Date: Wed, 6 Jan 1999 14:04:01 +0000 Subject: research fellowship (vision / neural computation; UK) Message-ID: Research Fellowship -- Visual Object Representation and Categorization School of Cognitive and Computing Sciences University of Sussex Applications are invited for a 2-year EPSRC-funded research fellowship (salary scale 15,735 to 23,651 GBP p/a), starting as soon as possible, and in any case not later than April 1st, 1999. The successful applicant will join an interdisciplinary research group within the School of Cognitive and Computing Sciences, and will work on the development of computer algorithms for representation, recognition and categorization of visual objects and scenes. Candidates should have a PhD degree in Computer Science or equivalent, and should be familiar with issues and modeling techniques in neural computation and computational neuroscience. Relevant research experience and proficiency with rapid prototyping programming environments such as Matlab will be advantageous. Send full CV and letter of application to Professor Shimon Edelman, COGS, University of Sussex, Brighton, BN1 9QH, England, by 30th January (informal enquiries may be directed to shimone at cogs.susx.ac.uk). From Jakub.Zavrel at kub.nl Wed Jan 6 09:31:44 1999 From: Jakub.Zavrel at kub.nl (Jakub.Zavrel@kub.nl) Date: Wed, 6 Jan 1999 15:31:44 +0100 (MET) Subject: Software release: Timbl 2.0 Message-ID: <199901061431.PAA20575@kubsuw.kub.nl> ---------------------------------------------------------------------- Software release: TiMBL 2.0 Tilburg Memory Based Learner ILK Research Group, http://ilk.kub.nl/ ---------------------------------------------------------------------- (sorry if you get this more than once) The ILK (Induction of Linguistic Knowledge) Research Group at Tilburg University, The Netherlands, announces the release of a new version of TiMBL, Tilburg Memory Based Learner (version 2.0). TiMBL is a machine learning program implementing a family of Memory-Based Learning techniques. TiMBL stores a representation of the training set explicitly in memory (hence `Memory Based'), and classifies new cases by extrapolating from the most similar stored cases. TiMBL features the following (optional) metrics and speed-up optimalizations that enhance the underlying k-nearest neighbor classifier engine: - Information Gain weighting for dealing with features of differing importance (the IB1-IG learning algorithm). - Stanfill & Waltz's / Cost & Salzberg's (Modified) Value Difference metric for making graded guesses of the match between two different symbolic values. - Conversion of the flat instance memory into a decision tree, and inverted indexing of the instance memory, both yielding faster classification. - Further compression and pruning of the decision tree, guided by feature information gain differences, for an even larger speed-up (the IGTREE learning algorithm). The current version is a complete rewrite of the software, and offers a number of new features: - Support for numeric features. - The TRIBL algorithm, a hybrid between decision tree and nearest neighbor search. - An API to access the functionality of TiMBL from your own C++ programs. - Increased ability to monitor the process of extrapolation from nearest neighbors. - Many bug-fixes and small improvements. TiMBL accepts commandline arguments by which these metrics and optimalizations can be selected and combined. TiMBL can read the C4.5 and WEKA's ARFF data file formats as well as column files and compact (fixed-width delimiter-less) data. -[download]----------------------------------------------------------- You are invited to download the TiMBL package for educational or non-commercial research purposes. When downloading the package you are asked to register, and express your agreement with the license terms. TiMBL is *not* shareware or public domain software. If you have registered for version 1.0, please be so kind to re-register for the current version. The TiMBL software package can be downloaded from http://ilk.kub.nl/software.html or by following the `Software' link under the ILK home page at http://ilk.kub.nl/ . The TiMBL package contains the following: - Source code (C++) with a Makefile. - A reference guide containing descriptions of the incorporated algorithms, detailed descriptions of the commandline options, and a brief hands-on tuturial. - Some example datasets. - The text of the licence agreement. - A postscript version of the paper that describes IGTREE. The package should be easy to install on most UNIX systems. -[background]--------------------------------------------------------- Memory-based learning (MBL) has proven to be quite successful in a large number of tasks in Natural Language Processing (NLP) -- MBL of NLP tasks (text-to-speech, part-of-speech tagging, chunking, light parsing) is the main theme of research of the ILK group. At one point it was decided to build a well-coded and generic tool that would combine the group's algorithms, favorite optimization tricks, and interface desiderata. The current incarnation of this is now version 2.0 of TiMBL. We think TiMBL can make a useful tool for NLP research, and, for that matter, for any other domain in machine learning. For information on the ILK Research Group, visit our site at http://ilk.kub.nl/ On this site you can find links to (postscript versions of) publications relating to the algorithms incorporated in TiMBL and on their application to NLP tasks. The reference guide ("TiMBL: Tilburg Memory-Based Learner, version 2.0, Reference Guide.", Walter Daelemans, Jakub Zavrel, Ko van der Sloot, and Antal van den Bosch. ILK Technical Report 99-01) can be downloaded separately and directly from http://ilk.kub.nl/~ilk/papers/ilk9901.ps.gz For comments and bugreports relating to TiMBL, please send mail to Timbl at kub.nl Please also send a mail to this address if you do not wish to receive further mail about Timbl. ---------------------------------------------------------------------- From mpp at us.ibm.com Wed Jan 6 15:07:04 1999 From: mpp at us.ibm.com (mpp@us.ibm.com) Date: Wed, 6 Jan 1999 15:07:04 -0500 Subject: Job announcement: IBM Pen Computing Group Message-ID: <852566F1.00700E84.00@D51MTA03.pok.ibm.com> ======> R&D Positions at IBM <====== The Pen Computing Group at the IBM T.J. Watson Research Center is looking for highly motivated individuals to fill R&D positions in the area of large-vocabulary, unconstrained, handwriting recognition. Candidates should have the following qualifications: - MS/PhD in EE, CS, Math, Physics or similar field - Strong mathematics/probability background - Excellent programming skills (in C and C++) - Creativity - Ability to work well as part of a team - Ability to direct their own research - Interest in creating a marketable product ======> Background <====== Our current projects include: - HMM-based, unconstrained, handwriting recognition - Language and grammar modeling - Accurate, high-speed, search methods - Document understanding and processing - Pen computing We have licensed our group's technology for the recently introduced CrossPad (http://www.cross-pcg.com) which enables users to efficiently index, search and manipulate their hand-written notes on their PC using a normal pad of paper. The IBM T.J. Watson Research Center is one of the top industrial laboratories in the world. We offer an exciting research environment with the opportunity to become involved in all aspects of cutting edge technology in the computer industry. ======> Contact Information <====== Please send CV's to: Michael P. Perrone mpp at us.ibm.com -or- Michael P. Perrone IBM T.J. Watson Research Center - 36-207 Route 134 Yorktown Heights, NY 10598 914-945-1779 From haim at fiz.huji.ac.il Wed Jan 6 07:21:16 1999 From: haim at fiz.huji.ac.il (Haim Sompolinsky) Date: Wed, 6 Jan 1999 14:21:16 +0200 Subject: Postdoc position at Hebrew University Message-ID: <00e201be396f$0f516c20$85504084@yesod.huji.ac.il> POST-DOCTORAL POSITION AVAILABLE The Neural Computational Theory Group in the Racah Institute of Physics at the Hebrew University of Jerusalem has a post-doctoral position open, beginning in the fall of 1999. The position will involve working on theoretical problems in two areas: 1.. Supervised and unsupervised learning 2.. Modeling visual and motor processing in cortical and subcortical neuronal structures The work to be done in the area of modeling is expected to involve interaction with experimental groups, which are part of the Interdisciplinary Center for Neural Computation at the Hebrew University. Candidates with a strong background in statistical mechanics, in neural network theory, or in computational neuroscience are encouraged to apply. Those interested should send an application letter along with CV, publications list, brief research statement, and three letters of recommendation to Professor Haim Sompolinsky. The position will be for one year, with the possibility for an extension of up to three years. Airmail submissions: Professor Haim Sompolinsky Racah Institute of Physics Hebrew University of Jerusalem Givat Ram Jerusalem 91904 ISRAEL Fax submissions: 972-2-6584437 Email submissions: haim at fiz.huji.ac.il From espaa at exeter.ac.uk Thu Jan 7 07:36:09 1999 From: espaa at exeter.ac.uk (ESPAA) Date: Thu, 7 Jan 1999 12:36:09 +0000 (GMT Standard Time) Subject: CFP Special Issue PAA journal Message-ID: PATTERN ANALYSIS AND APPLICATIONS JOURNAL http://www.dcs.exeter.ac.uk/paa Springer-Verlag Ltd. CALL FOR PAPERS FOR SPECIAL ISSUE ON 'DOCUMENT IMAGE ANALYSIS AND RECOGNITION' (Guest Editor- Adnan Amin, University of New South Wales, Australia) (DEADLINE FOR PAPER SUBMISSION: 1 April, 1999). Optical Character Recognition and document image analysis have become very important areas of research. Advanced computer and communication technologies now offer better ways to store, retrieve, and distribute this information. Document Image Analysis and Recognition (DIAR) research provides the technology for automated systems for extracting and organising information from paper based documents. Generally, these applications apply image processing and pattern recognition techniques. A document image may contain text, graphics, pictures, or a combination of these. Some commercial products are already available such as OCR systems for reading pages of machine printed text, but research is still required to improve their performance and the full range of real world variability in typography, image quality, and context. Performance evaluation of DIAR systems requires experimental design, a large train and test database, and sophisticated analysis of the results. The aim of this special issue is to show-case the state-of-the-art achievements in DIAR. Submitted papers should report the solution of a significant open problem: theoretical, algorithmic, and systems-architectural studies are welcome, as are papers describing practical applications supported by performance evaluation on a large scale. Topics appropriate for this special issue include, but are not limited to: x Document analysis algorithms and tools x Physical and logical page/image segmentation x Character and symbol recognition methods x Pre- and Post-processing algorithms x Graphical object recognition (e.g. maps and engineering drawings) x System performance measures x Recognition/classification methodologies for DIAR systems x Innovative and industrial applications Send four copies of your manuscript (marked "DIAR SPECIAL ISSUE") by April 1, 1999 to the following address: Sameer Singh, Editor-in-Chief, Pattern Analysis and Applications, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK __________________________________ Oliver Jenkin Editorial Secretary Pattern Analysis and Applications Department of Computer Science University of Exeter Exeter EX4 4PT tel: +44-1392-264066 fax: +44-1392-264067 email: espaa at exeter.ac.uk ____________________________ From smola at first.gmd.de Thu Jan 7 12:14:30 1999 From: smola at first.gmd.de (Alex Smola) Date: Thu, 07 Jan 1999 18:14:30 +0100 Subject: PhD Thesis available: Learning with Kernels Message-ID: <3694EB76.5520A7D9@first.gmd.de> Dear Connectionists, I am pleased to announce the availability of my PhD Thesis "Learning with Kernels" which is now available at http://svm.first.gmd.de/papers/Smola98.ps.gz Abstract Support Vector (SV) Machines combine several techniques from statistics, machine learning and neural networks. One of the most important ingredients are kernels, i.e. the concept of transforming linear algorithms into nonlinear ones via a map into feature spaces. The present work focuses on the following issues: - Extensions of Support Vector Machines. - Extensions of kernel methods to other algorithms such as unsupervised learning. - Capacity bounds which are particularly well suited for kernel methods. After a brief introduction to SV regression it is shown how the classical \epsilon insensitive loss function can be replaced by other cost functions while keeping the original advantages or adding other features such as automatic parameter adaptation. Moreover the connection between kernels and regularization is pointed out. A theoretical analysis of several common kernels follows and criteria to check Mercer's condition more easily are presented. Further modifications lead to semiparametric models and greedy approximation schemes. Next three different types of optimization algorithms, namely interior point codes, subset selection algorithms, and sequential minimal optimization (including pseudocode) are presented. The primal--dual framework is used as an analytic tool in this context. Unsupervised learning is an extension of kernel methods to new problems. Besides Kernel PCA one can use the regularization to obtain more general feature exractors. A second approach leads to regularized quantization functionals which allow a smooth transition between the Generative Topographic Map and Principal Curves. The second part of the thesis deals with uniform convergence bounds for the algorithms and concepts presented so far. It starts with a brief self contained overview over existing techniques and an introduction to functional analytic tools which play a crucial role in this problem. By viewing the class of kernel expansions as an image of a linear operator it is possible to give bounds on the generalization ability of kernel expansions even when standard concepts like the VC dimension fail or give way too conservative estimates. In particular it is shown that it is possible to compute the covering numbers of the given hypothesis classes directly instead of taking the detour via the VC dimension. Applications of the new tools to SV machines, convex combinations of hypotheses (i.e. boosting and sparse coding), greedy approximation schemes, and principal curves conclude the presentation. -- / Alex J. Smola GMD FIRST / / Room 214 Rudower Chaussee 5 / / Tel: (+49)30-6392-1833 12489 Berlin, Germany / / Fax: (+49)30-6392-1805 smola at first.gmd.de / / URL: http://www.first.gmd.de/~smola / From risc at lps.ens.fr Fri Jan 8 04:45:45 1999 From: risc at lps.ens.fr (risc@lps.ens.fr) Date: Fri, 8 Jan 1999 10:45:45 +0100 (CET) Subject: Workshop: Neurophysics and Physiology of the Motor System Message-ID: Neurophysics and Physiology of the Motor System Les Houches (France), February, 7-12, 1999 Les Houches school is located in a well known ski resort near Chamonix (in the french Alps). It has been for more than forty years a highly prestigious institution for physicists all over the world. The present session will be the first to be ever dedicated, at Les Houches, to integrated Neurosciences. It will bring together physiologists and physicists, who will discuss recent experimental and theoretical advances on the structure, dynamics and functions of the motor system. It will provide a unique opportunity for the participants to become familiar with many of the fundamental issues related to the elaboration, the execution and the control of movement in the central nervous system. It will also bring to light how Neurophysics can contribute to understanding the motor system. Among the topics discussed will be: emerging functional properties, the role of the nonlinearities of neural dynamics, the synchrony of neural activity. These questions will be introduced in the framework of the different nervous structures involved (cortex, cerebellum, basal ganglia, spinal cord), and further discussed in the wider context of the integrated physiology of the motor system. Mornings (8:30 AM to 11:30 AM) and late afternoons (5:00 PM to 7:15 PM) will be devoted to lectures. Organization: D. Hansel and C. Meunier, C.Ph.T. UMR 7644 CNRS Ecole Polytechnique 91128 Palaiseau, France. D. Golomb, Dept. of Physiology, Faculty of Health Sciences Ben Gurion University of the Negev, Beersheva, 84105, Israel Scientific committee: H. Bergman (Jerusalem), P. Collet (Paris), C. Masson (Dijon), A. Schmied (Marseille), I. Segev (Jerusalem) Speakers: M. Abeles (Jerusalem), H. Bergman (Jerusalem), E. Fetz (Seattle), C. Feuerstein (Grenoble), D. Golomb (Beersheva), D. Hansel (Paris), L. Jami (Paris), R. Lemon (London), Y. Manor (Beersheva), C. Meunier (Paris), A. Riehle (Marseille), A. Schmied (Marseille), I. Segev (Jerusalem), H. Sompolinsky (Jerusalem), E. Vaadia (Jerusalem), C. van Vreeswijk (London), Y. Yarom (Jerusalem), J. Yelnik (Paris), D. Zytnicki (Paris). Les Houches is a resort village in the Chamonix valley of the French Alps. Established in 1951, the School is located in a group of mountain chalets surrounded by meadows and woods at an altitude of 1150 m. It is above the village, facing the Mont-Blanc range. Registration fees are 2200FF including accommodation and meals during the whole session. Number of participants is limited to 40, speakers included. The participation of students and young scientists is encouraged, and they may benefit from reduced fees (limited number). Applications (short curriculum vitae and publications list) should be sent before January, 15, 1999 to Mrs. Martine Escoute, URA 1448, UFR biom\'edicale, 45 rue des Saints-P\`eres, 75270 Paris cedex 06; telephone: 0142862138; fax: 0149279062; e-mail: Martine.Escoute at biomedicale.univ-paris5.fr. Les Houches Physics school is affiliated to Universit\'e Joseph Fourier (Grenoble) and Institut National Polytechnique de Grenoble. It is subsidized by Minist\`ere de l'Education Nationale et de l'Enseignement Sup\'erieur, Centre National de la Recherche Scientifique and Commissariat \`a l'Energie Atomique . ---------------------- David Hansel hansel at cpht.polytechnique.fr From baluja at vie.ius.cs.cmu.edu Sat Jan 9 23:16:29 1999 From: baluja at vie.ius.cs.cmu.edu (baluja@vie.ius.cs.cmu.edu) Date: Sat, 9 Jan 99 23:16:29 EST Subject: Making Templates Rotationally Invariant: An Application to Rotated Digit Recognition Message-ID: The following paper is available from: http://www.cs.cmu.edu/~baluja Making Templates Rotationally Invariant: An Application to Rotated Digit Recognition Shumeet Baluja Abstract: This paper describes a simple and efficient method to make template-based object classification invariant to in-plane rotations. The task is divided into two parts: orientation discrimination and classification. The key idea is to perform the orientation discrimination before the classification. This can be accomplished by hypothesizing, in turn, that the input image belongs to each class of interest. The image can then be rotated to maximize its similarity to the training images in each class (these contain the prototype object in an upright orientation). This process yields a set of images, at least one of which will have the object in an upright position. The resulting images can then be classified by models which have been trained with only upright examples. This approach has been successfully applied to two real-world vision-based tasks: rotated handwritten digit recognition and rotated face detection in cluttered scenes. This work was completed while the author was at: Justsystem Pittsburgh Research Center & School of Computer Science, Carnegie Mellon University Comments and Questions welcome. Please send all feedback to sbaluja at lycos.com. From Jon.Baxter at syseng.anu.edu.au Sun Jan 10 20:55:09 1999 From: Jon.Baxter at syseng.anu.edu.au (Jonathan Baxter) Date: Mon, 11 Jan 1999 12:55:09 +1100 (EST) Subject: Trieste Workshop Message-ID: <199901110155.MAA15672@reid.anu.edu.au> Apologies if you receive this announcement more than once. --------------------------------------------------------------------------- SCHOOL ON NEURAL INFORMATION PROCESSING ( 3 - 28 May 1999 ) A School on Neural Information Processing will be held at the Abdus Salam International Centre for Theoretical Physics, Trieste, from 3 to 28 May 1999. This Bulletin contains the preliminary programme of the course, request for participation form and miscellaneous information. DIRECTORS: J.A. Hertz (NORDITA, Copenhagen) S.A. Solla (Northwestern University, Evanston) R. Zecchina (The Abdus Salam ICTP, Trieste) I. PURPOSE AND NATURE The goal of the school will be to present a systematic description of the theoretical approaches that provide tools to investigate the processing, transmission, and storage of information in the brain. Techniques based on the principles of statistical physics and information theory will be presented and applied to the analysis of a variety of problems in computational neuroscience, especially neural encoding and sensory processing. The lectures will be concentrated on biological neural networks, but will also cover some important developments in artificial networks and optimization theory. The lectures presented during the school intend to offer a broad and comprehensive training in order to provide a complete perspective of the field. In addition to the scheduled lectures (an average of 4 lectures per day, 5 days per week) there will be formal and informal seminars on a variety of research topics by lecturers, participants and visiting experts. Computer experimentations will also be organized. II. PRELIMINARY LIST OF LECTURERS AND TOPICS W. Bialek(NEC Research Institute, Princeton)Neural coding M. Biehl (University of Wuerzburg) The dynamics of learning P. Dayan (University College, London) Generative models A. Engel (University of Magdeburg) Statistical physics theory of learning D. Hansel (Ecole Polytechnique, Paris) Circuitry of the visual cortex J. Hertz (Nordita, Copenhagen ) Neural computation and encoding Li Zhaoping (University College, London) Sensory processing R. Monasson (Ecole Normale Superieure), Paris) Optimization problems J. Rinzel (New York University, New York) Modelling of cell/network dynamics T. Sejnowski (The Salk Institute, San Diego) Neural computation and encoding S. Solla(Northwestern University, Chicago)Neural networks for Bayesian inference M. Tsodyks (The Weizmann Institute, Rehovot) Synaptic dynamics L. van Hemmen (Technical University, Munich) Modelling neural circuitry R. Zecchina (ICTP, Trieste) Optimization problems III. PARTICIPATION Scientists and students from all countries that are members of the United Nations, UNESCO or IAEA can attend the School. The main purpose of the Centre is to help research workers from developing countries through a programme of training activities within a framework of international cooperation. However, students and post-doctoral scientists from developed countries are most welcome to attend. As the School will be conducted in English, participants should have an adequate working knowledge of that language. Participants should preferably have completed several years of study and research after a first degree. Applications from graduate students about to finish their PhD, fresh post-docs and young, active faculty members are encouraged. As a rule, travel and subsistence expenses of the participants are borne by the home institutions. However, some funds are available which permit the Centre to grant a subsistence allowance to a limited number of people from developing countries who will be selected by the Organizers. As scarcity of funds allows travel to be granted only in few exceptional cases, every effort should be made by candidates to secure support for their fares (or at least partial fare) by their home country. Such financial support is available only to those attending the entire School. Scientists from developed countries are welcome to join on their own funds. There is no registration fee for attending the School. Deadline for the RECEIPT of request for participation form: 20 January 1999 Candidates should complete and sign the attached "Request for Participation" form (also obtainable via e-mail: smr1157 at ictp.trieste.it, using as subject "get bulletin", or via WWW Server: http://www.ictp.trieste.it/), and send it to: International Centre for Theoretical Physics School on Neural Information Processing P.O. Box 586 (Strada Costiera 11: for courier delivery) I-34100 Trieste, Italy Please note that no LATEX/TEX files are permitted. Any attachments to the request for participation, relevant to extra information for selection purposes, should not exceed 6 pages. The decision of the Organizing Committee will be communicated to all candidates as soon as possible. UNITED NATIONS EDUCATIONAL SCIENTIFIC AND CULTURAL ORGANIZATION and INTERNATIONAL ATOMIC ENERGY AGENCY ABDUS SALAM INTERNATIONAL CENTRE FOR THEORETICAL PHYSICS (ICTP) P.O. Box 586 Telephone: +39 040 2240111 I-34100 Trieste Telex: +39 460392 ICTP I Italy Telefax: +39 040 224163 REQUEST FOR PARTICIPATION *) School on Neural Information Processing ( 3 - 28 May 1999 ) ____________________________________________________________________________ INSTRUCTIONS Each question must be answered clearly and A recent photo of the completely. Type or print in ink. If more candidate should be space is required, attach additional pages. attached here, signed The request for participation form should legibly on the reverse. be forwarded to the ICTP, School on Neural Information Processing, P.O. Box 586, I-34100 Trieste, Italy, to arrive before 20th January 1999 ________________________________________________________________________________ PERSONAL DATA PLEASE NOTE THAT UNLESS ALL REQUESTED PERSONAL DATA ARE PROVIDED, THE ICTP CANNOT PROCESS ANY VISA REQUESTS ________________________________________________________________________________ For women (if applicable) SURNAME: MAIDEN NAME: First name: Middle name(s): Sex: Please also indicate SURNAME, NAME, on passport if different from above: ________________________________________________________________________________ Place of birth (City and Country): Present nationality: Date of birth Year - Month - Day: ________________________________________________________________________________ Full name & address of permanent Institution: Tel. No. : Cable : Telex : Telefax : E-mail : ________________________________________________________________________________ Full name & address of present Institution (if different from permanent) Tel. No. : Cable : Telex : Telefax : E-mail : until: ________________________________________________________________________________ Home address: Tel. No. : ________________________________________________________________________________ Mailing address - please indicate whether: Permanent Institute __ Present Institute __ Home address __ ________________________________________________________________________________ Name and address of person to notify in case of emergency - Relationship: ________________________________________________________________________________ *) PLEASE NOTE that no request will be processed unless the permanent address (and present address, if different) is clearly indicated. EDUCATION (higher degrees) University or equivalent Years attended Degrees Name and place From to ________________________________________________________________________________ Seminars, summer schools, conferences or research Name and place Year ________________________________________________________________________________ SCIENTIFIC EMPLOYMENT AND ACADEMIC RESPONSIBILITY Research Institution or University Period of duty Academic Name and place From to responsibilities Present employment and duties, and foreseen employment upon return to home country after the activity: ________________________________________________________________________________ Have you participated in past ICTP activities: Yes ____ No If yes, which? Mention briefly your previous research experience, and explain your reasons for wishing to participate in this activity: NB: Our Scientific Information System keeps track of all applications made by the candidate to earlier ICTP activities. As a consequence, when the subject of the present activity is far from his/her previous applications, an explanation (not more than 200 words) of his/her change of interest should be included. ________________________________________________________________________________ Kindly supply (strictly within indicated lengths) a keyword description of your current scientific activities as follows: 1) Area of research (e.g. statistical physics, information theory): _______________________ (no more than 15 characters) 2) Specific topic of interest (e.g. learning theory, neural coding): ________________________________________________ (no more than 30 characters) - 2 Present field of interest (please indicate on the list below - up to 5 fields - underlining primary field) 10. PHYSICS OF CONDENSED MATTER 60. PHYSICS TEACHING 11. Solid State Physics 61. English 12. Atomic and Molecular Physics 62. French 13. Materials Science 63. Spanish 14. Surfaces and Interfaces 64. Arab 15. Statistical Physics 16. Computational Physics in Condensed Matter 80. MISCELLANEOUS 20. PHYSICS OF HIGH AND INTERMEDIATE ENERGIES 81. Others 82. Digital Communications and Computer Networking 21. High Energy and Particle Physics 22. Relativity, Cosmology and Astrophysics 23 Plasma Physics 90. PHYSICS OF THE LIVING STATE 24. Nuclear Physics 91. Neurophysics 92. Biophysics 30. MATHEMATICS 93. Medical Physics 31. Applicable Mathematics including: - Mathematical Ecology, - Systems Analysis, - Mathematical Economy - Mathematics in Industry AO. APPLIED PHYSICS 33. Algebra 34. Geometry A1. Physics in Industry 35. Topology A2. Microelectronics 36. Differential Equations A3. Fibre Optics for Communications 37. Analysis A4. Instrumentation 38. Mathematical Physics A5. Synchrotron Radiation A6. Non-destructive Evaluation A7. Lasers 40. PHYSICS AND ENERGY AA. Applied Superconductivity 41. Physics of Nuclear Reactors 42. Physics of Controlled Fusion 43. Non-Conventional Energy (Solar, Wind and others) 50. PHYSICS AND ENVIRONMENT B1. SPACE PHYSICS 51. Solid Earth Geophysics 52. Soil Physics 53. Climatology and Meteorology 54. Physics of the Oceans 55. Physics of Desertification 56. Physics of the Atmosphere, Troposphere Magnetosphere, Aeronomy 57. Environmental Monitoring and Remote Sensing ________________________________________________________________________________ List your scientific publications including books and articles (authors, title, Journal) in the period 1992-1999 - 3 - Please respond to the following questions regarding your expertise with computers. 1) Which is the operating system you use most often? _____ DOS - Windows _____ Unix _____ VMS _____ MacOS _____ Other (specify) 2) How experienced are you with the Unix environment? _____ Very experienced _____ Experienced _____ Somewhat familiar _____ Never used 3) Which programming languages are you familiar with? Very good Good Average Poor Fortran 77 _____ _____ _____ _____ Fortran 90 _____ _____ _____ _____ C _____ _____ _____ _____ C++ _____ _____ _____ _____ Pascal _____ _____ _____ _____ Other (specify _____ _____ _____ _____ 4) Have you written programmes for your research? ____ Yes _____ No If yes, briefly describe for what type of research applications: Approximately how many lines had the longest programme (or piece of programme) you ever wrote?_________ 5) Do you use computer programmes written by others (including commercial ones) for your research? _____ Yes _____ No If yes, briefly describe for what type of research applications: - 4 - Kindly state any positions you hold in the scientific administration of your Institution or any of the national scientific Institutions. If appropriate, especially for junior physicists, it would be of assistance to the Selection Committee if this request for participation were accompanied by a letter of recommendation. ________________________________________________________________________________ Indicate below your proficiency in the English language Reading: Good ___ Writing: Good ___ Speaking: Good ___ Average ___ Average ___ Average ___ Poor ___ Poor ___ Poor ___ ________________________________________________________________________________ APPLICABLE ONLY FOR CANDIDATES FROM DEVELOPING COUNTRIES (Important: Please note that very few travel grants are available, and preference in selecting participants will be given to eligible candidates who can guarantee travel coverage by own local sources). Please tick as appropriate: - I can definitely find complete travel funds from local sources ____ or - I can definitely find half my travel funds from local sources ____ I am requesting financial support from the ICTP for: - Half travel ____ - Full travel ____ - Living allowance ____ I am NOT requesting financial support from the ICTP ___ I certify that if granted funds for my travel I will attend the whole activity ................................................... Signature ____________________________________________________________________________ ________________________________ I certify that the statements made by me above are true and complete. If accepted, I undertake to refrain from engaging in any political or other activities which would reflect unfavourably on the international status of the Centre. I understand that any breach of this undertaking may result in the termination of the arrangements relating to my visit at the Centre. _________________________________________ ______________________ Signature of candidate Date - 5 - From mccallum at sandbox.jprc.com Mon Jan 11 19:15:50 1999 From: mccallum at sandbox.jprc.com (Andrew McCallum) Date: Mon, 11 Jan 1999 19:15:50 -0500 Subject: ML Papers & Cora: Two search engines for postscript papers Message-ID: <199901120015.TAA12092@sandbox.jprc.com> We are pleased to announce the availability of two search engines for Postscript papers on the Web. "ML Papers" provides access to Machine Learning papers. "Cora" provides access to papers on computer science as a whole. Both allow keyword searches over partial text of postscript-formatted papers they have found by spidering the Web. * For ML Papers: http://gubbio.cs.berkeley.edu/mlpapers/ * For Cora: http://www.cora.justresearch.com About ML Papers: "ML Papers", first released in 1997, is a search engine that automatically extract titles, authors and abstracts from postscript papers found on the Web; it was (to our knowledge) the first of such a form. Its index currently consists about 12,000 postscript papers, mostly related to Machine Learning, Datamining, Statistics, etc, and a Web interface provides search functionality over them. Links to poscript files and their referring pages are returned in response to queries. Titles/authors/abstracts of these papers are also displayed. You can see ML Papers at http://gubbio.cs.berkeley.edu/mlpapers/ "ML Papers," which was recently moved from MIT to UC Berkeley, was created by Andrew Ng. Its companion "Vision Papers" search engine can also be accessed at http://www.ai.mit.edu/people/ayn/cgi/vpapers. About Cora: "Cora" provides access to over 50,000 research papers on all computer science subjects. Search queries can include special operators such as +, -, "", title:, author:, reference:, and url:, (all with their typical meanings). Citation references have been processed to provide forward and backward crosslinks---showing both (1) papers referenced by the current paper, and (2) papers that reference the current paper. References have also been parsed in order to provide automatically-generated BibTeX entries. The papers are categorized into a "Yahoo-like" topic hierarchy with 75 leaves. In the near future, the citation structure will be analyzed in order to automatically identify seminal and survey articles in each category. Cora is at http://www.cora.justresearch.com "Cora" is the result of a continuing research project at Just Research, led by Andrew McCallum with interns Kamal Nigam, Jason Rennie and Kristie Seymore. Just Research is the U.S. research organization of Justsystem Corporation, the leading independent software company in Japan, and is located near the Carnegie Mellon campus. A paper describing Cora will be presented at the AAAI Spring Symposium, and can be found at http://www.cs.cmu.edu/~mccallum/papers/cora-aaaiss98.ps. Feel free to share this announcement with others. Enjoy and please send feedback. Andrew Ng ang at cs.berkeley.edu "ML Papers" Andrew McCallum mccallum at justresearch.com, mccallum at cs.cmu.edu "Cora" From jordan at CS.Berkeley.EDU Tue Jan 12 12:13:32 1999 From: jordan at CS.Berkeley.EDU (Michael Jordan) Date: Tue, 12 Jan 1999 09:13:32 -0800 (PST) Subject: Learning in Graphical Models Message-ID: <199901121713.JAA09319@orvieto.CS.Berkeley.EDU> The following book is available from MIT Press; see http://mitpress.mit.edu/promotions/books/JORLPS99 LEARNING IN GRAPHICAL MODELS Michael I. Jordan, Ed. Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering--uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms. PART I: INFERENCE Robert G. Cowell Uffe Kjaerulff Rina Dechter Michael I. Jordan, Zoubin Ghahramani, Tommi S. Jaakkola, and Lawrence K. Saul Tommi S. Jaakkola and Michael I. Jordan David J. C. MacKay Radford M. Neal PART II: INDEPENDENCE Thomas S. Richardson Milan Studeny and Jirina Vejnarova PART III: FOUNDATIONS FOR LEARNING David Heckerman Radford M. Neal and Geoffrey E. Hinton PART IV: LEARNING FROM DATA Christopher M. Bishop Joachim M. Buhmann Nir Friedman and Moises Goldszmidt Dan Geiger, David Heckerman, and Christopher Meek Geoffrey E. Hinton, Brian Sallans, and Zoubin Ghahramani Michael J. Kearns, Yishay Mansour, and Andrew Y. Ng Stefano Monti and Gregory F. Cooper Lawrence K. Saul and Michael I. Jordan Peter W. F. Smith and Joe Whittaker David J. Spiegelhalter, Nicky G. Best, Wally R. Gilks, and Hazel Inskip Christopher K. I. Williams Adaptive Computation and Machine Learning series 7 x 10, 648 pp. paper ISBN 0-262-60032-3 From berthouz at aidan.etl.go.jp Tue Jan 12 19:40:37 1999 From: berthouz at aidan.etl.go.jp (Luc Berthouze) Date: Wed, 13 Jan 1999 09:40:37 +0900 Subject: Emergence and Development of Embodied Cognition: - Symposium Announcement Message-ID: <199901130040.JAA03476@aidan.etl.go.jp> First International Symposium on Emergence and Development of Embodied Cognition (EDEC99) February 9, 1999. At AIST Tsukuba Research Center, Auditorium. Sponsored by Electrotechnical Laboratory (ETL), AIST, MITI and COE program by STA, Japan. Co-organized by: Dr. Yasuo Kuniyoshi (ETL) and Prof. Rolf Pfeifer (Univ. Zurich) Language: English. Participation: Open to public, limited capacity (140seats). Pre-registration strongly recommended (see our web page). Content: This is a one-day open symposium consisting of invited talks by world leading researchers converging onto the issue of interaction dynamics of embodied cognition from the fields of complex systems, biology, neuroscience, psychology, cognitive science, autonomous agents and robotics. The symposium intends to provide an overview and to demonstrate the importance of interdisciplinary collaborative efforts into this common research issue. Conference home page: All important information is located at http://www.etl.go.jp/etl/robotics/EDEC99/ Pre-Registration: Through our web page, as soon as possible, no later than Feb. 1. Fees: Please pay at the on-site registration desk in cash. Symposium: 1,000yen. (Including a handout and coffee break.) Lunchbox: Fee TBA (Recommended as the cafeteria may be crowded.) Reception: 3,000yen. (Need prior registration.) Symposium secretariats: Registration handling & web page manager: For registration, see our web page. Dr. Luc Berthouze, Email: berthouz at etl.go.jp Humanoid Interaction Lab., Intelligent Systems Division, Electrotechnical Laboratory. Local arrangements: Ms. Yoko Sato, Email: yosato at etl.go.jp Tel.:+81-298-54-5180 Fax.:+81-298-54-5971 Humanoid Interaction Lab., Intelligent Systems Division, Electrotechnical Laboratory. Program 9:00 Registration 9:30 Yasuo Kuniyoshi Opening Address - Interdisciplinary EDEC initiative. 9:40 Rolf Pfeifer Dynamics, Morphology, and Materials in The Emergence of Cognition 10:15 Esther Thelen Developmental Foundations of Embodied Cognition 10:50 Linda Smith The Task Dynamics of the A not-B Error 11:25 Kazuo Hiraki Prediction, Habituation and Attention in the Development of Spatial Cognition: Eye-Tracking Data of Infants. 12:00 Lunch 13:00 Shoji Itakura Comparative Cognitive Approach -- Ontogeny, Phylogeny, and 'Robogeny': In the Case of Primate Social Cognition 13:35 Gentaro Taga Complex Systems Approach to Development of Action and Perception of Infants 14:10 Yasuo Kuniyoshi Towards Emergence and Development of Meaningful Interaction Structures through Complex Embodiment - A Humanoid Robot 14:35 Luc Berthouze Emergence of Embodied Interaction: The Internal Dynamics Perspective 15:00 Break 15:15 Olaf Sporns Synthetic Neural Modeling: An Approach to Study the Interaction of Neural Dynamics and Behavior 15:50 Philippe Gaussier From dynamical behaviors to dynamical perception 16:25 Gregor Schoener The Dynamic Field and Its Preshaping: Concepts toward a General Theoretical Framework of Embodied Cognition. 17:00 Takashi Ikegami Simulating a "Theory of Mind" in Coupled Dynamical Recognizers --- Embodiment as Dynamic Interfaces --- 17:35 Closing Discussions 18:00 Symposium Closes 18:30 Reception. List of Presenters 1. Prof. Rolf PFEIFER Director of AI Lab, Computer Science Department, University of Zurich 2. Prof. Esther THELEN Professor, Psychology & Cognitive Science, Indiana University 3. Prof. Linda SMITH Professor, Psychology & Cognitive Science, Indiana University 4. Dr. Kazuo HIRAKI Senior Research Scientist, Information Science Division, Electrotechnical Laboratory 5. Prof. Shoji ITAKURA Associate Professor, Department of Health Sciences, Oita University of Nursing and Health Sciences 6. Prof. Gentaro TAGA Research Assistant Professor, Department of Pure and Applied Sciences, The University of Tokyo 7. Dr. Yasuo KUNIYOSHI Senior Research Scientist, Intelligent Systems Division, Electrotechnical Laboratory 8. Dr. Luc BERTHOUZE Research Scientist, Intelligent Systems Division, Electrotechnical Laboratory 9. Dr. Olaf SPORNS Senior Fellow in Theoretical and Experimental Neurobiology, The Neurosciences Institute 10. Prof. Philippe GAUSSIER Professor, The Image and Signal Processing Lab, ENSEA - The Cergy Pontoise University 11. Prof. Gregor SCHOENER Director, National Research Center of Cognitive Neuroscience, CNRS 12. Prof. Takashi IKEGAMI Associate Professor, Institute of Physics, College of Arts and Sciences, The University of Tokyo --------- Dr. Luc Berthouze, Research Scientist, Intelligent Systems Division Electrotechnical Laboratory (ETL), AIST, MITI, Japan. Tel.+81-298-54-5369 Fax.+81-298-54-5971 1-1-4 Umezono, Tsukuba 305-8568, Japan. Email: berthouz at etl.go.jp http://www.etl.go.jp/~berthouz From aslin at cvs.rochester.edu Wed Jan 13 11:19:10 1999 From: aslin at cvs.rochester.edu (Richard N. Aslin) Date: Wed, 13 Jan 1999 11:19:10 -0500 Subject: postdoc positions at University of Rochester Message-ID: POSTDOCTORAL FELLOWSHIPS, UNIVERSITY OF ROCHESTER. The Department of Brain and Cognitive Sciences seeks two outstanding postdoctoral fellows with research interests in learning and/or developmental cognitive science. One fellowship is affiliated with a NIH training grant in Learning, Development, and Behavior, and the other fellowship is affiliated with a NSF grant on Learning and Intelligent Systems. Supervising faculty for both fellowships work on problems of learning and development using behavioral, computational, and neurobiological approaches. Candidates should have prior background in at least one of these approaches and an interest in working collaboratively in a highly interdisciplinary setting. Several faculty have special interest in statistical learning in the domains of language and perception, although a commitment to this interest is associated only with the NSF fellowship. The NIH fellowship is open only to US citizens or permanent residents. Applicants should send a letter describing their graduate training and research interests, a curriculum vitae, and arrange to have three letters of recommendation sent to: Professor Richard N. Aslin, Department of Brain and Cognitive Sciences, Meliora Hall, University of Rochester, Rochester, NY 14627-0268. Review of applications will begin on February 15, 1999 and continue until one or both positions are filled, with an expected start date of June/August, 1999. Applicants can learn about the department, its faculty, and the opportunities for training by referring to our Web page (http://www.bcs.rochester.edu). Applications from women and members of underrepresented minority groups are especially welcome. The University of Rochester is an Equal Opportunity Employer. -------------------------------------------------------- Richard N. Aslin Department of Brain and Cognitive Sciences and the Center for Visual Science Meliora Hall University of Rochester Rochester, NY 14627 email: aslin at cvs.rochester.edu FAX: (716) 442-9216 Office: (716) 275-8687 http://www.cvs.rochester.edu/people/r_aslin/r_aslin.html From jagota at cse.ucsc.edu Thu Jan 14 14:49:58 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Thu, 14 Jan 1999 11:49:58 -0800 (PST) Subject: new survey e-publication Message-ID: <199901141949.LAA29667@arapaho.cse.ucsc.edu> New refereed e-publication action editor: John Shawe-Taylor T. B. Ludermir, A. de Carvalho, A. P. Braga, M. C. P. de Souto, Weightless neural models: a review of current and past works Neural Computing Surveys 2, 41--61, 1999. 108 references. http://www.icsi.berkeley.edu/~jagota/NCS Abstract: This paper presents a survey of a class of neural models known as Weightless Neural Networks (WNNs). As the name suggests, these models do not use weighted connections between nodes. Instead, a different kind of neuron model, usually based on RAM memory devices, is used. In the literature, the terms ``RAM-based'' and ``n-tuple based'' systems are also commonly used to refer to WNNs. WNNs are being widely investigated, motivating relevant applications and two international workshops in the last few years. The paper describes the most important works in WNNs found in the literature, pointing out the challenges and future directions in the area. A comparative study between weightless and weighted models is also presented. From petridis at eng.auth.gr Mon Jan 18 05:41:38 1999 From: petridis at eng.auth.gr (Vassilis Petridis) Date: Mon, 18 Jan 1999 12:41:38 +0200 Subject: A new paper on clustering disparate data Message-ID: <36A30FE2.388413BD@vergina.eng.auth.gr> Dear Connectionists, We would like to inform you that the paper V. Petridis, and V.G. Kaburlasos, "Fuzzy Lattice Neural Network (FLNN): A Hybrid Model for Learning", IEEE Transactions on Neural Networks, vol. 9, no. 5, September 1998, pp. 877-890, can be accessed at http://skiron.control.ee.auth.gr/post1990.html It has already been stated in the abstract of our above paper that the FLNN draws on Carpenter-Grossberg's Adaptive Resonance Theory (ART) as well as it draws on Simpson's Min-Max neurocomputing principles. Nevertheless we are going to point out herein that the FLNN is not merely a modified version of the previous well known neural paradigms. The important difference is that the FLNN is applicable to data types with the structure of a mathematical lattice. As a consequence, the FLNN is applicable apart from the conventional Euclidean space to other domains as well. For instance in the above paper we demonstrate a learning example in the domain of fuzzy sets over a universe of discourse. We treat other domains in forthcoming publications. Learning and decision-making by the FLNN can both make common sense and be subject to rigorous mathematical analysis. FLNN is a specific scheme within the framework of fuzzy lattices (or, FL-framework) which is presented briefly in the above paper. Apart from (possible) interest in FLNN's "off the mainstream" mathematics, there exists a significant practical potential underlying the employment of the FLNN. That is the capacity to treat jointly and rigorously disparate data. For example the FLNN can treat simultaneously - and with rigour - such disparate data as real numbers, fuzzy sets, propositional statements, symbols, etc. An example of processing disparate data is information filtering and retrieval from the web. We point out that to the best of our knowledge this is a unique capacity of the FLNN alone. In this sense the FLNN can emulate human's capacity for processing jointly disparate data. Comments are welcome. --------------------------------------------------------------------- Professor Vassilios Petridis Dept. of Electrical and Computer Eng. Faculty of Engineering Aristotle University of Thessaloniki GR54006 Thessaloniki, GREECE --------------------------------------------------------------------- email: petridis at vergina.eng.auth.gr phone: +3031 996331 fax : +3031 996367 web : http://control.ee.auth.gr/ From maass at igi.tu-graz.ac.at Mon Jan 18 13:24:10 1999 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Mon, 18 Jan 1999 19:24:10 +0100 Subject: Book on Pulsed Neural Networks Message-ID: <36A37C4A.D1BC3AAF@igi.tu-graz.ac.at> The following book has just appeared at MIT-Press: PULSED NEURAL NETWORKS edited by Wolfgang Maass and Christopher M. Bishop Contributors: Peter S. Burge, Stephen R. Deiss, Rodney J. Douglas, John G. Elias, Wulfram Gerstner, Alister Hamilton, David Horn, Axel Jahnke, Richard Kempter, Wolfgang Maass, Alessandro Mortara, Alan F. Murray, David P. M. Northmore, Irit Opher, Kostas A. Papathanasiou, Michael Recce, Barry J. P. Rising, Ulrich Roth, Tim Sch?nauer, Terrence J. Sejnowski, John Shawe-Taylor, Max R. van Daalen, J. Leo van Hemmen, Philippe Venier, Hermann Wagner, Adrian M. Whatley, Anthony M. Zador. Most artificial neural network models are inspired by models for biological neural systems where the output of a neuron is encoded exclusively in its firing rate: The output of a computational unit in an artificial neural network is a (static) binary or continuous variable that may be viewed as a representation (or abstraction) of the current firing rate of a biological neuron. In recent years, however, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses (called action potentials or spikes), also use the timing of these pulses to transmit information and to perform computation. This realization has stimulated a significant growth of research activity in the area of pulsed neural networks ranging from neurobiological modeling and theoretical analyses, to algorithm development and hardware implementations. Obviously quite different theoretical tools and models have to be developed for this purpose, since almost all traditional computational models (including most artificial neural network models) are based on the assumption that the timing of atomic computational events does not depend in an essential way on the input to the computation (an example is the common assumption that parallel computation steps are synchronized, another example is the assumption that their timing is largely stochastic). For implementations in novel electronic hardware artificial pulsed neural networks offer the possibility to create intriguing combinations of ideas from analog and digital circuits: a pulse has a stereotyped form, hence it may be viewed as a digital signal. On the other hand the timing of a pulse may encode an analog variable. The research reported in this book is motivated both by the desire to enhance our understanding of information processing in biological networks, as well as by the goal of developing new information processing technologies. Our aim in producing this book has been to provide a first comprehensive treatment of the field of pulsed neural networks, which will be accessible to researchers from diverse disciplines such as electrical engineering, signal processing, computer science, physics, and computational neuroscience. By virtue of its pedagogical emphasis, it will also find a place in many of the advanced undergraduate and graduate courses in neural networks now taught in many universities. Terrence J. Sejnowski's foreword, "Neural Pulse Coding," presents an overview of the topic. The first half of the book consists of longer tutorial articles spanning neurobiology, theory, algorithms, and hardware. The second half contains a larger number of shorter research chapters that present more advanced concepts. The contributors use consistent notation and terminology throughout the book. 408 pp., 195 illus., cloth ISBN 0-262-13350-4 MIT-Press; A Bradford Book For further information on this book visit http://www.cis.tu-graz.ac.at/igi/maass/PNN.html MIT-Press catalogue: http://mitpress.mit.edu/promotions/books/MAAPHS99 Amazon bookstore ordering information: http://www.amazon.com/exec/obidos/ASIN/0262133504/qid%3D916171995/002-1728549-3620249 -------------------------------------------------------------------------- Christopher M. Bishop is Senior Researcher at Microsoft Research, Cambridge, and Professor of Computer Science at the University of Edinburgh. Wolfgang Maass is Professor at the Institute for Theoretical Computer Science, Technische Universit?t Graz, Austria. From wsenn at iam.unibe.ch Tue Jan 19 05:17:47 1999 From: wsenn at iam.unibe.ch (Walter Senn) Date: Tue, 19 Jan 1999 10:17:47 +0000 Subject: Position Announcements Message-ID: <36A45BCB.6E351A6E@iam.unibe.ch> POSTDOC AND PhD POSITIONS IN COMPUTATIONAL NEUROSCIENCE The University of Bern, Switzerland, is building an interdisciplinary research group in Computational Neuroscience supported by the Department of Physiology and the Institute of Mathematics. As part of this effort, several positions will become available on 1/4/1999: 1 Post-doctoral position in Computational Neuroscience (up to 3 years) 1 Post-doctoral position in Experimental Neuroscience (up to 3 years) 1 Pre-doctoral (Ph.D.) position in Computational Neuroscience (4 years) The theoretical work of the existing group focuses on short-term synaptic adaptation, single neuron computation and dynamics of networks of spiking neurons. The recent experimental studies at the Physiological Institute concern LTP/LTD experiments on activity-dependent cortical synapses, dendritic signal processing and the culturing of networks on multi-electrode arrays. Candidates with research activities in one of these fields are encouraged to apply, but strong candidates with research in a related topic will also be considered. For additional information see http://pylp76.unibe.ch/~fniwww/jobs/NCjobsMore.html . Please send a letter of application along with CV, publication list, brief statement of current research, and two letters of recommendation before February 20, 1999, to Dr. Walter Senn Department of Physiology University of Bern Bhlplatz 5 CH-3012 Bern, Switzerland e-mail:wsenn at iam.unibe.ch FAX ++41 31 631 46 11 From bricolo at sissa.it Tue Jan 19 08:15:22 1999 From: bricolo at sissa.it (Emanuela Bricolo) Date: Tue, 19 Jan 1999 14:15:22 +0100 (NFT) Subject: pre and postdoctoral positions Message-ID: <199901191315.OAA89766@shannon.sissa.it> A non-text attachment was scrubbed... Name: not available Type: text Size: 3015 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/63c8d4c7/attachment.ksh From gert at cogsci.ed.ac.uk Tue Jan 19 14:17:06 1999 From: gert at cogsci.ed.ac.uk (gert@cogsci.ed.ac.uk) Date: Tue, 19 Jan 1999 19:17:06 +0000 (GMT) Subject: CFP: Workshop Biologically Inspired Machine Learning Message-ID: <8505.199901191917@finlay.cogsci.ed.ac.uk> A non-text attachment was scrubbed... Name: not available Type: text Size: 4004 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/4c09234c/attachment.ksh From oreilly at grey.colorado.edu Tue Jan 19 20:00:43 1999 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Tue, 19 Jan 1999 18:00:43 -0700 Subject: TR on Hippocampus and Neocortex Message-ID: <199901200100.SAA21605@grey.colorado.edu> The following technical report is now available for downloading as: ftp://grey.colorado.edu/pub/oreilly/papers/oreillyrudy99_hipconj_tr.ps - Randy Conjunctive Representations in Learning and Memory: Principles of Cortical and Hippocampal Function Randall C. O'Reilly and Jerry W. Rudy Department of Psychology University of Colorado Boulder, CO 80309 ICS Technical Report 99-01 Abstract: We present a theoretical framework for understanding the roles of the hippocampus and neocortex in learning and memory. This framework incorporates a theme found in many theories of hippocampal function, that the hippocampus is responsible for developing conjunctive representations binding together stimulus elements into a unitary representation that can later be recalled from partial input cues. This idea appears problematic, however, because it is contradicted by the fact that hippocampally lesioned rats can learn nonlinear discrimination problems that require conjunctive representations. Our framework accommodates this finding by establishing a principled division of labor between the cortex and hippocampus, where the cortex is responsible for slow learning that integrates over multiple experiences to extract generalities, while the hippocampus performs rapid learning of the arbitrary contents of individual experiences. This framework shows that nonlinear discrimination problems are not good tests of hippocampal function, and suggests that tasks involving rapid, incidental conjunctive learning are better. We implement this framework in a computational neural network model, and show that it can account for a wide range of data in animal learning, thus validating our theoretical ideas, and providing a number of insights and predictions about these learning phenomena. +-----------------------------------------------------------------------------+ | Dr. Randall C. O'Reilly | | | Assistant Professor | | | Department of Psychology | Phone: (303) 492-0054 | | University of Colorado Boulder | Fax: (303) 492-2967 | | Muenzinger D251C | Home: (303) 448-1810 | | Campus Box 345 | email: oreilly at psych.colorado.edu | | Boulder, CO 80309-0345 | www: http://psych.colorado.edu/~oreilly | +-----------------------------------------------------------------------------+ From sahami at Robotics.Stanford.EDU Tue Jan 19 22:01:39 1999 From: sahami at Robotics.Stanford.EDU (Mehran Sahami) Date: Tue, 19 Jan 1999 19:01:39 -0800 (PST) Subject: PhD Thesis on Machine Learning/Information Access Message-ID: <199901200301.TAA28996@luminous.Stanford.EDU> [Apologies if you receive this more than once.] Dear colleagues, I am very pleased to announce the availability of my PhD thesis, entitled "Using Machine Learning to Improve Information Access" at the following URL: http://robotics.stanford.edu/users/sahami/papers-dir/thesis.ps The dissertation examines the the use of novel clustering, feature selection and classification algorithms applied to text data (as well as some non-text domains). It also presents a working system, SONIA, that makes use of these technologies to enable the automatic topical organization of retrieval results. The table of contents and a more detailed abstract are appended below. Best, Mehran ------------------+---------------------------------- Mehran Sahami | http://xenon.stanford.edu/~sahami Systems Scientist | phone: (650) 496-2399 Epiphany, Inc. | http://www.epiphany.com ------------------+---------------------------------- ---------------------------------------------------------------------- Using Machine Learning to Improve Information Access Part I: Preliminaries Chapter 1: Introduction 1.1 Challenges of Information Access 1.2 System Overview 1.3 Reader's Guide Chapter 2: Document Representation 2.1 Defining a Vector Space 2.2 Controlling Dimensionality Chapter 3: Probabilistic Framework 3.1 Bayesian Networks 3.2 Machine Learning Overview Chapter 4: Related Work in Information Access 4.1 Probabilistic Retrieval 4.2 Feature Selection for Text 4.3 Document Clustering 4.4 Document Classification Part II: Clustering Chapter 5: Feature Selection for Clustering 5.1 Introduction 5.2 Mixture Modeling Revisited 5.3 Theoretical Underpinnings 5.4 Feature Selection Algorithms 5.5 Empirical Results 5.6 Conclusions Chapter 6: A New Model for Document Clustering 6.1 Introduction 6.2 Probabilistic Document Overlap 6.3 Clustering Algorithms 6.4 Results 6.5 Comparison With Mixture Modeling 6.6 Conclusion Part III: Classification Chapter 7: Feature Selection for Classification 7.1 Introduction 7.2 Theoretical Framework 7.3 An Approximate Algorithm 7.4 Initial Results on Non-Text Domains 7.5 Results on Text Domains 7.6 Conclusions Chapter 8: Limited Dependence Bayesian Classifiers 8.1 Introduction 8.2 Probabilistic Classification Models 8.3 The KDB Algorithm 8.4 Initial Results on Non-Text Domains 8.5 Results on Text Domains 8.6 Conclusions and Related Work Chapter 9: Hierarchical Classification 9.1 Introduction 9.2 Hierarchical Classification Scheme 9.3 Results 9.4 Extensions to Directed Acyclic Graphs 9.5 Conclusions Part IV: Putting It All Together Chapter 10: SONIA -- A Complete System 10.1 Introduction 10.2 SONIA on the InfoBus 10.3 A Component View SONIA 10.4 Examples of System Usage 10.5 Conclusions Chapter 11: Conclusions and Future Work 11.1 Where Have We Been? 11.2 Where Are We Going? ABSTRACT The explosion of on-line information has given rise to many query-based search engines (such as Alta Vista) and manually constructed topic hierarchies (such as Yahoo!). But with the current growth rate in the amount of information, query results grow incomprehensibly large and manual classification in topic hierarchies creates an immense information bottleneck. Therefore, these tools are rapidly becoming inadequate for addressing users' information needs. In this dissertation, we address these problems with a system for topical information space navigation that combines the query-based and taxonomic approaches. Our system, named SONIA (Service for Organizing Networked Information Autonomously), is implemented as part of the Stanford Digital Libraries testbed. It enables the creation of dynamic hierarchical document categorizations based on the full-text of articles. Using probability theory as a formal foundation, we develop several Machine Learning methods to allow document collections to be automatically organized at a topical level. First, to generate such topical hierarchies, we employ a novel probabilistic clustering scheme that outperforms traditional methods used in both Information Retrieval and Probabilistic Reasoning. Furthermore, we develop methods for classifying new articles into such automatically generated, or existing manually generated, hierarchies. In contrast to standard classification approaches which do not make use of the taxonomic relations in a topic hierarchy, our method explicitly uses the existing hierarchical relationships between topics, leading to improvements in classification accuracy. Much of this improvement is derived from the fact that the classification decisions in such a hierarchy can be made by considering only the presence (or absence) of a small number of features (words) in each document. The choice of relevant words is made using a novel information theoretic algorithm for feature selection. Many of the components developed as part of SONIA are also general enough that they have been successfully applied to data mining problems in different domains than text. The integration of hierarchical clustering and classification will allow large amounts of information to be organized and presented to users in a individualized and comprehensible way. By alleviating the information bottleneck, we hope to help users with the problems of information access on the Internet. From giacomo at ini.phys.ethz.ch Wed Jan 20 12:23:27 1999 From: giacomo at ini.phys.ethz.ch (Giacomo Indiveri) Date: Wed, 20 Jan 1999 17:23:27 +0000 Subject: TELLURIDE NEUROMORPHIC ENGINEERING WORKSHOP Message-ID: <36A6110F.A2AAC191@ini.phys.ethz.ch> We invite applications for an exciting three week summer workshop on Neuromorphic Engineering that will be held in Telluride, Colorado from Sunday, June 27 to Saturday, July 17, 1999. Details of the workshop and application instructions are at the URL: http://www.ini.unizh.ch/telluride99 -- Avis COHEN (University of Maryland) Rodney DOUGLAS (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) and Timmer HORIUCHI (Johns Hopkins University) Giacomo INDIVERI (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) From a.sharkey at dcs.shef.ac.uk Wed Jan 20 10:34:13 1999 From: a.sharkey at dcs.shef.ac.uk (Amanda Sharkey) Date: Wed, 20 Jan 1999 15:34:13 +0000 (GMT) Subject: IEE Colloquium on Condition Monitoring Message-ID: IEE Colloquium/NCAF Meeting : Birmingham, UK, 22nd-23rd APRIL 1999 Condition Monitoring: machinery, external structures and health. --------------------------------------------------------------- Neural computing and other computational intelligence techniques can be usefully employed in the areas of Condition Monitoring and Fault Diagnosis of machines and external structures (e.g bridges), and of Health Monitoring in medicine. Although these areas are usually treated quite separately, they share a number of common issues and solutions, and should benefit from a cross-fertilisation of ideas. The speakers at this two day colloquium will provide comprehensive reviews of recent research and techniques employed in the domains of condition monitoring, health monitoring, and fault diagnosis; with the underlying aim of facilitating an exchange of ideas and solutions. Organisers: Dr Peter Cowley (Rolls-Royce), Dr Amanda Sharkey (University of Sheffield), Dr Keith Worden (University of Sheffield).. Provisional Programme: 22nd April 09.00-09.30 Registration and Coffee 09.30-10.00 Welcome and Introduction Dr Peter Cowley Rolls-Royce 10.00-11.00 VIBRO-ACOUSTIC CONDITION MONITORING Professor Czeslaw Cempel Poznan University of Technology 11.00-11.30 Coffee and POSTER Session 11.30-12.30 THE LOS ALAMOS HEALTH MONITORING SURVEY Dr Chuck Farrar Los Alamos National Laboratories 12.30-13.30 Lunch 13.30-14.30 CONDITION MONITORING OF ELECTROMECHANICAL PLANT AND CIVIL STRUCTURES Professor James Penman University of Aberdeen 14.30-15.30 NOVELTY DETECTION IN JET ENGINES Professor Lionel Tarassenko University of Oxford. 15.30-16.00 TEA and POSTER Session 16.00-17.00 FAULT DIAGNOSIS FROM A PROCESS CONTROL PERSPECTIVE Professor Ron Patton University of Hull. Provisional Programme: 23rd April 09.30-10.30 ACOUSTIC EMISSION LOCATION USING FOUR SENSORS Dr. Paul Wells British Aerospace 10.30-11.00 Coffee and POSTER Session 11.00-12.00 FAULT DIAGNOSIS FOR CLOSED-LOOP DRUG INFUSION Professor Derek Linkens and Dr M.F. Abbod University of Sheffield 12.00-13.30 Lunch and POSTER Session 13.30-14.30 TECHNICAL AND MEDICAL CONSULTING SYSTEMS USING BAYES NETS Dr Volker Tresp Siemens 14.30-15.30 WHY I AM NOT A NON-BAYESIAN Professor Mahesan Niranjan University of Sheffield 15.30-16.00 Tea 16.00-17.00 Panel Discussion 17.00 Close To register for the above event (both or either of the days) please contact: Events Office, IEE Savoy Place, London WC2R OBL, tel: +44(0)171 240 1871 ext 2205/6 fax +44 (0)171 497 3633 or email: events at iee.org.uk Those interested in presenting a poster summarising research related to the themes of condition monitoring are invited to submit an abstract. Abstracts will be reviewed, and authors will be notified of acceptance. Deadlines for poster abstracts: March 1st 1999 Method for submitting poster abstracts: email to A.Sharkey at dcs.shef.ac.uk or send a hard copy to Dr Amanda Sharkey, Department of Computer Science, Regent Court, Portobello Rd, University of Sheffield, S1 4DP Notification of acceptance of poster abstract: 2nd April 1999 From Alexander.Riegler at univie.ac.at Wed Jan 20 09:19:50 1999 From: Alexander.Riegler at univie.ac.at (Alexander Riegler) Date: Wed, 20 Jan 1999 15:19:50 +0100 (MEZ) Subject: CFP: New Trends in Cognitive Science 99 Message-ID: After the success of the first New Trends in Cognitive Science conference in Vienna, Austria, we are pleased to announce its successor. While in 1997 we focused on the problem of representation (for details see http://www.univie.ac.at/cognition/ntcs97.htm), we will this year put emphasis on the notion of computationalism and its future in the cognitive sciences. Please have a look at the attached Call For Papers or the conference homepage at http://www.univie.ac.at/cognition/conf/ntcs99/ for more information. We are looking forward to welcoming you! Alex Riegler Austrian Society of Cognitive Science New Trends in Cognitive Science 1999 C o m p u t a t i o n a l i s m -- T h e N e x t G e n e r a t i o n International Conference and Workshop Vienna, Austria, May 17-20, 1999 http://www.univie.ac.at/cognition/conf/ntcs99/ Deadline for submissions: February 15, 1999 Invited speakers ---------------- Phil AGRE University of California, Los Angeles Rainer BORN University of Linz Jack B. COPELAND University of Canterbury Adrian CUSSINS University of Illinois, Urbana Stevan HARNAD University of Southampton John HAUGELAND University of Pittsburgh David ISRAEL SRI International Brian C. SMITH Indiana University, Bloomington Purpose ------- This international conference and workshop organized by the Austrian Society of Cognitive Science attemps to bring together theorists working on identifying a "successor" notion of computation--one that not only respects the classical (and critical) limiting results about algorithms, grammars, complexity bounds, etc., but that also does justice to real-world concerns of daily computational practice, and thereby offers a much better chance of serving as a possible foundation for a realistic theory of mind. The workshop will focus on the prospects for developing a theory that takes computing not to be not abstract, syntactic, disembodied, isolated, and non-intentional, but concrete, semantic, embodied, interactive, and intentional. If such a successor notion of computation can be defined, the resulting rehabilitated computationalism may still be our best bet for explaining cognition. It is hoped that this conference will set the agenda for a "philosophy of computation" that will tackle such as issues as: the program/process distinction; the notion of implementation and questions of physical realization; real-time constraint and real-world interaction; the use and limitations of models; relations between concrete and abstract; the proper interpretation of complexity results; etc. Addressing such questions is a critical prerequisite for providing a firm foundation for cognitive science in the new century. Paper submission ---------------- Submitted manuscripts should be between 4000 and 5000 words in length and typed doublespaced on one side of plain paper, with wide margins to allow for editorial notes. The first page of the manuscript should only contain the author's name and affiliation address, the article title, and an abstract of about 100-150 words. Each page of the manuscript should be consecutively numbered, including pages of references. References should be listed at the end of the article in alphabetical and chronological order. Notes should be placed at the bottom of each page as footnotes and numbered consecutively. Reviewing will be blind to the identities of the authors, which requires that authors exercise some care not to identify themselves in their papers. 3 hard copies of the manuscript should be sent to either Matthias Scheutz Institut fuer Wissenschaftstheorie Universitaet Wien Sensengasse 8/10 A-1090 Wien AUSTRIA or Matthias Scheutz Department of Computer Science and Engineering University of Notre Dame Notre Dame, IN 46556 USA Conference Site --------------- The conference will take place in the festival hall at the University of Vienna, located in Vienna's historical first district. More information ---------------- For details see http://www.univie.ac.at/cognition/conf/ntcs99/ or contact Matthias Scheutz at matthias.scheutz at univie.ac.at From xwu at gauss.Mines.EDU Thu Jan 21 18:19:32 1999 From: xwu at gauss.Mines.EDU (Xindong Wu) Date: Thu, 21 Jan 1999 16:19:32 -0700 (MST) Subject: Knowledge and Information Systems: Vol 1 No 1 (1999) Message-ID: <199901212319.QAA04124@gauss> Knowledge and Information Systems: An International Journal ----------------------------------------------------------- ISSN 0219-1377 by Springer-Verlag Home Page: http://kais.mines.edu/~kais/ ======================================= Volume 1 Number 1 (1999): Table of Contents ------------------------------------------- Regular Papers - Data Preparation for Mining World Wide Web Browsing Patterns, by Robert Cooley, Bamshad Mobasher, and Jaideep Srivastava - Data Mining via Discretization, Generalization and Rough Set Feature Selection, by Xiaohua Hu and Nick Cercone - Towards Automated Case Knowledge Discovery in the M2 Case-Based Reasoning System, by D. Patterson, S.S. Anand, W. Dubitzky, and J.G. Hughes - Learning from Batched Data: Model Combination Versus Data Combination, by Kai Ming Ting, Boon Toh Low, and Ian H. Witten Short Papers - Comparative Evaluation of Two Neural Network Based Techniques for Classification of Microcalcifications in Digital Mammograms, by Brijesh Verma - Managing Null Entries in Pairwise Comparisons, by Waldemar W. Koczkodaj, Michael W. Herman, and Marian Orlowski ---------------------------------------------------------------------- A subscription form and other accepted papers are available on the journal home page (http://kais.mines.edu/~kais/). From bert at mbfys.kun.nl Thu Jan 21 10:48:19 1999 From: bert at mbfys.kun.nl (Bert Kappen) Date: Thu, 21 Jan 1999 16:48:19 +0100 (MET) Subject: Postdoc position available at SNN Nijmegen Message-ID: <199901211548.QAA13514@bertus.mbfys.kun.nl> Post doc position available at SNN, University of Nijmegen, the Netherlands. Background: The group consists of 10 researchers and PhD students and conducts theoretical and applied research on neural networks and Bayesian methods. The group is part of the Laboratory of Biophysics which is involved in experimental brain science. Recent research of the group has focused on theoretical description of learning processes using the theory of stochastic processes and the design of efficient learning rules for Boltzmann machines and other graphical models using techniques from statistical mechanics; the extraction of rules from data and the integration of knowledge and data for modeling; the design of robust methods for confidence estimation with neural networks. Applied research is conducted on computer assisted medical diagnosis and prediction tasks. Since 1997, SNN Nijmegen has founded a company which sells commercial services and products in the field of neural networks, AI and statistics. For more information see also http://www.mbfys.kun.nl/SNN Job specification: The tasks of the post-doc will be to conduct independent research in one of the above areas. In addition, it is expected that the post-doc will initiate novel research and will assist in the supervision of PhD students. The postdoc should have a PhD in physics, mathematics or computer science and a strong theoretical background in neural networks. The post-doc salary will be maximally Dfl. 7396 per month, depending on experience. The position is available for 2 years with possible extension to 4 years. Applications: Interested candidates should send a letter with a CV and list of publications before 1997 to dr. H.J. Kappen, Stichting Neurale Netwerken, University of Nijmegen, Geert Grooteplein 21, 6525 EZ Nijmegen. For information contact dr. H.J. Kappen, +31 24 3614241. From bert at mbfys.kun.nl Fri Jan 22 04:36:06 1999 From: bert at mbfys.kun.nl (Bert Kappen) Date: Fri, 22 Jan 1999 10:36:06 +0100 (MET) Subject: postdoc available at SNN Nijmegen (correction) Message-ID: <199901220936.KAA13964@bertus.mbfys.kun.nl> The announcement for a postdoc position at SNN Nijmegen mentions 1997 as deadline for submission of applications. This should be february 20 1999. Sorry for the mess-up. Bert Kappen From fritzke at inf.tu-dresden.de Mon Jan 25 09:00:50 1999 From: fritzke at inf.tu-dresden.de (Bernd Fritzke) Date: Mon, 25 Jan 1999 15:00:50 +0100 Subject: Habilitation thesis on "Vector-based neural networks" available (in german!) Message-ID: <36AC7912.8410FACC@inf.tu-dresden.de> Dear Colleagues, This is to announce the availability of my habilitation thesis "Vektorbasierte Neuronale Netze" (engl: Vector-based neural networks) which is meant to be a comprehensive overview of those neural network architectures which are characterized by prototype vectors (e.g. SOM, Neural Gas, GNG, RBFN) as well as related statistical methods (e.g. LBG, k-means, k-NN). Due to some ancient university law I had to write this thing in German, which is kind of a pity, since this is not quite the general language of science 8v). Those of you who are able to understand German, however, can find the postscript version at http://pikas.inf.tu-dresden.de/~fritzke/papers/habil.ps.gz The thesis has also appeared as book. Details can be found at http://www.shaker.de/Online-Gesamtkatalog/Details.idc?ISBN=3-8265-4458-7 Apologies to the non-German-speaking list members for the bandwidth! Bernd -- Bernd Fritzke http://pikas.inf.tu-dresden.de/~fritzke Neural Computation Group Fax: ++49 351 463-8364 AI-Inst./CS-Dept./Dresden University of Technology Tel: ++49 351 463-8363 From shastri at ICSI.Berkeley.EDU Mon Jan 25 20:22:36 1999 From: shastri at ICSI.Berkeley.EDU (Lokendra Shastri) Date: Mon, 25 Jan 1999 17:22:36 PST Subject: Paper: symbol processing, dynamic bindings, and temporal synchrony Message-ID: <199901260122.RAA03673@lassi.ICSI.Berkeley.EDU> Dear Connectionists, The following preprint may be of interest to you. Best wishes. -- Lokendra Shastri http://www.icsi.berkeley.edu/~shastri/psfiles/shruti_adv_98.ps.gz ------------------------------------------------------------------------------ Advances in SHRUTI: A neurally motivated model of relational knowledge representation and rapid inference using temporal synchrony. Lokendra Shastri International Computer Science Institute Berkeley, CA 94704 Abstract We are capable of drawing a variety of inferences effortlessly, spontaneously, and with remarkable efficiency --- as though these inferences are a *reflex* response of our cognitive apparatus. This remarkable human ability poses a challenge for cognitive science and computational neuroscience: How can a network of slow neuron-like elements represent a large body of systematic knowledge and perform a wide range of inferences with such speed? The connectionist model SHRUTI attempts to address this challenge by demonstrating how a neurally plausible network can encode a large body of semantic and episodic facts, systematic rules, and knowledge about entities and types, and yet perform a wide range of explanatory and predictive inferences within a few hundred milliseconds. Relational structures (frames, schemas) are represented in SHRUTI by clusters of cells, and inference in SHRUTI corresponds to a transient propagation of rhythmic activity over such cell-clusters wherein *dynamic bindings* are represented by the synchronous firing of appropriate cells. SHRUTI encodes mappings across relational structures using high-efficacy links that enable the propagation of rhythmic activity, and it encodes items in long-term memory as coincidence and conincidence-error detector circuits that become active in response to the occurrence (or non-occurrence) of appropriate coincidences in the on going flux of rhythmic activity. Finally, ``understanding'' in SHRUTI corresponds to reverberant and coherent activity along closed loops of neural circuitry. Over the past several years, SHRUTI has undergone several enhancements that have augmented its expressiveness and inferential power. This paper describes some of these extensions that enable SHRUTI to (i) deal with negation and inconsistent beliefs, (ii) encode evidential rules and facts, (iii) perform inferences requiring the dynamic instantiation of entities, and (iv) seek coherent explanations of observations. Keywords: knowledge representation; inference; evidential reasoning; dynamic binding; temporal synchrony. To appear in Applied Intelligence. From kyoung at itsa.ucsf.edu Mon Jan 25 20:57:19 1999 From: kyoung at itsa.ucsf.edu (Karl Young) Date: Mon, 25 Jan 1999 17:57:19 -0800 (PST) Subject: Post-Doctoral Research in Novel Brain Imaging Methods Message-ID: Postdoctoral position at the University of California, San Francisco; in the Laboratory of Dr. Michael Weiner. The research involves the development and comparison of statistical classification techniques for medical imaging data. Knowledge of some subset of the following topics is important: feature extraction techniques, such as projection pursuit or principal component analysis, and cluster analysis techniques, such as linear discriminant analysis, CART, neural networks, or Bayesian techniques. While these are important, significant stress will be placed on the development of novel classification techniques based on recently proposed measures of structural complexity. Development of metrics for comparison of the new techniques with more standard ones will be an important component of the research. The candidate will also contribute to the design of a medical image data warehouse that is optimized for use by the automated classification techniques. Some familiarity with medical imaging would be useful but is not required; primary interaction will be with research physicists and mathematicians in Dr. Weiner's group and from the Santa Fe Institute. Send CV and references to: Dr. Karl Young University of California, SF Phone: (415) 750-2158 lab VA Medical Center, MRS Unit (114M) FAX: (415) 668-2864 4150 Clement Street Email: kyoung at itsa.ucsf.edu San Francisco, CA 94121 From rpare at allstate.com Tue Jan 26 18:58:08 1999 From: rpare at allstate.com (Rajesh Parekh) Date: Tue, 26 Jan 1999 17:58:08 -0600 Subject: Grammatical Inference Homepage Message-ID: An embedded and charset-unspecified text was scrubbed... Name: not available Url: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/40905dbd/attachment.ksh From Dave_Touretzky at cs.cmu.edu Wed Jan 27 13:48:27 1999 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Wed, 27 Jan 1999 13:48:27 -0500 Subject: CNS*99 submission deadline extended due to computer failure Message-ID: <21284.917462907@skinner.boltz.cs.cmu.edu> If you were trying to make the Tuesday 11:59 pm submission deadline for CNS*99 (the Eighth Annual Computational Neuroscience Meeting, July 18-22, Pittsburgh, PA) and couldn't, don't sweat it. The web server at Caltech was swamped by heavy traffic and was going up and down all day. We will continue to accept submissions through the end of the week. Sorry for the inconvenience. More details about the conference can be found on the conference home page (when the server isn't crashing) at: http://numedeon.com/cns-meetings/CNS99/ -- Dave Touretzky From harnad at coglit.soton.ac.uk Wed Jan 27 15:24:08 1999 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Wed, 27 Jan 1999 20:24:08 +0000 (GMT) Subject: Lexical Processing: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article *** please see also 5 important announcements about new BBS policies and address change at the bottom of this message) *** LEXICAL ENTRIES AND RULES OF LANGUAGE: A MULTIDISCIPLINARY STUDY OF GERMAN INFLECTION by Harald Clahsen This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to [PLEASE NOTE SLIGHTLY CHANGED ADDRESS]: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. _____________________________________________________________ LEXICAL ENTRIES AND RULES OF LANGUAGE: A MULTIDISCIPLINARY STUDY OF GERMAN INFLECTION Harald Clahsen Dept. of Linguistics University of Essex Colchester C04 3SQ UK harald at essex.ac.uk http://privatewww.essex.ac.uk/~harald ABSTRACT: It is hypothesized following much work in linguistic theory that the language faculty has a modular structure and consists of two basic components, a lexicon of (structured) entries and a computational system of combinatorial operations to form larger linguistic expressions from lexical entries. This target article provides evidence for the dual nature of the language faculty by describing some recent results from a multidisciplinary investigation of German inflection. We have examined (i) its linguistic representation focussing on noun plurals and verb inflection (participles), (ii) processes involved in the way adults produce and comprehend inflected words, (iii) brain potentials generated during the processing of inflected words and (iv) the way children acquire and use inflection. It will be shown that the evidence from all these sources converges and supports the distinction between lexical entries and combinatorial operations. Our experimental results indicate that adults have access to two distinct processing routes, one accessing (irregularly) inflected entries from the mental lexicon, and another involving morphological decomposition of (regularly) inflected words into stem+affix representations. These two processing routes correspond to the dual structure of the linguistic system. Results from event-related potentials confirm this linguistic distinction at the level of brain structures. In children's language, we found these two processes also to be clearly dissociated; regular and irregular inflection are used under different circumstances, and the constraints under which children apply them are identical to those of the adult linguistic system. Our findings will be explained in terms of a linguistic model, which maintains the distinction between the lexicon and the computational system, but replaces the traditional view of the lexicon as a simple list of idiosyncrasies with the notion of internally structured lexical representations. KEYWORDS: grammar, psycholinguistics, neuroscience of language, child language acquisition, human language processing, development of inflection ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.clahsen.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.clahsen ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.clahsen To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.clahsen When you have the file(s) you want, type: quit ____________________________________________________________ *** FIVE IMPORTANT ANNOUNCEMENTS *** ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/~harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From bengioy at IRO.UMontreal.CA Wed Jan 27 10:41:16 1999 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Wed, 27 Jan 1999 10:41:16 -0500 Subject: post-doc position in Montreal / statistical data analysis + machine learning Message-ID: <19990127104116.07451@IRO.UMontreal.CA> POST DOCTORAL RESEARCH STAFF MEMBER IN STATISTICAL DATA ANALYSIS AND MACHINE LEARNING FOR HIGH-DIMENSIONAL DATA SETS MATHEMATICS OF INFORMATION TECHNOLOGY AND COMPLEX SYSTEMS (MITACS: a new Canadian Network of Centers of Excellence) Position to be held jointly at the Department of Computer Science & Operations Research and the Department of Mathematics and Statistics, at the UNIVERSITY OF MONTREAL, Quebec, Canada NATURE AND SCOPE OF THE POSITION: A post-doctoral position position is available at the University of Montreal within the MITACS network of centers of excellence. The main research area will be the statistical data analysis of high-dimensional data sets with machine learning algorithms, also known as, "database mining". The main research questions that will be addressed in this research are the following: - how to deal with the "curse of dimensionality": algorithms based on variable selection or based on combining many variables with different importance, while controlling generalization to avoid overfitting; - how to make inferences on the models obtained with such methods, mostly using resampling methods such as the BOOTSTRAP. This research will be performed within the MITACS project entitled "INFERENCE FROM HIGH-DIMENSIONAL DATA". See http://www.iro.umontreal.ca/~bengioy/mitacs.html for more information on the project and http://www.mitacs.math.ca for more information on the MITACS network. The candidate will be working under the supervision of professors Yoshua Bengio (computer science and operations research) and Christian Leger (mathematics and statistics). See http://www.iro.umontreal.ca/~bengioy and http://www.iro.umontreal.ca/~lisa for more information respectively on Yoshua Bengio and his laboratory. See http://www.dms.umontreal.ca/~leger for more information on Christian Leger. ESSENTIAL SKILLS, KNOWLEDGE, AND ABILITIES: Candidates must possess a recent Ph.D. in computer science, statistics, mathematics, or a related discipline, with a research background in machine learning (in particular neural networks) and/or computational statistical methods such as the bootstrap. Candidates must have excellent programming skills, with demonstrated experience. Experience in the following areas will be mostly considered: - Statistical data analysis in general, - bootstrapping methods in particular. - Machine learning algorithms in general, - artificial neural networks in particular. - Programming skills in general, - object-oriented programming, - participation in large-scale, multiple authors, software projects, - experience with C, C++ and S-Plus languages, in particular. LENGTH OF EMPLOYMENT: 1 year (with possible renewal for 2 years total), starting as soon as possible. FOR FURTHER INFORMATION, PLEASE CONTACT: Yoshua Bengio bengioy at iro.umontreal.ca, 514-343-6804, fax 514-343-5834 or Christian Leger leger at dms.umontreal.ca 514-343-7824, fax 514-343-5700 Electronic applications (preferably as a postscript, raw text, or pdf file) are encouraged, in the form of a a Curriculum Vitae with information on academic experience, academic standing, research experience, programming experience, and any other relevant information (e.g., pointer to your web site, if any). -- Yoshua Bengio Professeur aggr?g? D?partement d'Informatique et Recherche Operationnelle Universit? de Montr?al, addresse postale: C.P. 6128 Succ. Centre-Ville, Montreal, Quebec, Canada H3C 3J7 addresse civique: 2920 Chemin de la Tour, Montreal, Quebec, Canada H3T 1J8 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/labs/neuro From campber at CLEMSON.EDU Wed Jan 27 10:10:06 1999 From: campber at CLEMSON.EDU (Robert L. Campbell) Date: Wed, 27 Jan 1999 10:10:06 -0500 Subject: Conference on Consciousness and Cognition (2nd call; deadline extended) Message-ID: Mind 4 Dublin City University, Dublin, Ireland, August 16-20, 1999 Theme: "Two Sciences of Mind" Confirmed invited speakers include: Bernard Baars David Galin Karl Pribram Stuart Hammeroff Kathy McGovern Steven Nachmanovitch Jacob Needleman Program Committee: Bernard Baars Mark Bickhard Robert Campbell Christian de Quincey Stuart Hammeroff Paul Mc Kevitt Kathy McGovern Steven Nachmanovitch Jacob Needleman Sean O Nuallain Yoshi Nakamura Max Velmans Terry Winegar Keynote addresses: Jabob Needleman: "Inner and Outer Empiricism in Consciousness Research" Bernard Baars: "The Compassionate Implications of Brain Imaging of Conscious Pain: New Vistas in Applied Cognitive Science." Stream 1: Outer and Inner empiricism in consciousness research This stream will feature papers that attempt to show how "inner" states can be elucidated with reference to external phenomena. "Inner empiricism" designates experience, or qualia. They are shaped (somehow) by brain processes or states which sense and interpret the external phenomena. The physical nature of these processes or states may tell us much about consciousness. Likewise, the argument that we are conscious of only one thing at a time because of the gating action of the nuclei reticularis thalami (Taylor, Baars, etc) is indicative of the kind of thinking we are trying to encourage. In this vein, pain experience and its imperfect relationship to neural activity are similarly relevant. We particularly welcome papers that feature empirical data, or, lacking these data, show a grasp of the range of disciplines necessary to do justice to the topic. Papers are also invited that - Interpret qualia in terms of a quantum-mechanics based panpsychism (or, in current terms, pan-protopsychism) - Establish links with developments like Whitehead's pan-experientialism and process thought -Interrelate physiological processes at the neural level with current thought in QM - Emphasize "relational empiricism", ie second-person considerations - Investigate the brain processes or states giving rise to qualia at whatever level the writer considers appropriate (eg intra-cellular cytoskeletal activities and/or quantum-level phenomena). - Involve studies of central pain states as well as other curiosities like allodynia, spontaneous analgesia, pain asymbolia, and hypnotic analgesia. The invited talks include: David Galin "The Experience of 'Spirit' in Cognitive Terms." Stuart Hameroff "Quantum Computing and Consciousness" Steve Nachmanovitch "Creativity and Consiousness" Each of these talks will be followed by a panel discussion discussing respectively, consciousness as explored experientially, through scientific investigation, and in the arts. Stream 2: Foundations of Cognitive Science Co-chairs: Sean O Nuallain Dublin City University, Dublin 9, Ireland (sonualla at compapp.dcu.ie) Robert L. Campbell Department of Psychology, Clemson University, Clemson, SC USA (campber at clemson.edu) WHAT THE STREAM IS ABOUT Though deep and contentious questions of theory and metatheory have always been prevalent in Cognitive Science--they arise whenever an attempt is made to define CS as a discipline--they have frequently been downrated by researchers, in favor of empirical work that remains safely within the confines of established theories and methods. Our goal to is redress the balance. We encourage participants in this stream to raise and discuss such questions as: * the adequacy of computationalist accounts of mind * the adequacy of conceptions of mental representation as structures that encode structures out in the environment * the consequences of excluding emotions, consciousness, and the social realm from the purview of cognitive studies * the consequences of Newell and Simon's "scientific bet" that developmental constraints do not have to be studied until detailed models of adult cognition have been constructed and tested * the relationship between cognitive science and formal logic A wide range of theoretical perspectives is welcome, so long as the presenters are willing to engage in serious discussion with the proponents of perspectives that are different from their own: * Vygotskian approaches to culture and cognition * Dynamic Systems theories * Piagetian constructivism * interactivism * neuroscience accounts such as those of Edelman and Grossberg * accounts of emergence in general, and emergent knowledge in particular * perception and action robotics * functional linguistics * genetic algorithms * Information Procesing * connectionism * evolutionary epistemology ******************** Contributors will be asked to submit short papers (3000 word limit) in the form of ASCII text files (HTML files are also welcome, but are optional) to Robert Campbell (for stream 2) and Sean O Nuallain (stream 1). The deadline is March 1, 1999. We will email notification of acceptance or rejection by April 1. The standard presentations during the streams will be 20-minute talks and poster sessions. *********** The "MIND" conferences have normally had their proceedings published by John Benjamins. We have already been approached by prospective publishers for Mind 4. All accepted papers and posters will be included in a preprint. Robert L. Campbell Professor, Psychology Brackett Hall 410A Clemson University Clemson, SC 29634-1511 USA phone (864) 656-4986 fax (864) 656-0358 http://hubcap.clemson.edu/~campber/index.html Editor, Dialogues in Psychology http://hubcap.clemson.edu/psych/Dialogues/dialogues.html From bert at mbfys.kun.nl Thu Jan 28 08:37:43 1999 From: bert at mbfys.kun.nl (Bert Kappen) Date: Thu, 28 Jan 1999 14:37:43 +0100 (MET) Subject: Postdoc position available at SNN Nijmegen Message-ID: <199901281337.OAA17607@bertus.mbfys.kun.nl> Post doc position available at SNN, University of Nijmegen, the Netherlands. Background: The group consists of 10 researchers and PhD students and conducts theoretical and applied research on neural networks and Bayesian methods. The group is part of the Laboratory of Biophysics which is involved in experimental brain science. Recent research of the group has focused on theoretical description of learning processes using the theory of stochastic processes and the design of efficient learning rules for Boltzmann machines and other graphical models using techniques from statistical mechanics; the extraction of rules from data and the integration of knowledge and data for modeling; the design of robust methods for confidence estimation with neural networks. Applied research is conducted on computer assisted medical diagnosis and prediction tasks. Since 1997, SNN Nijmegen has founded a company which sells commercial services and products in the field of neural networks, AI and statistics. For more information see also http://www.mbfys.kun.nl/SNN Job specification: The tasks of the post-doc will be to conduct independent research in one of the above areas. In addition, it is expected that the post-doc will initiate novel research and will assist in the supervision of PhD students. The postdoc should have a PhD in physics, mathematics or computer science and a strong theoretical background in neural networks. The post-doc salary will be maximally Dfl. 7396 per month, depending on experience. The position is available for 2 years with possible extension to 4 years. Applications: Interested candidates should send a letter with a CV and list of publications before february 20 1999 to dr. H.J. Kappen, SNN, University of Nijmegen, Geert Grooteplein 21, 6525 EZ Nijmegen. For information contact dr. H.J. Kappen, +31 24 3614241 or bert at mbfys.kun.nl. From a.sharkey at dcs.shef.ac.uk Thu Jan 28 05:43:45 1999 From: a.sharkey at dcs.shef.ac.uk (Amanda Sharkey) Date: Thu, 28 Jan 1999 10:43:45 +0000 (GMT) Subject: New Book Announcement Message-ID: The following book is now available: COMBINING ARTIFICIAL NEURAL NETS: Ensemble and Modular Multi-Net Systems Perspectives in Neural Computing Series Edited: Amanda J.C. Sharkey Publisher: Springer-Verlag London Ltd 1999 http://www.springer.co.uk Price: 39.50 pounds UK, 89.95 dollars US. ISBN 1-85233-004-X This volume consists of articles written by leading researchers in the field of Combining Artificial Neural Nets, and as such provides unique coverage of the area. The techniques presented include ensemble-based approaches, where a variety of methods are used to create a set of different nets trained on the same task, and modular approaches, where a task is decomposed into simpler problems. The focus is on the combination of Neural Nets, but many of the methods are applicable to a wider variety of statistical methods. The presentation of techniques is accompanied by analysis and evaluation of their relative effectiveness, and by reports on their application to a variety of problems. Chapters: 1. "Multi-Net Systems" A. Sharkey 2. "Combining Predictors" L. Breiman 3. "Boosting Using Neural Networks" H. Drucker 4. "A Genetic Algorith Approach for Creating Neural Network Ensembles" D. Opitz and J. Shavlik 5. "Treating Harmful Collinearity in Neural Network Ensembles" S. Hashem 6. "Linear and Order Statistics Combiners for Pattern Classification" K. Tumer and J. Ghosh 7. "Variance Reduction via Noise and Bias Constraints" Y. Raviv and N. Intrator 8. "A Comparison of Visual Cue Combination Models" I Fine and R. Jacobs 9. "Model Selection of Combined Neural Nets for Speech Recognition" C. Furlanello, D. Giuliani, S. Merler and E. Trentin 10. "Self-Organised Modular Neural Networks for Encoding Data" S. Luttrell 11. "Mixtures of X" R. Jacobs and M. Tanner Queries: postmaster at svl.co.uk From koenig at iee.et.tu-dresden.de Thu Jan 28 13:05:04 1999 From: koenig at iee.et.tu-dresden.de (Andreas Koenig) Date: Thu, 28 Jan 1999 19:05:04 +0100 Subject: QuickCog Development Environment Message-ID: <199901281805.TAA27909@eeebwm.et.tu-dresden.de> Dear Colleagues, this is to announce the availability of a novel PC based (Windows 95/98/NT) development environment, which serves as rapid and efficient system design tool for image processing and general pattern recognition applications, e.g. for automated visual inspection tasks. Fast and transparent design of technical cognitive systems is provided by the following key features: - Visual programming and sample set oriented processing - Interactive menu-driven creation of arbitrary sample sets, ROI definition/object partitioning and consistent preclassification - Proven methods of image processing, feature computation, classification and evaluation unified in a single system - Feature space visualisation and interactive analysis by dimension reducing mappings and interactive feature maps, assessment functions for system discriminance, method selection, and parameter settings. (The enclosed QuickMine-Toolbox also allows visual data analysis for arbitrary data) - Automatic feature selection by parametric and nonparametric methods - Efficient and easy to use classification methods, including centroid classifier, nearest-neighbor-techniques, artificial neural networks, and a rule-based classifier Currently, the system is available in German only, as it was commercialised on the German market in 1998, but an English version will be available soon. The description of the various included methods can be found in my doctoral thesis from 1995 (http://www.iee.et.tu-dresden.de/~koeniga), also in German. Several publications, that describe the main features of the system, are available in English from the QuickCog home page. Apologies to the non-German speaking list members. More information and a demo-version (for research purposes, restrictions with regard to the full version are negligible) can be obtained from: http://www.iee.et.tu-dresden.de/~koeniga/QuickCog.html Best regards Andreas Koenig -------------------------------------------- Doz. Dr.-Ing. Andreas Koenig Chair of Electronic Devices and Integrated Circuits Faculty of Electrical Engineering, IEE Dresden University of Technology Mommsenstr. 13 01062 Dresden Phone: +49 351 463 2805 Fax : +49 351 463 7260 E-Mail: koenig at iee.et.tu-dresden.de From jbower at bbb.caltech.edu Thu Jan 28 18:26:05 1999 From: jbower at bbb.caltech.edu (James M. Bower) Date: Thu, 28 Jan 1999 15:26:05 -0800 Subject: Postdoctoral Position Message-ID: Postdoctoral Position Available in Neuroimaging and Computational Neuroscience: A postdoctoral position is available in an NINDS-funded research project combining human neuroimaging (e.g., fMRI), healthy and patient population psychophysics, rat electrophysiology (multi-unit recordings), and computational modeling to investigate novel hypotheses about cerebellar function. The project is a collaboration between James Bower (at Caltech) and Lawrence Parsons and Peter Fox (at the Research Imaging Center, University of Texas Health Science Center at San Antonio). Term of position is 2-3 years,available immediately. Particular interest exists for candidates with a Ph.D. or M.D. who have experience in neuroimaging, experimental psychophysics, or neurology. Qualified women and minority candidates are strongly encouraged to apply. Send CV, three letters of recommendation, and two reprints to either: Professor J. M. Bower, Division of Biology 216-76, California Institute of Biology, Pasadena, CA 91125, USA; or Professor L. M. Parsons, Research Imaging Center, University of Texas Health Science Center, 7703 Floyd Curl Drive, San Antonio, TX 78284, USA. Email inquiries can be made to parsons at uthscsa.edu or jbower at bbb.caltech.edu. For related publications, see: Gao, Parsons, Bower et al., Science 272, 545 1996; and the chapters by Bower and by Parsons & Fox in Cerebellum & Cognition (J. Schmahmann, Ed., Academic Press, 1997). From zhaoping at gatsby.ucl.ac.uk Fri Jan 29 10:37:25 1999 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Fri, 29 Jan 1999 15:37:25 GMT Subject: Paper on Computational differences between asymmetrical and symmetrical networks Message-ID: <199901291537.PAA14881@vision.gatsby.ucl.ac.uk> Paper Available at http://www.gatsby.ucl.ac.uk/~zhaoping/papers.html Title: Computational differences between asymmetrical and symmetrical networks Authors: Zhaoping Li and Peter Dayan Abstract: Symmetrically connected recurrent networks have recently been used as models of a host of neural computations. However, biological neural networks have asymmetrical connections, at the very least because of the separation between excitatory and inhibitory neurons in the brain. We study characteristic differences between asymmetrical networks and their symmetrical counterparts in cases for which they act as selective amplifiers for particular classes of input patterns. We show that the dramatically different dynamical behaviours to which they have access, often make the asymmetrical networks computationally superior. We illustrate our results in networks that selectively amplify oriented bars and smooth contours in visual inputs. To appear in : Network: Computation in Neural Systems 10. 1 59-77, 1999. From vaina at enga.bu.edu Fri Jan 29 18:37:50 1999 From: vaina at enga.bu.edu (Lucia M. Vaina) Date: Fri, 29 Jan 1999 19:37:50 -0400 Subject: Please post In-Reply-To: <199811111522.QAA06995@anaxagoras> References: (vaina@enga.bu.edu) Message-ID: Graduate Student or Postdoctoral Position in Computational Functional MRI: from Fall 1999 (possible summer 99) >This exciting new venture in the Brain and Vision Research Laboratory at >Boston University, Department of Biomedical Engineering involves >visualisation the working (plasticity and restorative plasticity) of the >human brain during sensory-motor tasks. > Specifically, the postholder will: > > * establish new protocols of functional imaging on a research MRI >scanner, including the stimulus presentation system, >software and interfacing. > > * work with other members of the laboratory to define, establish > and run suitable test paradigms on healthy volunteers and patients > > * explore the uses of real-time and near-real-time analysis >techniques in fMRI studies. > * model the changes of functional connectivity of brain activations >using structural equations models and functional connectivity models. > >It is possible that for part of the time the RA will work at an NIH >laboratory, in Bethesda, MA. > >Candidates should have good background inC programming, mathematics, >signal processing and probability and should be familiar with the Unix >environment. Knowledge of human neuroanatomy is a plus. Please send a letter of application along with CV, publication list if available, brief statement of current research and background, and two letters of recommendation before March 10, 1999, to Professor Lucia M. Vaina Brain and Vision Research Laboratory Biomedical Engineering Department College of Engineering Boston University 44 Cummington str Boston, Ma 02115 USA fax: 617-353-6766 > Lucia M. Vaina Ph.D., D.Sc. Professor of Biomedical Engineering and Neurology Brain and Vision Research Laboratory Boston University, Department of Biomedical Engineering College of Engineering 44 Cummington str, Room 315 Boston University Boston, Ma 02215 USA tel: 617-353-2455 fax: 617-353-6766 From iehava at ie.technion.ac.il Mon Jan 4 11:18:25 1999 From: iehava at ie.technion.ac.il (Hava Siegelmann) Date: Mon, 4 Jan 1999 18:18:25 +0200 (EET) Subject: Announcing a New Book Message-ID: Neural Networks and Analog Computation: Beyond the Turing Limit Author: Hava T. Siegelmann The theoretical foundations of Neural Networks and Analog Computation conceptualize neural networks as a particular type of computer consisting of multiple assemblies of basic processors interconnected in an intricate structure. Examining these networks under various resource constraints reveals a continuum of computational devices, several of which coincide with well-known classical models. What emerges is a Church-Turing-like thesis, applied to the field of analog computation, which features the neural network model in place of the digital Turing machine. This new concept can serve as a point of departure for the development of alternative, supra-Turing computational theories. On a mathematical level, the treatment of neural computations enriches the theory of computation but also explicates the computational complexity associated with biological networks, adaptive engineering tools, and related models from the fields of control theory and nonlinear dynamics. The material in this book will be of interest to researchers in a variety of engineering and applied sciences disciplines. In addition, the work may provide the base of a graduate-level seminar in neural networks for computer science students. Special care has been taken to explain the theory clearly and concisely. The first chapter reviews the fundamental terms of modern computational theory from the point of view of neural networks and serves as a reference for the remainder of the book. Each of the subsequent chapters opens with introductory material and proceeds to explain the chapter's connection to the development of the theory. Thereafter, the concept is defined in mathematical terms. Birkh?user Boston * Basel * Berlin ISBN 0-8176-3949-7 e-mail: orders at birkhauser.ch Web: http://www.birkhauser.ch From harnad at coglit.soton.ac.uk Mon Jan 4 07:14:37 1999 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Mon, 4 Jan 1999 12:14:37 +0000 (GMT) Subject: Origin of Culture: PSYCOLOQUY Call for Commentators Message-ID: Gabora: ORIGIN OF CULTURE The target article whose abstract appears below has just appeared in PSYCOLOQUY, a refereed journal of Open Peer Commentary sponsored by the American Psychological Association. Qualified professional biobehavioral, neural or cognitive scientists are hereby invited to submit Open Peer Commentary on it. Please email for Instructions if you are not familiar with format or acceptance criteria for PSYCOLOQUY commentaries (all submissions are refereed). To submit articles and commentaries or to seek information: EMAIL: psyc at pucc.princeton.edu URL: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psyc RATIONALE FOR SOLICITING COMMENTARY: This target article presents a model of cognitive origins to explain the transition from episodic to mimetic/memetic culture (as outlined by Merlin Donald in "Origins of the Modern Mind," 1991) using Stuart Kauffman's ideas about how an information-evolving system can emerge through autocatalysis ("Origins of Order," 1993). I would like to invite commentary from cognitive anthropologists and archeologists on the plausibility of the proposal, from neuroscientists on the neurobiological aspects of this model, and from cognitive psychologists on its compatibility with other dynamic models memory (i.e. models of how one thought evokes another in a train of associations.) I also invite discussion of the memetic perspective of culture as an information-evolving system. Full text of article available at: http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.67 or ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/ psyc.98.9.67.origin-culture.1.gabora ----------------------------------------------------------------------- AUTOCATALYTIC CLOSURE IN A COGNITIVE SYSTEM: A TENTATIVE SCENARIO FOR THE ORIGIN OF CULTURE Liane Gabora Center Leo Apostel Brussels Free University Krijgskundestraat 33 1160 Brussels Belgium lgabora at vub.ac.be http://www.vub.ac.be/CLEA/liane/ ABSTRACT: This target article presents a speculative model of the cognitive mechanisms underlying the transition from episodic to mimetic (or memetic) culture with the arrival of Homo Erectus, which Donald (1991) claims paved the way for the unique features of human culture. The model draws on Kauffman's (1993) theory of how an information-evolving system emerges through the formation of an autocatalytic network. Though originally formulated to explain the origin of life, Kauffman's theory also provides a plausible account of how discrete episodic memories become woven into an internal model of the world, or world-view, that both structures, and is structured by, self-triggered streams of thought. Social interaction plays a role in (and may be critical to) this process. Implications for cognitive development are explored. KEYWORDS: abstraction, animal cognition, autocatalysis, cognitive development, cognitive origins, consciousness, cultural evolution, memory, meme, mimetic culture, representational redescription, world-view. ----------------------------------------------------------------------- INSTRUCTIONS FOR PSYCOLOQUY COMMENTATORS PSYCOLOQUY is a refereed electronic journal (ISSN 1055-0143) sponsored on an experimental basis by the American Psychological Association and currently estimated to reach a readership of 50,000. PSYCOLOQUY publishes brief reports of new ideas and findings on which the author wishes to solicit rapid peer feedback, international and interdisciplinary ("Scholarly Skywriting"), in all areas of psychology and its related fields (biobehavioral science, cognitive science, neuroscience, social science, etc.). All contributions are refereed. Accepted PSYCOLOQUY target articles have been judged by 5-8 referees to be appropriate for Open Peer Commentary, the special service provided by PSYCOLOQUY to investigators in psychology, neuroscience, behavioral biology, cognitive sciences and philosophy who wish to solicit multiple responses from an international group of fellow specialists within and across these disciplines to a particularly significant and controversial piece of work. If you feel that you can contribute substantive criticism, interpretation, elaboration or pertinent complementary or supplementary material on a PSYCOLOQUY target article, you are invited to submit a formal electronic commentary. 1. Before preparing your commentary, please examine recent numbers of PSYCOLOQUY if not familiar with the journal. 2. Commentaries should preferably be up to ~200 lines (~1800 words) 3. Please provide a title for your commentary. As many commentators will address the same general topic, your title should be a distinctive one that reflects the gist of your specific contribution and is suitable for the kind of keyword indexing used in modern bibliographic retrieval systems. Each commentary should also have a brief (~100 word) abstract 4. All paragraphs should be numbered consecutively. Line length should not exceed 72 characters. The commentary should begin with the title, your name and full institutional address (including zip code) and email address. References must be prepared in accordance with the examples given in the Instructions. Please read the sections of the Instruction for Authors concerning style, preparation and editing. Please include URL wherever available. Target article length should preferably be up to 1200 lines [c. 10,000 words]. All target articles, commentaries and responses must have (1) a short abstract (up to 200 words for target articles, shorter for commentaries and responses), (2) an indexable title, (3) the authors' full name(s) and institutional address(es), (4) email addresses, (5) Home-page URLs. In addition, for target articles only: (4) 6-8 indexable keywords, (5) a separate statement of the authors' rationale for soliciting commentary (e.g., why would commentary be useful and of interest to the field? what kind of commentary do you expect to elicit?) and (6) a list of potential commentators (with their email addresses). All paragraphs should be numbered in articles, commentaries and responses (see format of already published articles in the PSYCOLOQUY archive; line length should be < 80 characters, no hyphenation). Figures should be Web-ready gifs, jpegs or equivalent. Captions should be in separate html file that links to the gifs. PSYCOLOQUY also publishes multiple reviews of books in any of the above fields; these should normally be the same length as commentaries, but longer reviews will be considered as well. Book authors should submit a 500-line self-contained Precis of their book, in the format of a target article; if accepted, this will be published in PSYCOLOQUY together with a formal Call for Reviews (of the book, not the Precis). The author's publisher must agree in advance to furnish review copies to the reviewers selected. Authors of accepted manuscripts assign to PSYCOLOQUY the right to publish and distribute their text electronically and to archive and make it permanently retrievable electronically, but they retain the copyright, and after it has appeared in PSYCOLOQUY authors may republish their text in any way they wish -- electronic or print -- as long as they clearly acknowledge PSYCOLOQUY as its original locus of publication. However, except in very special cases, agreed upon in advance, contributions that have already been published or are being considered for publication elsewhere are not eligible to be considered for publication in PSYCOLOQUY, Please submit all material to psyc at pucc.bitnet or psyc at pucc.princeton.edu URLs for retrieving full texts of target articles: http://www.princeton.edu/~harnad/psyc.html http://cogsci.soton.ac.uk/psyc ftp://ftp.princeton.edu/pub/harnad/Psycoloquy ftp://cogsci.soton.ac.uk/pub/harnad/Psycoloquy news:sci.psychology.journals.psycoloquy From jagota at cse.ucsc.edu Tue Jan 5 18:52:41 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Tue, 5 Jan 1999 15:52:41 -0800 (PST) Subject: connectionist symbol processing collective survey Message-ID: <199901052352.PAA07507@arapaho.cse.ucsc.edu> 1. New NCS e-publication: collective survey 2. Forward references to NCS articles solicited ----- 1 ----- A. Jagota, T. Plate, L. Shastri, R. Sun (eds), Connectionist Symbol Processing: Dead or Alive?, Neural Computing Surveys 2, 1--40, 1999, 220 references. http://www.icsi.berkeley.edu/~jagota/NCS PREFACE: In August 1998 Dave Touretzky asked on the connectionists e-mailing list, ``Is connectionist symbol processing dead?'' This query lead to an interesting discussion and exchange of ideas. We thought it might be useful to capture this exchange in an article. We solicited contributions, and this collective article is the result. Contributions were solicited by a public call on the connectionists e-mailing list. All contributions received were subjected to two to three informal reviews. Almost all were accepted with varying degrees of revision. Given the number and variety of contributions, the articles cover a wide, though by no means complete, range of the work in the field. The pieces in this article are of varying nature: position summaries, individual research summaries, historical accounts, discussion of controversial issues, etc. We have not attempted to connect the various pieces together, or to organize them within a coherent framework. Despite this, we think, the reader will find this collection useful. CONTRIBUTORS: D. S. Blank, M. Coltheart, J. Diederich, B.M. Garner, R.W. Gayler, C.L. Giles, L. Goldfarb, M. Hadeishi, B. Hazlehurst, M. J. Healy, J. Henderson, N. G. Jani, D. S. Levine, S. Lucas, T. Plate, G. Reeke, D. Roth, L. Shastri, J. Sougne, R. Sun, W. Tabor, B. B. Thompson, S. Wermter ----- 2 ----- NCS would like to begin to maintain FORWARD references to published articles. If you are aware of any such, e-mail to jagota at cse.ucsc.edu the following information for each such reference CITING ARTICLE: FULL REFERENCE NCS CITED ARTICLE -------------- From baluja at grr.ius.cs.cmu.edu Tue Jan 5 22:26:37 1999 From: baluja at grr.ius.cs.cmu.edu (baluja@grr.ius.cs.cmu.edu) Date: Tue, 5 Jan 99 22:26:37 EST Subject: Probabilistic Modeling for Face Orientation Discrimination Message-ID: The following paper is available from: http://www.cs.cmu.edu/~baluja Probabilistic Modeling for Face Orientation Discrimination: Learning from Labeled and Unlabeled Data Shumeet Baluja Abstract: This paper presents probabilistic modeling methods to solve the problem of discriminating between five facial orientations with very little labeled data. Three models are explored. The first model maintains no inter-pixel dependencies, the second model is capable of modeling a set of arbitrary pair-wise dependencies, and the last model allows dependencies only between neighboring pixels. We show that for all three of these models, the accuracy of the learned models can be greatly improved by augmenting a small number of labeled training images with a large set of unlabeled images using Expectation-Maximization. This is important because it is often difficult to obtain image labels, while many unlabeled images are readily available. Through a large set of empirical tests, we examine the benefits of unlabeled data for each of the models. By using only two randomly selected labeled examples per class, we can discriminate between the five facial orientations with an accuracy of 94%; with six labeled examples, we achieve an accuracy of 98%. This work was completed while the author was at: Justsystem Pittsburgh Research Center & School of Computer Science, Carnegie Mellon University Comments and Questions welcome. Please send all feedback to sbaluja at lycos.com. From shimone at cogs.susx.ac.uk Wed Jan 6 09:04:01 1999 From: shimone at cogs.susx.ac.uk (Shimon Edelman) Date: Wed, 6 Jan 1999 14:04:01 +0000 Subject: research fellowship (vision / neural computation; UK) Message-ID: Research Fellowship -- Visual Object Representation and Categorization School of Cognitive and Computing Sciences University of Sussex Applications are invited for a 2-year EPSRC-funded research fellowship (salary scale 15,735 to 23,651 GBP p/a), starting as soon as possible, and in any case not later than April 1st, 1999. The successful applicant will join an interdisciplinary research group within the School of Cognitive and Computing Sciences, and will work on the development of computer algorithms for representation, recognition and categorization of visual objects and scenes. Candidates should have a PhD degree in Computer Science or equivalent, and should be familiar with issues and modeling techniques in neural computation and computational neuroscience. Relevant research experience and proficiency with rapid prototyping programming environments such as Matlab will be advantageous. Send full CV and letter of application to Professor Shimon Edelman, COGS, University of Sussex, Brighton, BN1 9QH, England, by 30th January (informal enquiries may be directed to shimone at cogs.susx.ac.uk). From Jakub.Zavrel at kub.nl Wed Jan 6 09:31:44 1999 From: Jakub.Zavrel at kub.nl (Jakub.Zavrel@kub.nl) Date: Wed, 6 Jan 1999 15:31:44 +0100 (MET) Subject: Software release: Timbl 2.0 Message-ID: <199901061431.PAA20575@kubsuw.kub.nl> ---------------------------------------------------------------------- Software release: TiMBL 2.0 Tilburg Memory Based Learner ILK Research Group, http://ilk.kub.nl/ ---------------------------------------------------------------------- (sorry if you get this more than once) The ILK (Induction of Linguistic Knowledge) Research Group at Tilburg University, The Netherlands, announces the release of a new version of TiMBL, Tilburg Memory Based Learner (version 2.0). TiMBL is a machine learning program implementing a family of Memory-Based Learning techniques. TiMBL stores a representation of the training set explicitly in memory (hence `Memory Based'), and classifies new cases by extrapolating from the most similar stored cases. TiMBL features the following (optional) metrics and speed-up optimalizations that enhance the underlying k-nearest neighbor classifier engine: - Information Gain weighting for dealing with features of differing importance (the IB1-IG learning algorithm). - Stanfill & Waltz's / Cost & Salzberg's (Modified) Value Difference metric for making graded guesses of the match between two different symbolic values. - Conversion of the flat instance memory into a decision tree, and inverted indexing of the instance memory, both yielding faster classification. - Further compression and pruning of the decision tree, guided by feature information gain differences, for an even larger speed-up (the IGTREE learning algorithm). The current version is a complete rewrite of the software, and offers a number of new features: - Support for numeric features. - The TRIBL algorithm, a hybrid between decision tree and nearest neighbor search. - An API to access the functionality of TiMBL from your own C++ programs. - Increased ability to monitor the process of extrapolation from nearest neighbors. - Many bug-fixes and small improvements. TiMBL accepts commandline arguments by which these metrics and optimalizations can be selected and combined. TiMBL can read the C4.5 and WEKA's ARFF data file formats as well as column files and compact (fixed-width delimiter-less) data. -[download]----------------------------------------------------------- You are invited to download the TiMBL package for educational or non-commercial research purposes. When downloading the package you are asked to register, and express your agreement with the license terms. TiMBL is *not* shareware or public domain software. If you have registered for version 1.0, please be so kind to re-register for the current version. The TiMBL software package can be downloaded from http://ilk.kub.nl/software.html or by following the `Software' link under the ILK home page at http://ilk.kub.nl/ . The TiMBL package contains the following: - Source code (C++) with a Makefile. - A reference guide containing descriptions of the incorporated algorithms, detailed descriptions of the commandline options, and a brief hands-on tuturial. - Some example datasets. - The text of the licence agreement. - A postscript version of the paper that describes IGTREE. The package should be easy to install on most UNIX systems. -[background]--------------------------------------------------------- Memory-based learning (MBL) has proven to be quite successful in a large number of tasks in Natural Language Processing (NLP) -- MBL of NLP tasks (text-to-speech, part-of-speech tagging, chunking, light parsing) is the main theme of research of the ILK group. At one point it was decided to build a well-coded and generic tool that would combine the group's algorithms, favorite optimization tricks, and interface desiderata. The current incarnation of this is now version 2.0 of TiMBL. We think TiMBL can make a useful tool for NLP research, and, for that matter, for any other domain in machine learning. For information on the ILK Research Group, visit our site at http://ilk.kub.nl/ On this site you can find links to (postscript versions of) publications relating to the algorithms incorporated in TiMBL and on their application to NLP tasks. The reference guide ("TiMBL: Tilburg Memory-Based Learner, version 2.0, Reference Guide.", Walter Daelemans, Jakub Zavrel, Ko van der Sloot, and Antal van den Bosch. ILK Technical Report 99-01) can be downloaded separately and directly from http://ilk.kub.nl/~ilk/papers/ilk9901.ps.gz For comments and bugreports relating to TiMBL, please send mail to Timbl at kub.nl Please also send a mail to this address if you do not wish to receive further mail about Timbl. ---------------------------------------------------------------------- From mpp at us.ibm.com Wed Jan 6 15:07:04 1999 From: mpp at us.ibm.com (mpp@us.ibm.com) Date: Wed, 6 Jan 1999 15:07:04 -0500 Subject: Job announcement: IBM Pen Computing Group Message-ID: <852566F1.00700E84.00@D51MTA03.pok.ibm.com> ======> R&D Positions at IBM <====== The Pen Computing Group at the IBM T.J. Watson Research Center is looking for highly motivated individuals to fill R&D positions in the area of large-vocabulary, unconstrained, handwriting recognition. Candidates should have the following qualifications: - MS/PhD in EE, CS, Math, Physics or similar field - Strong mathematics/probability background - Excellent programming skills (in C and C++) - Creativity - Ability to work well as part of a team - Ability to direct their own research - Interest in creating a marketable product ======> Background <====== Our current projects include: - HMM-based, unconstrained, handwriting recognition - Language and grammar modeling - Accurate, high-speed, search methods - Document understanding and processing - Pen computing We have licensed our group's technology for the recently introduced CrossPad (http://www.cross-pcg.com) which enables users to efficiently index, search and manipulate their hand-written notes on their PC using a normal pad of paper. The IBM T.J. Watson Research Center is one of the top industrial laboratories in the world. We offer an exciting research environment with the opportunity to become involved in all aspects of cutting edge technology in the computer industry. ======> Contact Information <====== Please send CV's to: Michael P. Perrone mpp at us.ibm.com -or- Michael P. Perrone IBM T.J. Watson Research Center - 36-207 Route 134 Yorktown Heights, NY 10598 914-945-1779 From haim at fiz.huji.ac.il Wed Jan 6 07:21:16 1999 From: haim at fiz.huji.ac.il (Haim Sompolinsky) Date: Wed, 6 Jan 1999 14:21:16 +0200 Subject: Postdoc position at Hebrew University Message-ID: <00e201be396f$0f516c20$85504084@yesod.huji.ac.il> POST-DOCTORAL POSITION AVAILABLE The Neural Computational Theory Group in the Racah Institute of Physics at the Hebrew University of Jerusalem has a post-doctoral position open, beginning in the fall of 1999. The position will involve working on theoretical problems in two areas: 1.. Supervised and unsupervised learning 2.. Modeling visual and motor processing in cortical and subcortical neuronal structures The work to be done in the area of modeling is expected to involve interaction with experimental groups, which are part of the Interdisciplinary Center for Neural Computation at the Hebrew University. Candidates with a strong background in statistical mechanics, in neural network theory, or in computational neuroscience are encouraged to apply. Those interested should send an application letter along with CV, publications list, brief research statement, and three letters of recommendation to Professor Haim Sompolinsky. The position will be for one year, with the possibility for an extension of up to three years. Airmail submissions: Professor Haim Sompolinsky Racah Institute of Physics Hebrew University of Jerusalem Givat Ram Jerusalem 91904 ISRAEL Fax submissions: 972-2-6584437 Email submissions: haim at fiz.huji.ac.il From espaa at exeter.ac.uk Thu Jan 7 07:36:09 1999 From: espaa at exeter.ac.uk (ESPAA) Date: Thu, 7 Jan 1999 12:36:09 +0000 (GMT Standard Time) Subject: CFP Special Issue PAA journal Message-ID: PATTERN ANALYSIS AND APPLICATIONS JOURNAL http://www.dcs.exeter.ac.uk/paa Springer-Verlag Ltd. CALL FOR PAPERS FOR SPECIAL ISSUE ON 'DOCUMENT IMAGE ANALYSIS AND RECOGNITION' (Guest Editor- Adnan Amin, University of New South Wales, Australia) (DEADLINE FOR PAPER SUBMISSION: 1 April, 1999). Optical Character Recognition and document image analysis have become very important areas of research. Advanced computer and communication technologies now offer better ways to store, retrieve, and distribute this information. Document Image Analysis and Recognition (DIAR) research provides the technology for automated systems for extracting and organising information from paper based documents. Generally, these applications apply image processing and pattern recognition techniques. A document image may contain text, graphics, pictures, or a combination of these. Some commercial products are already available such as OCR systems for reading pages of machine printed text, but research is still required to improve their performance and the full range of real world variability in typography, image quality, and context. Performance evaluation of DIAR systems requires experimental design, a large train and test database, and sophisticated analysis of the results. The aim of this special issue is to show-case the state-of-the-art achievements in DIAR. Submitted papers should report the solution of a significant open problem: theoretical, algorithmic, and systems-architectural studies are welcome, as are papers describing practical applications supported by performance evaluation on a large scale. Topics appropriate for this special issue include, but are not limited to: x Document analysis algorithms and tools x Physical and logical page/image segmentation x Character and symbol recognition methods x Pre- and Post-processing algorithms x Graphical object recognition (e.g. maps and engineering drawings) x System performance measures x Recognition/classification methodologies for DIAR systems x Innovative and industrial applications Send four copies of your manuscript (marked "DIAR SPECIAL ISSUE") by April 1, 1999 to the following address: Sameer Singh, Editor-in-Chief, Pattern Analysis and Applications, Department of Computer Science, University of Exeter, Exeter EX4 4PT, UK __________________________________ Oliver Jenkin Editorial Secretary Pattern Analysis and Applications Department of Computer Science University of Exeter Exeter EX4 4PT tel: +44-1392-264066 fax: +44-1392-264067 email: espaa at exeter.ac.uk ____________________________ From smola at first.gmd.de Thu Jan 7 12:14:30 1999 From: smola at first.gmd.de (Alex Smola) Date: Thu, 07 Jan 1999 18:14:30 +0100 Subject: PhD Thesis available: Learning with Kernels Message-ID: <3694EB76.5520A7D9@first.gmd.de> Dear Connectionists, I am pleased to announce the availability of my PhD Thesis "Learning with Kernels" which is now available at http://svm.first.gmd.de/papers/Smola98.ps.gz Abstract Support Vector (SV) Machines combine several techniques from statistics, machine learning and neural networks. One of the most important ingredients are kernels, i.e. the concept of transforming linear algorithms into nonlinear ones via a map into feature spaces. The present work focuses on the following issues: - Extensions of Support Vector Machines. - Extensions of kernel methods to other algorithms such as unsupervised learning. - Capacity bounds which are particularly well suited for kernel methods. After a brief introduction to SV regression it is shown how the classical \epsilon insensitive loss function can be replaced by other cost functions while keeping the original advantages or adding other features such as automatic parameter adaptation. Moreover the connection between kernels and regularization is pointed out. A theoretical analysis of several common kernels follows and criteria to check Mercer's condition more easily are presented. Further modifications lead to semiparametric models and greedy approximation schemes. Next three different types of optimization algorithms, namely interior point codes, subset selection algorithms, and sequential minimal optimization (including pseudocode) are presented. The primal--dual framework is used as an analytic tool in this context. Unsupervised learning is an extension of kernel methods to new problems. Besides Kernel PCA one can use the regularization to obtain more general feature exractors. A second approach leads to regularized quantization functionals which allow a smooth transition between the Generative Topographic Map and Principal Curves. The second part of the thesis deals with uniform convergence bounds for the algorithms and concepts presented so far. It starts with a brief self contained overview over existing techniques and an introduction to functional analytic tools which play a crucial role in this problem. By viewing the class of kernel expansions as an image of a linear operator it is possible to give bounds on the generalization ability of kernel expansions even when standard concepts like the VC dimension fail or give way too conservative estimates. In particular it is shown that it is possible to compute the covering numbers of the given hypothesis classes directly instead of taking the detour via the VC dimension. Applications of the new tools to SV machines, convex combinations of hypotheses (i.e. boosting and sparse coding), greedy approximation schemes, and principal curves conclude the presentation. -- / Alex J. Smola GMD FIRST / / Room 214 Rudower Chaussee 5 / / Tel: (+49)30-6392-1833 12489 Berlin, Germany / / Fax: (+49)30-6392-1805 smola at first.gmd.de / / URL: http://www.first.gmd.de/~smola / From risc at lps.ens.fr Fri Jan 8 04:45:45 1999 From: risc at lps.ens.fr (risc@lps.ens.fr) Date: Fri, 8 Jan 1999 10:45:45 +0100 (CET) Subject: Workshop: Neurophysics and Physiology of the Motor System Message-ID: Neurophysics and Physiology of the Motor System Les Houches (France), February, 7-12, 1999 Les Houches school is located in a well known ski resort near Chamonix (in the french Alps). It has been for more than forty years a highly prestigious institution for physicists all over the world. The present session will be the first to be ever dedicated, at Les Houches, to integrated Neurosciences. It will bring together physiologists and physicists, who will discuss recent experimental and theoretical advances on the structure, dynamics and functions of the motor system. It will provide a unique opportunity for the participants to become familiar with many of the fundamental issues related to the elaboration, the execution and the control of movement in the central nervous system. It will also bring to light how Neurophysics can contribute to understanding the motor system. Among the topics discussed will be: emerging functional properties, the role of the nonlinearities of neural dynamics, the synchrony of neural activity. These questions will be introduced in the framework of the different nervous structures involved (cortex, cerebellum, basal ganglia, spinal cord), and further discussed in the wider context of the integrated physiology of the motor system. Mornings (8:30 AM to 11:30 AM) and late afternoons (5:00 PM to 7:15 PM) will be devoted to lectures. Organization: D. Hansel and C. Meunier, C.Ph.T. UMR 7644 CNRS Ecole Polytechnique 91128 Palaiseau, France. D. Golomb, Dept. of Physiology, Faculty of Health Sciences Ben Gurion University of the Negev, Beersheva, 84105, Israel Scientific committee: H. Bergman (Jerusalem), P. Collet (Paris), C. Masson (Dijon), A. Schmied (Marseille), I. Segev (Jerusalem) Speakers: M. Abeles (Jerusalem), H. Bergman (Jerusalem), E. Fetz (Seattle), C. Feuerstein (Grenoble), D. Golomb (Beersheva), D. Hansel (Paris), L. Jami (Paris), R. Lemon (London), Y. Manor (Beersheva), C. Meunier (Paris), A. Riehle (Marseille), A. Schmied (Marseille), I. Segev (Jerusalem), H. Sompolinsky (Jerusalem), E. Vaadia (Jerusalem), C. van Vreeswijk (London), Y. Yarom (Jerusalem), J. Yelnik (Paris), D. Zytnicki (Paris). Les Houches is a resort village in the Chamonix valley of the French Alps. Established in 1951, the School is located in a group of mountain chalets surrounded by meadows and woods at an altitude of 1150 m. It is above the village, facing the Mont-Blanc range. Registration fees are 2200FF including accommodation and meals during the whole session. Number of participants is limited to 40, speakers included. The participation of students and young scientists is encouraged, and they may benefit from reduced fees (limited number). Applications (short curriculum vitae and publications list) should be sent before January, 15, 1999 to Mrs. Martine Escoute, URA 1448, UFR biom\'edicale, 45 rue des Saints-P\`eres, 75270 Paris cedex 06; telephone: 0142862138; fax: 0149279062; e-mail: Martine.Escoute at biomedicale.univ-paris5.fr. Les Houches Physics school is affiliated to Universit\'e Joseph Fourier (Grenoble) and Institut National Polytechnique de Grenoble. It is subsidized by Minist\`ere de l'Education Nationale et de l'Enseignement Sup\'erieur, Centre National de la Recherche Scientifique and Commissariat \`a l'Energie Atomique . ---------------------- David Hansel hansel at cpht.polytechnique.fr From baluja at vie.ius.cs.cmu.edu Sat Jan 9 23:16:29 1999 From: baluja at vie.ius.cs.cmu.edu (baluja@vie.ius.cs.cmu.edu) Date: Sat, 9 Jan 99 23:16:29 EST Subject: Making Templates Rotationally Invariant: An Application to Rotated Digit Recognition Message-ID: The following paper is available from: http://www.cs.cmu.edu/~baluja Making Templates Rotationally Invariant: An Application to Rotated Digit Recognition Shumeet Baluja Abstract: This paper describes a simple and efficient method to make template-based object classification invariant to in-plane rotations. The task is divided into two parts: orientation discrimination and classification. The key idea is to perform the orientation discrimination before the classification. This can be accomplished by hypothesizing, in turn, that the input image belongs to each class of interest. The image can then be rotated to maximize its similarity to the training images in each class (these contain the prototype object in an upright orientation). This process yields a set of images, at least one of which will have the object in an upright position. The resulting images can then be classified by models which have been trained with only upright examples. This approach has been successfully applied to two real-world vision-based tasks: rotated handwritten digit recognition and rotated face detection in cluttered scenes. This work was completed while the author was at: Justsystem Pittsburgh Research Center & School of Computer Science, Carnegie Mellon University Comments and Questions welcome. Please send all feedback to sbaluja at lycos.com. From Jon.Baxter at syseng.anu.edu.au Sun Jan 10 20:55:09 1999 From: Jon.Baxter at syseng.anu.edu.au (Jonathan Baxter) Date: Mon, 11 Jan 1999 12:55:09 +1100 (EST) Subject: Trieste Workshop Message-ID: <199901110155.MAA15672@reid.anu.edu.au> Apologies if you receive this announcement more than once. --------------------------------------------------------------------------- SCHOOL ON NEURAL INFORMATION PROCESSING ( 3 - 28 May 1999 ) A School on Neural Information Processing will be held at the Abdus Salam International Centre for Theoretical Physics, Trieste, from 3 to 28 May 1999. This Bulletin contains the preliminary programme of the course, request for participation form and miscellaneous information. DIRECTORS: J.A. Hertz (NORDITA, Copenhagen) S.A. Solla (Northwestern University, Evanston) R. Zecchina (The Abdus Salam ICTP, Trieste) I. PURPOSE AND NATURE The goal of the school will be to present a systematic description of the theoretical approaches that provide tools to investigate the processing, transmission, and storage of information in the brain. Techniques based on the principles of statistical physics and information theory will be presented and applied to the analysis of a variety of problems in computational neuroscience, especially neural encoding and sensory processing. The lectures will be concentrated on biological neural networks, but will also cover some important developments in artificial networks and optimization theory. The lectures presented during the school intend to offer a broad and comprehensive training in order to provide a complete perspective of the field. In addition to the scheduled lectures (an average of 4 lectures per day, 5 days per week) there will be formal and informal seminars on a variety of research topics by lecturers, participants and visiting experts. Computer experimentations will also be organized. II. PRELIMINARY LIST OF LECTURERS AND TOPICS W. Bialek(NEC Research Institute, Princeton)Neural coding M. Biehl (University of Wuerzburg) The dynamics of learning P. Dayan (University College, London) Generative models A. Engel (University of Magdeburg) Statistical physics theory of learning D. Hansel (Ecole Polytechnique, Paris) Circuitry of the visual cortex J. Hertz (Nordita, Copenhagen ) Neural computation and encoding Li Zhaoping (University College, London) Sensory processing R. Monasson (Ecole Normale Superieure), Paris) Optimization problems J. Rinzel (New York University, New York) Modelling of cell/network dynamics T. Sejnowski (The Salk Institute, San Diego) Neural computation and encoding S. Solla(Northwestern University, Chicago)Neural networks for Bayesian inference M. Tsodyks (The Weizmann Institute, Rehovot) Synaptic dynamics L. van Hemmen (Technical University, Munich) Modelling neural circuitry R. Zecchina (ICTP, Trieste) Optimization problems III. PARTICIPATION Scientists and students from all countries that are members of the United Nations, UNESCO or IAEA can attend the School. The main purpose of the Centre is to help research workers from developing countries through a programme of training activities within a framework of international cooperation. However, students and post-doctoral scientists from developed countries are most welcome to attend. As the School will be conducted in English, participants should have an adequate working knowledge of that language. Participants should preferably have completed several years of study and research after a first degree. Applications from graduate students about to finish their PhD, fresh post-docs and young, active faculty members are encouraged. As a rule, travel and subsistence expenses of the participants are borne by the home institutions. However, some funds are available which permit the Centre to grant a subsistence allowance to a limited number of people from developing countries who will be selected by the Organizers. As scarcity of funds allows travel to be granted only in few exceptional cases, every effort should be made by candidates to secure support for their fares (or at least partial fare) by their home country. Such financial support is available only to those attending the entire School. Scientists from developed countries are welcome to join on their own funds. There is no registration fee for attending the School. Deadline for the RECEIPT of request for participation form: 20 January 1999 Candidates should complete and sign the attached "Request for Participation" form (also obtainable via e-mail: smr1157 at ictp.trieste.it, using as subject "get bulletin", or via WWW Server: http://www.ictp.trieste.it/), and send it to: International Centre for Theoretical Physics School on Neural Information Processing P.O. Box 586 (Strada Costiera 11: for courier delivery) I-34100 Trieste, Italy Please note that no LATEX/TEX files are permitted. Any attachments to the request for participation, relevant to extra information for selection purposes, should not exceed 6 pages. The decision of the Organizing Committee will be communicated to all candidates as soon as possible. UNITED NATIONS EDUCATIONAL SCIENTIFIC AND CULTURAL ORGANIZATION and INTERNATIONAL ATOMIC ENERGY AGENCY ABDUS SALAM INTERNATIONAL CENTRE FOR THEORETICAL PHYSICS (ICTP) P.O. Box 586 Telephone: +39 040 2240111 I-34100 Trieste Telex: +39 460392 ICTP I Italy Telefax: +39 040 224163 REQUEST FOR PARTICIPATION *) School on Neural Information Processing ( 3 - 28 May 1999 ) ____________________________________________________________________________ INSTRUCTIONS Each question must be answered clearly and A recent photo of the completely. Type or print in ink. If more candidate should be space is required, attach additional pages. attached here, signed The request for participation form should legibly on the reverse. be forwarded to the ICTP, School on Neural Information Processing, P.O. Box 586, I-34100 Trieste, Italy, to arrive before 20th January 1999 ________________________________________________________________________________ PERSONAL DATA PLEASE NOTE THAT UNLESS ALL REQUESTED PERSONAL DATA ARE PROVIDED, THE ICTP CANNOT PROCESS ANY VISA REQUESTS ________________________________________________________________________________ For women (if applicable) SURNAME: MAIDEN NAME: First name: Middle name(s): Sex: Please also indicate SURNAME, NAME, on passport if different from above: ________________________________________________________________________________ Place of birth (City and Country): Present nationality: Date of birth Year - Month - Day: ________________________________________________________________________________ Full name & address of permanent Institution: Tel. No. : Cable : Telex : Telefax : E-mail : ________________________________________________________________________________ Full name & address of present Institution (if different from permanent) Tel. No. : Cable : Telex : Telefax : E-mail : until: ________________________________________________________________________________ Home address: Tel. No. : ________________________________________________________________________________ Mailing address - please indicate whether: Permanent Institute __ Present Institute __ Home address __ ________________________________________________________________________________ Name and address of person to notify in case of emergency - Relationship: ________________________________________________________________________________ *) PLEASE NOTE that no request will be processed unless the permanent address (and present address, if different) is clearly indicated. EDUCATION (higher degrees) University or equivalent Years attended Degrees Name and place From to ________________________________________________________________________________ Seminars, summer schools, conferences or research Name and place Year ________________________________________________________________________________ SCIENTIFIC EMPLOYMENT AND ACADEMIC RESPONSIBILITY Research Institution or University Period of duty Academic Name and place From to responsibilities Present employment and duties, and foreseen employment upon return to home country after the activity: ________________________________________________________________________________ Have you participated in past ICTP activities: Yes ____ No If yes, which? Mention briefly your previous research experience, and explain your reasons for wishing to participate in this activity: NB: Our Scientific Information System keeps track of all applications made by the candidate to earlier ICTP activities. As a consequence, when the subject of the present activity is far from his/her previous applications, an explanation (not more than 200 words) of his/her change of interest should be included. ________________________________________________________________________________ Kindly supply (strictly within indicated lengths) a keyword description of your current scientific activities as follows: 1) Area of research (e.g. statistical physics, information theory): _______________________ (no more than 15 characters) 2) Specific topic of interest (e.g. learning theory, neural coding): ________________________________________________ (no more than 30 characters) - 2 Present field of interest (please indicate on the list below - up to 5 fields - underlining primary field) 10. PHYSICS OF CONDENSED MATTER 60. PHYSICS TEACHING 11. Solid State Physics 61. English 12. Atomic and Molecular Physics 62. French 13. Materials Science 63. Spanish 14. Surfaces and Interfaces 64. Arab 15. Statistical Physics 16. Computational Physics in Condensed Matter 80. MISCELLANEOUS 20. PHYSICS OF HIGH AND INTERMEDIATE ENERGIES 81. Others 82. Digital Communications and Computer Networking 21. High Energy and Particle Physics 22. Relativity, Cosmology and Astrophysics 23 Plasma Physics 90. PHYSICS OF THE LIVING STATE 24. Nuclear Physics 91. Neurophysics 92. Biophysics 30. MATHEMATICS 93. Medical Physics 31. Applicable Mathematics including: - Mathematical Ecology, - Systems Analysis, - Mathematical Economy - Mathematics in Industry AO. APPLIED PHYSICS 33. Algebra 34. Geometry A1. Physics in Industry 35. Topology A2. Microelectronics 36. Differential Equations A3. Fibre Optics for Communications 37. Analysis A4. Instrumentation 38. Mathematical Physics A5. Synchrotron Radiation A6. Non-destructive Evaluation A7. Lasers 40. PHYSICS AND ENERGY AA. Applied Superconductivity 41. Physics of Nuclear Reactors 42. Physics of Controlled Fusion 43. Non-Conventional Energy (Solar, Wind and others) 50. PHYSICS AND ENVIRONMENT B1. SPACE PHYSICS 51. Solid Earth Geophysics 52. Soil Physics 53. Climatology and Meteorology 54. Physics of the Oceans 55. Physics of Desertification 56. Physics of the Atmosphere, Troposphere Magnetosphere, Aeronomy 57. Environmental Monitoring and Remote Sensing ________________________________________________________________________________ List your scientific publications including books and articles (authors, title, Journal) in the period 1992-1999 - 3 - Please respond to the following questions regarding your expertise with computers. 1) Which is the operating system you use most often? _____ DOS - Windows _____ Unix _____ VMS _____ MacOS _____ Other (specify) 2) How experienced are you with the Unix environment? _____ Very experienced _____ Experienced _____ Somewhat familiar _____ Never used 3) Which programming languages are you familiar with? Very good Good Average Poor Fortran 77 _____ _____ _____ _____ Fortran 90 _____ _____ _____ _____ C _____ _____ _____ _____ C++ _____ _____ _____ _____ Pascal _____ _____ _____ _____ Other (specify _____ _____ _____ _____ 4) Have you written programmes for your research? ____ Yes _____ No If yes, briefly describe for what type of research applications: Approximately how many lines had the longest programme (or piece of programme) you ever wrote?_________ 5) Do you use computer programmes written by others (including commercial ones) for your research? _____ Yes _____ No If yes, briefly describe for what type of research applications: - 4 - Kindly state any positions you hold in the scientific administration of your Institution or any of the national scientific Institutions. If appropriate, especially for junior physicists, it would be of assistance to the Selection Committee if this request for participation were accompanied by a letter of recommendation. ________________________________________________________________________________ Indicate below your proficiency in the English language Reading: Good ___ Writing: Good ___ Speaking: Good ___ Average ___ Average ___ Average ___ Poor ___ Poor ___ Poor ___ ________________________________________________________________________________ APPLICABLE ONLY FOR CANDIDATES FROM DEVELOPING COUNTRIES (Important: Please note that very few travel grants are available, and preference in selecting participants will be given to eligible candidates who can guarantee travel coverage by own local sources). Please tick as appropriate: - I can definitely find complete travel funds from local sources ____ or - I can definitely find half my travel funds from local sources ____ I am requesting financial support from the ICTP for: - Half travel ____ - Full travel ____ - Living allowance ____ I am NOT requesting financial support from the ICTP ___ I certify that if granted funds for my travel I will attend the whole activity ................................................... Signature ____________________________________________________________________________ ________________________________ I certify that the statements made by me above are true and complete. If accepted, I undertake to refrain from engaging in any political or other activities which would reflect unfavourably on the international status of the Centre. I understand that any breach of this undertaking may result in the termination of the arrangements relating to my visit at the Centre. _________________________________________ ______________________ Signature of candidate Date - 5 - From mccallum at sandbox.jprc.com Mon Jan 11 19:15:50 1999 From: mccallum at sandbox.jprc.com (Andrew McCallum) Date: Mon, 11 Jan 1999 19:15:50 -0500 Subject: ML Papers & Cora: Two search engines for postscript papers Message-ID: <199901120015.TAA12092@sandbox.jprc.com> We are pleased to announce the availability of two search engines for Postscript papers on the Web. "ML Papers" provides access to Machine Learning papers. "Cora" provides access to papers on computer science as a whole. Both allow keyword searches over partial text of postscript-formatted papers they have found by spidering the Web. * For ML Papers: http://gubbio.cs.berkeley.edu/mlpapers/ * For Cora: http://www.cora.justresearch.com About ML Papers: "ML Papers", first released in 1997, is a search engine that automatically extract titles, authors and abstracts from postscript papers found on the Web; it was (to our knowledge) the first of such a form. Its index currently consists about 12,000 postscript papers, mostly related to Machine Learning, Datamining, Statistics, etc, and a Web interface provides search functionality over them. Links to poscript files and their referring pages are returned in response to queries. Titles/authors/abstracts of these papers are also displayed. You can see ML Papers at http://gubbio.cs.berkeley.edu/mlpapers/ "ML Papers," which was recently moved from MIT to UC Berkeley, was created by Andrew Ng. Its companion "Vision Papers" search engine can also be accessed at http://www.ai.mit.edu/people/ayn/cgi/vpapers. About Cora: "Cora" provides access to over 50,000 research papers on all computer science subjects. Search queries can include special operators such as +, -, "", title:, author:, reference:, and url:, (all with their typical meanings). Citation references have been processed to provide forward and backward crosslinks---showing both (1) papers referenced by the current paper, and (2) papers that reference the current paper. References have also been parsed in order to provide automatically-generated BibTeX entries. The papers are categorized into a "Yahoo-like" topic hierarchy with 75 leaves. In the near future, the citation structure will be analyzed in order to automatically identify seminal and survey articles in each category. Cora is at http://www.cora.justresearch.com "Cora" is the result of a continuing research project at Just Research, led by Andrew McCallum with interns Kamal Nigam, Jason Rennie and Kristie Seymore. Just Research is the U.S. research organization of Justsystem Corporation, the leading independent software company in Japan, and is located near the Carnegie Mellon campus. A paper describing Cora will be presented at the AAAI Spring Symposium, and can be found at http://www.cs.cmu.edu/~mccallum/papers/cora-aaaiss98.ps. Feel free to share this announcement with others. Enjoy and please send feedback. Andrew Ng ang at cs.berkeley.edu "ML Papers" Andrew McCallum mccallum at justresearch.com, mccallum at cs.cmu.edu "Cora" From jordan at CS.Berkeley.EDU Tue Jan 12 12:13:32 1999 From: jordan at CS.Berkeley.EDU (Michael Jordan) Date: Tue, 12 Jan 1999 09:13:32 -0800 (PST) Subject: Learning in Graphical Models Message-ID: <199901121713.JAA09319@orvieto.CS.Berkeley.EDU> The following book is available from MIT Press; see http://mitpress.mit.edu/promotions/books/JORLPS99 LEARNING IN GRAPHICAL MODELS Michael I. Jordan, Ed. Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering--uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms. PART I: INFERENCE Robert G. Cowell Uffe Kjaerulff Rina Dechter Michael I. Jordan, Zoubin Ghahramani, Tommi S. Jaakkola, and Lawrence K. Saul Tommi S. Jaakkola and Michael I. Jordan David J. C. MacKay Radford M. Neal PART II: INDEPENDENCE Thomas S. Richardson Milan Studeny and Jirina Vejnarova PART III: FOUNDATIONS FOR LEARNING David Heckerman Radford M. Neal and Geoffrey E. Hinton PART IV: LEARNING FROM DATA Christopher M. Bishop Joachim M. Buhmann Nir Friedman and Moises Goldszmidt Dan Geiger, David Heckerman, and Christopher Meek Geoffrey E. Hinton, Brian Sallans, and Zoubin Ghahramani Michael J. Kearns, Yishay Mansour, and Andrew Y. Ng Stefano Monti and Gregory F. Cooper Lawrence K. Saul and Michael I. Jordan Peter W. F. Smith and Joe Whittaker David J. Spiegelhalter, Nicky G. Best, Wally R. Gilks, and Hazel Inskip Christopher K. I. Williams Adaptive Computation and Machine Learning series 7 x 10, 648 pp. paper ISBN 0-262-60032-3 From berthouz at aidan.etl.go.jp Tue Jan 12 19:40:37 1999 From: berthouz at aidan.etl.go.jp (Luc Berthouze) Date: Wed, 13 Jan 1999 09:40:37 +0900 Subject: Emergence and Development of Embodied Cognition: - Symposium Announcement Message-ID: <199901130040.JAA03476@aidan.etl.go.jp> First International Symposium on Emergence and Development of Embodied Cognition (EDEC99) February 9, 1999. At AIST Tsukuba Research Center, Auditorium. Sponsored by Electrotechnical Laboratory (ETL), AIST, MITI and COE program by STA, Japan. Co-organized by: Dr. Yasuo Kuniyoshi (ETL) and Prof. Rolf Pfeifer (Univ. Zurich) Language: English. Participation: Open to public, limited capacity (140seats). Pre-registration strongly recommended (see our web page). Content: This is a one-day open symposium consisting of invited talks by world leading researchers converging onto the issue of interaction dynamics of embodied cognition from the fields of complex systems, biology, neuroscience, psychology, cognitive science, autonomous agents and robotics. The symposium intends to provide an overview and to demonstrate the importance of interdisciplinary collaborative efforts into this common research issue. Conference home page: All important information is located at http://www.etl.go.jp/etl/robotics/EDEC99/ Pre-Registration: Through our web page, as soon as possible, no later than Feb. 1. Fees: Please pay at the on-site registration desk in cash. Symposium: 1,000yen. (Including a handout and coffee break.) Lunchbox: Fee TBA (Recommended as the cafeteria may be crowded.) Reception: 3,000yen. (Need prior registration.) Symposium secretariats: Registration handling & web page manager: For registration, see our web page. Dr. Luc Berthouze, Email: berthouz at etl.go.jp Humanoid Interaction Lab., Intelligent Systems Division, Electrotechnical Laboratory. Local arrangements: Ms. Yoko Sato, Email: yosato at etl.go.jp Tel.:+81-298-54-5180 Fax.:+81-298-54-5971 Humanoid Interaction Lab., Intelligent Systems Division, Electrotechnical Laboratory. Program 9:00 Registration 9:30 Yasuo Kuniyoshi Opening Address - Interdisciplinary EDEC initiative. 9:40 Rolf Pfeifer Dynamics, Morphology, and Materials in The Emergence of Cognition 10:15 Esther Thelen Developmental Foundations of Embodied Cognition 10:50 Linda Smith The Task Dynamics of the A not-B Error 11:25 Kazuo Hiraki Prediction, Habituation and Attention in the Development of Spatial Cognition: Eye-Tracking Data of Infants. 12:00 Lunch 13:00 Shoji Itakura Comparative Cognitive Approach -- Ontogeny, Phylogeny, and 'Robogeny': In the Case of Primate Social Cognition 13:35 Gentaro Taga Complex Systems Approach to Development of Action and Perception of Infants 14:10 Yasuo Kuniyoshi Towards Emergence and Development of Meaningful Interaction Structures through Complex Embodiment - A Humanoid Robot 14:35 Luc Berthouze Emergence of Embodied Interaction: The Internal Dynamics Perspective 15:00 Break 15:15 Olaf Sporns Synthetic Neural Modeling: An Approach to Study the Interaction of Neural Dynamics and Behavior 15:50 Philippe Gaussier From dynamical behaviors to dynamical perception 16:25 Gregor Schoener The Dynamic Field and Its Preshaping: Concepts toward a General Theoretical Framework of Embodied Cognition. 17:00 Takashi Ikegami Simulating a "Theory of Mind" in Coupled Dynamical Recognizers --- Embodiment as Dynamic Interfaces --- 17:35 Closing Discussions 18:00 Symposium Closes 18:30 Reception. List of Presenters 1. Prof. Rolf PFEIFER Director of AI Lab, Computer Science Department, University of Zurich 2. Prof. Esther THELEN Professor, Psychology & Cognitive Science, Indiana University 3. Prof. Linda SMITH Professor, Psychology & Cognitive Science, Indiana University 4. Dr. Kazuo HIRAKI Senior Research Scientist, Information Science Division, Electrotechnical Laboratory 5. Prof. Shoji ITAKURA Associate Professor, Department of Health Sciences, Oita University of Nursing and Health Sciences 6. Prof. Gentaro TAGA Research Assistant Professor, Department of Pure and Applied Sciences, The University of Tokyo 7. Dr. Yasuo KUNIYOSHI Senior Research Scientist, Intelligent Systems Division, Electrotechnical Laboratory 8. Dr. Luc BERTHOUZE Research Scientist, Intelligent Systems Division, Electrotechnical Laboratory 9. Dr. Olaf SPORNS Senior Fellow in Theoretical and Experimental Neurobiology, The Neurosciences Institute 10. Prof. Philippe GAUSSIER Professor, The Image and Signal Processing Lab, ENSEA - The Cergy Pontoise University 11. Prof. Gregor SCHOENER Director, National Research Center of Cognitive Neuroscience, CNRS 12. Prof. Takashi IKEGAMI Associate Professor, Institute of Physics, College of Arts and Sciences, The University of Tokyo --------- Dr. Luc Berthouze, Research Scientist, Intelligent Systems Division Electrotechnical Laboratory (ETL), AIST, MITI, Japan. Tel.+81-298-54-5369 Fax.+81-298-54-5971 1-1-4 Umezono, Tsukuba 305-8568, Japan. Email: berthouz at etl.go.jp http://www.etl.go.jp/~berthouz From aslin at cvs.rochester.edu Wed Jan 13 11:19:10 1999 From: aslin at cvs.rochester.edu (Richard N. Aslin) Date: Wed, 13 Jan 1999 11:19:10 -0500 Subject: postdoc positions at University of Rochester Message-ID: POSTDOCTORAL FELLOWSHIPS, UNIVERSITY OF ROCHESTER. The Department of Brain and Cognitive Sciences seeks two outstanding postdoctoral fellows with research interests in learning and/or developmental cognitive science. One fellowship is affiliated with a NIH training grant in Learning, Development, and Behavior, and the other fellowship is affiliated with a NSF grant on Learning and Intelligent Systems. Supervising faculty for both fellowships work on problems of learning and development using behavioral, computational, and neurobiological approaches. Candidates should have prior background in at least one of these approaches and an interest in working collaboratively in a highly interdisciplinary setting. Several faculty have special interest in statistical learning in the domains of language and perception, although a commitment to this interest is associated only with the NSF fellowship. The NIH fellowship is open only to US citizens or permanent residents. Applicants should send a letter describing their graduate training and research interests, a curriculum vitae, and arrange to have three letters of recommendation sent to: Professor Richard N. Aslin, Department of Brain and Cognitive Sciences, Meliora Hall, University of Rochester, Rochester, NY 14627-0268. Review of applications will begin on February 15, 1999 and continue until one or both positions are filled, with an expected start date of June/August, 1999. Applicants can learn about the department, its faculty, and the opportunities for training by referring to our Web page (http://www.bcs.rochester.edu). Applications from women and members of underrepresented minority groups are especially welcome. The University of Rochester is an Equal Opportunity Employer. -------------------------------------------------------- Richard N. Aslin Department of Brain and Cognitive Sciences and the Center for Visual Science Meliora Hall University of Rochester Rochester, NY 14627 email: aslin at cvs.rochester.edu FAX: (716) 442-9216 Office: (716) 275-8687 http://www.cvs.rochester.edu/people/r_aslin/r_aslin.html From jagota at cse.ucsc.edu Thu Jan 14 14:49:58 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Thu, 14 Jan 1999 11:49:58 -0800 (PST) Subject: new survey e-publication Message-ID: <199901141949.LAA29667@arapaho.cse.ucsc.edu> New refereed e-publication action editor: John Shawe-Taylor T. B. Ludermir, A. de Carvalho, A. P. Braga, M. C. P. de Souto, Weightless neural models: a review of current and past works Neural Computing Surveys 2, 41--61, 1999. 108 references. http://www.icsi.berkeley.edu/~jagota/NCS Abstract: This paper presents a survey of a class of neural models known as Weightless Neural Networks (WNNs). As the name suggests, these models do not use weighted connections between nodes. Instead, a different kind of neuron model, usually based on RAM memory devices, is used. In the literature, the terms ``RAM-based'' and ``n-tuple based'' systems are also commonly used to refer to WNNs. WNNs are being widely investigated, motivating relevant applications and two international workshops in the last few years. The paper describes the most important works in WNNs found in the literature, pointing out the challenges and future directions in the area. A comparative study between weightless and weighted models is also presented. From petridis at eng.auth.gr Mon Jan 18 05:41:38 1999 From: petridis at eng.auth.gr (Vassilis Petridis) Date: Mon, 18 Jan 1999 12:41:38 +0200 Subject: A new paper on clustering disparate data Message-ID: <36A30FE2.388413BD@vergina.eng.auth.gr> Dear Connectionists, We would like to inform you that the paper V. Petridis, and V.G. Kaburlasos, "Fuzzy Lattice Neural Network (FLNN): A Hybrid Model for Learning", IEEE Transactions on Neural Networks, vol. 9, no. 5, September 1998, pp. 877-890, can be accessed at http://skiron.control.ee.auth.gr/post1990.html It has already been stated in the abstract of our above paper that the FLNN draws on Carpenter-Grossberg's Adaptive Resonance Theory (ART) as well as it draws on Simpson's Min-Max neurocomputing principles. Nevertheless we are going to point out herein that the FLNN is not merely a modified version of the previous well known neural paradigms. The important difference is that the FLNN is applicable to data types with the structure of a mathematical lattice. As a consequence, the FLNN is applicable apart from the conventional Euclidean space to other domains as well. For instance in the above paper we demonstrate a learning example in the domain of fuzzy sets over a universe of discourse. We treat other domains in forthcoming publications. Learning and decision-making by the FLNN can both make common sense and be subject to rigorous mathematical analysis. FLNN is a specific scheme within the framework of fuzzy lattices (or, FL-framework) which is presented briefly in the above paper. Apart from (possible) interest in FLNN's "off the mainstream" mathematics, there exists a significant practical potential underlying the employment of the FLNN. That is the capacity to treat jointly and rigorously disparate data. For example the FLNN can treat simultaneously - and with rigour - such disparate data as real numbers, fuzzy sets, propositional statements, symbols, etc. An example of processing disparate data is information filtering and retrieval from the web. We point out that to the best of our knowledge this is a unique capacity of the FLNN alone. In this sense the FLNN can emulate human's capacity for processing jointly disparate data. Comments are welcome. --------------------------------------------------------------------- Professor Vassilios Petridis Dept. of Electrical and Computer Eng. Faculty of Engineering Aristotle University of Thessaloniki GR54006 Thessaloniki, GREECE --------------------------------------------------------------------- email: petridis at vergina.eng.auth.gr phone: +3031 996331 fax : +3031 996367 web : http://control.ee.auth.gr/ From maass at igi.tu-graz.ac.at Mon Jan 18 13:24:10 1999 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Mon, 18 Jan 1999 19:24:10 +0100 Subject: Book on Pulsed Neural Networks Message-ID: <36A37C4A.D1BC3AAF@igi.tu-graz.ac.at> The following book has just appeared at MIT-Press: PULSED NEURAL NETWORKS edited by Wolfgang Maass and Christopher M. Bishop Contributors: Peter S. Burge, Stephen R. Deiss, Rodney J. Douglas, John G. Elias, Wulfram Gerstner, Alister Hamilton, David Horn, Axel Jahnke, Richard Kempter, Wolfgang Maass, Alessandro Mortara, Alan F. Murray, David P. M. Northmore, Irit Opher, Kostas A. Papathanasiou, Michael Recce, Barry J. P. Rising, Ulrich Roth, Tim Sch?nauer, Terrence J. Sejnowski, John Shawe-Taylor, Max R. van Daalen, J. Leo van Hemmen, Philippe Venier, Hermann Wagner, Adrian M. Whatley, Anthony M. Zador. Most artificial neural network models are inspired by models for biological neural systems where the output of a neuron is encoded exclusively in its firing rate: The output of a computational unit in an artificial neural network is a (static) binary or continuous variable that may be viewed as a representation (or abstraction) of the current firing rate of a biological neuron. In recent years, however, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses (called action potentials or spikes), also use the timing of these pulses to transmit information and to perform computation. This realization has stimulated a significant growth of research activity in the area of pulsed neural networks ranging from neurobiological modeling and theoretical analyses, to algorithm development and hardware implementations. Obviously quite different theoretical tools and models have to be developed for this purpose, since almost all traditional computational models (including most artificial neural network models) are based on the assumption that the timing of atomic computational events does not depend in an essential way on the input to the computation (an example is the common assumption that parallel computation steps are synchronized, another example is the assumption that their timing is largely stochastic). For implementations in novel electronic hardware artificial pulsed neural networks offer the possibility to create intriguing combinations of ideas from analog and digital circuits: a pulse has a stereotyped form, hence it may be viewed as a digital signal. On the other hand the timing of a pulse may encode an analog variable. The research reported in this book is motivated both by the desire to enhance our understanding of information processing in biological networks, as well as by the goal of developing new information processing technologies. Our aim in producing this book has been to provide a first comprehensive treatment of the field of pulsed neural networks, which will be accessible to researchers from diverse disciplines such as electrical engineering, signal processing, computer science, physics, and computational neuroscience. By virtue of its pedagogical emphasis, it will also find a place in many of the advanced undergraduate and graduate courses in neural networks now taught in many universities. Terrence J. Sejnowski's foreword, "Neural Pulse Coding," presents an overview of the topic. The first half of the book consists of longer tutorial articles spanning neurobiology, theory, algorithms, and hardware. The second half contains a larger number of shorter research chapters that present more advanced concepts. The contributors use consistent notation and terminology throughout the book. 408 pp., 195 illus., cloth ISBN 0-262-13350-4 MIT-Press; A Bradford Book For further information on this book visit http://www.cis.tu-graz.ac.at/igi/maass/PNN.html MIT-Press catalogue: http://mitpress.mit.edu/promotions/books/MAAPHS99 Amazon bookstore ordering information: http://www.amazon.com/exec/obidos/ASIN/0262133504/qid%3D916171995/002-1728549-3620249 -------------------------------------------------------------------------- Christopher M. Bishop is Senior Researcher at Microsoft Research, Cambridge, and Professor of Computer Science at the University of Edinburgh. Wolfgang Maass is Professor at the Institute for Theoretical Computer Science, Technische Universit?t Graz, Austria. From wsenn at iam.unibe.ch Tue Jan 19 05:17:47 1999 From: wsenn at iam.unibe.ch (Walter Senn) Date: Tue, 19 Jan 1999 10:17:47 +0000 Subject: Position Announcements Message-ID: <36A45BCB.6E351A6E@iam.unibe.ch> POSTDOC AND PhD POSITIONS IN COMPUTATIONAL NEUROSCIENCE The University of Bern, Switzerland, is building an interdisciplinary research group in Computational Neuroscience supported by the Department of Physiology and the Institute of Mathematics. As part of this effort, several positions will become available on 1/4/1999: 1 Post-doctoral position in Computational Neuroscience (up to 3 years) 1 Post-doctoral position in Experimental Neuroscience (up to 3 years) 1 Pre-doctoral (Ph.D.) position in Computational Neuroscience (4 years) The theoretical work of the existing group focuses on short-term synaptic adaptation, single neuron computation and dynamics of networks of spiking neurons. The recent experimental studies at the Physiological Institute concern LTP/LTD experiments on activity-dependent cortical synapses, dendritic signal processing and the culturing of networks on multi-electrode arrays. Candidates with research activities in one of these fields are encouraged to apply, but strong candidates with research in a related topic will also be considered. For additional information see http://pylp76.unibe.ch/~fniwww/jobs/NCjobsMore.html . Please send a letter of application along with CV, publication list, brief statement of current research, and two letters of recommendation before February 20, 1999, to Dr. Walter Senn Department of Physiology University of Bern Bhlplatz 5 CH-3012 Bern, Switzerland e-mail:wsenn at iam.unibe.ch FAX ++41 31 631 46 11 From bricolo at sissa.it Tue Jan 19 08:15:22 1999 From: bricolo at sissa.it (Emanuela Bricolo) Date: Tue, 19 Jan 1999 14:15:22 +0100 (NFT) Subject: pre and postdoctoral positions Message-ID: <199901191315.OAA89766@shannon.sissa.it> A non-text attachment was scrubbed... Name: not available Type: text Size: 3015 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/63c8d4c7/attachment-0001.ksh From gert at cogsci.ed.ac.uk Tue Jan 19 14:17:06 1999 From: gert at cogsci.ed.ac.uk (gert@cogsci.ed.ac.uk) Date: Tue, 19 Jan 1999 19:17:06 +0000 (GMT) Subject: CFP: Workshop Biologically Inspired Machine Learning Message-ID: <8505.199901191917@finlay.cogsci.ed.ac.uk> A non-text attachment was scrubbed... Name: not available Type: text Size: 4004 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/4c09234c/attachment-0001.ksh From oreilly at grey.colorado.edu Tue Jan 19 20:00:43 1999 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Tue, 19 Jan 1999 18:00:43 -0700 Subject: TR on Hippocampus and Neocortex Message-ID: <199901200100.SAA21605@grey.colorado.edu> The following technical report is now available for downloading as: ftp://grey.colorado.edu/pub/oreilly/papers/oreillyrudy99_hipconj_tr.ps - Randy Conjunctive Representations in Learning and Memory: Principles of Cortical and Hippocampal Function Randall C. O'Reilly and Jerry W. Rudy Department of Psychology University of Colorado Boulder, CO 80309 ICS Technical Report 99-01 Abstract: We present a theoretical framework for understanding the roles of the hippocampus and neocortex in learning and memory. This framework incorporates a theme found in many theories of hippocampal function, that the hippocampus is responsible for developing conjunctive representations binding together stimulus elements into a unitary representation that can later be recalled from partial input cues. This idea appears problematic, however, because it is contradicted by the fact that hippocampally lesioned rats can learn nonlinear discrimination problems that require conjunctive representations. Our framework accommodates this finding by establishing a principled division of labor between the cortex and hippocampus, where the cortex is responsible for slow learning that integrates over multiple experiences to extract generalities, while the hippocampus performs rapid learning of the arbitrary contents of individual experiences. This framework shows that nonlinear discrimination problems are not good tests of hippocampal function, and suggests that tasks involving rapid, incidental conjunctive learning are better. We implement this framework in a computational neural network model, and show that it can account for a wide range of data in animal learning, thus validating our theoretical ideas, and providing a number of insights and predictions about these learning phenomena. +-----------------------------------------------------------------------------+ | Dr. Randall C. O'Reilly | | | Assistant Professor | | | Department of Psychology | Phone: (303) 492-0054 | | University of Colorado Boulder | Fax: (303) 492-2967 | | Muenzinger D251C | Home: (303) 448-1810 | | Campus Box 345 | email: oreilly at psych.colorado.edu | | Boulder, CO 80309-0345 | www: http://psych.colorado.edu/~oreilly | +-----------------------------------------------------------------------------+ From sahami at Robotics.Stanford.EDU Tue Jan 19 22:01:39 1999 From: sahami at Robotics.Stanford.EDU (Mehran Sahami) Date: Tue, 19 Jan 1999 19:01:39 -0800 (PST) Subject: PhD Thesis on Machine Learning/Information Access Message-ID: <199901200301.TAA28996@luminous.Stanford.EDU> [Apologies if you receive this more than once.] Dear colleagues, I am very pleased to announce the availability of my PhD thesis, entitled "Using Machine Learning to Improve Information Access" at the following URL: http://robotics.stanford.edu/users/sahami/papers-dir/thesis.ps The dissertation examines the the use of novel clustering, feature selection and classification algorithms applied to text data (as well as some non-text domains). It also presents a working system, SONIA, that makes use of these technologies to enable the automatic topical organization of retrieval results. The table of contents and a more detailed abstract are appended below. Best, Mehran ------------------+---------------------------------- Mehran Sahami | http://xenon.stanford.edu/~sahami Systems Scientist | phone: (650) 496-2399 Epiphany, Inc. | http://www.epiphany.com ------------------+---------------------------------- ---------------------------------------------------------------------- Using Machine Learning to Improve Information Access Part I: Preliminaries Chapter 1: Introduction 1.1 Challenges of Information Access 1.2 System Overview 1.3 Reader's Guide Chapter 2: Document Representation 2.1 Defining a Vector Space 2.2 Controlling Dimensionality Chapter 3: Probabilistic Framework 3.1 Bayesian Networks 3.2 Machine Learning Overview Chapter 4: Related Work in Information Access 4.1 Probabilistic Retrieval 4.2 Feature Selection for Text 4.3 Document Clustering 4.4 Document Classification Part II: Clustering Chapter 5: Feature Selection for Clustering 5.1 Introduction 5.2 Mixture Modeling Revisited 5.3 Theoretical Underpinnings 5.4 Feature Selection Algorithms 5.5 Empirical Results 5.6 Conclusions Chapter 6: A New Model for Document Clustering 6.1 Introduction 6.2 Probabilistic Document Overlap 6.3 Clustering Algorithms 6.4 Results 6.5 Comparison With Mixture Modeling 6.6 Conclusion Part III: Classification Chapter 7: Feature Selection for Classification 7.1 Introduction 7.2 Theoretical Framework 7.3 An Approximate Algorithm 7.4 Initial Results on Non-Text Domains 7.5 Results on Text Domains 7.6 Conclusions Chapter 8: Limited Dependence Bayesian Classifiers 8.1 Introduction 8.2 Probabilistic Classification Models 8.3 The KDB Algorithm 8.4 Initial Results on Non-Text Domains 8.5 Results on Text Domains 8.6 Conclusions and Related Work Chapter 9: Hierarchical Classification 9.1 Introduction 9.2 Hierarchical Classification Scheme 9.3 Results 9.4 Extensions to Directed Acyclic Graphs 9.5 Conclusions Part IV: Putting It All Together Chapter 10: SONIA -- A Complete System 10.1 Introduction 10.2 SONIA on the InfoBus 10.3 A Component View SONIA 10.4 Examples of System Usage 10.5 Conclusions Chapter 11: Conclusions and Future Work 11.1 Where Have We Been? 11.2 Where Are We Going? ABSTRACT The explosion of on-line information has given rise to many query-based search engines (such as Alta Vista) and manually constructed topic hierarchies (such as Yahoo!). But with the current growth rate in the amount of information, query results grow incomprehensibly large and manual classification in topic hierarchies creates an immense information bottleneck. Therefore, these tools are rapidly becoming inadequate for addressing users' information needs. In this dissertation, we address these problems with a system for topical information space navigation that combines the query-based and taxonomic approaches. Our system, named SONIA (Service for Organizing Networked Information Autonomously), is implemented as part of the Stanford Digital Libraries testbed. It enables the creation of dynamic hierarchical document categorizations based on the full-text of articles. Using probability theory as a formal foundation, we develop several Machine Learning methods to allow document collections to be automatically organized at a topical level. First, to generate such topical hierarchies, we employ a novel probabilistic clustering scheme that outperforms traditional methods used in both Information Retrieval and Probabilistic Reasoning. Furthermore, we develop methods for classifying new articles into such automatically generated, or existing manually generated, hierarchies. In contrast to standard classification approaches which do not make use of the taxonomic relations in a topic hierarchy, our method explicitly uses the existing hierarchical relationships between topics, leading to improvements in classification accuracy. Much of this improvement is derived from the fact that the classification decisions in such a hierarchy can be made by considering only the presence (or absence) of a small number of features (words) in each document. The choice of relevant words is made using a novel information theoretic algorithm for feature selection. Many of the components developed as part of SONIA are also general enough that they have been successfully applied to data mining problems in different domains than text. The integration of hierarchical clustering and classification will allow large amounts of information to be organized and presented to users in a individualized and comprehensible way. By alleviating the information bottleneck, we hope to help users with the problems of information access on the Internet. From giacomo at ini.phys.ethz.ch Wed Jan 20 12:23:27 1999 From: giacomo at ini.phys.ethz.ch (Giacomo Indiveri) Date: Wed, 20 Jan 1999 17:23:27 +0000 Subject: TELLURIDE NEUROMORPHIC ENGINEERING WORKSHOP Message-ID: <36A6110F.A2AAC191@ini.phys.ethz.ch> We invite applications for an exciting three week summer workshop on Neuromorphic Engineering that will be held in Telluride, Colorado from Sunday, June 27 to Saturday, July 17, 1999. Details of the workshop and application instructions are at the URL: http://www.ini.unizh.ch/telluride99 -- Avis COHEN (University of Maryland) Rodney DOUGLAS (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) and Timmer HORIUCHI (Johns Hopkins University) Giacomo INDIVERI (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) From a.sharkey at dcs.shef.ac.uk Wed Jan 20 10:34:13 1999 From: a.sharkey at dcs.shef.ac.uk (Amanda Sharkey) Date: Wed, 20 Jan 1999 15:34:13 +0000 (GMT) Subject: IEE Colloquium on Condition Monitoring Message-ID: IEE Colloquium/NCAF Meeting : Birmingham, UK, 22nd-23rd APRIL 1999 Condition Monitoring: machinery, external structures and health. --------------------------------------------------------------- Neural computing and other computational intelligence techniques can be usefully employed in the areas of Condition Monitoring and Fault Diagnosis of machines and external structures (e.g bridges), and of Health Monitoring in medicine. Although these areas are usually treated quite separately, they share a number of common issues and solutions, and should benefit from a cross-fertilisation of ideas. The speakers at this two day colloquium will provide comprehensive reviews of recent research and techniques employed in the domains of condition monitoring, health monitoring, and fault diagnosis; with the underlying aim of facilitating an exchange of ideas and solutions. Organisers: Dr Peter Cowley (Rolls-Royce), Dr Amanda Sharkey (University of Sheffield), Dr Keith Worden (University of Sheffield).. Provisional Programme: 22nd April 09.00-09.30 Registration and Coffee 09.30-10.00 Welcome and Introduction Dr Peter Cowley Rolls-Royce 10.00-11.00 VIBRO-ACOUSTIC CONDITION MONITORING Professor Czeslaw Cempel Poznan University of Technology 11.00-11.30 Coffee and POSTER Session 11.30-12.30 THE LOS ALAMOS HEALTH MONITORING SURVEY Dr Chuck Farrar Los Alamos National Laboratories 12.30-13.30 Lunch 13.30-14.30 CONDITION MONITORING OF ELECTROMECHANICAL PLANT AND CIVIL STRUCTURES Professor James Penman University of Aberdeen 14.30-15.30 NOVELTY DETECTION IN JET ENGINES Professor Lionel Tarassenko University of Oxford. 15.30-16.00 TEA and POSTER Session 16.00-17.00 FAULT DIAGNOSIS FROM A PROCESS CONTROL PERSPECTIVE Professor Ron Patton University of Hull. Provisional Programme: 23rd April 09.30-10.30 ACOUSTIC EMISSION LOCATION USING FOUR SENSORS Dr. Paul Wells British Aerospace 10.30-11.00 Coffee and POSTER Session 11.00-12.00 FAULT DIAGNOSIS FOR CLOSED-LOOP DRUG INFUSION Professor Derek Linkens and Dr M.F. Abbod University of Sheffield 12.00-13.30 Lunch and POSTER Session 13.30-14.30 TECHNICAL AND MEDICAL CONSULTING SYSTEMS USING BAYES NETS Dr Volker Tresp Siemens 14.30-15.30 WHY I AM NOT A NON-BAYESIAN Professor Mahesan Niranjan University of Sheffield 15.30-16.00 Tea 16.00-17.00 Panel Discussion 17.00 Close To register for the above event (both or either of the days) please contact: Events Office, IEE Savoy Place, London WC2R OBL, tel: +44(0)171 240 1871 ext 2205/6 fax +44 (0)171 497 3633 or email: events at iee.org.uk Those interested in presenting a poster summarising research related to the themes of condition monitoring are invited to submit an abstract. Abstracts will be reviewed, and authors will be notified of acceptance. Deadlines for poster abstracts: March 1st 1999 Method for submitting poster abstracts: email to A.Sharkey at dcs.shef.ac.uk or send a hard copy to Dr Amanda Sharkey, Department of Computer Science, Regent Court, Portobello Rd, University of Sheffield, S1 4DP Notification of acceptance of poster abstract: 2nd April 1999 From Alexander.Riegler at univie.ac.at Wed Jan 20 09:19:50 1999 From: Alexander.Riegler at univie.ac.at (Alexander Riegler) Date: Wed, 20 Jan 1999 15:19:50 +0100 (MEZ) Subject: CFP: New Trends in Cognitive Science 99 Message-ID: After the success of the first New Trends in Cognitive Science conference in Vienna, Austria, we are pleased to announce its successor. While in 1997 we focused on the problem of representation (for details see http://www.univie.ac.at/cognition/ntcs97.htm), we will this year put emphasis on the notion of computationalism and its future in the cognitive sciences. Please have a look at the attached Call For Papers or the conference homepage at http://www.univie.ac.at/cognition/conf/ntcs99/ for more information. We are looking forward to welcoming you! Alex Riegler Austrian Society of Cognitive Science New Trends in Cognitive Science 1999 C o m p u t a t i o n a l i s m -- T h e N e x t G e n e r a t i o n International Conference and Workshop Vienna, Austria, May 17-20, 1999 http://www.univie.ac.at/cognition/conf/ntcs99/ Deadline for submissions: February 15, 1999 Invited speakers ---------------- Phil AGRE University of California, Los Angeles Rainer BORN University of Linz Jack B. COPELAND University of Canterbury Adrian CUSSINS University of Illinois, Urbana Stevan HARNAD University of Southampton John HAUGELAND University of Pittsburgh David ISRAEL SRI International Brian C. SMITH Indiana University, Bloomington Purpose ------- This international conference and workshop organized by the Austrian Society of Cognitive Science attemps to bring together theorists working on identifying a "successor" notion of computation--one that not only respects the classical (and critical) limiting results about algorithms, grammars, complexity bounds, etc., but that also does justice to real-world concerns of daily computational practice, and thereby offers a much better chance of serving as a possible foundation for a realistic theory of mind. The workshop will focus on the prospects for developing a theory that takes computing not to be not abstract, syntactic, disembodied, isolated, and non-intentional, but concrete, semantic, embodied, interactive, and intentional. If such a successor notion of computation can be defined, the resulting rehabilitated computationalism may still be our best bet for explaining cognition. It is hoped that this conference will set the agenda for a "philosophy of computation" that will tackle such as issues as: the program/process distinction; the notion of implementation and questions of physical realization; real-time constraint and real-world interaction; the use and limitations of models; relations between concrete and abstract; the proper interpretation of complexity results; etc. Addressing such questions is a critical prerequisite for providing a firm foundation for cognitive science in the new century. Paper submission ---------------- Submitted manuscripts should be between 4000 and 5000 words in length and typed doublespaced on one side of plain paper, with wide margins to allow for editorial notes. The first page of the manuscript should only contain the author's name and affiliation address, the article title, and an abstract of about 100-150 words. Each page of the manuscript should be consecutively numbered, including pages of references. References should be listed at the end of the article in alphabetical and chronological order. Notes should be placed at the bottom of each page as footnotes and numbered consecutively. Reviewing will be blind to the identities of the authors, which requires that authors exercise some care not to identify themselves in their papers. 3 hard copies of the manuscript should be sent to either Matthias Scheutz Institut fuer Wissenschaftstheorie Universitaet Wien Sensengasse 8/10 A-1090 Wien AUSTRIA or Matthias Scheutz Department of Computer Science and Engineering University of Notre Dame Notre Dame, IN 46556 USA Conference Site --------------- The conference will take place in the festival hall at the University of Vienna, located in Vienna's historical first district. More information ---------------- For details see http://www.univie.ac.at/cognition/conf/ntcs99/ or contact Matthias Scheutz at matthias.scheutz at univie.ac.at From xwu at gauss.Mines.EDU Thu Jan 21 18:19:32 1999 From: xwu at gauss.Mines.EDU (Xindong Wu) Date: Thu, 21 Jan 1999 16:19:32 -0700 (MST) Subject: Knowledge and Information Systems: Vol 1 No 1 (1999) Message-ID: <199901212319.QAA04124@gauss> Knowledge and Information Systems: An International Journal ----------------------------------------------------------- ISSN 0219-1377 by Springer-Verlag Home Page: http://kais.mines.edu/~kais/ ======================================= Volume 1 Number 1 (1999): Table of Contents ------------------------------------------- Regular Papers - Data Preparation for Mining World Wide Web Browsing Patterns, by Robert Cooley, Bamshad Mobasher, and Jaideep Srivastava - Data Mining via Discretization, Generalization and Rough Set Feature Selection, by Xiaohua Hu and Nick Cercone - Towards Automated Case Knowledge Discovery in the M2 Case-Based Reasoning System, by D. Patterson, S.S. Anand, W. Dubitzky, and J.G. Hughes - Learning from Batched Data: Model Combination Versus Data Combination, by Kai Ming Ting, Boon Toh Low, and Ian H. Witten Short Papers - Comparative Evaluation of Two Neural Network Based Techniques for Classification of Microcalcifications in Digital Mammograms, by Brijesh Verma - Managing Null Entries in Pairwise Comparisons, by Waldemar W. Koczkodaj, Michael W. Herman, and Marian Orlowski ---------------------------------------------------------------------- A subscription form and other accepted papers are available on the journal home page (http://kais.mines.edu/~kais/). From bert at mbfys.kun.nl Thu Jan 21 10:48:19 1999 From: bert at mbfys.kun.nl (Bert Kappen) Date: Thu, 21 Jan 1999 16:48:19 +0100 (MET) Subject: Postdoc position available at SNN Nijmegen Message-ID: <199901211548.QAA13514@bertus.mbfys.kun.nl> Post doc position available at SNN, University of Nijmegen, the Netherlands. Background: The group consists of 10 researchers and PhD students and conducts theoretical and applied research on neural networks and Bayesian methods. The group is part of the Laboratory of Biophysics which is involved in experimental brain science. Recent research of the group has focused on theoretical description of learning processes using the theory of stochastic processes and the design of efficient learning rules for Boltzmann machines and other graphical models using techniques from statistical mechanics; the extraction of rules from data and the integration of knowledge and data for modeling; the design of robust methods for confidence estimation with neural networks. Applied research is conducted on computer assisted medical diagnosis and prediction tasks. Since 1997, SNN Nijmegen has founded a company which sells commercial services and products in the field of neural networks, AI and statistics. For more information see also http://www.mbfys.kun.nl/SNN Job specification: The tasks of the post-doc will be to conduct independent research in one of the above areas. In addition, it is expected that the post-doc will initiate novel research and will assist in the supervision of PhD students. The postdoc should have a PhD in physics, mathematics or computer science and a strong theoretical background in neural networks. The post-doc salary will be maximally Dfl. 7396 per month, depending on experience. The position is available for 2 years with possible extension to 4 years. Applications: Interested candidates should send a letter with a CV and list of publications before 1997 to dr. H.J. Kappen, Stichting Neurale Netwerken, University of Nijmegen, Geert Grooteplein 21, 6525 EZ Nijmegen. For information contact dr. H.J. Kappen, +31 24 3614241. From bert at mbfys.kun.nl Fri Jan 22 04:36:06 1999 From: bert at mbfys.kun.nl (Bert Kappen) Date: Fri, 22 Jan 1999 10:36:06 +0100 (MET) Subject: postdoc available at SNN Nijmegen (correction) Message-ID: <199901220936.KAA13964@bertus.mbfys.kun.nl> The announcement for a postdoc position at SNN Nijmegen mentions 1997 as deadline for submission of applications. This should be february 20 1999. Sorry for the mess-up. Bert Kappen From fritzke at inf.tu-dresden.de Mon Jan 25 09:00:50 1999 From: fritzke at inf.tu-dresden.de (Bernd Fritzke) Date: Mon, 25 Jan 1999 15:00:50 +0100 Subject: Habilitation thesis on "Vector-based neural networks" available (in german!) Message-ID: <36AC7912.8410FACC@inf.tu-dresden.de> Dear Colleagues, This is to announce the availability of my habilitation thesis "Vektorbasierte Neuronale Netze" (engl: Vector-based neural networks) which is meant to be a comprehensive overview of those neural network architectures which are characterized by prototype vectors (e.g. SOM, Neural Gas, GNG, RBFN) as well as related statistical methods (e.g. LBG, k-means, k-NN). Due to some ancient university law I had to write this thing in German, which is kind of a pity, since this is not quite the general language of science 8v). Those of you who are able to understand German, however, can find the postscript version at http://pikas.inf.tu-dresden.de/~fritzke/papers/habil.ps.gz The thesis has also appeared as book. Details can be found at http://www.shaker.de/Online-Gesamtkatalog/Details.idc?ISBN=3-8265-4458-7 Apologies to the non-German-speaking list members for the bandwidth! Bernd -- Bernd Fritzke http://pikas.inf.tu-dresden.de/~fritzke Neural Computation Group Fax: ++49 351 463-8364 AI-Inst./CS-Dept./Dresden University of Technology Tel: ++49 351 463-8363 From shastri at ICSI.Berkeley.EDU Mon Jan 25 20:22:36 1999 From: shastri at ICSI.Berkeley.EDU (Lokendra Shastri) Date: Mon, 25 Jan 1999 17:22:36 PST Subject: Paper: symbol processing, dynamic bindings, and temporal synchrony Message-ID: <199901260122.RAA03673@lassi.ICSI.Berkeley.EDU> Dear Connectionists, The following preprint may be of interest to you. Best wishes. -- Lokendra Shastri http://www.icsi.berkeley.edu/~shastri/psfiles/shruti_adv_98.ps.gz ------------------------------------------------------------------------------ Advances in SHRUTI: A neurally motivated model of relational knowledge representation and rapid inference using temporal synchrony. Lokendra Shastri International Computer Science Institute Berkeley, CA 94704 Abstract We are capable of drawing a variety of inferences effortlessly, spontaneously, and with remarkable efficiency --- as though these inferences are a *reflex* response of our cognitive apparatus. This remarkable human ability poses a challenge for cognitive science and computational neuroscience: How can a network of slow neuron-like elements represent a large body of systematic knowledge and perform a wide range of inferences with such speed? The connectionist model SHRUTI attempts to address this challenge by demonstrating how a neurally plausible network can encode a large body of semantic and episodic facts, systematic rules, and knowledge about entities and types, and yet perform a wide range of explanatory and predictive inferences within a few hundred milliseconds. Relational structures (frames, schemas) are represented in SHRUTI by clusters of cells, and inference in SHRUTI corresponds to a transient propagation of rhythmic activity over such cell-clusters wherein *dynamic bindings* are represented by the synchronous firing of appropriate cells. SHRUTI encodes mappings across relational structures using high-efficacy links that enable the propagation of rhythmic activity, and it encodes items in long-term memory as coincidence and conincidence-error detector circuits that become active in response to the occurrence (or non-occurrence) of appropriate coincidences in the on going flux of rhythmic activity. Finally, ``understanding'' in SHRUTI corresponds to reverberant and coherent activity along closed loops of neural circuitry. Over the past several years, SHRUTI has undergone several enhancements that have augmented its expressiveness and inferential power. This paper describes some of these extensions that enable SHRUTI to (i) deal with negation and inconsistent beliefs, (ii) encode evidential rules and facts, (iii) perform inferences requiring the dynamic instantiation of entities, and (iv) seek coherent explanations of observations. Keywords: knowledge representation; inference; evidential reasoning; dynamic binding; temporal synchrony. To appear in Applied Intelligence. From kyoung at itsa.ucsf.edu Mon Jan 25 20:57:19 1999 From: kyoung at itsa.ucsf.edu (Karl Young) Date: Mon, 25 Jan 1999 17:57:19 -0800 (PST) Subject: Post-Doctoral Research in Novel Brain Imaging Methods Message-ID: Postdoctoral position at the University of California, San Francisco; in the Laboratory of Dr. Michael Weiner. The research involves the development and comparison of statistical classification techniques for medical imaging data. Knowledge of some subset of the following topics is important: feature extraction techniques, such as projection pursuit or principal component analysis, and cluster analysis techniques, such as linear discriminant analysis, CART, neural networks, or Bayesian techniques. While these are important, significant stress will be placed on the development of novel classification techniques based on recently proposed measures of structural complexity. Development of metrics for comparison of the new techniques with more standard ones will be an important component of the research. The candidate will also contribute to the design of a medical image data warehouse that is optimized for use by the automated classification techniques. Some familiarity with medical imaging would be useful but is not required; primary interaction will be with research physicists and mathematicians in Dr. Weiner's group and from the Santa Fe Institute. Send CV and references to: Dr. Karl Young University of California, SF Phone: (415) 750-2158 lab VA Medical Center, MRS Unit (114M) FAX: (415) 668-2864 4150 Clement Street Email: kyoung at itsa.ucsf.edu San Francisco, CA 94121 From rpare at allstate.com Tue Jan 26 18:58:08 1999 From: rpare at allstate.com (Rajesh Parekh) Date: Tue, 26 Jan 1999 17:58:08 -0600 Subject: Grammatical Inference Homepage Message-ID: An embedded and charset-unspecified text was scrubbed... Name: not available Url: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/40905dbd/attachment-0001.ksh From Dave_Touretzky at cs.cmu.edu Wed Jan 27 13:48:27 1999 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Wed, 27 Jan 1999 13:48:27 -0500 Subject: CNS*99 submission deadline extended due to computer failure Message-ID: <21284.917462907@skinner.boltz.cs.cmu.edu> If you were trying to make the Tuesday 11:59 pm submission deadline for CNS*99 (the Eighth Annual Computational Neuroscience Meeting, July 18-22, Pittsburgh, PA) and couldn't, don't sweat it. The web server at Caltech was swamped by heavy traffic and was going up and down all day. We will continue to accept submissions through the end of the week. Sorry for the inconvenience. More details about the conference can be found on the conference home page (when the server isn't crashing) at: http://numedeon.com/cns-meetings/CNS99/ -- Dave Touretzky From harnad at coglit.soton.ac.uk Wed Jan 27 15:24:08 1999 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Wed, 27 Jan 1999 20:24:08 +0000 (GMT) Subject: Lexical Processing: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article *** please see also 5 important announcements about new BBS policies and address change at the bottom of this message) *** LEXICAL ENTRIES AND RULES OF LANGUAGE: A MULTIDISCIPLINARY STUDY OF GERMAN INFLECTION by Harald Clahsen This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to [PLEASE NOTE SLIGHTLY CHANGED ADDRESS]: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. _____________________________________________________________ LEXICAL ENTRIES AND RULES OF LANGUAGE: A MULTIDISCIPLINARY STUDY OF GERMAN INFLECTION Harald Clahsen Dept. of Linguistics University of Essex Colchester C04 3SQ UK harald at essex.ac.uk http://privatewww.essex.ac.uk/~harald ABSTRACT: It is hypothesized following much work in linguistic theory that the language faculty has a modular structure and consists of two basic components, a lexicon of (structured) entries and a computational system of combinatorial operations to form larger linguistic expressions from lexical entries. This target article provides evidence for the dual nature of the language faculty by describing some recent results from a multidisciplinary investigation of German inflection. We have examined (i) its linguistic representation focussing on noun plurals and verb inflection (participles), (ii) processes involved in the way adults produce and comprehend inflected words, (iii) brain potentials generated during the processing of inflected words and (iv) the way children acquire and use inflection. It will be shown that the evidence from all these sources converges and supports the distinction between lexical entries and combinatorial operations. Our experimental results indicate that adults have access to two distinct processing routes, one accessing (irregularly) inflected entries from the mental lexicon, and another involving morphological decomposition of (regularly) inflected words into stem+affix representations. These two processing routes correspond to the dual structure of the linguistic system. Results from event-related potentials confirm this linguistic distinction at the level of brain structures. In children's language, we found these two processes also to be clearly dissociated; regular and irregular inflection are used under different circumstances, and the constraints under which children apply them are identical to those of the adult linguistic system. Our findings will be explained in terms of a linguistic model, which maintains the distinction between the lexicon and the computational system, but replaces the traditional view of the lexicon as a simple list of idiosyncrasies with the notion of internally structured lexical representations. KEYWORDS: grammar, psycholinguistics, neuroscience of language, child language acquisition, human language processing, development of inflection ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.clahsen.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.clahsen ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.clahsen To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.clahsen When you have the file(s) you want, type: quit ____________________________________________________________ *** FIVE IMPORTANT ANNOUNCEMENTS *** ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/~harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From bengioy at IRO.UMontreal.CA Wed Jan 27 10:41:16 1999 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Wed, 27 Jan 1999 10:41:16 -0500 Subject: post-doc position in Montreal / statistical data analysis + machine learning Message-ID: <19990127104116.07451@IRO.UMontreal.CA> POST DOCTORAL RESEARCH STAFF MEMBER IN STATISTICAL DATA ANALYSIS AND MACHINE LEARNING FOR HIGH-DIMENSIONAL DATA SETS MATHEMATICS OF INFORMATION TECHNOLOGY AND COMPLEX SYSTEMS (MITACS: a new Canadian Network of Centers of Excellence) Position to be held jointly at the Department of Computer Science & Operations Research and the Department of Mathematics and Statistics, at the UNIVERSITY OF MONTREAL, Quebec, Canada NATURE AND SCOPE OF THE POSITION: A post-doctoral position position is available at the University of Montreal within the MITACS network of centers of excellence. The main research area will be the statistical data analysis of high-dimensional data sets with machine learning algorithms, also known as, "database mining". The main research questions that will be addressed in this research are the following: - how to deal with the "curse of dimensionality": algorithms based on variable selection or based on combining many variables with different importance, while controlling generalization to avoid overfitting; - how to make inferences on the models obtained with such methods, mostly using resampling methods such as the BOOTSTRAP. This research will be performed within the MITACS project entitled "INFERENCE FROM HIGH-DIMENSIONAL DATA". See http://www.iro.umontreal.ca/~bengioy/mitacs.html for more information on the project and http://www.mitacs.math.ca for more information on the MITACS network. The candidate will be working under the supervision of professors Yoshua Bengio (computer science and operations research) and Christian Leger (mathematics and statistics). See http://www.iro.umontreal.ca/~bengioy and http://www.iro.umontreal.ca/~lisa for more information respectively on Yoshua Bengio and his laboratory. See http://www.dms.umontreal.ca/~leger for more information on Christian Leger. ESSENTIAL SKILLS, KNOWLEDGE, AND ABILITIES: Candidates must possess a recent Ph.D. in computer science, statistics, mathematics, or a related discipline, with a research background in machine learning (in particular neural networks) and/or computational statistical methods such as the bootstrap. Candidates must have excellent programming skills, with demonstrated experience. Experience in the following areas will be mostly considered: - Statistical data analysis in general, - bootstrapping methods in particular. - Machine learning algorithms in general, - artificial neural networks in particular. - Programming skills in general, - object-oriented programming, - participation in large-scale, multiple authors, software projects, - experience with C, C++ and S-Plus languages, in particular. LENGTH OF EMPLOYMENT: 1 year (with possible renewal for 2 years total), starting as soon as possible. FOR FURTHER INFORMATION, PLEASE CONTACT: Yoshua Bengio bengioy at iro.umontreal.ca, 514-343-6804, fax 514-343-5834 or Christian Leger leger at dms.umontreal.ca 514-343-7824, fax 514-343-5700 Electronic applications (preferably as a postscript, raw text, or pdf file) are encouraged, in the form of a a Curriculum Vitae with information on academic experience, academic standing, research experience, programming experience, and any other relevant information (e.g., pointer to your web site, if any). -- Yoshua Bengio Professeur aggr?g? D?partement d'Informatique et Recherche Operationnelle Universit? de Montr?al, addresse postale: C.P. 6128 Succ. Centre-Ville, Montreal, Quebec, Canada H3C 3J7 addresse civique: 2920 Chemin de la Tour, Montreal, Quebec, Canada H3T 1J8 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/labs/neuro From campber at CLEMSON.EDU Wed Jan 27 10:10:06 1999 From: campber at CLEMSON.EDU (Robert L. Campbell) Date: Wed, 27 Jan 1999 10:10:06 -0500 Subject: Conference on Consciousness and Cognition (2nd call; deadline extended) Message-ID: Mind 4 Dublin City University, Dublin, Ireland, August 16-20, 1999 Theme: "Two Sciences of Mind" Confirmed invited speakers include: Bernard Baars David Galin Karl Pribram Stuart Hammeroff Kathy McGovern Steven Nachmanovitch Jacob Needleman Program Committee: Bernard Baars Mark Bickhard Robert Campbell Christian de Quincey Stuart Hammeroff Paul Mc Kevitt Kathy McGovern Steven Nachmanovitch Jacob Needleman Sean O Nuallain Yoshi Nakamura Max Velmans Terry Winegar Keynote addresses: Jabob Needleman: "Inner and Outer Empiricism in Consciousness Research" Bernard Baars: "The Compassionate Implications of Brain Imaging of Conscious Pain: New Vistas in Applied Cognitive Science." Stream 1: Outer and Inner empiricism in consciousness research This stream will feature papers that attempt to show how "inner" states can be elucidated with reference to external phenomena. "Inner empiricism" designates experience, or qualia. They are shaped (somehow) by brain processes or states which sense and interpret the external phenomena. The physical nature of these processes or states may tell us much about consciousness. Likewise, the argument that we are conscious of only one thing at a time because of the gating action of the nuclei reticularis thalami (Taylor, Baars, etc) is indicative of the kind of thinking we are trying to encourage. In this vein, pain experience and its imperfect relationship to neural activity are similarly relevant. We particularly welcome papers that feature empirical data, or, lacking these data, show a grasp of the range of disciplines necessary to do justice to the topic. Papers are also invited that - Interpret qualia in terms of a quantum-mechanics based panpsychism (or, in current terms, pan-protopsychism) - Establish links with developments like Whitehead's pan-experientialism and process thought -Interrelate physiological processes at the neural level with current thought in QM - Emphasize "relational empiricism", ie second-person considerations - Investigate the brain processes or states giving rise to qualia at whatever level the writer considers appropriate (eg intra-cellular cytoskeletal activities and/or quantum-level phenomena). - Involve studies of central pain states as well as other curiosities like allodynia, spontaneous analgesia, pain asymbolia, and hypnotic analgesia. The invited talks include: David Galin "The Experience of 'Spirit' in Cognitive Terms." Stuart Hameroff "Quantum Computing and Consciousness" Steve Nachmanovitch "Creativity and Consiousness" Each of these talks will be followed by a panel discussion discussing respectively, consciousness as explored experientially, through scientific investigation, and in the arts. Stream 2: Foundations of Cognitive Science Co-chairs: Sean O Nuallain Dublin City University, Dublin 9, Ireland (sonualla at compapp.dcu.ie) Robert L. Campbell Department of Psychology, Clemson University, Clemson, SC USA (campber at clemson.edu) WHAT THE STREAM IS ABOUT Though deep and contentious questions of theory and metatheory have always been prevalent in Cognitive Science--they arise whenever an attempt is made to define CS as a discipline--they have frequently been downrated by researchers, in favor of empirical work that remains safely within the confines of established theories and methods. Our goal to is redress the balance. We encourage participants in this stream to raise and discuss such questions as: * the adequacy of computationalist accounts of mind * the adequacy of conceptions of mental representation as structures that encode structures out in the environment * the consequences of excluding emotions, consciousness, and the social realm from the purview of cognitive studies * the consequences of Newell and Simon's "scientific bet" that developmental constraints do not have to be studied until detailed models of adult cognition have been constructed and tested * the relationship between cognitive science and formal logic A wide range of theoretical perspectives is welcome, so long as the presenters are willing to engage in serious discussion with the proponents of perspectives that are different from their own: * Vygotskian approaches to culture and cognition * Dynamic Systems theories * Piagetian constructivism * interactivism * neuroscience accounts such as those of Edelman and Grossberg * accounts of emergence in general, and emergent knowledge in particular * perception and action robotics * functional linguistics * genetic algorithms * Information Procesing * connectionism * evolutionary epistemology ******************** Contributors will be asked to submit short papers (3000 word limit) in the form of ASCII text files (HTML files are also welcome, but are optional) to Robert Campbell (for stream 2) and Sean O Nuallain (stream 1). The deadline is March 1, 1999. We will email notification of acceptance or rejection by April 1. The standard presentations during the streams will be 20-minute talks and poster sessions. *********** The "MIND" conferences have normally had their proceedings published by John Benjamins. We have already been approached by prospective publishers for Mind 4. All accepted papers and posters will be included in a preprint. Robert L. Campbell Professor, Psychology Brackett Hall 410A Clemson University Clemson, SC 29634-1511 USA phone (864) 656-4986 fax (864) 656-0358 http://hubcap.clemson.edu/~campber/index.html Editor, Dialogues in Psychology http://hubcap.clemson.edu/psych/Dialogues/dialogues.html From bert at mbfys.kun.nl Thu Jan 28 08:37:43 1999 From: bert at mbfys.kun.nl (Bert Kappen) Date: Thu, 28 Jan 1999 14:37:43 +0100 (MET) Subject: Postdoc position available at SNN Nijmegen Message-ID: <199901281337.OAA17607@bertus.mbfys.kun.nl> Post doc position available at SNN, University of Nijmegen, the Netherlands. Background: The group consists of 10 researchers and PhD students and conducts theoretical and applied research on neural networks and Bayesian methods. The group is part of the Laboratory of Biophysics which is involved in experimental brain science. Recent research of the group has focused on theoretical description of learning processes using the theory of stochastic processes and the design of efficient learning rules for Boltzmann machines and other graphical models using techniques from statistical mechanics; the extraction of rules from data and the integration of knowledge and data for modeling; the design of robust methods for confidence estimation with neural networks. Applied research is conducted on computer assisted medical diagnosis and prediction tasks. Since 1997, SNN Nijmegen has founded a company which sells commercial services and products in the field of neural networks, AI and statistics. For more information see also http://www.mbfys.kun.nl/SNN Job specification: The tasks of the post-doc will be to conduct independent research in one of the above areas. In addition, it is expected that the post-doc will initiate novel research and will assist in the supervision of PhD students. The postdoc should have a PhD in physics, mathematics or computer science and a strong theoretical background in neural networks. The post-doc salary will be maximally Dfl. 7396 per month, depending on experience. The position is available for 2 years with possible extension to 4 years. Applications: Interested candidates should send a letter with a CV and list of publications before february 20 1999 to dr. H.J. Kappen, SNN, University of Nijmegen, Geert Grooteplein 21, 6525 EZ Nijmegen. For information contact dr. H.J. Kappen, +31 24 3614241 or bert at mbfys.kun.nl. From a.sharkey at dcs.shef.ac.uk Thu Jan 28 05:43:45 1999 From: a.sharkey at dcs.shef.ac.uk (Amanda Sharkey) Date: Thu, 28 Jan 1999 10:43:45 +0000 (GMT) Subject: New Book Announcement Message-ID: The following book is now available: COMBINING ARTIFICIAL NEURAL NETS: Ensemble and Modular Multi-Net Systems Perspectives in Neural Computing Series Edited: Amanda J.C. Sharkey Publisher: Springer-Verlag London Ltd 1999 http://www.springer.co.uk Price: 39.50 pounds UK, 89.95 dollars US. ISBN 1-85233-004-X This volume consists of articles written by leading researchers in the field of Combining Artificial Neural Nets, and as such provides unique coverage of the area. The techniques presented include ensemble-based approaches, where a variety of methods are used to create a set of different nets trained on the same task, and modular approaches, where a task is decomposed into simpler problems. The focus is on the combination of Neural Nets, but many of the methods are applicable to a wider variety of statistical methods. The presentation of techniques is accompanied by analysis and evaluation of their relative effectiveness, and by reports on their application to a variety of problems. Chapters: 1. "Multi-Net Systems" A. Sharkey 2. "Combining Predictors" L. Breiman 3. "Boosting Using Neural Networks" H. Drucker 4. "A Genetic Algorith Approach for Creating Neural Network Ensembles" D. Opitz and J. Shavlik 5. "Treating Harmful Collinearity in Neural Network Ensembles" S. Hashem 6. "Linear and Order Statistics Combiners for Pattern Classification" K. Tumer and J. Ghosh 7. "Variance Reduction via Noise and Bias Constraints" Y. Raviv and N. Intrator 8. "A Comparison of Visual Cue Combination Models" I Fine and R. Jacobs 9. "Model Selection of Combined Neural Nets for Speech Recognition" C. Furlanello, D. Giuliani, S. Merler and E. Trentin 10. "Self-Organised Modular Neural Networks for Encoding Data" S. Luttrell 11. "Mixtures of X" R. Jacobs and M. Tanner Queries: postmaster at svl.co.uk From koenig at iee.et.tu-dresden.de Thu Jan 28 13:05:04 1999 From: koenig at iee.et.tu-dresden.de (Andreas Koenig) Date: Thu, 28 Jan 1999 19:05:04 +0100 Subject: QuickCog Development Environment Message-ID: <199901281805.TAA27909@eeebwm.et.tu-dresden.de> Dear Colleagues, this is to announce the availability of a novel PC based (Windows 95/98/NT) development environment, which serves as rapid and efficient system design tool for image processing and general pattern recognition applications, e.g. for automated visual inspection tasks. Fast and transparent design of technical cognitive systems is provided by the following key features: - Visual programming and sample set oriented processing - Interactive menu-driven creation of arbitrary sample sets, ROI definition/object partitioning and consistent preclassification - Proven methods of image processing, feature computation, classification and evaluation unified in a single system - Feature space visualisation and interactive analysis by dimension reducing mappings and interactive feature maps, assessment functions for system discriminance, method selection, and parameter settings. (The enclosed QuickMine-Toolbox also allows visual data analysis for arbitrary data) - Automatic feature selection by parametric and nonparametric methods - Efficient and easy to use classification methods, including centroid classifier, nearest-neighbor-techniques, artificial neural networks, and a rule-based classifier Currently, the system is available in German only, as it was commercialised on the German market in 1998, but an English version will be available soon. The description of the various included methods can be found in my doctoral thesis from 1995 (http://www.iee.et.tu-dresden.de/~koeniga), also in German. Several publications, that describe the main features of the system, are available in English from the QuickCog home page. Apologies to the non-German speaking list members. More information and a demo-version (for research purposes, restrictions with regard to the full version are negligible) can be obtained from: http://www.iee.et.tu-dresden.de/~koeniga/QuickCog.html Best regards Andreas Koenig -------------------------------------------- Doz. Dr.-Ing. Andreas Koenig Chair of Electronic Devices and Integrated Circuits Faculty of Electrical Engineering, IEE Dresden University of Technology Mommsenstr. 13 01062 Dresden Phone: +49 351 463 2805 Fax : +49 351 463 7260 E-Mail: koenig at iee.et.tu-dresden.de From jbower at bbb.caltech.edu Thu Jan 28 18:26:05 1999 From: jbower at bbb.caltech.edu (James M. Bower) Date: Thu, 28 Jan 1999 15:26:05 -0800 Subject: Postdoctoral Position Message-ID: Postdoctoral Position Available in Neuroimaging and Computational Neuroscience: A postdoctoral position is available in an NINDS-funded research project combining human neuroimaging (e.g., fMRI), healthy and patient population psychophysics, rat electrophysiology (multi-unit recordings), and computational modeling to investigate novel hypotheses about cerebellar function. The project is a collaboration between James Bower (at Caltech) and Lawrence Parsons and Peter Fox (at the Research Imaging Center, University of Texas Health Science Center at San Antonio). Term of position is 2-3 years,available immediately. Particular interest exists for candidates with a Ph.D. or M.D. who have experience in neuroimaging, experimental psychophysics, or neurology. Qualified women and minority candidates are strongly encouraged to apply. Send CV, three letters of recommendation, and two reprints to either: Professor J. M. Bower, Division of Biology 216-76, California Institute of Biology, Pasadena, CA 91125, USA; or Professor L. M. Parsons, Research Imaging Center, University of Texas Health Science Center, 7703 Floyd Curl Drive, San Antonio, TX 78284, USA. Email inquiries can be made to parsons at uthscsa.edu or jbower at bbb.caltech.edu. For related publications, see: Gao, Parsons, Bower et al., Science 272, 545 1996; and the chapters by Bower and by Parsons & Fox in Cerebellum & Cognition (J. Schmahmann, Ed., Academic Press, 1997). From zhaoping at gatsby.ucl.ac.uk Fri Jan 29 10:37:25 1999 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Fri, 29 Jan 1999 15:37:25 GMT Subject: Paper on Computational differences between asymmetrical and symmetrical networks Message-ID: <199901291537.PAA14881@vision.gatsby.ucl.ac.uk> Paper Available at http://www.gatsby.ucl.ac.uk/~zhaoping/papers.html Title: Computational differences between asymmetrical and symmetrical networks Authors: Zhaoping Li and Peter Dayan Abstract: Symmetrically connected recurrent networks have recently been used as models of a host of neural computations. However, biological neural networks have asymmetrical connections, at the very least because of the separation between excitatory and inhibitory neurons in the brain. We study characteristic differences between asymmetrical networks and their symmetrical counterparts in cases for which they act as selective amplifiers for particular classes of input patterns. We show that the dramatically different dynamical behaviours to which they have access, often make the asymmetrical networks computationally superior. We illustrate our results in networks that selectively amplify oriented bars and smooth contours in visual inputs. To appear in : Network: Computation in Neural Systems 10. 1 59-77, 1999. From vaina at enga.bu.edu Fri Jan 29 18:37:50 1999 From: vaina at enga.bu.edu (Lucia M. Vaina) Date: Fri, 29 Jan 1999 19:37:50 -0400 Subject: Please post In-Reply-To: <199811111522.QAA06995@anaxagoras> References: (vaina@enga.bu.edu) Message-ID: Graduate Student or Postdoctoral Position in Computational Functional MRI: from Fall 1999 (possible summer 99) >This exciting new venture in the Brain and Vision Research Laboratory at >Boston University, Department of Biomedical Engineering involves >visualisation the working (plasticity and restorative plasticity) of the >human brain during sensory-motor tasks. > Specifically, the postholder will: > > * establish new protocols of functional imaging on a research MRI >scanner, including the stimulus presentation system, >software and interfacing. > > * work with other members of the laboratory to define, establish > and run suitable test paradigms on healthy volunteers and patients > > * explore the uses of real-time and near-real-time analysis >techniques in fMRI studies. > * model the changes of functional connectivity of brain activations >using structural equations models and functional connectivity models. > >It is possible that for part of the time the RA will work at an NIH >laboratory, in Bethesda, MA. > >Candidates should have good background inC programming, mathematics, >signal processing and probability and should be familiar with the Unix >environment. Knowledge of human neuroanatomy is a plus. Please send a letter of application along with CV, publication list if available, brief statement of current research and background, and two letters of recommendation before March 10, 1999, to Professor Lucia M. Vaina Brain and Vision Research Laboratory Biomedical Engineering Department College of Engineering Boston University 44 Cummington str Boston, Ma 02115 USA fax: 617-353-6766 > Lucia M. Vaina Ph.D., D.Sc. Professor of Biomedical Engineering and Neurology Brain and Vision Research Laboratory Boston University, Department of Biomedical Engineering College of Engineering 44 Cummington str, Room 315 Boston University Boston, Ma 02215 USA tel: 617-353-2455 fax: 617-353-6766