From petitot at poly.polytechnique.fr Sun Apr 2 09:12:15 2000 From: petitot at poly.polytechnique.fr (Jean Petitot) Date: Sun, 2 Apr 2000 15:12:15 +0200 Subject: New Book: NATURALIZING PHENOMENOLOGY Message-ID: (Our apologies for multiple copies of this message) The following book is available from Stanford University Press; see http://www.scico.u-bordeaux2.fr/episteme/bbooka/ NATURALIZING PHENOMENOLOGY ISSUES IN CONTEMPORARY PHENOMENOLOGY AND COGNITIVE SCIENCE Edited by Jean Petitot, Francisco J. Varela, Bernard Pachoud, Jean-Michel Roy Stanford University Press This work aims to shed a new light on the relations between Husserlian phenomenology and present-day efforts towards a scientific theory of cognition. Its primary goal is not to present a new interpretation of Husserl's writings. Rather, the contributors consider how Husserlian phenomenology might contribute to specific contemporary theories, either by complementing or by questioning them. What clearly emerges is that Husserlian phenomenology cannot become instrumental to cognitive science without undergoing a substantial transformation. Therefore, the book's central concern is not only the progress of contemporary theories of cognition but also the reorientation of Husserlian phenomenology. Because a single volume could not encompass the numerous facets of this wide-ranging interrogation, the contributors focus on the issue of naturalization.This perspective is far-reaching enough to allow for the coverage of a great variety of topics, ranging from the general structures of intentionality, the nature of temporality and perception, the mathematical modeling of their phenomenological descriptions, to the founding epistemological and ontological principles of cognitive science "Naturalizing Phenomenology" is thus a collective reflection on the possibility of bringing a naturalized Husserlian phenomenology to bear on a scientific theory of cognition that fills the explanatory gap between the phenomenological mind and brain. CONTENTS 1. Beyond the Gap: An Introduction to Naturalizing Phenomenology Jean-Michel Roy, Jean Petitot, Bernard Pachoud, and Francisco J. Varela PART ONE: INTENTIONALITY, MOVEMENT, AND TEMPORALITY INTENTIONALITY 2. Intentionality Naturalized? David Woodruff Smith 3. Saving Intentional Phenomena: Intentionality, Representation, and Symbol Jean-Michel Roy 4. Leibhaftigkeit and Representational Theories of Perception Elisabeth Pacherie MOVEMENT 5. Perceptual Completion: A Case Study in Phenomenology and Cognitive Science Evan Thompson, Alva Noe, and Luiz Pessoa 6. The Teleological Dimension of Perceptual and Motor Intentionality Bernard Pachoud 7. Constitution by Movement: Husserl in Light of Recent Neurobiological Findings Jean-Luc Petit TEMPORALITY 8. Wooden Iron? Husserlian Phenomenology Meets Cognitive Science Tim Van Gelder 9. The Specious Present: A Neurophenomenology of Time Consciousness Francisco J. Varela PART TWO: MATHEMATICS IN PHENOMENOLOGY FORMAL MODELS 1O. Truth and the Visual Field Barry Smith 11. Morphological Eidetics for a Phenomenology of Perception Jean Petitot 12. Formal Structures in the Phenomenology of Motion Roberto Casati PHENOMENOLOGY AND MATHEMATICS 13. G?del and Husserl Dagfinn F?llesdal 14. The Mathematical Continuum: From Intuition to Logic Giuseppe Longo PART THREE: THE NATURE AND LIMITS OF NATURALIZATION PHILOSOPHICAL STRATEGIES OF NATURALIZATION 15. Naturalizing Phenomenology? Dretske on Qualia Ronald Mcintyre 16. The Immediately Given as Ground and Background Juan-Jos? Botero 17. When Transcendental Genesis Encounters the Naturalization Project Natalie Depraz SKEPTICAL ATTITUDES 18. Sense and Continuum in Husserl Jean-Michel Salanskis 19. Cognitive Psychology and the Transcendental Theory of Knowledge Maria Villela-Petit 20. The Movement of the Living as the Originary Foundation of Perceptual Intentionality Renaud Barbaras HISTORICAL PERSPECTIVES 21. Philosophy and Cognition: Historical Roots Jean-Pierre Dupuy Writing Science series edited by Timothy Lenoir and Hans Ulrich Gumbrecht 7 x 10, 648 pp. ISBN 0-8047-3322-8 (cloth) ISBN 0-8047-3610-3 (pbk) _______________________________________________ The book is edited by Stanford University Press http://www.sup.org It is distributed by Cambridge University Press http://www.cup.cam.ac.uk Mail/fax order: FOR: US, Canada, Mexico, Central America) Cambridge University Press Distribution Ctr 110 Midland Avenue Port Chester, NY 10573-4930 USA (914)-937-4712 FOR: other countries Cambridge University Press The Edimburgh Building Shaftesbury Road Cambridge CB2 2RU UK (44)-1223-325959 In Paris: available at Librairie VRIN 6 Place de la Sorbonne 75005 Paris, France (33)-(0)1 43 54 32 75 The book is also available trough online bookshops as http://wwww.amazon.com http://www.amazon.com/uk http://wwww.Barnesnobles.com http://wwww.bookshop.blackwell.com http://wwww.fatbrain.com http://wwww.Kingbooks.com http://wwww.1bookstreet.com From terry at salk.edu Mon Apr 3 22:10:48 2000 From: terry at salk.edu (terry@salk.edu) Date: Mon, 3 Apr 2000 19:10:48 -0700 (PDT) Subject: NEURAL COMPUTATION 12:4 Message-ID: <200004040210.TAA26780@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 4 - April 1, 2000 ARTICLE the Multifractal Structure of Contrast Changes In Natural Images: From eytan at dpt-info.u-strasbg.fr Tue Apr 4 05:15:09 2000 From: eytan at dpt-info.u-strasbg.fr (Michel Eytan) Date: Tue, 4 Apr 2000 11:15:09 +0200 Subject: Paper available In-Reply-To: Message-ID: Thus hath held forth a member of Connectionists list at 31-03-2000 re Paper available: > The following IJCNN'00 paper is accepted and available on-line. > I will be happy about any feedback. [snip] Folks, I have already asked several times to *please* give SIGNIFICANT TITLES to the mails sent to the list. It so happens that I archive some of the mails to this list (and others). Just imagine what happens when I search for a mail ang see about 200 ones all with Subject: Paper available... Thank you and sorry for the bother. -- Michel Eytan eytan at dpt-info.u-strasbg.fr I say what I mean and mean what I say From B344DSL at UTARLG.UTA.EDU Tue Apr 4 19:05:53 2000 From: B344DSL at UTARLG.UTA.EDU (B344DSL@UTARLG.UTA.EDU) Date: Tue, 04 Apr 2000 17:05:53 -0600 (CST) Subject: Second edition of neural networks textbook by Levine Message-ID: <01JNUG3U9DKI00506S@UTARLG.UTA.EDU> INTRODUCTION TO NEURAL AND COGNITIVE MODELING SECOND EDITION DANIEL S. LEVINE LAWRENCE ERLBAUM ASSOCIATES COPYRIGHT 2000 (First edition published by LEA 1991) 491 pages Paperback: ISBN 0-8058-2006-X, $36.00 Cloth: ISBN 0-8058-2005-1, $99.95 From vera at cs.cas.cz Wed Apr 5 16:35:37 2000 From: vera at cs.cas.cz (Vera Kurkova) Date: Wed, 5 Apr 00 16:35:37 CET Subject: 1st call for papers ICANNGA 2001 Message-ID: <59738.vera@uivt1.uivt.cas.cz> **************************************************************** * * * > > > >>> CALL FOR PAPERS <<< < < < * * * **************************************************************** * * * ICANNGA 2001 * * * * 5th International Conference on * * Artificial Neural Networks and Genetic Algorithms * * * * including a special session on * * Computer-Intensive Methods in Control and Data Processing * * * * organized by * * the Institute of Computer Science, * * Academy of Sciences of the Czech Republic * * to be held at * * the Lichtenstein Palace, Prague, Czech Republic * * * * April 22-25, 2001 * * * * * * http://www.cs.cas.cz/icannga * * * **************************************************************** The focus of ICANNGA is on theoretical aspects and practical applications of computational paradigms inspired by natural processes, especially ARTIFICIAL NEURAL NETWORKS and EVOLUTIONARY ALGORITHMS. ICANNGA 2001 will include invited plenary talks, contributed papers, poster session, tutorials and a social program. CONFERENCE TOPICS The following list indicates some areas of interest, but is not exhaustive: * Neural Networks: Architectures, Algorithms, Approximation, Complexity, Biological Foundations, Computational Neuroscience * Evolutionary Computation: Genetic Algorithms, Genetic Programming, Classifier Systems, Artificial Life * Hybrid Systems: Fuzzy Logic, Soft Computing, Neuro-Fuzzy Controllers, Genetic Learning of Neural Networks * Applications: Pattern Recognition, Signal Processing, Control, Simulation, Robotics, Data Mining, Transport, Defense, Security, Environment, Finance and Business ================================================================= We invite contributed papers for ICANNGA 2001 on topics as above. Draft papers will be refereed, and papers accepted for oral presentation and selected papers for poster presentation will appear in the conference proceedings, to be published by Springer. Submission of draft versions of papers September 20, 2000 Notification of acceptance January 10, 2001 Delivery of revised papers February 7, 2001 ICANNGA conference April 22-25, 2001 ================================================================= Vera Kurkova, Institute of Computer Science, Prague, Chair of the Organizing Committee and of the Program Committee International Advisory Committee Rudolf Albrecht, University of Innsbruck, Austria Andrej Dobnikar, University of Ljubljana, Slovenia David Pearson, University of Saint Etienne, France Nigel Steele, Coventry University, United Kingdom ================================================================= For more information, please, visit the conference web site at http://www.cs.cas.cz/icannga ================================================================= From stefan.wermter at sunderland.ac.uk Wed Apr 5 10:26:31 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Wed, 05 Apr 2000 15:26:31 +0100 Subject: International emernet workshop on NN and neuroscience Message-ID: <38EB4D16.D2624687@sunderland.ac.uk> Emerging computational neural Network architectures based on neuroscience (EmerNet): International EPSRC Workshop on Current Computational Architectures Integrating Neural Networks and Neuroscience. Date: 8-9 August 2000 Location: Durham Castle, Durham, United Kingdom Workshop web page is http://www.his.sunderland.ac.uk/worksh3 Organising Committee ----------------------- Prof. Stefan Wermter Chair Hybrid Intelligent Systems Group University of Sunderland Prof. Jim Austin Advanced Computer Architecture Group Department of Computer Science University of York Prof. David Willshaw Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh Call for Papers and Participation -------------------------------- Description and Motivation --------------------------- Although there is a massive body of research and knowledge regarding how processing occurs in the brain this has had little impact on the design and development of computational systems. Many challenges remain in the development of computational systems, such as robustness, learning capability, modularity, massive parallelism for speed, simple programming, more reliability etc. This workshop aims to consider if the design of computational systems can learn from the integration of cognitive neuroscience, neurobiology and artificial neural networks. The main objective is the transfer of knowledge by bringing Together researchers in the twin domains of artificial and real neural networks. The goal is to enable computer scientists to comprehend how the brain processes information to generate new techniques for computation and encourage neuroscientists to consider computational factors when performing their research. Areas of Interest for Workshop -------------------------------- The main areas of interest for the workshop bring together Neural Network Architectures and Neuroscience Robustness: What are the characteristics that enable the human brain to carry on operating despite failure of its elements? How can the brain's slow but robust memory be utilised to replace the brittle but fast memory presently found in conventional computers? Modular construction: How can the brain provide ideas for Bringing together the current small artificial neural networks to create larger modular systems that can solve more complex tasks like associative retrieval, vision and language understanding? Learning in context: There is evidence from neuron, network and Brain levels that the internal state of such a neurobiological system has an influence on processing and learning. Is it possible to build computational models of these processes and states, and design incremental learning algorithms and dynamic architectures? Synchronisation: How does the brain synchronise its processing when using millions of processors? How can large asynchronous computerised systems be produced that do not rely on a central clock? Timing: Undertaking actions before a given deadline is vital. What structural and processing characteristics enable the brain to deal with real time situations? How can these be incorporated into a computerised approach? Processing speed: despite having relatively slow computing element, how is real-time performance achieved? Preliminary Invited Speakers We plan to have around 30 participants, including speakers and participants. -------------------------------------- Dr Jim Fleming - EPSRC Prof. Michael Fourman - University of Edinburgh Prof. Angela Frederici - Max Planck Institute of Cognitive NeuroScience Prof. Stephen Hanson - Rutgers University Prof. Stevan Harnad - University of Southampton Prof. Vasant Honavar - Iowa State University Dr Hermann Moisl - University of Newcastle upon Tyne Prof. Heiko Neumann - Universit Ulm Prof. Gnther Palm - Universit Ulm Prof. Kim Plunkett (tbc) - Oxford University Prof. James A. Reggia - University of Maryland Prof. John Taylor - King's College London Workshop Details ------------------- In order to have a workshop of the highest quality it incorporates a combination of paper presentations on one of the six areas of interest by the participants and more open discussion oriented activities. The discussion element of the EmerNet Workshop will be related to the questions above and it is highly desirable that those wishing to participate focus on one or more of these issues in an extended abstract or position paper of up to 4 pages. Papers should be in either ps, pdf or doc format via email for consideration to Professor Stefan Wermter and Mark Elshaw by the 1st of June 2000. KEY QUESTIONS IS: What can we learn from cognitive neuroscience and the brain for building new computational neural architectures. It is intended that for all participants registration, meals and accommodation at Durham Castle for the Workshop will be provided free of charge. Further, specially invited participants are to receive reasonable travel expenses reimbursed and additional participants rail travel costs in the UK. We also plan to have six places for PhD students or recent post-doctorates and encourage applicants. Extended versions of papers can be published as book chapters in a book with Springer. Location - Durham Castle ------------------------- The EmerNet Workshop is to be held at Durham Castle, Durham(chosen as in between Sunderland, York and Edinburgh) in the North East of England. There are few places in the world that can match the historic City of Durham, with its dramatic setting on a rocky horseshoe bend in the River Wear and beautiful local countryside. Furthermore, it offers easy accessibility by rail from anywhere in the Great Britain and is close to the international airport at Newcastle. The workshop provides the chance to stay at a real English castle that was constructed under the orders of King William the Conqueror in 1072, shortly after the Norman Conquest. It has many rooms of interest including a Norman Chapel that has some of the most fascinating Norman sculptures in existence and the Great Hall that acts as the dinning area. By having the EmerNet Workshop at this excellent location this provides the chance for interesting and productive discussion in a peaceful and historic atmosphere. It is possible to gain a flavour of Durham Castle and Cathedral on the on-line tour at http://www.dur.ac.uk/~dla0www/c_tour/tour.html Contact Details --------------- Mark Elshaw (Workshop Organiser) Hybrid Intelligent Systems Group Informatics Centre SCET University of Sunderland St Peter's Way Sunderland SR6 0DD United Kingdom Phone: +44 191 515 3249 Fax: +44 191 515 2781 E-mail: Mark.Elshaw at sunderland.ac.uk Prof. Stefan Wermter (Chair) Informatics Centre, SCET University of Sunderland St Peter's Way Sunderland SR6 0DD United Kingdom Phone: +44 191 515 3279 Fax: +44 191 515 2781 E-mail: Stefan.Wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ From rfrench at ulg.ac.be Fri Apr 7 14:44:35 2000 From: rfrench at ulg.ac.be (Robert French) Date: Fri, 07 Apr 2000 20:44:35 +0200 Subject: Sixth Neural Computation and Psychology Workshop, Liege, Belgium, Sept 16-18 Message-ID: <4.1.20000407202804.00a77250@pop3.mailst.ulg.ac.be> SIXTH NEURAL COMPUTATION AND PSYCHOLOGY WORKSHOP (NCPW6) Connectionist models of evolution, learning and development University of Liege, Liege, Belgium Saturday, September 16 to Monday, September 18, 2000 URL: http://www.fapse.ulg.ac.be/ncpw6/ AIMS AND OBJECTIVES This workshop, the sixth in a series, has the explicit goal of bringing together connectionists modelers, and especially connectionist modelers from Europe, who are primarily focused on some aspect of psychology or neuropsychology. Each year there is a relatively wide-ranging theme for the Workshop. For example, ?Neurodynamics and Psychology? was the theme of the first workshop, ?Perception?, the third, ?Connectionist representations? the fourth, and so on. This year the theme will be ?Development, learning and evolution.? This is a broad topic and intentionally so. Although we aren?t interested in, say, connectionist applications to submarine warfare, we will consider all papers that have something to do with the announced topic, even if rather tangentially. The organization of the final program will depend on the submissions received. As in previous years, the Workshop will be reasonably small. There are, for example, no parallel sessions and no poster sessions. And plenty of time will be left for discussion among participants. The atmosphere is designed to be congenial but rigorous. Even though participation in the Workshop is by no means limited to Europeans, one of its explicit goals is to bring together European connectionist modelers interested in psychology or neuropsychology. This is the first year that NCPW is being held on the Continent, a move that was explicitly designed to attract not only the usual contingent of British connectionists, but also our colleagues from other European countries as well. CALL FOR ABSTRACTS The Workshop will last from Saturday, September 16 through the morning of Monday, September 18, 2000. There will be approximately 25-30 paper presentations. Abstracts (approximately 200 words) are due by June 7. Notification of acceptance for a paper presentation will be done by June 22. The finished paper must be ready by the time of the Workshop. See the NCPW6 Web page for all details: http://www.fapse.ulg.ac.be/ncpw6/ REGISTRATION, ETC. The registration fee will be 100 euros. (Reductions will be made upon request for a limited number of students, especially students from abroad who also have significant travel expenses.) Included in this fee will be the welcome reception (evening of September 15), coffee breaks, lunches, and a copy of Workshop Proceedings afterthe Workshop (we have a tentative publishing agreement with Springer-Verlag, as in the past). Special conference rates have been arranged at a number of hotels, as indicated on the Web page for the Workshop. ORGANIZING COMMITTEE Bob French (rfrench at ulg.ac.be) University of Liege, Belgium, organizer Axel Cleeremans (axcleer at ulb.ac.be) Universite Libre de Bruxelles Nick Chater (Nick.Chater at warwick.ac.uk) University of Warwick Denis Mareschal (d.mareschal at bbk.ac.uk) Birkbeck College, London Jacques Sougn? (j.sougne at ulg.ac.be) University of Li?ge, Belgium CONTACT DETAILS For any problems or questions, please send e-mail to mailto:cogsci at ulg.ac.be From bsrc at neuron.kaist.ac.kr Thu Apr 6 21:53:07 2000 From: bsrc at neuron.kaist.ac.kr (=?EUC-KR?B?vcXHysij?=) Date: Fri, 07 Apr 2000 10:53:07 +0900 Subject: CALL FOR PAPERS - ICONIP 2000 Message-ID: <38ED3F81.166079D2@neuron.kaist.ac.kr> <<< ICONIP 2000 >>> - 7th International Conference on Neural Information Processing Taejon, Korea November 14-18, 2000 Paper Submission Due : May 15, 2000 ICONIP-2000, the annual conference sponsored by the Asia-Pacific Neural Network Assembly (APNNA), will be held in Taejon, Korea, from November 14 to 18, 2000. The theme of the conference, Neural Information Processing, is broad enough to promote wide interactions among researchers in many academic disciplines. The conference will consist of one-day tutorial, three-and-a-half day oral and poster presentations, and a half-day historic tour. In addition to mathematical and engineering approaches, we also include cognitive science in the main stream. The conference topics include but are not limited to: - NEURAL NETWORK MODELS Learning algorithms Neural network architectures Neurodynamics & spiking neuron Statistical neural network models - COGNITIVE SCIENCE Data representation Learning and memory Neurobiological systems Perception, emotion, and cognition Selective attention Vision and auditory models - HYBRID NEURAL SYSTEMS Evolutionary neural systems Fuzzy neural systems Soft computing Symbolic-neural hybrid systems - NEURAL HARDWARE IMPLEMENTATION Analog, digital, and hybrid neuro-chips Artificial retina and cochlear chips DSP and software implementation - NEURAL NETWORK APPLICATIONS Computer vision Data mining Expert systems Finance and electronic commerce Human-computer interaction Intelligent control Natural language processing Pattern recognition Robotics Sensorimotor systems Signal processing Speech recognition Time series prediction Papers should be submitted by 15 May 2000 and registered on the conference web page. They will be reviewed by senior researchers in the field and the authors will be informed about the decision of the review process by 15 July 2000. The accepted papers must be submitted in a camera-ready format by 15 Aug 2000. All accepted papers will be presented at the conference either in an oral or in a poster session and included in the conference proceedings. At least one author for each accepted paper should make advance registration. The conference proceedings will be published as a hardcopy and a CD-ROM. To facilitate a fast review process and publication of the proceedings, electronic submission of compressed PostScript or PDF files is required. If absolutely unavoidable, send four copies of the paper by mail to the Conference Secretariat. The paper must be written in English. The paper should not exceed six pages when printed on a A4-format white paper with 2.5cm margins on all four sides, in a single-spaced two-column single-page format, in Times or similar font of 10 points. Centered at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by an abstract and the text. Detail paper format may be found at http://braintech.kaist.ac.kr/ICONIP2000/authors.html. Proposals are solicited for tutorials. The proposals for tutorials should outline the subject area with a brief biography of the organizer, and should be submitted to the Conference Secretariat by May 1, 2000. Proposals for special sessions are solicited. Each proposal for a special session should include the following information: title of the special session, special session organizer and affiliation, subject areas to be covered by the special session, and potential authors. The proposals should be submitted by May 1, 2000 via e-mail to the special sessions committee chair at btzhang at scai.snu.ac.kr. More details on proposals for special sessions can be found at http://braintech.kaist.ac.kr/ICONIP2000/information.html. - May 1, 2000 : Proposals for Tutorials and Special Invited Sessions - May 15, 2000 : Paper Submission - July 15, 2000 : Acceptance Notification - August 15, 2000 : Camera-ready Manuscripts and Advance Registration * Regular registration - US$475 (advance registration), US$550 (on-site) * Student registration - US$200 (advance registration), US$250 (on-site) * Both Regular and Student registration include free descent lunches from November 15th to 18th and a half-day historic tour in the afternoon of November 16th. Free coffee, welcoming reception, and closing reception will also be provided to all registrants. * Regular registration includes both paper and CD-ROM proceedings, while Student registration includes only CD-ROM proceedings. * Additional paper proceedings : US$70 * Additional CD-ROM proceedings: US$30 * Banquet on November 17, 2000 : US$50 Full time students who will present a paper at ICONIP2000 are eligible to apply for Student Conference Travelling Fund supported by =A1=B0Amari/Kasabov and co-authors of the Brain-like Computi= ng Book=A1=B1. More detailed information is available at http://braintech.kaist.ac.kr/ICONIP2000/travellingfund.htm Asia-Pacific Neural Network Assembly (APNNA) Brain Science Research Center, KAIST International Neural Network Society (INNS) IEEE Neural Network Council (IEEE NNC) European Neural Network Society (ENNS) Asian Office of Aerospace Research & Development, US Air Office of Scientific Researches Brain Science Research Center 3rd Floor, LG Semicon Hall Korea Advanced Institution of Science & Technology 373-1 Kusong-dong, Yusong-ku Taejon, 305-701, Korea Tel: +82-42-869-5431 Fax: +82-42-869-8492 Email : ICONIP2000 at braintech.kaist.ac.kr Web: http://braintech.kaist.ac.kr/ICONIP2000 Conference Chair : Soo-Young Lee KAIST Conference Co-Chairs : Kunihiko Fukushima Univ. of Electro-Comm. Jung-Mo Lee Sungkyunkwan Univ. Program Committee Chair : Seong-Whan Lee Korea University Co-Chairs : Erkki Oja Helsinki Univ. of Tech. Minoru Tsukada Tamagawa University Secretary : Sung-Bae Cho Yonsei University Members : Kazuyuki Aihara Univ. of Tokyo Hye-Ran Byun Yonsei Univ. Laiwan Chan The Cheinese Univ. of Hong Kong Sungzoon Cho Seoul National University Chan-Sup Chung Yonsei Univ. Myung Jin Chung KAIST Andrzej Cichocki RIKEN BSI Joydeep Ghosh Univ. of Texas, Austin Seung Kee Han Chungbuk National Univ. Zhen-Ya He Southeast Univ. Yuzo Hirai University of Tsukuba Jenq-neng Hwang Univ. of Washington, Seattle Matsumi Ishikawa Kyushu Institute of Tech. Marwan Jabri University of Sydney Anil Jain Michigan State University Janusz Kacprzyk Polish Academy of Science Nikola Kasabov University of Otago Hideki Kawahara Wakayama University Doh-Suk Kim Samsung Adv. Inst. Tech. Irwin King Chinese Univ. Hong Kong Chong-Ho Lee Inha University Choongkil Lee Seoul National University Daniel Lee Lucent Technologies Te-Won Lee Salk Institute Yillbyung Lee Yonsei University Jianchang Mao IBM Almaden Research Ctr. Gen Matsumoto RIKEN BSI Jong-Seop Moon Korea University Mike Mozer Univ. of Colorado at Boulder Takashi Omori Tokyo Univ. of Agriculture & Tech. Dong Chul Park Myungji University Seung Kwon Park Hanyang University Elie Sanchez NEURINFO Yasuji Sawada Tohoku University Sebastian Seung MIT Jude Shavlik Univ. of Wisc. at Madison Jang Kyoo Shin Kyungpook National Univ. Jonghan Shin RIKEN BSI Satoshi Shioiri Chiba University Keiji Tanaka RIKEN BSI Keiji Uchikawa Tokyo Institute of Technology Deliang Wang Ohio State University Patrick Wong Univ. of New South Wales Hyun-Seung Yang KAIST Ramin Yasdi GMD Myung-Hyun Yoo Korea University Shuji Yoshizawa University of Tokyo Byoung-Tak Zhang Seoul National University General Affairs Committee Chair : Rhee-Man Kil KAIST Members : Jin Young Choi Seoul National University Sung Ho Kim KAIST Cheol-Hoon Park KAIST Local Arrangement Committee Chair : Chang-Dong Yoo KAIST Members : Yong-Soo Kim Taejon Univ. Kwee-Bo Sim Chungang Univ. Finance Committee Chair : Hong-Tae Jeon Chungang Univ. Members : Mun-Sung Han ETRI Hong Jeong Postech Seong-Gon Kong Soongsil Univ. Yung-Bin Kwon Chungang Univ. Publication Committee Chair : Seungwhan Kim Postech Members : Min-Shik Kim Yonsei University Seungjin Choi Chungbuk National Univ. Publicity Committee: Chair : Chong-Ho Lee Inha University Members : Hoon Kang Chungang Univ. Hyung-Cheul Shin Hallym University International Advisory Committee: Chair : Sung-Yang Bang Postech Co-Chairs : Shunichi Amari RIKEN BSI Harold Szu NSWC Members : Jim Bezdek Univ. of West Florida David Casasent Carnegie Mellon University Chan-Sup Chung Yonsei Univ. Rolf Eckmiller University of Bonn Tom Gedeon Univ. of New South Wales Zhen-Ya He Southeast University Nikola Kasabov Univ. of Otago Jin Hyung Kim KAIST Myung-Won Kim Soongsil University Cliff Lau Office of naval Researches Sukhan Lee Samsung Advanced Institute Hansperter Mallot Max-Planck Institute Se-Yung Oh Postech Nikhil R. Pal Indian Statistical Institute John Taylor King's College of London Shiro Usui Toyohashi University of Tech. Lipo Wang Nanyang Tech. Univ. Patrick Wong UNSW Lei Xu Chinese Univ. Hong Kong Youshou Wu Tsinghua Univ. Takeshi Yamakawa Kyushu Inst. of Technology Jacek Zurada Univ. of Louisville Exhibition Committee Chair : Dong-Jo Park KAIST Tutorial Committee Chair : Sungzoon Cho Seoul National University Members : Seungjin Choi Chungbuk National Univ. Sungbae Cho Yonsei University Hyukjoon Lee Kwangwoon University Minho Lee Kyungpook National Univ. Special Session Committee Chair : Byoung-Tak Zhang Seoul National University Members : Daniel D. Lee Bell Lab, Lucent Technologies Nando de Freitas Univ.of California, Berkeley Te-Won Lee Univ.of California, San Diego From Ramin.Yasdi at gmd.de Fri Apr 7 09:44:56 2000 From: Ramin.Yasdi at gmd.de (Ramin Yasdi) Date: Fri, 07 Apr 2000 15:44:56 +0200 Subject: ICONIP-2000 Special Session Message-ID: <38EDE658.376CD298@gmd.de> CALL FOR PAPERS ICONIP-2000 Special Session ON NEURAL NETWORKS FOR INTELLIGENT USER INTERFACES User interfaces that adapt themselves to individual needs, preferences, and knowledge of their users are becoming more and more important. Personalized interfaces are of special importance to deal with information overload and navigation by personalizing and improving the quality of information retrieval and filtering, information restructuring and annotation, as well as information visualization. The development of these new intelligent user interfaces require techniques that enable computer programs to learn how to serve the user most efficiently. Neural networks are not yet widely used within this challenging domain. But the domain seems to be an interesting new application area for neural networks due to availability of large sets of data and the required automatic adaptation to new situations and users. Therefore, growing interest in using various powerful learning methods known from neural network models for intelligent user interfaces is arising among researchers. The scope of the session includes but is not limited to following topics: * user models * adaptive hyper media * classifying and recognizing users, emotions and situations * information retrieval * adapting complex user interfaces * intelligent student systems * representation of application domains We solicit reports on actual neural networks applications, and discussion contributions on their usefulness. Since most successful applications in this area use symbolic AI methods, it is under debate if and how neural networks can contribute to this area. SESSION FORMAT: The session will give participants possibility of short presentations (talks or demos, about 20 minutes ) on their vision or work in the area. Most of the session, however, will have the format of an open discussion forum. At the end of the session, a discussion will take place to deal with questions on how to combine research efforts and how to link the community. SUBMISSION INSTRUCTIONS: Please send your paper to: ramin.yasdi at gmd.de by the submission deadline below. See guideline for authors for more details. http://braintech.kaist.ac.kr/ICONIP2000 Letter of interest in submitting a paper to the special session: May1, 2000. Deadline for paper submission: June 15, 2000. Notification of acceptance: July 15, 2000. Camera-ready papers due: August 15, 2000. SESSION ORGANIZERS: Ramin Yasdi German National Research Centre for Information Technology (GMD) Schloss Birlinghoven, 53754 Sankt Augustin, Germany Email: Ramin.Yasdi at gmd.de From planning at icsc.ab.ca Sat Apr 8 13:48:58 2000 From: planning at icsc.ab.ca (Jeanny S. Ryffel) Date: Sat, 8 Apr 2000 11:48:58 -0600 Subject: cfp for symbol processing session for ISA'2000 Message-ID: <000501bfa17a$1e2b4ee0$984722cf@compusmart.ab.ca> SPECIAL SESSION ON SYMBOLS, SYMBOL PROCESSING and NEURAL NETWORKS http://www.icsc.ab.ca/150-prog.html#Scientific Organizer: Bernadette M. Garner Bernadette.Garner at infotech.monash.edu.au Topics to be covered in the special session include: symbol processing and the nature of symbols. Not specifically language processing but - how the biological brain handles symbols - how symbols can be stored - how symbols can be manipulated - the definition of symbols - how artifical neural networks can be trained using symbols However, any topic relevant to symbols and symbol processing will be considered for discussion. Interested researchers should submit manuscripts of up to 5,000 words. Submission by electronic mail is strongly recommended. Or fax 2 copies to B. M. Garner CSSE Monash University Clayton, 3168, Australia. Fax: +61 3 9905 5146 PROCEEDINGS AND PUBLICATIONS All accepted and invited papers will be included in the congress proceedings, published in print and on CD-ROM by ICSC Academic Press, Canada/Switzerland. A selected number of papers will be expanded and revised for possible inclusion in special issues of some prestigious journals. IMPORTANT DATES May 15, 2000: Submission deadline June 15, 2000: Notification of acceptance July 30, 2000: Delivery of full papers December 12-15, 2000: ISA'2000 congress This session is part of the International Congress on INTELLIGENT SYSTEMS AND APPLICATIONS (ISA'2000) University of Wollongong (near Sydney), Australia December 12-15, 2000 http://www.icsc.ab.ca/isa2000.htm SPONSORS University of Wollongong, Industrial Automation Research Centre Nortel Networks IEE The Institution of Electrical Engineers IEAust The Institution of Engineers, Australia CRC IMST Cooperate Resarch Centre for Intelligent Manufacturing Systems and Technologies Ltd. ICSC International Computer Science Conventions From b344dsl at utarlg.uta.edu Sat Apr 8 14:35:25 2000 From: b344dsl at utarlg.uta.edu (Dan Levine) Date: Sat, 8 Apr 2000 13:35:25 -0500 Subject: Levine's textbook, 2nd edition Message-ID: <003c01bfa189$34ee7ee0$bd1a6b81@uta.edu> In my announcement about the second edition of my textbook with Erlbaum coming out, i forgot to include contact information. Any questions or comments can be e-mailed to me at levine at uta.edu. Also there is some additional information about the book (though it needs to be updated) at my web site, www.uta.edu/psychology/faculty/levine. Dan Levine From thilo.reski at gmx.de Mon Apr 10 05:33:12 2000 From: thilo.reski at gmx.de (thilo.reski@gmx.de) Date: Mon, 10 Apr 2000 11:33:12 +0200 (MEST) Subject: PhD Thesis: Mapping and Parallel Simulation of ANN Message-ID: <31366.955359192@www6.gmx.net> Dear Connectionists, My PhD thesis on mapping and parallel simulation of neural networks is now available Title: Mapping and Parallel, Distributed Simulation of Neural Networks on Message Passing Multiprocessors Abstract: This thesis introduces an etire policy to the parallelization and parallel simulation of artificial neural networks (ANN). The idea is to hide the parallelization effort to the ANN developer and to the ANN user. Main issues are i) Analysis of the ANN in terms of parallel execution, ii) Mapping the ANN to an abstract (scaleable) message passing computer systems, iii) parallel simulation on such an architecture. Results indicate, that transparent parallelization of neural networks is useful in order to efficiently develop and apply non-trivial neural networks. If you are interested, send an empty e-Mail to "thilo.reski at gmx.de" with subject "PhD Thesis" Best regards, Thilo Reski -- Dr. Thilo Reski Am Wolfsberg 9a 64569 Nauheim Germany Fax/Tel: +49 / 6152 / 637977 Mobil : +49 / 178 / 637977 8 e-mail: kontakt at thilo-reski.de http://www.thilo-reski.de Sent through GMX FreeMail - http://www.gmx.net From murphyk at cs.berkeley.edu Mon Apr 10 16:09:35 2000 From: murphyk at cs.berkeley.edu (Kevin Murphy) Date: Mon, 10 Apr 2000 13:09:35 -0700 Subject: Bayes Net Toolbox 2.0 for Matlab Message-ID: <38F234FF.17780AE1@cs.berkeley.edu> I am pleased to announce a major new release of the Bayes Net Toolbox, a software package for Matlab 5 that supports inference and learning in directed graphical models. Specifically, it supports exact and approximate inference, discrete and continuous variables, static and dynamic networks, and parameter and structure learning. Hence it can handle a large number of popular statistical models, such as the following: PCA/factor analysis, logistic regression, hierarchical mixtures of experts, QMR, DBNs, factorial HMMs, switching Kalman filters, etc. For more details, and to download the software, please go to http://www.cs.berkeley.edu/~murphyk/Bayes/bnt.html The new version (2.0) has been completely rewritten, making it much easier to read, use and extend. It is also somewhat faster. The main change is that I now make extensive use of objects. (I used to use structs, and a dispatch mechanism based on the type-tag system in Abelson and Sussman.) In addition, each inference algorithm (junction tree, sampling, loopy belief propagation, etc.) is now an object. This makes the code and documentation much more modular. It also makes it easier to add special-case algorithms, and to combine algorithms in novel ways (e.g., combining sampling and exact inference). I have gone to great lengths to make the source code readable, so it should prove an invaluable teaching tool. In addition, I am hoping that people will contribute algorithms to the toolbox, in the spirit of the open source movement. Kevin Murphy From moatl at cs.tu-berlin.de Mon Apr 10 02:58:56 2000 From: moatl at cs.tu-berlin.de (Martin Stetter) Date: Mon, 10 Apr 2000 08:58:56 +0200 Subject: Final Call: EU Advanced Course in Computational Neuroscience Message-ID: <38F17BB0.7A0CE64A@cs.tu-berlin.de> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Second Call for the EU ADVANCED COURSE IN COMPUTATIONAL NEUROSCIENCE (AN IBRO NEUROSCIENCE SCHOOL) AUGUST 21 - SEPTEMBER 15, 2000 INTERNATIONAL CENTRE FOR THEORETICAL PHYSICS, TRIESTE, ITALY DIRECTORS: Erik De Schutter (University of Antwerp, Belgium) Klaus Obermayer (Technical University Berlin, Germany) Alessandro Treves (SISSA, Trieste, Italy) Eilon Vaadia (Hebrew University, Jerusalem, Israel) The EU Advanced Course in Computational Neuroscience introduces students to the panoply of problems and methods of computational neuroscience, simultaneously addressing several levels of neural organisation, from subcellular processes to operations of the entire brain. The course consists of two complementary parts. A distinguished international faculty gives morning lectures on topics in experimental and computational neuroscience. The rest of the day is devoted to practicals, including learning how to use simulation software and how to implement a model of the system the student wishes to study on individual unix workstations. The first week of the course introduces students to essential neuro- biological concepts and to the most important techniques in modeling single cells, networks and neural systems. Students learn how to apply software packages like GENESIS, MATLAB, NEURON, XPP, etc. to the solution of their problems. During the following three weeks the lectures will cover specific brain functions. Each week topics ranging from modeling single cells and subcellular processes through the simulation of simple circuits, large neuronal networks and system level models of the brain will be covered. The course ends with a presentation of the students' projects. The EU Advanced Course in Computational Neuroscience is designed for advanced graduate students and postdoctoral fellows in a variety of disciplines, including neuroscience, physics, electrical engineering, computer science and psychology. Students are expected to have a basic background in neurobiology as well as some computer experience. Students of any nationality can apply. A total of 32 students will be accepted. Students of any nationality can apply. About 20 students will be from the European Union and affiliated countries (Iceland, Israel, Liechtenstein and Norway plus all countries which are negotiating future membership with the EU). These students are supported by the European Commission and we specifically encourage applications from researchers who work in less-favoured regions of the EU, from women and from researchers from industry. IBRO and ICTP provide support for participation from students of non-Western countries, in particular countries from the former Soviet Union, Africa and Asia, while The Brain Science Foundation supports Japanese students. Students receiving support from the mentioned sources will receive travel grants and free full board at the Adriatico Guest House. More information and application forms can be obtained: - http://www.bbf.uia.ac.be/EU_course.shtml Please apply electronically using a web browser if possible. - email: eucourse at bbf.uia.ac.be - by mail: Prof. E. De Schutter Born-Bunge Foundation University of Antwerp - UIA, Universiteitsplein 1 B2610 Antwerp Belgium FAX: +32-3-8202669 APPLICATION DEADLINE: April 15, 2000. Applicants will be notified of the results of the selection procedures by May 31, 2000. COURSE FACULTY: Moshe Abeles (Hebrew University of Jerusalem, Israel), Carol Barnes (University of Arizona, USA), Avrama Blackwell (George Mason University, Washington, USA), Valentino Braitenberg (MPI Tuebingen, Germany), Jean Bullier (Universite Paul Sabatier, Toulouse, France), Ron Calabrese (Emory University, Atlanta, USA), Carol Colby (University Pittsburgh, USA), Virginia de Sa (University California San Francisco, USA), Alain Destexhe (Laval University, Canada), Opher Donchin (Hebrew University of Jerusalem, Israel), Karl J. Friston (Institute of Neurology, London, England), Bruce Graham (University of Edinburgh, Scotland), Julian J.B. Jack (Oxford University, England), Mitsuo Kawato (ATR HIP Labs, Kyoto, Japan), Jennifer Lund (University College London, England), Miguel Nicolelis (Duke University, Durham, USA), Klaus Obermayer (Technical University Berlin, Germany), Stefano Panzeri (University of Newcastle, England), Alex Pouget (University of Rochester, USA), John M. Rinzel (New York University, USA), Nicolas Schweighofer (ATR ERATO, Kyoto, Japan), Idan Segev (Hebrew University of Jerusalem, Israel), Terry Sejnowski (Salk Institute, USA), Haim Sompolinsky (Hebrew University of Jerusalem, Israel), Martin Stetter (Siemens AG Muenchen, Germany), Shigeru Tanaka (RIKEN, Japan), Alex M. Thomson (Royal Free Hospital, London, England), Naftali Tishby (Hebrew University of Jerusalem, Israel), Alessandro Treves (SISSA, Trieste, Italy), Eilon Vaadia (Hebrew University of Jerusalem, Israel), Charlie Wilson (University of Texas, San Antonio, USA), More to be announced... The 2000 EU Advanced Course in Computational Neuroscience is supported by the European Commission (5th Framework program), by the International Centre for Theoretical Physics (Trieste), by the Boehringer Ingelheim Foundation, by the International Brain Research Organization and by The Brain Science Foundation (Tokyo). -- ---------------------------------------------------------------------- Dr. Martin Stetter phone: ++49-30-314-73117 FR2-1, Informatik fax: ++49-30-314-73121 Technische Universitaet Berlin web: http://www.ni.cs.tu-berlin.de Franklinstrasse 28/29 D-10587 Berlin, Germany ---------------------------------------------------------------------- From ASJagath at ntu.edu.sg Tue Apr 11 05:01:18 2000 From: ASJagath at ntu.edu.sg (Jagath C Rajapakse (Asst Prof)) Date: Tue, 11 Apr 2000 17:01:18 +0800 Subject: ICONIP2000: Special Session on Brain Imaging Message-ID: CALL FOR PAPERS ICONIP 2000: SPECIAL SESSION ON BRAIN IMAGING Today much of what we know about neural information processing and diseases of the human brain have been derived from images of human brain, produced by various imaging modalities. Although the brain images are direct measurement of its structure and often its function, the full potential of these images remains largely unexploited today. This session will focus on recent advances in brain imaging research to explore natural neural information processing mechanisms and to investigate characteristics of brain diseases from imaging data. Research papers will be solicited for presentation in both structural and functional brain imaging but not restricted to the following areas. 1. Structural brain imaging Xray CT; MRI; Brain shelling; Detection of sulcul and gyral patterns; Cortical segmentation; Cortical parcellation; Segmentation of subcortical structures, hippocampus, cerebellum; 3-D visualization, rendering; Morphometrical correlates of neurological and psychiatric disease. 2. Functional brain imaging EEG; MEG; fMRI,; PET; Optical; Near infrared; Source localization; Statistical parameter maps; Time-series analysis; Multi-modality imaging, registration. SUBMISSION INSTRUCTIONS: Please send your letter of interest and paper by email to asjagath at ntu.edu.sg by the submission deadline below and see guidelines for authors for more details. http://braintech.kaist.ac.kr/ICONIP2000 Letter of interest in submitting a paper: May1, 2000. Deadline for paper submission: June 15, 2000. Notification of acceptance: July 15, 2000. Camera-ready papers due: August 15, 2000. SESSION CHAIRS Dr: Jagath C. Rajapakse Dr. Frithjof Kruggel School of Applied Science Max Planck Institute of Cognitive Neuroscience Nanyang Technological University Stephanstrasse 1 N4, Nanyang Avenue 04103 Leipzig Singapore. Germany Email: asjagath at ntu.edu.sg Email: kruggel at cns.mpg.de From dario.floreano at epfl.ch Wed Apr 12 06:47:39 2000 From: dario.floreano at epfl.ch (Dario Floreano) Date: Wed, 12 Apr 2000 12:47:39 +0200 Subject: PhD studentships available Message-ID: 3 PhD Studentships (Research Assistant) @ Swiss Federal Institute of Technology in Lausanne (EPFL) Three postgraduate research positions leading to a PhD in Engineering at the Institute of Robotic Systems of the Swiss Federal Institute of Technology in Lausanne (EPFL) are offered for a project in BIO-INSPIRED AND ADAPTIVE ROBOTICS by Dario Floreano. The research topics are: 1- Methods in Evolutionary Robotics 2- Evolutionary Embedded Vision 3- Interactive Adaptation for Personal and Service Robotics For project descriptions and application procedures, please see: http://diwww.epfl.ch/lami/team/floreano/jobs.html --------------------------------------- Prof. Dario Floreano Autonomous Systems Laboratory (ASL) Institute of Robotic Systems (ISR-DMT) Swiss Federal Institute of Technology (EPFL) CH-1015 Lausanne, Switzerland Dario.Floreano at epfl.ch Phone: ++41 21 693 5230 Fax: ++41 21 693 5263 http://diwww.epfl.ch/lami/team/floreano From nick.jakobi at animaths.com Thu Apr 13 10:46:01 2000 From: nick.jakobi at animaths.com (Nick Jakobi) Date: Thu, 13 Apr 2000 15:46:01 +0100 Subject: Job Vacancies at MASA (U.K. office) Message-ID: <01BFA560.7F9025E0.nick.jakobi@animaths.com> Founded in Paris in 1997, MASA specializes in the production of cutting-edge adaptive technologies - software and hardware that seeks to emulate and exploit many of the properties of living things. The company is heavily research orientated and now employs over 50 people including many PhDs from the areas of Artificial Life, Artificial Intelligence, Mathematics and Scientific computing. This makes it one of the largest laboratories (public or private) of its kind in the world. In early 1999, MASA opened its British division on the campus of Sussex University (Brighton, UK) to take advantage of close links with academia. This division has a wide remit and current projects include financial prediction, constraint satisfaction, the development of controllers for unmanned vehicles, path-finding algorithms and the creation of original and powerful tools for industrial combinatorial optimization problems. As part of its continued expansion, MASA is currently looking for exceptional candidates to fill the following posts at its UK offices: 3 Research Scientists. Prospective candidates must have recently obtained (or be about to obtain) a PhD or similar high-level research experience in a relevant discipline. Ideally, they will have skills in the following areas: computing, mathematical modeling and visualization, evolutionary and adaptive systems, optimization. They will be expected to perform creative research that produces innovative solutions to hard industrial problems within commercial constraints. After an initial training period they will also be expected to manage 1-2 developers on a day-to-day basis who will help them implement their ideas and work with them as part of a team. 4 Developers. Prospective candidates will hold a MSc or equivalent in a relevant discipline. Ideally, they will have commercial software development experience, but the ability to learn new techniques quickly and to understand and implement complex algorithms is more important. MASA offers a very attractive salary and benefits package. The UK offices are situated in the Brighton Innovation Centre on the University of Sussex Campus surrounded by the beautiful South Downs and 3 miles from Brighton town centre and the sea. Please email a C.V. with the names of at least two referees and a covering letter to nick.jakobi at animaths.com. From gasser at cs.indiana.edu Thu Apr 13 23:14:57 2000 From: gasser at cs.indiana.edu (Michael Gasser) Date: Thu, 13 Apr 2000 22:14:57 -0500 (EST) Subject: Postdocs in cognitive development In-Reply-To: Message-ID: The Developmental Training Grant at Indiana University has several post-doctoral traineeships open for application. We are interested in individuals with backgrounds in cognitive science, cognitive development, language, linguistics, and connectionist modelling who who would benefit from interdisciplinary training. Information about the Training Grant may be found at http://www.indiana.edu/~psych/postdoc/multidis.html; by writing to Multidisciplinary Training in Developmental Process, c/o Melissa Foster, Department of Psychology, Indiana University, Bloomington, Indiana 4740, e-mail: mefoster at indiana.edu; or by contacting any member of the training faculty. Linda Smith, smith4 at indiana.edu Michael Gasser, gasser at indiana.edu Indiana University is an Equal Opportunity/Affirmative Action institution. Positions open until filled. From DominikD at cruxfe.com Fri Apr 14 02:22:30 2000 From: DominikD at cruxfe.com (Dominik Dersch) Date: Fri, 14 Apr 2000 16:22:30 +1000 Subject: Job Offer at CruxFE (Sydney) Message-ID: <610AC1238DA7D111B80E0020AFF2B9D1576137@mail.ucs.com.au> Financial Engineering Analyst Crux Financial Engineering (CruxFE) develops advanced systems for finance and industry, including artificial intelligence trading systems, and trading tools. CruxFE experiences significant growth over the last few years. In order to keep the competitive edge sharp we are seeking to appoint an outstanding Financial Engineering Analyst with a strong artificial neural networks background. Your role will focus on developing and implementing new time series forecasting and trading tools to a broad range of derivative instruments. Ideally you will possess: a recent Ph.D. in Physics, Engineering, Mathematics or a related field, a track record in financial time series forecasting, artificial neural networks, signal processing, pattern recognition and statistics, excellent programming skills in C/C++, Perl, and Matlab across NT and LINUX platforms and very good communication and problem solving skills. CruxFE offers a highly professional, creative working environment in a fast paced industry that experiences fundamental changes. We are located in the central business district of Sydney. For further details about the position and CruxFE please visit our web page at www.cruxfe.com or contact Dominik Dersch (dominik at cruxfe.com) Please reply quoting Ref. No.0405 to recruitment at cruxfe.com.au or in writing to Crux Financial Engineering Pty Ltd Level 7, 50 Carrington Street Sydney 2000 PO Box 656, Grosvenor Place NSW 1220 ______________________________________________________________________ Dr Dominik Dersch Research and Development Crux Financial Engineering Australia email: dominikd at cruxfe.com tel: +61 (02) 90040637 From mdorigo at iridia0.ulb.ac.be Fri Apr 14 09:38:45 2000 From: mdorigo at iridia0.ulb.ac.be (Marco Dorigo) Date: Fri, 14 Apr 2000 15:38:45 +0200 (CEST) Subject: CFP: IEEE Transactions on Evolutionary Computation Special Issue on Ant Algorithms and Swarm Intelligence Message-ID: <200004141338.PAA26995@iridia0.ulb.ac.be> ============================================= IEEE Transactions on Evolutionary Computation Special Issue on Ant Algorithms and Swarm Intelligence ============================================= ============================================= We apologize if you receive multiple copies ============================================= ================== CALL FOR PAPERS ================== The IEEE Transactions on Evolutionary Computation will publish a special issue on Ant Algorithms and Swarm Intelligence. The behavior of social insects in general, and of ants living in colonies in particular, has fascinated researchers in ethology and animal behavior for a long time. Many models have been proposed to explain their capabilities. Recently, ant algorithms and swarm intelligence systems have been offered as a novel computational approach that replaces the traditional emphasis on control, preprogramming, and centralization with designs featuring autonomy, emergence, and distributed functioning. These designs are proving flexible and robust, able to adapt quickly to changing environments and to continue functioning even when individual elements fail. The special issue will be dedicated to the publication of original research results on ant algorithms and, more in general, on swarm intelligence. Papers that prove new theoretical results on ant algorithms and swarm intelligence systems behavior, or that describe their successful applications to real-world problems are particularly welcome. The submission of papers is open to any researcher in ant algorithms. Researchers taking part in "ANTS'2000 - From Ant Colonies to Artificial Ants: Second International Workshop on Ant Colony Optimization" will be invited to submit a significantly extended version of their workshop submission to the special issue. Up-to-date information on the special issue is maintained at: http://iridia.ulb.ac.be/~ants/ants2000/ants2000-pub.html =================== EXPECTED TIMELINE =================== The ANTS'2000 workshop will take place September 8 to 9, 2000 in Brussels, Belgium. The deadline for the submission of papers to the special issue will be approximately four months after the workshop, so that authors have the time to improve their papers according to the feedback obtained at the workshop. Every paper will be refereed by at least two experts in the field and by at least one of the editors. To make the review process run smoother a special panel will be formed before the submission deadline. Based on our previous experience, we expect that a high percentage of the papers that will eventually be published will need to undergo a revision process consisting of at least two iterations. The tentative schedule is as follows: December 31, 2000. Deadline for submissions to the special issue. March 15, 2001. Referees reports and editors' decisions are sent to authors. May 31, 2001. Deadline for the revised versions of the papers. July 10, 2001. Referees reports and editors' decisions are sent to authors. September 15, 2001. Authors send final versions of the papers to the editors. October 15, 2001. Editors send the special issue to the publisher. The expected publication year of the special issue will be 2002. =================== THE GUEST EDITORS =================== Marco Dorigo, Universite' Libre de Bruxelles, Belgium Luca Maria Gambardella, IDSIA, Manno, Switzerland Martin Middendorf, Universitaet Karlsruhe, Germany Thomas Stuetzle, Technische Universitaet Darmstadt, Germany From nnk at hip.atr.co.jp Mon Apr 17 02:34:40 2000 From: nnk at hip.atr.co.jp (Neural Networks Japan Office) Date: Mon, 17 Apr 2000 15:34:40 +0900 Subject: Neural Networks 13(3) Message-ID: NEURAL NETWORKS 13(3) Contents - Volume 13, Number 3 - 2000 _______________________________________________________________ NEURAL NETWORKS LETTERS: Self-organized hierarchial structure in a plastic network of chaotic units J. Ito, K. Kaneko Improving local minima of Hopfield networks with augmented Lagrange multipliers for large scale TSPs M. Martin-Valdivia, A. Ruiz-Sepulveda, F. Triguero-Ruiz CURRENT OPINIONS: Learning non-stationary conditional probability distributions D. Husmeier ARTICLES: *** Psychology and Cognitive Science *** The patchwork engine: image segmentation from symmetries G.J. Van Tonder, Y. Ejima *** Neuroscience and Neuropsychology *** Position invariant recognition in the visual system with cluttered environments S.M. Stringer, E.T. Rolls *** Mathematical and Computational Analysis *** Local minima and plateaus in hierarchical structures of multilayer perceptrons K. Fukumizu, S.-I. Amari Learning in higher order Boltzmann machines using linear response M.A.R. Leisink, H.J. Kappen A recurrent neural network for solving linear projection equations J. Xia, J. Wang Efficient perceptron learning using constrained steepest descent S.J. Perantonis, V. Virvilis Information complexity of neural networks M.A. Kon, L. Paskota *** Technology and Applications *** A connectionist model for convex-hull of a planar set A. Datta, N.R. Pal, N.R. Pal Multilayer neural networks for solving a class of partial differential equations S. He, K. Reif, R. Unbehauen BOOK REVIEW: Book review: A fruitful blend, or a trinket-box? The MIT Encyclopedia of the Cognitive Sciences R. Raizada _______________________________________________________________ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** end. ==================================================================== NEURAL NETWORKS Editorial Office ATR Human Information Processing Research Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-1008 E-MAIL nnk at hip.atr.co.jp ==================================================================== From wmking at physics.bell-labs.com Mon Apr 17 16:38:33 2000 From: wmking at physics.bell-labs.com (Wayne M. King) Date: Mon, 17 Apr 2000 16:38:33 -0400 Subject: Workshop for the Analysis of Neural Data Message-ID: <38FB7649.1E7EC6DF@physics.bell-labs.com> Hello all, I wanted to share with you this announcement for an upcoming workshop on the analysis of neural data to be held this summer at woods hole, MA. The workshop has been extremely productive in the past years and we anticipate another good working group this year. I am including the poster for this year's workshop below. I would appreciate it if you could forward this e-mail to any of your colleagues who might be interested in these issues. If you have any further questions I can be reached via e-mail at wmking at bell-labs.com or by phone at (908) 582-2669. Applications should be mailed or faxed to the address listed in the poster, but I would be happy to answer any of your questions you might have about the workshop. Please check out the workshop's web site in order to see what has been discussed in previous years, as well as a list of participants. Thank you for your attention and help in disseminating this information. Sincerely, Wayne King Analysis of Neural Data Modern methods and open issues in the analysis and interpretation of multivariate time series and imaging data in the neurosciences 20 August - 2 September 2000 Marine Biological Laboratories - Woods Hole, MA A working group of scientists committed to quantitative approaches to problems in neuroscience will focus their efforts on experimental and theoretical issues related to the analysis of single and multichannel data sets. The work group is motivated by issues in two complimentary areas that are critical to an understanding of brain function. The first involves advanced signal processing methods, particularly those appropriate for emerging multisite recording and noninvasive imaging techniques. The second involves the development of a calculus to study the dynamical behavior of nervous systems and the computations they perform. A distinguishing feature of the work group is a close collaboration between experimentalists and theorists with regard to the analysis of data and the planning of experiments. The work group will have a limited number of research lectures, supplemented by tutorials on relevant computational, experimental, and mathematical techniques. The topics covered in the workgroup will maintain continuity with past years and will include the analysis of point process data (spike trains) as well as continuous processes (LFP, imaging data), and miscellaneous topics such as spike waveform classification. We will have two one-day workshops in addition to the scheduled activities: 28 August-Neuronal Control Signals for Prosthetic Devices 1 September-Statistical Inference for fMRI Time Series Participants: About twenty five participants, both experimentalists and theorists. Experimentalists are encouraged to bring data records to the work group; appropriate computational facilities will be provided. The work group will further take advantage of interested investigators and course faculty concurrently present at the MBL. We encourage graduate students and postdoctoral fellows as well as senior researchers to apply. PARTICIPANT FEE: $300 Accepted participants will be provided with shared dormitory accomodations at MBL and board. Support: National Institutes of Health-NIMH, NIA, NIAAA, NICHD/NCRR, NIDCD, NIDA, and NINDS. Organizers: David Kleinfeld (UCSD) and Partha P. Mitra (Bell Laboratories, Lucent Technologies). Website: www.vis.caltech.edu/~WAND/ Application: Send a copy of your c.v. together with a cover letter that contains a brief (ca. 200 word) paragraph on why you wish to attend the work group to: Ms. Jean B. Ainge Bell Laboratories, Lucent Technologies 700 Mountain Avenue 1D-427 Murray Hill, NJ 07974 908-582-4702 (fax) or Graduate students and postdoctoral fellows are encouraged to include a brief letter of support from their research advisor. Applications must be received by 19 May 2000 Participants will be notified by 29 May 2000 From ms at acl.icnet.uk Tue Apr 18 12:08:43 2000 From: ms at acl.icnet.uk (Margarita Sordo) Date: Tue, 18 Apr 2000 17:08:43 +0100 (BST) Subject: DPhil Thesis: A Neurosymbolic Approach to the Classification of Scarce and Complex Data Message-ID: <200004181608.RAA02502@marr.acl.icnet.uk> Dear Connectionists, My DPhil thesis on knowledge-based neural networks for classification of scarce and complex medical data is now available. Title: "A Neurosymbolic Approach to the Classification of Scarce and Complex Data" It can be found at: http://www.acl.icnet.uk/lab/aclsanchez.html Abstract: Artificial neural networks possess characteristics that make them a useful tool for pattern recognition and classification. Important features such as generalization, tolerance to noise and graceful degradation make them a robust learning paradigm. However, their performance strongly relies on large amounts of data for training. Therefore, their applicability is precluded in domains where data are scarce. Knowledge-based artificial neural networks (KBANNs) provide a means for combining symbolic and connectionist approaches into a hybrid methodology capable of dealing with small datasets. The suitability of such networks has been evaluated in binary-valued domain theories. After replicating some initial results with a binary-valued domain theory, this thesis presents new results with scarce and complex real-valued medical data. 31P magnetic resonance spectroscopy (MRS) of normal and cancerous breast tissues provide good testbeds to assess the advantages of such a methodology over other, more traditional connectionist approach for classification purposes in constrained domains. Experimental work confirmed the suitability of the proposed neurosymbolic approach for real-life applications with such constraints. Knowledge in the symbolic module helps to overcome the difficulties found by the connectionist module when confronted with small datasets. Details of breast tissue metabolism and MRS are presented. Knowledge acquisition methodologies for gathering the required knowledge for the definition of the domain theories are also described. Future directions for improving the KBANN methodology are discussed. =============================================================================== Margarita Sordo Sanchez ms at acl.icnet.uk Advanced Computation Laboratory Imperial Cancer Research Fund 61 Lincoln's Inn Fields, London WC2A 3PX, England, United Kingdom phone: 44 (020) 7242 0200 Ext 2911 44 (020) 7269 2911 (direct) fax: 44 (020) 7269 3186 =============================================================================== From sam26 at cam.ac.uk Tue Apr 18 10:57:15 2000 From: sam26 at cam.ac.uk (Stu McLellan) Date: Tue, 18 Apr 2000 15:57:15 +0100 Subject: Post Doctoral Position Message-ID: <02bd01bfa946$62d6c370$0cbe6f83@psychol.cam.ac.uk> UNIVERSITY OF CAMBRIDGE CENTRE FOR SPEECH AND LANGUAGE DEPARTMENT OF EXPERIMENTAL PSYCHOLOGY Post-doctoral Research Associate (full time) Connectionist modeller Applications are invited for a post-doctoral RA to work as part of a multi-disciplinary team, led by Professor L K. Tyler, investigating the interface between functional and neural accounts of the language system. Candidates should have a strong background and training in connectionist modelling and an interest in semantics and/or morphology and will be expected to contribute to the research programme by developing and analysing computational models with the aim of extending and testing theoretical accounts and generating novel predictions. Related experience in psycholinguistics, neuropsychology or neuroimaging would also be an advantage. This post is funded for a maximum of 5 years, starting as soon as possible. Salary will be on the RA1A scale =A315735 - =A323,651 (under review) according to age and experience.=20 =20 Applications in the form of a covering letter, full c.v., and the names and addresses of three referees (including email address) should be sent to Professor L. K. Tyler, Department of Experimental Psychology, University of Cambridge, Downing Street, Cambridge CB2 3EB to arrive no later than 15 May 2000. Informal enquiries can be emailed to hem10 at cam.ac.uk (until 28.4.00) and thereafter to lktyler at csl.psychol.cam.ac.uk. The University of Cambridge is an equal opportunities employer. From josh at vlsia.uccs.edu Tue Apr 18 14:12:16 2000 From: josh at vlsia.uccs.edu (Alspector) Date: Tue, 18 Apr 2000 12:12:16 -0600 (MDT) Subject: Research programmer position at university spinoff Internet startup Message-ID: Personalogy, Inc. is a fast growing company in Colorado Springs, CO that develops and applies state-of-the-art machine learning techniques to personalize information on the Internet. We are currently looking for a research programmer with the following credentials: -Master's or PhD degree in Computer Science, EE or related field -1-4 years in research and professional programming experience -strong programming skills in Perl/C/C++ -good knowledge of HTML/CGI -strong interests in information retrieval and user modeling -background in machine learning/neural networks/intelligent data mining -comfortable with Windows/NT/2K and Unix/Solaris/Linux systems -very good communication and problem solving skills The person will be responsible for enhancing and optimizing the core algorithms of the company as well as researching novel ways of reorganizing internet page contents using user-specific information. The person will also be involved in the overall system design. Personalogy offers a competitive salary, benefits and stock options. Please respond by sending a resume to personalogy at personalogy.net. Professor Joshua Alspector Univ. of Colorado at Col. Springs Dept. of Elec. & Comp. Eng. P.O. Box 7150 Colorado Springs, CO 80933-7150 (719) 262 3510 (719) 262 3589 (fax) josh at eas.uccs.edu From anderson at europa.cog.brown.edu Tue Apr 18 16:49:16 2000 From: anderson at europa.cog.brown.edu (anderson) Date: Tue, 18 Apr 2000 16:49:16 -0400 (EDT) Subject: Position available at Simpli.com Message-ID: <200004182049.QAA04226@europa.cog.brown.edu> Computer Scientist with Experience in Linguistics and Neural Networks Apply your skills to a new problem developing advanced language tools for Web search and other applications. Work in a small, dynamic internet start-up. Simpli.com (www.simpli.com) seeks a computer scientist or software engineer with a strong background in computational linguistics, preferably having experience with natural language processing, neural networks or advanced statistics. MA or PhD or 3-4 years experience required. Salary and Benefits: Salary commensurate with experience. Competitive benefits package. Environment: Simpli.com is an up-and-coming internet start-up devoted to improving web search. We were a "Company to Watch" at Demo 2000 in February. We are located in downtown Providence, Rhode Island within walking distance of Brown University and the Rhode Island School of Design. Providence has a vibrant arts community, museums, theaters, and world class dining. Living costs are extremely reasonable. Providence is an hour from Boston and 4 hours from New York. To apply: Mail, fax, or e-mail a cover letter, current resume and names of three references to: Andrew Duchon, Simpli.com, Inc., 203 S. Main, Providence, RI 02903. fax: 401-621-3220. email: aduchon at simpli.com. From mjhealy at u.washington.edu Tue Apr 18 22:19:00 2000 From: mjhealy at u.washington.edu (M. Healy) Date: Tue, 18 Apr 2000 19:19:00 -0700 (PDT) Subject: IJCNN 2000 paper available (fwd) Message-ID: My paper accepted for IJCNN 2000, M. J. Healy (2000), "Category Theory Applied to Neural Modeling and Graphical Representations" is available (with minor revisions) on the web in postscript format at http://cialab.ee.washington.edu/pubs.htm . I can supply .pdf format for anyone who requests it from me at mjhealy at u.washington.edu . This has a larger scope than my IJCNN 99 paper. Following a brief tutorial on category theory, it shows a simplified view of how functors implement concept hierarchies in a category of neural architecture components through the use of colimits (the content of the 1999 paper). It goes on to show how natural transformations between the functors interconnect the hierarchies at all their levels of abstraction. This yields a mathematical model of the semantics of the formation of ever-more-complex concepts in a connectionist memory during adaptation, and multiple implementations in subnetworks associated with different processing functions (vision, tactile and other sensors, association regions, motor function, etc.). The multiple implementations are hierarchies directed from the abstract to the specialized. These must be interconnected in a manner consistent with abstraction. Here, we mean that, first, different implementations of concepts at different abstraction levels must maintain the same relative positions in the implementation hierarchies. Second, performing a cross-hierarchy association and then abstracting or specializing (moving "down" or "up" in the associated hierarchy) must be interchangeable with first moving "down" or "up" and then moving across, so that the order of different stages in a perception or reasoning exercise does not change the semantics of what is being perceived or reasoned about. My claim is that the functors and natural transformations capture these peoperties mathematically. A full paper is under construction. Mike -- =========================================================================== e Michael J. Healy A FA ----------> GA (425)865-3123 | | FAX(425)865-2964 | | Ff | | Gf c/o The Boeing Company | | PO Box 3707 MS 7L-66 \|/ \|/ Seattle, WA 98124-2207 ' ' USA FB ----------> GB -or for priority mail- e "I'm a natural man." 2760 160th Ave SE MS 7L-66 B Bellevue, WA 98008 USA michael.j.healy at boeing.com -or- mjhealy at u.washington.edu ============================================================================ From kruschke at indiana.edu Wed Apr 19 10:27:54 2000 From: kruschke at indiana.edu (John K. Kruschke) Date: Wed, 19 Apr 2000 09:27:54 -0500 Subject: Post Doc in Cognitive Modeling at Indiana U. Message-ID: <38FDC26A.753DCB58@indiana.edu> > POSTDOCTORAL TRAINING FELLOWSHIPS in MODELING OF COGNITIVE > PROCESSES. The Psychology Department and Cognitive Science Program at > Indiana University anticipate one or more Postdoctoral Traineeships > funded by the National Institutes of Health. > Appointments will pay rates appropriate for a new or > recent Ph.D. and will be for one or two years, beginning July > 1, 2000 or later. Traineeships will be offered to qualified individuals > who wish to further their training in mathematical modeling or computer > simulation modeling, in any substantive area of cognitive psychology or > Cognitive Science. Women, minority group members, and handicapped > individuals are urged to apply. The NIMH awards are restricted to U.S. > citizens or permanent residents. Deadline for submission of application > materials has been extended to May 1, 2000, but we encourage earlier > applications. Applicants should send an up-to-date vita, > relevant reprints and preprints, a personal letter describing > their research interests, background, goals, and career plans, > and reference letters from two individuals. Send Materials to > Professor Jerome R. Busemeyer, Department of Psychology, Rm 367, Indiana > University, 1101 E. 10th St. Bloomington, IN 47405-7007. > Cognitive Science information may be obtained at > http://www.psych.indiana.edu/ Indiana University is an > Affirmative Action Employer. From X.Yao at cs.bham.ac.uk Wed Apr 19 06:44:16 2000 From: X.Yao at cs.bham.ac.uk (Xin Yao) Date: Wed, 19 Apr 2000 11:44:16 +0100 (BST) Subject: Lecturer in Computer Science (3 posts) Message-ID: Dear colleagues, We are currently inviting applications for the following three posts. Evolutionary and natural computation is one of the areas that we are particularly interested in. General enquiries should be directed to the HoS (contact info below). I'm happy to answer questions related to the research activities in evolutionary and natural computation. Preliminary announcement: A three-year research fellowship in evolutionary computation in the School of CS at the University of Birmingham will be advertised formally soon. Informal enquiries can be made to me. Best regards, Xin Yao (x.yao at cs.bham.ac.uk) ----------------------------------------------------------------------- URL for further particulars: http://www.bham.ac.uk/personnel/s35434.htm ----------------------------------------------------------------------- REFERENCE NUMBER S35434/00 JOB TITLE Lecturer in Computer Science (3 posts) DEPARTMENT/SCHOOL School of Computer Science HOURS Full time STARTING SALARY On Lecturer A or B scale in the range GBP17,238 - 30,065 per annum (Depending on experience and qualifications) DURATION Open STARTING DATE As soon as possible INFORMAL ENQUIRIES Prof Achim Jung (Head of School) phone (+44) 121 414 4776 email: A.Jung at cs.bham.ac.uk CLOSING DATE FOR RECEIPT OF APPLICATIONS 16 May 2000 Late applications may be considered APPLICATION FORMS RETURNABLE TO The Director of Personnel Services The University of Birmingham Edgbaston, Birmingham, B15 2TT England RECRUITMENT OFFICE FAX NUMBER: +44 121 414 4802 RECRUITMENT OFFICE TELEPHONE NUMBER: +44 121 414 6486 RECRUITMENT OFFICE E-MAIL ADDRESS: h.h.luong at bham.ac.uk Applications are invited for three Lectureships in Computer Science at the University of Birmingham. Applications from all areas of Computer Science will be considered but preference will be given to candidates who show promise in the areas discussed below. (See the web page for more details) The successful candidate should have or be about to complete a PhD in Computer Science or an appropriate, closely related field. (S)he is expected to have research experience as evidenced by publications in leading international journals or conference proceedings. The research potential of a new PhD may also be judged from his/her PhD thesis. The successful candidate must have the commitment to achieve excellence in teaching at all levels (from undergraduate teaching to research student supervision), including teaching subjects that may not be in his/her research areas. All academic staff are also expected to help with administration. ----------------------------------------------------------------------- From a540aa at email.sps.mot.com Wed Apr 19 20:16:24 2000 From: a540aa at email.sps.mot.com (Kari Torkkola (a540aa)) Date: Wed, 19 Apr 2000 17:16:24 -0700 Subject: papers available on dimension reduction Message-ID: <38FE4C58.D66F5AFA@email.mot.com> Two papers on dimension reduction are available: 1. Kari Torkkola and William M. Campbell, Mutual Information in Learning Feature Transformations Abstract We present feature transformations useful for exploratory data analysis or for pattern recognition. Transformations are learned from example data sets by maximizing the mutual information between transformed data and their class labels. We make use of Renyi's quadratic entropy, and we extend the work of Principe et al. to mutual information between continuous multidimensional variables and discrete-valued class labels. The paper can be retrieved through page http://members.home.net/torkkola/mmi.html together with some illustrative examples. 2. William M. Campbell, Kari Torkkola, and Sreeram V. Balakrishnan, Dimension Reduction Techniques for Training Polynomial Networks Abstract We propose two novel methods for reducing dimension in training polynomial networks. We consider the class of polynomial networks whose output is the weighted sum of a basis of monomials. Our first method for dimension reduction eliminates redundancy in the training process. Using an implicit matrix structure, we derive iterative methods that converge quickly. A second method for dimension reduction involves a novel application of random dimension reduction to ``feature space.'' The combination of these algorithms produces a method for training polynomial networks on large data sets with decreased computation over traditional methods and model complexity reduction and control. http://members.home.net/torkkola/sp_papers/campbell-icml2000.ps.gz or http://members.home.net/torkkola/sp_papers/campbell-icml2000.pdf Both papers will appear in the Proceedings of ICML 2000, June 29 - July 2, Stanford, CA. -- Kari Torkkola phone: +1-480-4134129 Motorola Labs, MD EL508 fax: +1-480-4137281 2100 East Elliot Road email: a540aa at email.mot.com Tempe, AZ 85284 http://members.home.net/torkkola From janetw at csee.uq.edu.au Thu Apr 20 02:12:40 2000 From: janetw at csee.uq.edu.au (Janet Wiles) Date: Thu, 20 Apr 2000 16:12:40 +1000 (EST) Subject: Lecturer in Computer Science and Electrical Engineering Message-ID: Dear colleagues, We are currently inviting applications for several tenurable posts. Neural and evolutionary computation are areas we are particularly interested in. General enquiries should be directed to the HoD (contact info below). I'm happy to answer questions related to the research activities in neural and evolutionary computation. Best regards, Janet Wiles -------------------------------------------- URL for CSEE Dept http://www.csee.uq.edu.au/ -------------------------------------------- THE UNIVERSITY OF QUEENSLAND (Brisbane, Australia) DEPARTMENT OF COMPUTER SCIENCE & ELECTRICAL ENGINEERING Lecturer/Senior Lecturer/Associate Professor in Computer Systems The Department of Computer Science & Electrical Engineering within the Faculty of Engineering, Physical Sciences and Architecture is one of the largest in Australia, with a strong research base and a large postgraduate research school. The position(s) available are continuing, with level of appointment depending on candidate's qualifications and experience. Appointees must have a strong commitment to research and demonstrated achievement appropriate to the level of appointment in one of the following areas of computer systems: Distributed Systems; Intelligent Machines/Systems; or Microelectronics/ Digital System Design. Opportunities for research into computer systems are extensive, particularly through involvement with the departmental Intelligent Systems and Digital Systems research groups, large research centres (such as CRC for Distributed Systems Technology and ARC Special Research Centre in Applied Genomics) and the proposed Australian Microelectronics Network. Further information on the Department may be found at http://www.csee.uq.edu.au/ Appointees will also be expected to contribute to well-established programs in Engineering (Computer Systems, Electrical, Software) and Information Technology, including supervision of Honours, Masters, and PhD students, and to contribute to the development of postgraduate coursework and to the internationalisation of the department's teaching. To these ends, appropriate levels of knowledge and demonstrated expertise in more than one of the following areas are essential: Computer Architecture; Digital System Design; Distributed Systems; Embedded/Systems Programming; and Neural Computing. The appointments will be continuing at Levels B, C or D. Salary ranges are: Aus$49,535 - $58,823 per annum (Level B); $60,681 - $69,968 per annum (Level C); $73,064 - $80,495 per annum (Level D), plus employer superannuation contribution of 17%. Contact Professor Paul Bailes, Head, Department of Computer Science and Electrical Engineering, The University of Queensland, on telephone (07) 3365 3869; fax (07) 3365 4999; email hod at csee.uq.edu.au to discuss the role or to obtain a position description and selection criteria. Alternatively, see the University web site. Send nine (9) copies of your application (an original plus eight (8) copies) to the Personnel Officer, Faculty of Engineering, Physical Sciences and Architecture, The University of Queensland, QLD 4072, Australia. Please quote Reference No. 17700, address the selection criteria and include a resume and the names and contact details of 3 referees. Closing date for applications: 26 May 2000. From RaymonB at heartsol.wmids.nhs.uk Thu Apr 20 06:17:52 2000 From: RaymonB at heartsol.wmids.nhs.uk (Raymond Ben) Date: Thu, 20 Apr 2000 11:17:52 +0100 Subject: Thesis on data visualisation and HRV analysis Message-ID: The following PhD thesis is available from http://www.eleceng.adelaide.edu.au/Personal/braymond/thesis.pdf.gz It deals mostly with the application of data visualisation techniques to a biomedical problem (heart rate variability analysis) but does delve a little into issues of training and regularisation in the LSS and GTM. Comments welcome. Ben Abstract Variations in heart rate on a beat-to-beat basis reflect variations in autonomic tone. Heart rate variability (HRV) analysis is a popular research tool for probing autonomic function and has found a wide range of applications. This thesis studies data visualisation and classification of HRV as alternatives to established processing methods. Data visualisation algorithms transform a high-dimension data set into an easily-visualised, two-dimensional representation, or mapping. This transformation is conducted such that the interesting structure of the data set is preserved. Visualisaton techniques thus allow the researcher to investigate relationships between HRV data without the need to define fixed bands of interest within each spectrum. Two visualisation algorithms are primarily used throughout this thesis: the least-squares scaling (a form of multidimensional scaling) and the generative topographic mapping (GTM). The least-squares scaling (LSS) may be implemented using radial basis function neural networks, adding the ability to project new data onto an existing mapping. The training of such networks can be done in conjunction with the construction of the map itself, or as a separate step. It has previously been suggested that the former approach yields smoother networks and thus better generalisation; here, it is shown that, with appropriate network regularisation, the generalisation properties of the two methods are comparable. It is also shown that the incorporation of prior knowledge (such as class labels) into the LSS can improve the visual properties of the resulting mapping. A simple modification to the GTM is given to allow similar use of prior information. The visualisation of HRV data is demonstrated on two data sets. The first was from a simple study involving postural and pharmacological intervention in healthy subjects. The LSS and GTM both produced logical mappings of the data, with the ordering of the points within the map reflecting the sympathovagal balance during the various phases of the study. Data visualisation is also demonstrated on HRV data from overnight studies into the sleep apnoea/hypopnoea syndrome. The ordering of the points within the map in this case was strongly related to the power in the very low frequency region of the spectra, known to be an indicator of sleep apnoea. Subjects who suffered predominantly hypopnoeas rather than true apnoeas were found to show HRV similar to control subjects. The final section of the thesis briefly addresses the classification of HRV, emphasising the combination of HRV with information from other diagnostic signals and sources. Classification of data from the intervention study showed that mean heart rate together with HRV allowed more reliable classification than did either mean heart rate or HRV alone. In the classification of sleep apnoea data, the addition of body mass index and age did not improve classification; however, the inclusion of oxyhaemoglobin desaturation information did improve the classification accuracy. From golden at utdallas.edu Thu Apr 20 17:19:24 2000 From: golden at utdallas.edu (Richard M Golden) Date: Thu, 20 Apr 2000 16:19:24 -0500 (CDT) Subject: Special Issue on "Model Selection" in Journal of Mathematical Psychology Message-ID: I would like to call people's attention to the special issue on "model selection" in the Journal of Mathematical Psychology which just came out. Here is the table of contents. Journal of Mathematical Psychology, Vol. 44, March 2000 "Accuracy, Scope, and Flexibility of Models" James E. Cutting "How to Assess a Model's Testablity and Identifiability" Bamber and van Santen "An Introduction to Model Selection" Walter Zucchini "Akaike's Information Criterion and Recent Developments in Information Complexity" Hamparsum Bozdogan "Bayesian Model Selection and Model Averaging" Larry Wasserman "Cross-Validation Methods" P. Grunwald "Statistical Tests for Comparing Possibly Misspecified and Nonnested Models" Richard Golden "Model Comparisons and Model Selections based on Generalization Criterion Methodology" Jerome Busemeyer and Yi-Min Wang "The Importance of Complexity in Model Selection" In Jae Myung "Key Concepts in Model Selection: Performance and Generalizability" Malcolm Forster ******************************************************************************* Richard M. Golden, Associate Professor Cognitive Science & Engineering The University of Texas at Dallas, Box 830688 Richardson, Texas 75083-0688, PHONE: (972) 883-2423 EMAIL: golden at utdallas.edu, WEB: http://www.utdallas.edu/~golden/index.html ******************************************************************************* From wahba at stat.wisc.edu Thu Apr 20 20:37:03 2000 From: wahba at stat.wisc.edu (Grace Wahba) Date: Thu, 20 Apr 2000 19:37:03 -0500 (CDT) Subject: Intro Model Bldg w. RKHS Message-ID: <200004210037.TAA18824@hera.stat.wisc.edu> `An Introduction to Model Building With Reproducing Kernel Hilbert Spaces' notes from a shortcourse given at Interface 2000 - now available at ftp://ftp.stat.wisc.edu/pub/wahba/interf/index.html Abstract: We assume no knowledge of reproducing kernel Hilbert spaces, but review some basic concepts, with a view towards demonstrating how this setting allows the building of interesting statistical models that allow the simultaneous analysis of heterogenous, scattered observations, and other information. The abstract ideas will be illustrated with several data analyses including modeling risk factors for eye diseases. ............................................... Gaussian processes, radial basis functions, support vector machines, functional ANOVA decompositions and the bias-variance tradeoff fit in the framework discussed. ................................................ Grace Wahba From hzs at cns.brown.edu Fri Apr 21 12:51:24 2000 From: hzs at cns.brown.edu (Harel Z. Shouval) Date: Fri, 21 Apr 2000 12:51:24 -0400 (EDT) Subject: Symposium anouncement Message-ID: SYMPOSIUM ANNOUNCEMENT: The Dynamic Brain: Molecules, Mathematics, the Mind Brown University, May 31 - June 3, 2000 The "Dynamic Brain" Meeting celebrates interdisciplinary research and education as well as the Brain Science Program at Brown University, which was launched in October 1999. Twenty-four of the world's leading researchers will present the latest findings in brain development and function. The meeting will bring together experimentalists and theoreticians to discuss how we can accelerate our understanding of the brain through interdisciplinary studies that blend mathematics, biology, computation, andcognitive and behavioral sciences; this collaboration across disciplines is the model for the Brain Science Program at Brown. In particular, there will be an emphasis on the intraction of theory and experiment that has greatly enriched both endeavors. This interaction has enabled scientists to pose new questions with precision and clarity. Each of the four sessions will span levels of study and will include group discussion of challenges and interdisciplinary approaches to understanding brain function. We invite you to join us for this landmark event. John Donoghue and Leon Cooper, conference organizers PROGRAM Wednesday, May 31 Registration: Begins at 3:00 p.m. Opening Reception: 6:00 p.m. Thursday, June 1 Session I: 9:00 a.m.- 3:00 p.m. Receptive Field Plasticity: From Molecule to Systems Speakers: Mark Bear, Tobias Bonhieffer, Leon Cooper, Yves Fregnac, Robert Malenka, Susumu Tonegawa Session II: 3:00-6:30 p.m. Temporal Dynamics: From Synapse to Systems Speakers: Laurence Abbott, Barry Connors, John Donoghue, Eve Marder, Henry Markram, Carla Shatz Friday, June 2 Session II: (continued) 9:00-11:00 a.m. Temporal Dynamics: From Synapse to Systems Session III: 11:00 a.m.- 5:00 p.m. Memory Consolidation: From Molecule to Behavior Speakers: Cristina Alberini, Justin Fallon, Eric Kandel, Richard Morris, Larry Squire, Jerry Yin Saturday, June 3 Session IV: 9:00 a.m.- 3:00 p.m. The Neuronal Mind Speakers: Jean-Pierre Changeux, Richard Frackowiak, David Mumford, Keiji Tanaka, Michael Tarr REGISTRATION INFORMATION Fees Graduate and Postdoctoral Students $50 General Registration $100 Accommodations Westin Hotel, Waterplace Park, Providence. Special conference rates are available until April 30, 2000: US $149 single/double; $175 triple/quad. Contact the hotel directly at: Tel 401.598.8000 Fax 401.598.8200 Web site http://www.westin.com/ FOR MORE INFORMATION Tel 401.863 9524 E-mail brainscience at brown.edu Web site http://www.brainscience.brown.edu/ From rsun at cecs.missouri.edu Fri Apr 21 14:35:20 2000 From: rsun at cecs.missouri.edu (Ron Sun) Date: Fri, 21 Apr 2000 13:35:20 -0500 Subject: recent issues of Cognitive Systems Research Message-ID: <200004211835.NAA07121@pc113.cecs.missouri.edu> Contents of the recent issues of Cognitive Systems Research: ------------------------- Table of Contents for Cognitive Systems Research Volume 1, Issue 1, 1999 Ron Sun, Vasant Honavar and Gregg C. Oden Editorial: Integration of cognitive systems across disciplinary boundaries 1-3 Andy Clark Where brain, body, and world collide 5-17 Arthur M. Glenberg, David A. Robertson, Jennifer L. Jansen and Mina C. Johnson-Glenberg Not Propositions 19-33 Arthur C. Graesser, Katja Wiemer-Hastings, Peter Wiemer-Hastings and Roger Kreuz AutoTutor: A simulation of a human tutor 35-51 Pentti Kanerva Book Review: Artificial Minds, Stan Franklin, MIT Press, Cambridge, MA, 1995 53-57 Xin Yao Conference Report: Evolutionary computation comes of age 59-64 ------------------------- Table of Contents for Cognitive Systems Research Volume 1, Issue 2, January 2000 Mark H. Bickhard Information and representation in autonomous agents 65-75 Valerie Gray Hardcastle The development of the self 77-86 Umberto Castiello et al. Human inferior parietal cortex `programs' the action class of grasping 89-97 Marsha C. Lovett, Larry Z. Daily and Lynne M. Reder A source activation theory of working memory: cross-task prediction of performance in ACT-R 99-118 A. El Imrani, A. Bouroumi, H. Zine El Abidine, M. Limouri and A. Essad A fuzzy clustering-based niching approach to multimodal function optimization 119-133 ------------------------- Table of Contents for Cognitive Systems Research Volume 1, Issue 3, April 2000 Brijesh Verma and Chris Lane Vertical jump height prediction using EMG characteristics and neural networks 135-141 Scott A. Huettel and Gregory Lockhead Psychologically rational choice: selection between alternatives in a multiple-equilibrium game 143-160 Robert C. Mathews, Lewis G. Roussel, Barbara P. Cochran, Ann E. Cook and Deborah L. Dunaway The role of implicit learning in the acquisition of generative knowledge 161-174 ------------------------------------------------------------------- Publish your work with Cognitive Systems Research --- the new journal devoted to the interdisciplinary study of cognitive science http://www.elsevier.nl/locate/cogsys Elsevier Science Co-Editors-in-Chief Ron Sun, University of Missouri-Columbia. E-mail: rsun at cecs.missouri.edu Vasant Honavar, Iowa State University. E-mail: honavar at cs.iastate.edu Gregg Oden, University of Iowa. E-mail: gregg-oden at uiowa.edu Cognitive Systems Research covers all topics of cognition, including ' Problem-Solving and Cognitive Skills ' Knowledge Representation and Reasoning ' Perception ' Action and Behavior ' Memory ' Learning ' Language and Communication ' Agents ' Integrative and Interdisciplinary Studies For a full description of subjects and submission information, access the Website: http://www.elsevier.nl/locate/cogsys or http://www.cecs.missouri.edu/~rsun/journal.html ------------------------------------------------------------------- From tommi at ai.mit.edu Fri Apr 21 16:04:43 2000 From: tommi at ai.mit.edu (Tommi Jaakkola) Date: Fri, 21 Apr 2000 16:04:43 -0400 Subject: AISTATS 2001: Call for papers Message-ID: <200004212004.QAA06446@susi.ai.mit.edu> (apologies for multiple posting) ==================================================================== AI and STATISTICS 2001 Eighth International Workshop on Artificial Intelligence and Statistics January 3-6, 2001, Hyatt Hotel, Key West, Florida http://www.ai.mit.edu/conferences/aistats2001/ This is the eighth in a series of workshops which have brought together researchers in Artificial Intelligence (AI) and in Statistics to discuss problems of mutual interest. The exchange has broadened research in both fields and has strongly encouraged interdisciplinary work. Papers on all aspects of the interface between AI & Statistics are encouraged. To encourage interaction and a broad exchange of ideas, the presentations will be limited to about 20 discussion papers in single session meetings over three days (Jan. 4-6). Focused poster sessions will provide the means for presenting and discussing the remaining research papers. Papers for poster sessions will be treated equally with papers for presentation in publications. Attendance at the workshop will not be limited. The three days of research presentations will be preceded by a day of tutorials (Jan. 3). These are intended to expose researchers in each field to the methodology and techniques used in other related areas. The Eighth workshop especially encourages submissions related to the following workshop themes in the interface between information retrieval and statistics: Statistical natural language processing Game theory Missing information; unlabeled examples Error correcting codes In addition, papers on all aspects of the interface between AI & Statistics are strongly encouraged, including but not limited to Automated data analysis Cluster analysis and unsupervised learning Statistical advisory systems, experimental design Integrated man-machine modeling methods Interpretability in modelling Knowledge discovery in databases Metadata and the design of statistical data bases Model uncertainty, multiple models Multivariate graphical models, belief networks, causal modeling Online analytic processing in statistics Pattern recognition Prediction: classification and regression Probabilistic neural networks Probability and search Statistical strategy Vision, robotics, natural language processing, speech recognition Visualization of very large datasets Submission Requirements: ----------------------- Electronic submission of abstracts is required. The abstracts (up to 4 pages in length) should be submitted through the AI and Statistics Conference Management page supported by Microsoft Research. More specific instructions will be made available at http://cmt.research.microsoft.com/AISTATS2001/ In special circumstances other arrangements can be made to facilitate submission. For more information about possible arrangements, please contact the conference chairs. Submissions will be considered if they are received by midnight July 1, 2000. Please indicate the theme and/or the topic(s) your abstract addresses. Receipt of all submissions will be confirmed via electronic mail. Acceptance notices will be emailed by September 1, 2000. Preliminary papers (up to 12 pages, double column) must be received by November 1, 2000. These preliminary papers will be copied and distributed at the workshop. Program Chairs: -------------- Thomas Richardson, University of Washington, tsr at stat.washington.edu Tommi Jaakkola, MIT, tommi at ai.mit.edu Program Committee: ----------------- Russell Almond, Educational Testing Service, Princeton Hagai Attias, Microsoft Research, Cambridge Yoshua Bengio, University of Montreal Max Chickering, Microsoft Research, Redmond Greg Cooper, University of Pittsburgh Robert Cowell, City University, London Phil Dawid, University College, London Vanessa Didelez, University of Munich David Dowe, Monash University Brendan Frey, University of Waterloo Nir Friedman, Hebrew University, Jerusalem Dan Geiger, Technion Edward George, University of Texas Paolo Giudici, University of Pavia Zoubin Ghahramani, University College, London Clark Glymour, Carnegie-Mellon University David Heckerman, Microsoft Research, Redmond Thomas Hofmann, Brown University Reimar Hofmann, Siemens Michael Jordan, University of California, Berkeley David Madigan, Soliloquy Chris Meek, Microsoft Research, Redmond Marina Meila, Carnegie-Mellon University Kevin Murphy, University of California, Berkeley Mahesan Niranjan, University of Sheffield John Platt, Microsoft Research, Redmond Greg Ridgeway, University of Washington Lawrence Saul, AT&T Research Prakash Shenoy, University of Kansas Dale Schuurmans, University of Waterloo Padhraic Smyth, University of California, Irvine David Spiegelhalter, University of Cambridge Peter Spirtes, Carnegie-Mellon University Milan Studeny, Academy of Sciences, Czech Republic Michael Tipping, Microsoft Research, Cambridge Henry Tirri, University of Helsinki Volker Tresp, Siemens Chris Watkins, Royal Holloway and Bedford New College, Nanny Wermuth, University of Mainz Joe Whittaker, Lancaster University Chris Williams, University of Edinburgh From maass at igi.tu-graz.ac.at Sat Apr 22 11:46:47 2000 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Sat, 22 Apr 2000 17:46:47 +0200 Subject: Program of the NeuroCOLT Workshop May 2000 in Graz (Austria) Message-ID: <3901C967.A8561F95@igi.tu-graz.ac.at> Program of the NeuroCOLT Workshop NEW PERSPECTIVES IN THE THEORY OF NEURAL NETS May 3 to 5 , 2000 at Schloss St. Martin in Graz (Austria). Organizer: Wolfgang Maass, Graz University of Technology ---------------------------------------------------- Wednesday, May 3 morning: Shai Ben-David, Israel: An Efficient Agnostic Learning Algorithm for Half-Spaces Michael Schmitt, Germany: On the Complexity of Computing and Learning with Multiplicative Neural Networks Pekka Orponen, Finland: Some new results on the computational properties of analog recurrent neural networks Georg Dorffner, Austria: Recurrent neural networks and symbolic dynamics afternoon: Juergen Schmidhuber, CH: Long Short-Term Memory and Context Sensitive Languages Volker Tresp, Germany: The Generalized Bayesian Committee Machine Nicol N.Schraudolph, CH: Stochastic Meta-Descent Ron Meir, Israel: Localized Boosting Algorithms and Weak Learning Klaus Obermeyer, Germany: TBA ---------------------------------------------------------- Thursday, May 4 morning: Rodney Douglas, CH: Transposing cortical processing into neuromorphic analog VLSI circuits Wolfgang Maass/Robert Legenstein, Austria: Foundations of a Circuit Complexity Theory for Sensory Processing Georg Schnitger, Germany: Neural Circuits for Elementary Vision Problems Andreas Herz, Germany: Neural representation of acoustic communication signals afternoon: Wulfram Gerstner, CH: Spike-time Dependent Hebbian Learning Thomas Natschlaeger, Austria: Dynamic Synapses as Nonlinear Filters Peter Koenig, CH: Learning and synchronization (at 4pm we leave for the opening of the exhibition gr2000az in Schloss Eggenberg, see http://www.comm.gr2000az.at/ ) --------------------------------------------------- Friday, May 5 morning: Bob Williamson, Australia: Margins, Sparsity and Perceptrons Bernhard Schoelkopf, GB: Kernels: Similarities and Dissimilarities Chris Bishop, GB: The Variational Relevance Vector Machine John Shawe-Taylor, GB: Bounds Combining Sparsity and Margins afternoon: Helene Paugam-Moisy, France: Multiclass discrimination, SVM, and multimodal learning Manfred Opper, GB: The TAP Mean Field approach for probabilistic models Martin Anthony, GB: Some Results on Cross-Validation Bhaskar Das Gupta, USA: On Approximate Learning by Multi-layered Feedforward Circuits Pascal Koiran, France: The stability of saturated linear dynamical systems is undecidable Zoubin Ghahramani, GB: Bayesian Learning of Model Structure (evening: discussion of the future of research on neural nets in Europe, including meetings and funding options) ----------------------------------------------------- POSTERS: Nello Cristianini, Harald Burgsteiner (A Learning Algorithm for Winner-Take-All Circuits), Jyrki Kivinen, Robert Legenstein (Circuit Complexity Theory for Sensory Processing), Gabor Lugosi, Bojan Novak (Conditions and Requirements for an Efficient Parallel Implementation of Various Learning Algorithms), Laurent Perrinet (Networks of Integrate-and-Fire Neuron using Rank Order Coding: How to implement Hebbian Learning), Jiri Sima, Eva Volna (Optimal Neural Network Topology for Real Problem of Pattern Recognition). ------------------------------------------------------------- ABSTRACTS, REGISTRATION AND TRAVEL INFORMATION: http://www.tu-graz.ac.at/igi/maass/nn2000/ A few slots are still open for registration; for lodging see http://www.graztourism.at/ From russell at CS.Berkeley.EDU Sun Apr 23 16:58:27 2000 From: russell at CS.Berkeley.EDU (Stuart Russell) Date: Sun, 23 Apr 2000 13:58:27 -0700 (PDT) Subject: UC Berkeley postdoc position(s) Message-ID: <200004232058.NAA01632@tower.CS.Berkeley.EDU> POSTDOC POSITION: MOTOR CONTROL LEARNING The Complex Motor Learning project at UC Berkeley would like to hire one or more postdoctoral research scientists. The project combines approaches from reinforcement learning, adaptive control theory, and biological motor control in order to study and develop systems that learn complex motor control behaviors such as walking, running, throwing, and flying. Experimental subjects include humans, insects, and robotic systems. Faculty investigators include Fearing, Russell, and Sastry (EECS); Dickinson, Farley, and Full (Int. Biol.); and Ivry (Psych.). A more complete project description appears at http://www.cs.berkeley.edu/~russell/cml/ Candidates should have: - a very strong background in at least one and preferably two of the three disciplines listed above; - excellent mathematical skills; and proficiency in at least one of - software development (ideally, physical simulation systems) - experimental robotics - behavioral studies of human or animal subjects Interested candidates should send a short letter stating interest and a CV with names and email addresses of three references to Stuart Russell, Computer Science Division, University of California, Berkeley, CA 94720. Email applications (PLAIN TEXT AND/OR UNENCODED POSTSCRIPT ONLY) to russell at cs.berkeley.edu From xjliu at wspc.com.sg Sun Apr 23 20:58:01 2000 From: xjliu at wspc.com.sg (Xuejun) Date: Mon, 24 Apr 2000 08:58:01 +0800 Subject: New Books on Robotics Message-ID: <00Apr24.085802sst.14978@gateway.wspc.com.sg> **** Apologies if you receive multiple copies **** Dear Colleagues, I wish to inform you on the following recent books and volumes by World Scientific and Imperial College Press. ==================================== Series in Robotics and Intelligent Systems Vol. 24 Robot Learning - An Interdisciplinary Approach Edited by J. Demiris (Univ. Edinburgh) & A. Birk (Vrije Universiteit Brussel) 220pp, May 2000 ISBN: 981-02-4320-0 http://www.worldscientific.com/books/compsci/4436.html ==================================== Associative Learning for a Robot Intelligence By J. H. Andreae (Univ. Canterbury) 360pp, Sept 1998 ISBN: 1-86094-132-x http://www.worldscientific.com/books/compsci/p113.html ===================================== Geometrical Foundations of Robotics Edited by J. M. Selig (South Bank Univ. UK) 164pp, Mar 2000 ISBN: 981-02-4113-5 http://www.worldscientific.com/books/compsci/4257.html ===================================== Series in Intelligent Control and Intelligent Automation - Vol. 11 Multisensor Fusion - A Minimal Representation Framework By R.Joshi (Real-Time Innovations Inc., USA) & A. C Sanderson (Rensselaer Polytechnic Institute, USA) 336pp, Dec 1999 ISBN: 981-02-3880-0 http://www.worldscientific.com/books/compsci/4106.html ===================================== For more information on these and previous books and volumes see the WWW page of the series: ***************************************** Series in Robotics and Intelligent Systems Edited by C. J. Harris (Univ. Southampton) http://www.worldscientific.com/books/series/wssris_series.html ***************************************** Series on Machine Perception & Artificial Intelligence Edited by H. Bunke (Univ. Bern) & P. S. P. Wang (Northeastern Univ.) http://www.worldscientific.com/books/series/wsmpai_series.html ***************************************** Series in Intelligent Control and Intelligent Automation Edited by F. Y. Wang (Univ. Arizona) http://www.worldscientific.com/books/series/sicia_series.html ***************************************** Many new books and volumes are coming soon in these series. With my best regards. Sincerely yours, Xuejun Liu World Scientific and Imperial College Press http://www.worldscientific.com/ From terry at salk.edu Mon Apr 24 14:34:28 2000 From: terry at salk.edu (terry@salk.edu) Date: Mon, 24 Apr 2000 11:34:28 -0700 (PDT) Subject: NEURAL COMPUTATION 12:5 Message-ID: <200004241834.LAA03549@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 5 - May 1, 2000 REVIEW Expanding Neuron's Repertoire of Mechanisms With NMODL M. L. Hines and N. Y. Carnevale ARTICLE Latent Attractors: A Model For Context-Dependent Place Representations in The Hippocampus Simona Doboli, Ali A. Minai and Phillip J. Best NOTE The Approach of A Neuron Population Firing Rate to A New Equilibrium: An Exact Theoretical Result B.W. Knight, A. Omurtag and L. Sirovich Formation of Direction Selectivity in Natural Scene Environments Brian Blais, Harel Shouval and Leon N. Cooper LETTERS A Phase Model of Temperature-Dependent Mammalian Cold Receptors Peter Roper, Paul C. Bressloff and Andre Longtin The Number of Synaptic Inputs and The Synchrony of Large Sparse Neuronal Networks D. Golomb and D. Hansel Neural Network Architecture For Visual Selection Yali Amit Using Bayes Rule to Model Multisensory Enhancement in the Superior Coliculus Thomas J. Anastasio, Paul E. Patton, and Kamel Belkacem-Boussaid Choice and Value Flexibility Jointly Contribute to The Capacity of a Subsampled Quadratic Classifier Panayiota Poirazi and Bartlett W. Mel New Support Vector Algorithms Alex Smola, Berhard Scholkopf, Robert Williamson, Peter Bartlett ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2000 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From mayank at MIT.EDU Tue Apr 25 20:59:59 2000 From: mayank at MIT.EDU (Mayank R. Mehta) Date: Tue, 25 Apr 2000 20:59:59 -0400 Subject: Effect of learning on receptive field shape and direction selectivity. Message-ID: <200004260100.VAA01957@all-night-tool.mit.edu> The following two papers on effect of Hebbian learning, or temporally asymmetric NMDA dependent LTP, on receptive field shape [1], and direction selectivity in V1 [2], are available at: http://www.mit.edu/~mayank/ -Mayank -------------------------------------------------------------------- Mayank R. Mehta http://www.mit.edu/~mayank/ E25-236, 45 E Carleton Street Work: 617 252 1841 Massachusetts Institute of Technology FAX: 617 258 7978 Cambridge, MA 02139 Email: Mayank at MIT.edu -------------------------------------------------------------------- Paper 1: `Experience-dependent, asymmetric shape of hippocampal receptive fields'. Mayank R. Mehta, Michael. C. Quirk & Matthew A. Wilson. Neuron (2000) 25:707-715. Abstract: We propose a novel parameter, namely the skewness or the asymmetry of the shape of a receptive field, and use this measure to characterize two properties of hippocampal place fields that may reflect the underlying mechanism of experience dependent plasticity. First, a majority of hippocampal receptive fields on linear tracks are negatively skewed, such that during a single pass the firing rate is low as the rat enters the field, but high as it exits. Second, while the place fields are symmetric at the beginning of a session, they become highly asymmetric with experience. Further experiments suggest that these results are likely to arise due to synaptic plasticity during behavior, and not due to other non-specific mechanisms. Using a purely feed forward neural network model we show that following repeated directional activation, the temporally asymmetric nature of NMDA dependent LTP/D could result in an experience dependent asymmetrization of receptive fields. Paper 2: `From Hippocampus to V1: Effect of LTP on spatio-temporal dynamics of receptive fields'. Mayank R. Mehta & Matthew A. Wilson. To appear in Neurocomputing, (2000). Recent studies have revealed novel effects of patterns of neuronal activity and synaptic plasticity on the size and specificity of receptive fields. However, little has been done to quantify their effect on the receptive field {\it shape}. It has been shown that place fields are highly asymmetric such that, the firing rate of a place cell rises slowly as a rat enters a place field but the firing rate drops off abruptly at the end of the place field in an experience dependent fashion. Here we present a computational model that can explain the results, based on NMDA dependent LTP. Striking similarities between the hippocampal and striate receptive field dynamics are pointed out. Our model suggests that LTP/D could result in diverse phenomena such as phase precession in the hippocampal neurons and the origin of directional receptive fields in the striate cortex. It is suggested that the key feature underlying directionality and inseparable spatio-temporal dynamics is the asymmetric shape of the receptive field. From allan at biomedica.org Wed Apr 26 10:28:42 2000 From: allan at biomedica.org (Allan Kardec Barros) Date: Wed, 26 Apr 2000 11:28:42 -0300 Subject: Biomedical Engineering - Mailing list Message-ID: <3906FD1A.EE7A9E03@biomedica.org> Dear connectionists, We have created in Nagoya University a new mailing list for those who are interested in biomedical engineering. Further details you can find at: http://www.ohnishi.nuie.nagoya-u.ac.jp/BME. Best regards, Allan Kardec Barros, Yoshinori Takeuchi, Hiroaki Kudo From robert.smith at uwe.ac.uk Wed Apr 26 11:21:09 2000 From: robert.smith at uwe.ac.uk (Robert E. Smith) Date: Wed, 26 Apr 2000 16:21:09 +0100 Subject: Call for Abstracts: Workshop on Self-Organazing Multi-Agent Systems (UK) Message-ID: Sent to: connectionists at MAILBOX.SRV.CS.CMU.EDU Self-organisation in Multi-agent Systems (SOMAS) Date: July 27-28, 2000 Milton Keynes, UK A Workshop organised by the Emergent Computing Network http://images.ee.umist.ac.uk/emergent/ Call for Contributors 1 Introduction Multi-agent systems (MAS) are collections of interacting autonomous entities. The behaviour of the MAS is a result of the repeated asynchronous action and interaction of the agents. Understanding how to engineer self-organisation is thus central to the application of agents on a large scale. Multi-agent simulations can also be used to study emergent behaviour in real systems. Interest in large-scale systems of agents is growing, as is illustrated by the recent Framework Five (Future and Emergent Technologies) action on the so-called Universal Information Ecosystem. Advances in telecommunications and the spread of the Internet, electronic commerce, etc. mean that information infrastructure operates as a global dynamic system. As time passes, the density and diversity of interconnections in such system will increase rapidly. Moreover, such systems are being required to service the needs of a diverse set of users (whatever their distinctive needs), not just a virtual 'representative' user. Thus, such systems must adapt to personal requirements, by providing highly customised packages of services. Simultaneously providing highly diverse services to a huge user population in an enormous, interconnected system is a task beyond centralised management techniques. The only way to manage this form of agent-based system is to utilise its emergent properties to make it self-organising and self-regulating. Desirable self-organisation is observed in many biological, social and physical systems. However, fostering these conditions in artificial systems proves to be difficult and offers the potential for undesirable behaviours to emerge. Thus, it is vital to be able to understand and shape emergent behaviours in agent based systems. Current mathematical and empirical tools give only a partial insight into emergent behaviour in large, agent-based societies. The goal of this workshop is to open a dialog among practitioners from diverse fields, including: agent based systems, complex systems, AI, optimisation theory and non-linear systems, neural networks, evolutionary computation, neuro-biology, and computer science The workshop will focus on localised means of measuring, understanding, and shaping emergent behaviour in large scale distributed systems. The workshop represents an important opportunity for those active or interested in emergent behaviour research, to hear about current work, discuss future directions and priorities, and form invaluable research contacts. 2 Venue and Format The Workshop will commence on July, 27th and will take place over 2 days at the BT Conference Facility in Milton Keynes, UK. Since the primary goal of this workshop is to provide time for communication between presenters and attendees ample opportunity will be provided for structured discussion. A detailed programme will be issued by June, 2000. 3 Invited Speakers We have a promising list of invited speakers to address the workshop's theme from a variety of perspectives. Names will be announced as they are confirmed. 4 Call for Contribution You are invited to contribute to this workshop. Your 250-300 word abstract should include title, authors, affiliations, keywords, source of external support (if any) and the body of the abstract should stress the relevance of your work to the workshop topic. All accepted talks will be allocated 20-25 minutes. Speakers will be asked to provide copies of their overheads for inclusion in the Workshop information pack. Selected contributions from this workshop and others in the series will be published by Springer-Verlag as a highlighted volume of their "Lecture Notes in Computer Science" series. Detailed submission instructions for this series publication will be issued later. 5 Registration Details of workshop registration are forthcoming. Please contact one of the organizers, listed below. 6 General Enquiries To obtain submission or registration instructions, or to make any general questions relating to this Workshop, please contact one of the organizers: Paul Kearney BT Labs, Adastral Park Phone: 01473 605544 Email: paul.3.kearney at bt.com Robert Smith The Intelligent Computer Systems Centre The University of The West of England Phone: 0117 942 1495 Email: robert.smith at uwe.ac.uk Andy Wright BAe Sowerby Research (currently a Visiting Fellow at Bristol University) Phone: 0117 95 46883 Email: Andy.Wright at bristol.ac.uk 7 Important Dates: Abstract Submission Deadline: May 21st, 2000 Notification of Acceptance of Abstract: June 30, 2000. Workshop Dates: July 27th-28th, 2000 8 Emergent Computing Workshop Series This is the 6th workshop being organised by the `Emergent Computing' network, to bring together multi-disciplinary ideas from complex systems, AI, optimisation theory and non-linear systems, neural networks, neuro-biology and computer science. The workshops are: 1. Self-Organising Systems at the University of Manchester Institute of Science and Technology, UK 2. Spatially Distributed Nonlinear Systems at the University of Leeds, UK (December 1999) 3. Associative Computing at the University of York, UK (February 2000) 4. Emergent Computation in Molecular and Cellular Biology at the University of Hertfordshire, UK (April 2000) 5. Strategies for Implementing Large Scale Emergent Computing Systems at the University of Wales, Cardiff, UK (June 2000) 6. Self-organisation in Multi-agent Systems (SOMAS) at The BT Conference Centre, Milton Keynes, UK (July 27-28, 2000). From hali at theophys.kth.se Wed Apr 26 20:04:46 2000 From: hali at theophys.kth.se (Hans =?iso-8859-1?Q?Liljenstr=F6m?=) Date: Thu, 27 Apr 2000 02:04:46 +0200 Subject: Nordic Symposium on Computational Biology Message-ID: <3907841E.1DCE3F88@theophys.kth.se> NORDIC SYMPOSIUM ON COMPUTATIONAL BIOLOGY 2000 18-23 JUNE 2000 AGORA FOR BIOSYSTEMS, SIGTUNA, SWEDEN Co-organized by Agora for Biosystems and Nordita The symposium addresses questions of high current interest in biology and related fields. The main focus is on bioinformatics, but computational methods applied to molecular and cellular biology, as well as computational neurobiology and ecology will also be included. The symposium is primarily intended to attract young scientists in the Nordic and surrounding countries, but all interested are welcome as long as space allows. A major objective is to give an introduction to some of the most important problems and challenges within the field. This will be accomplished by several tutorials, in addition to invited talks given by top scientists from abroad and from the region. There will also be ample time for poster presentations and for formal and informal discussions. TOPICS INCLUDE: - Bioinformatics - Macromolecular dynamics - Computational (functional) genomics - Computational neurobiology - Computational ecology INVITED SPEAKERS INCLUDE: Edward Cox, Dept. of Molecular Biology, Princeton University Mats Gyllenberg, Dept. of Mathematics, University of Turku John Hopfield, Dept. of Molecular Biology, Princeton University Eric Jakobsson, Dept. of Molecular and Integrative Physiology, University of Illinois Inge Jonassen, Dept. of Informatics, University of Bergen Erik Lindahl, Dept. of Physics, Royal Institute of Technology Kristian Lindgren, Dept. of Physical Resource Theory, Chalmers and G=F6teborg University Michael Mackey, Department of Physiology, McGill University Johan Paulsson, Dept. of Molecular Biology, Uppsala University Hans Plesser, Dept. of Physics, G=F6ttingen Dirk Repsilber, Inst. for Forest Genetics and Forest Tree Breeding, University of Hamburg Mattias Wahde, Dept. of Mechanical Engineering, Chalmers ORGANIZING COMMITTEE Soren Brunak, Center for Biological Sequence Analysis, Technical University of Denmark Olle Edholm, Theoretical Physics, Royal Institute of Technology, Stockholm Gaute Einevoll, Dept. of Physics, Agricultural University of Norway Jarl-Thure Eriksson, Electrical Engineering Labs, Tampere University Gunnar von Heijne, Center for Bioinformatics, Stockholm University John Hertz, Nordita, Copenhagen Hans Liljenstr=F6m, Agora for Biosystems, Sigtuna Dietrich von Rosen, Dept. of Biometrics, SLU, Uppsala More information and registration form can be obtained: - http://www.agora.kva.se/meetings/CompBio2000 Please register electronically using a web browser if possible. Abstract submissions can also be made via the online registration form. - email: hans.liljenstrom at sdi.slu.se - by mail: Hans Liljenstrom Dept. of Biometrics, SLU Box 7013 SE-750 07 Uppsala Sweden FAX: +46-18-673502 REGISTRATION DEADLINE: May 31, 2000. From Alex.Smola at anu.edu.au Wed Apr 26 23:25:57 2000 From: Alex.Smola at anu.edu.au (Alex Smola) Date: Thu, 27 Apr 2000 13:25:57 +1000 Subject: New Website on Kernel Machines Message-ID: <3907B345.C7497895@anu.edu.au> We are pleased to announce a new website on Kernel Machines and related methods. It can be found at http://www.kernel-machines.org It is a superset of the Support Vector website at GMD FIRST. Most links to http://svm.first.gmd.de will still be operational and should result in the near future (as soon as the chanes are made to the site in Berlin) in a redirect of your browser to the new site. However, we would like to ask you to update any existing links. Compared to the old site, the main difference is that we have enlarged the scope to form a repository not only for research on SVMs, but also Gaussian Process prediction, Mathematical Programming with Kernels, Regularization Networks, Reproducing Kernel Hilbert Spaces, and related methods. The aim is to serve as a central information source by providing links to papers, upcoming events, datasets, code, a discussion board, etc. On the technical side, novelties include the fact that data entry is now fully automatic, papers can be uploaded to the website, there exists a search option for papers, and data can also be provided in BibTeX format which should make it easier referencing to papers available at the site. We would like to express thanks to GMD FIRST for allowing us to host the webpage in the past three years in Berlin. Links to research at the Berlin group on Intelligent Data Analysis can now be found at http://ida.first.gmd.de The changes were needed since the boundaries between Support Vectors and other methods have become less well defined and we feel that there is scientific benefit in bringing the various research areas together. Moreover Alex Smola and Bernhard Scholkopf have moved to the Australian National University (Canberra), and Microsoft Research (Cambridge), respectively. The server for http://www.kernel-machines.org is located at the Australian National University. We thank ANU for the resources. The organizational structure is novel. From the beginning, we have strived to create a forum which would provide a balanced representation of the emerging field of SVM and kernel methods research. It is our hope that this forum has contributed its share to the exciting developments that all of us have witnessed over the last years. Now that the field has become more mature, we felt that it was time to support the website with an editorial board. This change also reflects the increasing importance of dissemination of research via the world wide web. If web sites gradually take over part of what journals were responsible for in the past, then they should also adhere to comparable levels of scholarly standards. We believe that the changes will ensure that the website will continue to be a useful resource for researchers. The editorial board comprises Nello Cristianini, Royal Holloway College, University of London Bernhard Scholkopf, Microsoft Research, Cambridge (UK) John Shawe-Taylor, Royal Holloway College, University of London Alex Smola, Australian National University, Canberra Vladimir Vapnik, AT&T, New Jersey Bob Williamson, Australian National University, Canberra. -- / Alexander J. Smola / spigot.anu.edu.au/~smola / / Australian National University / Alex.Smola at anu.edu.au / / Dept. of Engineering and RSISE / Tel: (+61) 410 457 686 / / Canberra, ACT 0200 / Fax: (+61) 2 6249 0506 / From Mark.Butler at unilever.com Thu Apr 27 08:34:16 2000 From: Mark.Butler at unilever.com (mark butler) Date: Thu, 27 Apr 2000 13:34:16 +0100 (GMT Daylight Time) Subject: JOB: Research Scientist in Adaptive Computation Message-ID: Adaptive Computation Scientist - Unilever Research Wirral - North West A truly multi-local multinational, Unilever are taking the needs of millions of people across the globe seriously. We invest 550 million in pioneering new research to ensure that our products remain the preferred choice, 150 million times a day. Adaptive Computation techniques play a key role in the development of a wide spectrum of our product and manufacturing applications for household brands such as Impulse, Organics, Persil, Flora and Wall's Ice Cream. Our recently established Centre of Excellence in Port Sunlight is committed to this area as well as developing and maintaining links with leading academics. Due to success and expansion of the Group we have a number of exciting opportunities for talented individuals who will relish the challenge of developing leading-edge solutions to complex industrial problems. A combination of world-class research and an understanding of how technology may be applied in practice will enable you to extend both your technological capabilities and application areas. Flexible, self motivated and a strong team player with a broad scientific interest, you will need a high level of numeracy and the ability to work to tight deadlines. Your good first Degree encompassing a strong mathematical component should be supported by a PhD, MSc, academic or industrial experience. Specialists in one or more of the following areas would be of particular interest: Neuro-Fuzzy, Genetic Algorithms, Neural Networks, Data Mining, Evolutionary Systems, Adaptive Agents, Fitness Landscapes, Machine Learning, Pattern Recognition. As a world-leading organisation we can offer an attractive salary and benefits package and excellent career opportunities. To take up the challenge, please write with full CV quoting ref 22429/NS to Vanessa Gilroy, TMP Response Management, 32 Aybrook Street, London W1M 3JL. Or e-mail your details to: response at tmpw.co.uk Closing date for applications is 19th May 2000. For more information about Unilever Research and Unilever visit our Internet Web Site at http://research.unilever.com From rsun at cecs.missouri.edu Thu Apr 27 13:10:25 2000 From: rsun at cecs.missouri.edu (Ron Sun) Date: Thu, 27 Apr 2000 12:10:25 -0500 Subject: IJCNN'2000 Call for Participation Message-ID: <200004271710.MAA21742@pc113.cecs.missouri.edu> Call For Participation *** I J C N N -2 0 0 0 *** IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS to be held in Grand Hotel di Como, Como, Italy -- July 24-27, 2000 This is the premier international neural networks conference, sponsored by the IEEE Neural Network Council, the International Neural Network Society, and the European Neural Network Society, and with the technical cooperation of the Japanese Neural Network Society, AEI (the Italian Association of Electrical and Electronic Engineers), SIREN (the Italian Association of Neural Networks), and AI*IA (the Italian Association for Artificial Intelligence). The list of accepted papers and the tentative conference program is now available on the web page. For complete information regarding the conference (including information about Como, Italy), visit the conference web site at: http://www.ims.unico.it/2000ijcnn.html The organizers may be contacted by email at ijcnn2000 at elet.polimi.it. From hinton at gatsby.ucl.ac.uk Fri Apr 28 11:52:33 2000 From: hinton at gatsby.ucl.ac.uk (Geoffrey Hinton) Date: Fri, 28 Apr 2000 16:52:33 +0100 Subject: technical reports available Message-ID: <200004281552.QAA08520@axon.gatsby.ucl.ac.uk> Two new technical reports are now available at http://www.gatsby.ucl.ac.uk/hinton/chronological.html _______________________________ Training Products of Experts by Maximizing Contrastive Divergence Geoffrey Hinton Technical Report GCNU TR 2000-004 ABSTRACT It is possible to combine multiple probabilistic models of the same data by multiplying their probability distributions together and then renormalizing. This is a very efficient way to model high-dimensional data which simultaneously satisfies many different low-dimensional constraints because each individual expert model can focus on giving high probability to data vectors that satisfy just one of the constraints. Data vectors that satisfy this one constraint but violate other constraints will be ruled out by their low probability under the other experts. Training a product of experts appears difficult because, in addition to maximizing the probability that each individual expert assigns to the observed data, it is necessary to make the experts be as different as possible. This ensures that the product of their distributions is small which allows the renormalization to magnify the probability of the data under the product of experts model. Fortunately, if the individual experts are tractable there is an efficient way to train a product of experts. __________________________________ Learning Distributed Representations of Concepts Using Linear Relational Embedding Alberto Paccanaro and Geoffrey Hinton Technical Report GCNU TR 2000-002 ABSTRACT In this paper we introduce Linear Relational Embedding as a means of learning a distributed representation of concepts from data consisting of binary relations between concepts. The key idea is to represent concepts as vectors, binary relations as matrices, and the operation of applying a relation to a concept as a matrix-vector multiplication that produces an approximation to the related concept. A repesentation for concepts and relations is learned by maximizing an appropriate discriminative goodness function using gradient ascent. On a task involving family relationships, learning is fast and leads to good generalization. From reggia at cs.umd.edu Fri Apr 28 13:13:32 2000 From: reggia at cs.umd.edu (James A. Reggia) Date: Fri, 28 Apr 2000 13:13:32 -0400 (EDT) Subject: Post-Doc Position: Computational Neuroscience of Language Message-ID: <200004281713.NAA25338@avion.cs.umd.edu> Post-Doctoral Fellowship in Computational Neuroscience of Language A post-doctoral fellowship position is available in the area of computational modeling of the neurobiological basis of language. The fellowship can begin summer/fall of 2000, and is jointly at the Neurology Dept. (Baltimore campus) and Computer Science Dept. (near Washington DC) of the University of Maryland. Research in this position could focus on any topic related to normal or impaired language, although we have special interest in areas such as cerebral specialization, reading disorders, origins of language, aphasia recovery following stroke, and functional imaging correlates of language. Applicants are expected to have a recent PhD in either a biological/cognitive/linguistic discipline or a computational discipline (e.g., computer science, applied mathematics, physics, engineering). US citizenship is required. For full consideration, applications should be received by May 26, 2000. To apply, send two copies (*hard copies only*) of a CV and research interest statement, how to contact you by mail/fax/email/phone, and the names and contact information of two references to: James A. Reggia, Dept. of Computer Science, A. V. Williams Bldg., University of Maryland, College Park, MD 20742 USA. Questions may be directed to reggia at cs.umd.edu From petitot at poly.polytechnique.fr Sun Apr 2 09:12:15 2000 From: petitot at poly.polytechnique.fr (Jean Petitot) Date: Sun, 2 Apr 2000 15:12:15 +0200 Subject: New Book: NATURALIZING PHENOMENOLOGY Message-ID: (Our apologies for multiple copies of this message) The following book is available from Stanford University Press; see http://www.scico.u-bordeaux2.fr/episteme/bbooka/ NATURALIZING PHENOMENOLOGY ISSUES IN CONTEMPORARY PHENOMENOLOGY AND COGNITIVE SCIENCE Edited by Jean Petitot, Francisco J. Varela, Bernard Pachoud, Jean-Michel Roy Stanford University Press This work aims to shed a new light on the relations between Husserlian phenomenology and present-day efforts towards a scientific theory of cognition. Its primary goal is not to present a new interpretation of Husserl's writings. Rather, the contributors consider how Husserlian phenomenology might contribute to specific contemporary theories, either by complementing or by questioning them. What clearly emerges is that Husserlian phenomenology cannot become instrumental to cognitive science without undergoing a substantial transformation. Therefore, the book's central concern is not only the progress of contemporary theories of cognition but also the reorientation of Husserlian phenomenology. Because a single volume could not encompass the numerous facets of this wide-ranging interrogation, the contributors focus on the issue of naturalization.This perspective is far-reaching enough to allow for the coverage of a great variety of topics, ranging from the general structures of intentionality, the nature of temporality and perception, the mathematical modeling of their phenomenological descriptions, to the founding epistemological and ontological principles of cognitive science "Naturalizing Phenomenology" is thus a collective reflection on the possibility of bringing a naturalized Husserlian phenomenology to bear on a scientific theory of cognition that fills the explanatory gap between the phenomenological mind and brain. CONTENTS 1. Beyond the Gap: An Introduction to Naturalizing Phenomenology Jean-Michel Roy, Jean Petitot, Bernard Pachoud, and Francisco J. Varela PART ONE: INTENTIONALITY, MOVEMENT, AND TEMPORALITY INTENTIONALITY 2. Intentionality Naturalized? David Woodruff Smith 3. Saving Intentional Phenomena: Intentionality, Representation, and Symbol Jean-Michel Roy 4. Leibhaftigkeit and Representational Theories of Perception Elisabeth Pacherie MOVEMENT 5. Perceptual Completion: A Case Study in Phenomenology and Cognitive Science Evan Thompson, Alva Noe, and Luiz Pessoa 6. The Teleological Dimension of Perceptual and Motor Intentionality Bernard Pachoud 7. Constitution by Movement: Husserl in Light of Recent Neurobiological Findings Jean-Luc Petit TEMPORALITY 8. Wooden Iron? Husserlian Phenomenology Meets Cognitive Science Tim Van Gelder 9. The Specious Present: A Neurophenomenology of Time Consciousness Francisco J. Varela PART TWO: MATHEMATICS IN PHENOMENOLOGY FORMAL MODELS 1O. Truth and the Visual Field Barry Smith 11. Morphological Eidetics for a Phenomenology of Perception Jean Petitot 12. Formal Structures in the Phenomenology of Motion Roberto Casati PHENOMENOLOGY AND MATHEMATICS 13. G?del and Husserl Dagfinn F?llesdal 14. The Mathematical Continuum: From Intuition to Logic Giuseppe Longo PART THREE: THE NATURE AND LIMITS OF NATURALIZATION PHILOSOPHICAL STRATEGIES OF NATURALIZATION 15. Naturalizing Phenomenology? Dretske on Qualia Ronald Mcintyre 16. The Immediately Given as Ground and Background Juan-Jos? Botero 17. When Transcendental Genesis Encounters the Naturalization Project Natalie Depraz SKEPTICAL ATTITUDES 18. Sense and Continuum in Husserl Jean-Michel Salanskis 19. Cognitive Psychology and the Transcendental Theory of Knowledge Maria Villela-Petit 20. The Movement of the Living as the Originary Foundation of Perceptual Intentionality Renaud Barbaras HISTORICAL PERSPECTIVES 21. Philosophy and Cognition: Historical Roots Jean-Pierre Dupuy Writing Science series edited by Timothy Lenoir and Hans Ulrich Gumbrecht 7 x 10, 648 pp. ISBN 0-8047-3322-8 (cloth) ISBN 0-8047-3610-3 (pbk) _______________________________________________ The book is edited by Stanford University Press http://www.sup.org It is distributed by Cambridge University Press http://www.cup.cam.ac.uk Mail/fax order: FOR: US, Canada, Mexico, Central America) Cambridge University Press Distribution Ctr 110 Midland Avenue Port Chester, NY 10573-4930 USA (914)-937-4712 FOR: other countries Cambridge University Press The Edimburgh Building Shaftesbury Road Cambridge CB2 2RU UK (44)-1223-325959 In Paris: available at Librairie VRIN 6 Place de la Sorbonne 75005 Paris, France (33)-(0)1 43 54 32 75 The book is also available trough online bookshops as http://wwww.amazon.com http://www.amazon.com/uk http://wwww.Barnesnobles.com http://wwww.bookshop.blackwell.com http://wwww.fatbrain.com http://wwww.Kingbooks.com http://wwww.1bookstreet.com From terry at salk.edu Mon Apr 3 22:10:48 2000 From: terry at salk.edu (terry@salk.edu) Date: Mon, 3 Apr 2000 19:10:48 -0700 (PDT) Subject: NEURAL COMPUTATION 12:4 Message-ID: <200004040210.TAA26780@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 4 - April 1, 2000 ARTICLE the Multifractal Structure of Contrast Changes In Natural Images: From eytan at dpt-info.u-strasbg.fr Tue Apr 4 05:15:09 2000 From: eytan at dpt-info.u-strasbg.fr (Michel Eytan) Date: Tue, 4 Apr 2000 11:15:09 +0200 Subject: Paper available In-Reply-To: Message-ID: Thus hath held forth a member of Connectionists list at 31-03-2000 re Paper available: > The following IJCNN'00 paper is accepted and available on-line. > I will be happy about any feedback. [snip] Folks, I have already asked several times to *please* give SIGNIFICANT TITLES to the mails sent to the list. It so happens that I archive some of the mails to this list (and others). Just imagine what happens when I search for a mail ang see about 200 ones all with Subject: Paper available... Thank you and sorry for the bother. -- Michel Eytan eytan at dpt-info.u-strasbg.fr I say what I mean and mean what I say From B344DSL at UTARLG.UTA.EDU Tue Apr 4 19:05:53 2000 From: B344DSL at UTARLG.UTA.EDU (B344DSL@UTARLG.UTA.EDU) Date: Tue, 04 Apr 2000 17:05:53 -0600 (CST) Subject: Second edition of neural networks textbook by Levine Message-ID: <01JNUG3U9DKI00506S@UTARLG.UTA.EDU> INTRODUCTION TO NEURAL AND COGNITIVE MODELING SECOND EDITION DANIEL S. LEVINE LAWRENCE ERLBAUM ASSOCIATES COPYRIGHT 2000 (First edition published by LEA 1991) 491 pages Paperback: ISBN 0-8058-2006-X, $36.00 Cloth: ISBN 0-8058-2005-1, $99.95 From vera at cs.cas.cz Wed Apr 5 16:35:37 2000 From: vera at cs.cas.cz (Vera Kurkova) Date: Wed, 5 Apr 00 16:35:37 CET Subject: 1st call for papers ICANNGA 2001 Message-ID: <59738.vera@uivt1.uivt.cas.cz> **************************************************************** * * * > > > >>> CALL FOR PAPERS <<< < < < * * * **************************************************************** * * * ICANNGA 2001 * * * * 5th International Conference on * * Artificial Neural Networks and Genetic Algorithms * * * * including a special session on * * Computer-Intensive Methods in Control and Data Processing * * * * organized by * * the Institute of Computer Science, * * Academy of Sciences of the Czech Republic * * to be held at * * the Lichtenstein Palace, Prague, Czech Republic * * * * April 22-25, 2001 * * * * * * http://www.cs.cas.cz/icannga * * * **************************************************************** The focus of ICANNGA is on theoretical aspects and practical applications of computational paradigms inspired by natural processes, especially ARTIFICIAL NEURAL NETWORKS and EVOLUTIONARY ALGORITHMS. ICANNGA 2001 will include invited plenary talks, contributed papers, poster session, tutorials and a social program. CONFERENCE TOPICS The following list indicates some areas of interest, but is not exhaustive: * Neural Networks: Architectures, Algorithms, Approximation, Complexity, Biological Foundations, Computational Neuroscience * Evolutionary Computation: Genetic Algorithms, Genetic Programming, Classifier Systems, Artificial Life * Hybrid Systems: Fuzzy Logic, Soft Computing, Neuro-Fuzzy Controllers, Genetic Learning of Neural Networks * Applications: Pattern Recognition, Signal Processing, Control, Simulation, Robotics, Data Mining, Transport, Defense, Security, Environment, Finance and Business ================================================================= We invite contributed papers for ICANNGA 2001 on topics as above. Draft papers will be refereed, and papers accepted for oral presentation and selected papers for poster presentation will appear in the conference proceedings, to be published by Springer. Submission of draft versions of papers September 20, 2000 Notification of acceptance January 10, 2001 Delivery of revised papers February 7, 2001 ICANNGA conference April 22-25, 2001 ================================================================= Vera Kurkova, Institute of Computer Science, Prague, Chair of the Organizing Committee and of the Program Committee International Advisory Committee Rudolf Albrecht, University of Innsbruck, Austria Andrej Dobnikar, University of Ljubljana, Slovenia David Pearson, University of Saint Etienne, France Nigel Steele, Coventry University, United Kingdom ================================================================= For more information, please, visit the conference web site at http://www.cs.cas.cz/icannga ================================================================= From stefan.wermter at sunderland.ac.uk Wed Apr 5 10:26:31 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Wed, 05 Apr 2000 15:26:31 +0100 Subject: International emernet workshop on NN and neuroscience Message-ID: <38EB4D16.D2624687@sunderland.ac.uk> Emerging computational neural Network architectures based on neuroscience (EmerNet): International EPSRC Workshop on Current Computational Architectures Integrating Neural Networks and Neuroscience. Date: 8-9 August 2000 Location: Durham Castle, Durham, United Kingdom Workshop web page is http://www.his.sunderland.ac.uk/worksh3 Organising Committee ----------------------- Prof. Stefan Wermter Chair Hybrid Intelligent Systems Group University of Sunderland Prof. Jim Austin Advanced Computer Architecture Group Department of Computer Science University of York Prof. David Willshaw Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh Call for Papers and Participation -------------------------------- Description and Motivation --------------------------- Although there is a massive body of research and knowledge regarding how processing occurs in the brain this has had little impact on the design and development of computational systems. Many challenges remain in the development of computational systems, such as robustness, learning capability, modularity, massive parallelism for speed, simple programming, more reliability etc. This workshop aims to consider if the design of computational systems can learn from the integration of cognitive neuroscience, neurobiology and artificial neural networks. The main objective is the transfer of knowledge by bringing Together researchers in the twin domains of artificial and real neural networks. The goal is to enable computer scientists to comprehend how the brain processes information to generate new techniques for computation and encourage neuroscientists to consider computational factors when performing their research. Areas of Interest for Workshop -------------------------------- The main areas of interest for the workshop bring together Neural Network Architectures and Neuroscience Robustness: What are the characteristics that enable the human brain to carry on operating despite failure of its elements? How can the brain's slow but robust memory be utilised to replace the brittle but fast memory presently found in conventional computers? Modular construction: How can the brain provide ideas for Bringing together the current small artificial neural networks to create larger modular systems that can solve more complex tasks like associative retrieval, vision and language understanding? Learning in context: There is evidence from neuron, network and Brain levels that the internal state of such a neurobiological system has an influence on processing and learning. Is it possible to build computational models of these processes and states, and design incremental learning algorithms and dynamic architectures? Synchronisation: How does the brain synchronise its processing when using millions of processors? How can large asynchronous computerised systems be produced that do not rely on a central clock? Timing: Undertaking actions before a given deadline is vital. What structural and processing characteristics enable the brain to deal with real time situations? How can these be incorporated into a computerised approach? Processing speed: despite having relatively slow computing element, how is real-time performance achieved? Preliminary Invited Speakers We plan to have around 30 participants, including speakers and participants. -------------------------------------- Dr Jim Fleming - EPSRC Prof. Michael Fourman - University of Edinburgh Prof. Angela Frederici - Max Planck Institute of Cognitive NeuroScience Prof. Stephen Hanson - Rutgers University Prof. Stevan Harnad - University of Southampton Prof. Vasant Honavar - Iowa State University Dr Hermann Moisl - University of Newcastle upon Tyne Prof. Heiko Neumann - Universit Ulm Prof. Gnther Palm - Universit Ulm Prof. Kim Plunkett (tbc) - Oxford University Prof. James A. Reggia - University of Maryland Prof. John Taylor - King's College London Workshop Details ------------------- In order to have a workshop of the highest quality it incorporates a combination of paper presentations on one of the six areas of interest by the participants and more open discussion oriented activities. The discussion element of the EmerNet Workshop will be related to the questions above and it is highly desirable that those wishing to participate focus on one or more of these issues in an extended abstract or position paper of up to 4 pages. Papers should be in either ps, pdf or doc format via email for consideration to Professor Stefan Wermter and Mark Elshaw by the 1st of June 2000. KEY QUESTIONS IS: What can we learn from cognitive neuroscience and the brain for building new computational neural architectures. It is intended that for all participants registration, meals and accommodation at Durham Castle for the Workshop will be provided free of charge. Further, specially invited participants are to receive reasonable travel expenses reimbursed and additional participants rail travel costs in the UK. We also plan to have six places for PhD students or recent post-doctorates and encourage applicants. Extended versions of papers can be published as book chapters in a book with Springer. Location - Durham Castle ------------------------- The EmerNet Workshop is to be held at Durham Castle, Durham(chosen as in between Sunderland, York and Edinburgh) in the North East of England. There are few places in the world that can match the historic City of Durham, with its dramatic setting on a rocky horseshoe bend in the River Wear and beautiful local countryside. Furthermore, it offers easy accessibility by rail from anywhere in the Great Britain and is close to the international airport at Newcastle. The workshop provides the chance to stay at a real English castle that was constructed under the orders of King William the Conqueror in 1072, shortly after the Norman Conquest. It has many rooms of interest including a Norman Chapel that has some of the most fascinating Norman sculptures in existence and the Great Hall that acts as the dinning area. By having the EmerNet Workshop at this excellent location this provides the chance for interesting and productive discussion in a peaceful and historic atmosphere. It is possible to gain a flavour of Durham Castle and Cathedral on the on-line tour at http://www.dur.ac.uk/~dla0www/c_tour/tour.html Contact Details --------------- Mark Elshaw (Workshop Organiser) Hybrid Intelligent Systems Group Informatics Centre SCET University of Sunderland St Peter's Way Sunderland SR6 0DD United Kingdom Phone: +44 191 515 3249 Fax: +44 191 515 2781 E-mail: Mark.Elshaw at sunderland.ac.uk Prof. Stefan Wermter (Chair) Informatics Centre, SCET University of Sunderland St Peter's Way Sunderland SR6 0DD United Kingdom Phone: +44 191 515 3279 Fax: +44 191 515 2781 E-mail: Stefan.Wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ From rfrench at ulg.ac.be Fri Apr 7 14:44:35 2000 From: rfrench at ulg.ac.be (Robert French) Date: Fri, 07 Apr 2000 20:44:35 +0200 Subject: Sixth Neural Computation and Psychology Workshop, Liege, Belgium, Sept 16-18 Message-ID: <4.1.20000407202804.00a77250@pop3.mailst.ulg.ac.be> SIXTH NEURAL COMPUTATION AND PSYCHOLOGY WORKSHOP (NCPW6) Connectionist models of evolution, learning and development University of Liege, Liege, Belgium Saturday, September 16 to Monday, September 18, 2000 URL: http://www.fapse.ulg.ac.be/ncpw6/ AIMS AND OBJECTIVES This workshop, the sixth in a series, has the explicit goal of bringing together connectionists modelers, and especially connectionist modelers from Europe, who are primarily focused on some aspect of psychology or neuropsychology. Each year there is a relatively wide-ranging theme for the Workshop. For example, ?Neurodynamics and Psychology? was the theme of the first workshop, ?Perception?, the third, ?Connectionist representations? the fourth, and so on. This year the theme will be ?Development, learning and evolution.? This is a broad topic and intentionally so. Although we aren?t interested in, say, connectionist applications to submarine warfare, we will consider all papers that have something to do with the announced topic, even if rather tangentially. The organization of the final program will depend on the submissions received. As in previous years, the Workshop will be reasonably small. There are, for example, no parallel sessions and no poster sessions. And plenty of time will be left for discussion among participants. The atmosphere is designed to be congenial but rigorous. Even though participation in the Workshop is by no means limited to Europeans, one of its explicit goals is to bring together European connectionist modelers interested in psychology or neuropsychology. This is the first year that NCPW is being held on the Continent, a move that was explicitly designed to attract not only the usual contingent of British connectionists, but also our colleagues from other European countries as well. CALL FOR ABSTRACTS The Workshop will last from Saturday, September 16 through the morning of Monday, September 18, 2000. There will be approximately 25-30 paper presentations. Abstracts (approximately 200 words) are due by June 7. Notification of acceptance for a paper presentation will be done by June 22. The finished paper must be ready by the time of the Workshop. See the NCPW6 Web page for all details: http://www.fapse.ulg.ac.be/ncpw6/ REGISTRATION, ETC. The registration fee will be 100 euros. (Reductions will be made upon request for a limited number of students, especially students from abroad who also have significant travel expenses.) Included in this fee will be the welcome reception (evening of September 15), coffee breaks, lunches, and a copy of Workshop Proceedings afterthe Workshop (we have a tentative publishing agreement with Springer-Verlag, as in the past). Special conference rates have been arranged at a number of hotels, as indicated on the Web page for the Workshop. ORGANIZING COMMITTEE Bob French (rfrench at ulg.ac.be) University of Liege, Belgium, organizer Axel Cleeremans (axcleer at ulb.ac.be) Universite Libre de Bruxelles Nick Chater (Nick.Chater at warwick.ac.uk) University of Warwick Denis Mareschal (d.mareschal at bbk.ac.uk) Birkbeck College, London Jacques Sougn? (j.sougne at ulg.ac.be) University of Li?ge, Belgium CONTACT DETAILS For any problems or questions, please send e-mail to mailto:cogsci at ulg.ac.be From bsrc at neuron.kaist.ac.kr Thu Apr 6 21:53:07 2000 From: bsrc at neuron.kaist.ac.kr (=?EUC-KR?B?vcXHysij?=) Date: Fri, 07 Apr 2000 10:53:07 +0900 Subject: CALL FOR PAPERS - ICONIP 2000 Message-ID: <38ED3F81.166079D2@neuron.kaist.ac.kr> <<< ICONIP 2000 >>> - 7th International Conference on Neural Information Processing Taejon, Korea November 14-18, 2000 Paper Submission Due : May 15, 2000 ICONIP-2000, the annual conference sponsored by the Asia-Pacific Neural Network Assembly (APNNA), will be held in Taejon, Korea, from November 14 to 18, 2000. The theme of the conference, Neural Information Processing, is broad enough to promote wide interactions among researchers in many academic disciplines. The conference will consist of one-day tutorial, three-and-a-half day oral and poster presentations, and a half-day historic tour. In addition to mathematical and engineering approaches, we also include cognitive science in the main stream. The conference topics include but are not limited to: - NEURAL NETWORK MODELS Learning algorithms Neural network architectures Neurodynamics & spiking neuron Statistical neural network models - COGNITIVE SCIENCE Data representation Learning and memory Neurobiological systems Perception, emotion, and cognition Selective attention Vision and auditory models - HYBRID NEURAL SYSTEMS Evolutionary neural systems Fuzzy neural systems Soft computing Symbolic-neural hybrid systems - NEURAL HARDWARE IMPLEMENTATION Analog, digital, and hybrid neuro-chips Artificial retina and cochlear chips DSP and software implementation - NEURAL NETWORK APPLICATIONS Computer vision Data mining Expert systems Finance and electronic commerce Human-computer interaction Intelligent control Natural language processing Pattern recognition Robotics Sensorimotor systems Signal processing Speech recognition Time series prediction Papers should be submitted by 15 May 2000 and registered on the conference web page. They will be reviewed by senior researchers in the field and the authors will be informed about the decision of the review process by 15 July 2000. The accepted papers must be submitted in a camera-ready format by 15 Aug 2000. All accepted papers will be presented at the conference either in an oral or in a poster session and included in the conference proceedings. At least one author for each accepted paper should make advance registration. The conference proceedings will be published as a hardcopy and a CD-ROM. To facilitate a fast review process and publication of the proceedings, electronic submission of compressed PostScript or PDF files is required. If absolutely unavoidable, send four copies of the paper by mail to the Conference Secretariat. The paper must be written in English. The paper should not exceed six pages when printed on a A4-format white paper with 2.5cm margins on all four sides, in a single-spaced two-column single-page format, in Times or similar font of 10 points. Centered at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by an abstract and the text. Detail paper format may be found at http://braintech.kaist.ac.kr/ICONIP2000/authors.html. Proposals are solicited for tutorials. The proposals for tutorials should outline the subject area with a brief biography of the organizer, and should be submitted to the Conference Secretariat by May 1, 2000. Proposals for special sessions are solicited. Each proposal for a special session should include the following information: title of the special session, special session organizer and affiliation, subject areas to be covered by the special session, and potential authors. The proposals should be submitted by May 1, 2000 via e-mail to the special sessions committee chair at btzhang at scai.snu.ac.kr. More details on proposals for special sessions can be found at http://braintech.kaist.ac.kr/ICONIP2000/information.html. - May 1, 2000 : Proposals for Tutorials and Special Invited Sessions - May 15, 2000 : Paper Submission - July 15, 2000 : Acceptance Notification - August 15, 2000 : Camera-ready Manuscripts and Advance Registration * Regular registration - US$475 (advance registration), US$550 (on-site) * Student registration - US$200 (advance registration), US$250 (on-site) * Both Regular and Student registration include free descent lunches from November 15th to 18th and a half-day historic tour in the afternoon of November 16th. Free coffee, welcoming reception, and closing reception will also be provided to all registrants. * Regular registration includes both paper and CD-ROM proceedings, while Student registration includes only CD-ROM proceedings. * Additional paper proceedings : US$70 * Additional CD-ROM proceedings: US$30 * Banquet on November 17, 2000 : US$50 Full time students who will present a paper at ICONIP2000 are eligible to apply for Student Conference Travelling Fund supported by =A1=B0Amari/Kasabov and co-authors of the Brain-like Computi= ng Book=A1=B1. More detailed information is available at http://braintech.kaist.ac.kr/ICONIP2000/travellingfund.htm Asia-Pacific Neural Network Assembly (APNNA) Brain Science Research Center, KAIST International Neural Network Society (INNS) IEEE Neural Network Council (IEEE NNC) European Neural Network Society (ENNS) Asian Office of Aerospace Research & Development, US Air Office of Scientific Researches Brain Science Research Center 3rd Floor, LG Semicon Hall Korea Advanced Institution of Science & Technology 373-1 Kusong-dong, Yusong-ku Taejon, 305-701, Korea Tel: +82-42-869-5431 Fax: +82-42-869-8492 Email : ICONIP2000 at braintech.kaist.ac.kr Web: http://braintech.kaist.ac.kr/ICONIP2000 Conference Chair : Soo-Young Lee KAIST Conference Co-Chairs : Kunihiko Fukushima Univ. of Electro-Comm. Jung-Mo Lee Sungkyunkwan Univ. Program Committee Chair : Seong-Whan Lee Korea University Co-Chairs : Erkki Oja Helsinki Univ. of Tech. Minoru Tsukada Tamagawa University Secretary : Sung-Bae Cho Yonsei University Members : Kazuyuki Aihara Univ. of Tokyo Hye-Ran Byun Yonsei Univ. Laiwan Chan The Cheinese Univ. of Hong Kong Sungzoon Cho Seoul National University Chan-Sup Chung Yonsei Univ. Myung Jin Chung KAIST Andrzej Cichocki RIKEN BSI Joydeep Ghosh Univ. of Texas, Austin Seung Kee Han Chungbuk National Univ. Zhen-Ya He Southeast Univ. Yuzo Hirai University of Tsukuba Jenq-neng Hwang Univ. of Washington, Seattle Matsumi Ishikawa Kyushu Institute of Tech. Marwan Jabri University of Sydney Anil Jain Michigan State University Janusz Kacprzyk Polish Academy of Science Nikola Kasabov University of Otago Hideki Kawahara Wakayama University Doh-Suk Kim Samsung Adv. Inst. Tech. Irwin King Chinese Univ. Hong Kong Chong-Ho Lee Inha University Choongkil Lee Seoul National University Daniel Lee Lucent Technologies Te-Won Lee Salk Institute Yillbyung Lee Yonsei University Jianchang Mao IBM Almaden Research Ctr. Gen Matsumoto RIKEN BSI Jong-Seop Moon Korea University Mike Mozer Univ. of Colorado at Boulder Takashi Omori Tokyo Univ. of Agriculture & Tech. Dong Chul Park Myungji University Seung Kwon Park Hanyang University Elie Sanchez NEURINFO Yasuji Sawada Tohoku University Sebastian Seung MIT Jude Shavlik Univ. of Wisc. at Madison Jang Kyoo Shin Kyungpook National Univ. Jonghan Shin RIKEN BSI Satoshi Shioiri Chiba University Keiji Tanaka RIKEN BSI Keiji Uchikawa Tokyo Institute of Technology Deliang Wang Ohio State University Patrick Wong Univ. of New South Wales Hyun-Seung Yang KAIST Ramin Yasdi GMD Myung-Hyun Yoo Korea University Shuji Yoshizawa University of Tokyo Byoung-Tak Zhang Seoul National University General Affairs Committee Chair : Rhee-Man Kil KAIST Members : Jin Young Choi Seoul National University Sung Ho Kim KAIST Cheol-Hoon Park KAIST Local Arrangement Committee Chair : Chang-Dong Yoo KAIST Members : Yong-Soo Kim Taejon Univ. Kwee-Bo Sim Chungang Univ. Finance Committee Chair : Hong-Tae Jeon Chungang Univ. Members : Mun-Sung Han ETRI Hong Jeong Postech Seong-Gon Kong Soongsil Univ. Yung-Bin Kwon Chungang Univ. Publication Committee Chair : Seungwhan Kim Postech Members : Min-Shik Kim Yonsei University Seungjin Choi Chungbuk National Univ. Publicity Committee: Chair : Chong-Ho Lee Inha University Members : Hoon Kang Chungang Univ. Hyung-Cheul Shin Hallym University International Advisory Committee: Chair : Sung-Yang Bang Postech Co-Chairs : Shunichi Amari RIKEN BSI Harold Szu NSWC Members : Jim Bezdek Univ. of West Florida David Casasent Carnegie Mellon University Chan-Sup Chung Yonsei Univ. Rolf Eckmiller University of Bonn Tom Gedeon Univ. of New South Wales Zhen-Ya He Southeast University Nikola Kasabov Univ. of Otago Jin Hyung Kim KAIST Myung-Won Kim Soongsil University Cliff Lau Office of naval Researches Sukhan Lee Samsung Advanced Institute Hansperter Mallot Max-Planck Institute Se-Yung Oh Postech Nikhil R. Pal Indian Statistical Institute John Taylor King's College of London Shiro Usui Toyohashi University of Tech. Lipo Wang Nanyang Tech. Univ. Patrick Wong UNSW Lei Xu Chinese Univ. Hong Kong Youshou Wu Tsinghua Univ. Takeshi Yamakawa Kyushu Inst. of Technology Jacek Zurada Univ. of Louisville Exhibition Committee Chair : Dong-Jo Park KAIST Tutorial Committee Chair : Sungzoon Cho Seoul National University Members : Seungjin Choi Chungbuk National Univ. Sungbae Cho Yonsei University Hyukjoon Lee Kwangwoon University Minho Lee Kyungpook National Univ. Special Session Committee Chair : Byoung-Tak Zhang Seoul National University Members : Daniel D. Lee Bell Lab, Lucent Technologies Nando de Freitas Univ.of California, Berkeley Te-Won Lee Univ.of California, San Diego From Ramin.Yasdi at gmd.de Fri Apr 7 09:44:56 2000 From: Ramin.Yasdi at gmd.de (Ramin Yasdi) Date: Fri, 07 Apr 2000 15:44:56 +0200 Subject: ICONIP-2000 Special Session Message-ID: <38EDE658.376CD298@gmd.de> CALL FOR PAPERS ICONIP-2000 Special Session ON NEURAL NETWORKS FOR INTELLIGENT USER INTERFACES User interfaces that adapt themselves to individual needs, preferences, and knowledge of their users are becoming more and more important. Personalized interfaces are of special importance to deal with information overload and navigation by personalizing and improving the quality of information retrieval and filtering, information restructuring and annotation, as well as information visualization. The development of these new intelligent user interfaces require techniques that enable computer programs to learn how to serve the user most efficiently. Neural networks are not yet widely used within this challenging domain. But the domain seems to be an interesting new application area for neural networks due to availability of large sets of data and the required automatic adaptation to new situations and users. Therefore, growing interest in using various powerful learning methods known from neural network models for intelligent user interfaces is arising among researchers. The scope of the session includes but is not limited to following topics: * user models * adaptive hyper media * classifying and recognizing users, emotions and situations * information retrieval * adapting complex user interfaces * intelligent student systems * representation of application domains We solicit reports on actual neural networks applications, and discussion contributions on their usefulness. Since most successful applications in this area use symbolic AI methods, it is under debate if and how neural networks can contribute to this area. SESSION FORMAT: The session will give participants possibility of short presentations (talks or demos, about 20 minutes ) on their vision or work in the area. Most of the session, however, will have the format of an open discussion forum. At the end of the session, a discussion will take place to deal with questions on how to combine research efforts and how to link the community. SUBMISSION INSTRUCTIONS: Please send your paper to: ramin.yasdi at gmd.de by the submission deadline below. See guideline for authors for more details. http://braintech.kaist.ac.kr/ICONIP2000 Letter of interest in submitting a paper to the special session: May1, 2000. Deadline for paper submission: June 15, 2000. Notification of acceptance: July 15, 2000. Camera-ready papers due: August 15, 2000. SESSION ORGANIZERS: Ramin Yasdi German National Research Centre for Information Technology (GMD) Schloss Birlinghoven, 53754 Sankt Augustin, Germany Email: Ramin.Yasdi at gmd.de From planning at icsc.ab.ca Sat Apr 8 13:48:58 2000 From: planning at icsc.ab.ca (Jeanny S. Ryffel) Date: Sat, 8 Apr 2000 11:48:58 -0600 Subject: cfp for symbol processing session for ISA'2000 Message-ID: <000501bfa17a$1e2b4ee0$984722cf@compusmart.ab.ca> SPECIAL SESSION ON SYMBOLS, SYMBOL PROCESSING and NEURAL NETWORKS http://www.icsc.ab.ca/150-prog.html#Scientific Organizer: Bernadette M. Garner Bernadette.Garner at infotech.monash.edu.au Topics to be covered in the special session include: symbol processing and the nature of symbols. Not specifically language processing but - how the biological brain handles symbols - how symbols can be stored - how symbols can be manipulated - the definition of symbols - how artifical neural networks can be trained using symbols However, any topic relevant to symbols and symbol processing will be considered for discussion. Interested researchers should submit manuscripts of up to 5,000 words. Submission by electronic mail is strongly recommended. Or fax 2 copies to B. M. Garner CSSE Monash University Clayton, 3168, Australia. Fax: +61 3 9905 5146 PROCEEDINGS AND PUBLICATIONS All accepted and invited papers will be included in the congress proceedings, published in print and on CD-ROM by ICSC Academic Press, Canada/Switzerland. A selected number of papers will be expanded and revised for possible inclusion in special issues of some prestigious journals. IMPORTANT DATES May 15, 2000: Submission deadline June 15, 2000: Notification of acceptance July 30, 2000: Delivery of full papers December 12-15, 2000: ISA'2000 congress This session is part of the International Congress on INTELLIGENT SYSTEMS AND APPLICATIONS (ISA'2000) University of Wollongong (near Sydney), Australia December 12-15, 2000 http://www.icsc.ab.ca/isa2000.htm SPONSORS University of Wollongong, Industrial Automation Research Centre Nortel Networks IEE The Institution of Electrical Engineers IEAust The Institution of Engineers, Australia CRC IMST Cooperate Resarch Centre for Intelligent Manufacturing Systems and Technologies Ltd. ICSC International Computer Science Conventions From b344dsl at utarlg.uta.edu Sat Apr 8 14:35:25 2000 From: b344dsl at utarlg.uta.edu (Dan Levine) Date: Sat, 8 Apr 2000 13:35:25 -0500 Subject: Levine's textbook, 2nd edition Message-ID: <003c01bfa189$34ee7ee0$bd1a6b81@uta.edu> In my announcement about the second edition of my textbook with Erlbaum coming out, i forgot to include contact information. Any questions or comments can be e-mailed to me at levine at uta.edu. Also there is some additional information about the book (though it needs to be updated) at my web site, www.uta.edu/psychology/faculty/levine. Dan Levine From thilo.reski at gmx.de Mon Apr 10 05:33:12 2000 From: thilo.reski at gmx.de (thilo.reski@gmx.de) Date: Mon, 10 Apr 2000 11:33:12 +0200 (MEST) Subject: PhD Thesis: Mapping and Parallel Simulation of ANN Message-ID: <31366.955359192@www6.gmx.net> Dear Connectionists, My PhD thesis on mapping and parallel simulation of neural networks is now available Title: Mapping and Parallel, Distributed Simulation of Neural Networks on Message Passing Multiprocessors Abstract: This thesis introduces an etire policy to the parallelization and parallel simulation of artificial neural networks (ANN). The idea is to hide the parallelization effort to the ANN developer and to the ANN user. Main issues are i) Analysis of the ANN in terms of parallel execution, ii) Mapping the ANN to an abstract (scaleable) message passing computer systems, iii) parallel simulation on such an architecture. Results indicate, that transparent parallelization of neural networks is useful in order to efficiently develop and apply non-trivial neural networks. If you are interested, send an empty e-Mail to "thilo.reski at gmx.de" with subject "PhD Thesis" Best regards, Thilo Reski -- Dr. Thilo Reski Am Wolfsberg 9a 64569 Nauheim Germany Fax/Tel: +49 / 6152 / 637977 Mobil : +49 / 178 / 637977 8 e-mail: kontakt at thilo-reski.de http://www.thilo-reski.de Sent through GMX FreeMail - http://www.gmx.net From murphyk at cs.berkeley.edu Mon Apr 10 16:09:35 2000 From: murphyk at cs.berkeley.edu (Kevin Murphy) Date: Mon, 10 Apr 2000 13:09:35 -0700 Subject: Bayes Net Toolbox 2.0 for Matlab Message-ID: <38F234FF.17780AE1@cs.berkeley.edu> I am pleased to announce a major new release of the Bayes Net Toolbox, a software package for Matlab 5 that supports inference and learning in directed graphical models. Specifically, it supports exact and approximate inference, discrete and continuous variables, static and dynamic networks, and parameter and structure learning. Hence it can handle a large number of popular statistical models, such as the following: PCA/factor analysis, logistic regression, hierarchical mixtures of experts, QMR, DBNs, factorial HMMs, switching Kalman filters, etc. For more details, and to download the software, please go to http://www.cs.berkeley.edu/~murphyk/Bayes/bnt.html The new version (2.0) has been completely rewritten, making it much easier to read, use and extend. It is also somewhat faster. The main change is that I now make extensive use of objects. (I used to use structs, and a dispatch mechanism based on the type-tag system in Abelson and Sussman.) In addition, each inference algorithm (junction tree, sampling, loopy belief propagation, etc.) is now an object. This makes the code and documentation much more modular. It also makes it easier to add special-case algorithms, and to combine algorithms in novel ways (e.g., combining sampling and exact inference). I have gone to great lengths to make the source code readable, so it should prove an invaluable teaching tool. In addition, I am hoping that people will contribute algorithms to the toolbox, in the spirit of the open source movement. Kevin Murphy From moatl at cs.tu-berlin.de Mon Apr 10 02:58:56 2000 From: moatl at cs.tu-berlin.de (Martin Stetter) Date: Mon, 10 Apr 2000 08:58:56 +0200 Subject: Final Call: EU Advanced Course in Computational Neuroscience Message-ID: <38F17BB0.7A0CE64A@cs.tu-berlin.de> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Second Call for the EU ADVANCED COURSE IN COMPUTATIONAL NEUROSCIENCE (AN IBRO NEUROSCIENCE SCHOOL) AUGUST 21 - SEPTEMBER 15, 2000 INTERNATIONAL CENTRE FOR THEORETICAL PHYSICS, TRIESTE, ITALY DIRECTORS: Erik De Schutter (University of Antwerp, Belgium) Klaus Obermayer (Technical University Berlin, Germany) Alessandro Treves (SISSA, Trieste, Italy) Eilon Vaadia (Hebrew University, Jerusalem, Israel) The EU Advanced Course in Computational Neuroscience introduces students to the panoply of problems and methods of computational neuroscience, simultaneously addressing several levels of neural organisation, from subcellular processes to operations of the entire brain. The course consists of two complementary parts. A distinguished international faculty gives morning lectures on topics in experimental and computational neuroscience. The rest of the day is devoted to practicals, including learning how to use simulation software and how to implement a model of the system the student wishes to study on individual unix workstations. The first week of the course introduces students to essential neuro- biological concepts and to the most important techniques in modeling single cells, networks and neural systems. Students learn how to apply software packages like GENESIS, MATLAB, NEURON, XPP, etc. to the solution of their problems. During the following three weeks the lectures will cover specific brain functions. Each week topics ranging from modeling single cells and subcellular processes through the simulation of simple circuits, large neuronal networks and system level models of the brain will be covered. The course ends with a presentation of the students' projects. The EU Advanced Course in Computational Neuroscience is designed for advanced graduate students and postdoctoral fellows in a variety of disciplines, including neuroscience, physics, electrical engineering, computer science and psychology. Students are expected to have a basic background in neurobiology as well as some computer experience. Students of any nationality can apply. A total of 32 students will be accepted. Students of any nationality can apply. About 20 students will be from the European Union and affiliated countries (Iceland, Israel, Liechtenstein and Norway plus all countries which are negotiating future membership with the EU). These students are supported by the European Commission and we specifically encourage applications from researchers who work in less-favoured regions of the EU, from women and from researchers from industry. IBRO and ICTP provide support for participation from students of non-Western countries, in particular countries from the former Soviet Union, Africa and Asia, while The Brain Science Foundation supports Japanese students. Students receiving support from the mentioned sources will receive travel grants and free full board at the Adriatico Guest House. More information and application forms can be obtained: - http://www.bbf.uia.ac.be/EU_course.shtml Please apply electronically using a web browser if possible. - email: eucourse at bbf.uia.ac.be - by mail: Prof. E. De Schutter Born-Bunge Foundation University of Antwerp - UIA, Universiteitsplein 1 B2610 Antwerp Belgium FAX: +32-3-8202669 APPLICATION DEADLINE: April 15, 2000. Applicants will be notified of the results of the selection procedures by May 31, 2000. COURSE FACULTY: Moshe Abeles (Hebrew University of Jerusalem, Israel), Carol Barnes (University of Arizona, USA), Avrama Blackwell (George Mason University, Washington, USA), Valentino Braitenberg (MPI Tuebingen, Germany), Jean Bullier (Universite Paul Sabatier, Toulouse, France), Ron Calabrese (Emory University, Atlanta, USA), Carol Colby (University Pittsburgh, USA), Virginia de Sa (University California San Francisco, USA), Alain Destexhe (Laval University, Canada), Opher Donchin (Hebrew University of Jerusalem, Israel), Karl J. Friston (Institute of Neurology, London, England), Bruce Graham (University of Edinburgh, Scotland), Julian J.B. Jack (Oxford University, England), Mitsuo Kawato (ATR HIP Labs, Kyoto, Japan), Jennifer Lund (University College London, England), Miguel Nicolelis (Duke University, Durham, USA), Klaus Obermayer (Technical University Berlin, Germany), Stefano Panzeri (University of Newcastle, England), Alex Pouget (University of Rochester, USA), John M. Rinzel (New York University, USA), Nicolas Schweighofer (ATR ERATO, Kyoto, Japan), Idan Segev (Hebrew University of Jerusalem, Israel), Terry Sejnowski (Salk Institute, USA), Haim Sompolinsky (Hebrew University of Jerusalem, Israel), Martin Stetter (Siemens AG Muenchen, Germany), Shigeru Tanaka (RIKEN, Japan), Alex M. Thomson (Royal Free Hospital, London, England), Naftali Tishby (Hebrew University of Jerusalem, Israel), Alessandro Treves (SISSA, Trieste, Italy), Eilon Vaadia (Hebrew University of Jerusalem, Israel), Charlie Wilson (University of Texas, San Antonio, USA), More to be announced... The 2000 EU Advanced Course in Computational Neuroscience is supported by the European Commission (5th Framework program), by the International Centre for Theoretical Physics (Trieste), by the Boehringer Ingelheim Foundation, by the International Brain Research Organization and by The Brain Science Foundation (Tokyo). -- ---------------------------------------------------------------------- Dr. Martin Stetter phone: ++49-30-314-73117 FR2-1, Informatik fax: ++49-30-314-73121 Technische Universitaet Berlin web: http://www.ni.cs.tu-berlin.de Franklinstrasse 28/29 D-10587 Berlin, Germany ---------------------------------------------------------------------- From ASJagath at ntu.edu.sg Tue Apr 11 05:01:18 2000 From: ASJagath at ntu.edu.sg (Jagath C Rajapakse (Asst Prof)) Date: Tue, 11 Apr 2000 17:01:18 +0800 Subject: ICONIP2000: Special Session on Brain Imaging Message-ID: CALL FOR PAPERS ICONIP 2000: SPECIAL SESSION ON BRAIN IMAGING Today much of what we know about neural information processing and diseases of the human brain have been derived from images of human brain, produced by various imaging modalities. Although the brain images are direct measurement of its structure and often its function, the full potential of these images remains largely unexploited today. This session will focus on recent advances in brain imaging research to explore natural neural information processing mechanisms and to investigate characteristics of brain diseases from imaging data. Research papers will be solicited for presentation in both structural and functional brain imaging but not restricted to the following areas. 1. Structural brain imaging Xray CT; MRI; Brain shelling; Detection of sulcul and gyral patterns; Cortical segmentation; Cortical parcellation; Segmentation of subcortical structures, hippocampus, cerebellum; 3-D visualization, rendering; Morphometrical correlates of neurological and psychiatric disease. 2. Functional brain imaging EEG; MEG; fMRI,; PET; Optical; Near infrared; Source localization; Statistical parameter maps; Time-series analysis; Multi-modality imaging, registration. SUBMISSION INSTRUCTIONS: Please send your letter of interest and paper by email to asjagath at ntu.edu.sg by the submission deadline below and see guidelines for authors for more details. http://braintech.kaist.ac.kr/ICONIP2000 Letter of interest in submitting a paper: May1, 2000. Deadline for paper submission: June 15, 2000. Notification of acceptance: July 15, 2000. Camera-ready papers due: August 15, 2000. SESSION CHAIRS Dr: Jagath C. Rajapakse Dr. Frithjof Kruggel School of Applied Science Max Planck Institute of Cognitive Neuroscience Nanyang Technological University Stephanstrasse 1 N4, Nanyang Avenue 04103 Leipzig Singapore. Germany Email: asjagath at ntu.edu.sg Email: kruggel at cns.mpg.de From dario.floreano at epfl.ch Wed Apr 12 06:47:39 2000 From: dario.floreano at epfl.ch (Dario Floreano) Date: Wed, 12 Apr 2000 12:47:39 +0200 Subject: PhD studentships available Message-ID: 3 PhD Studentships (Research Assistant) @ Swiss Federal Institute of Technology in Lausanne (EPFL) Three postgraduate research positions leading to a PhD in Engineering at the Institute of Robotic Systems of the Swiss Federal Institute of Technology in Lausanne (EPFL) are offered for a project in BIO-INSPIRED AND ADAPTIVE ROBOTICS by Dario Floreano. The research topics are: 1- Methods in Evolutionary Robotics 2- Evolutionary Embedded Vision 3- Interactive Adaptation for Personal and Service Robotics For project descriptions and application procedures, please see: http://diwww.epfl.ch/lami/team/floreano/jobs.html --------------------------------------- Prof. Dario Floreano Autonomous Systems Laboratory (ASL) Institute of Robotic Systems (ISR-DMT) Swiss Federal Institute of Technology (EPFL) CH-1015 Lausanne, Switzerland Dario.Floreano at epfl.ch Phone: ++41 21 693 5230 Fax: ++41 21 693 5263 http://diwww.epfl.ch/lami/team/floreano From nick.jakobi at animaths.com Thu Apr 13 10:46:01 2000 From: nick.jakobi at animaths.com (Nick Jakobi) Date: Thu, 13 Apr 2000 15:46:01 +0100 Subject: Job Vacancies at MASA (U.K. office) Message-ID: <01BFA560.7F9025E0.nick.jakobi@animaths.com> Founded in Paris in 1997, MASA specializes in the production of cutting-edge adaptive technologies - software and hardware that seeks to emulate and exploit many of the properties of living things. The company is heavily research orientated and now employs over 50 people including many PhDs from the areas of Artificial Life, Artificial Intelligence, Mathematics and Scientific computing. This makes it one of the largest laboratories (public or private) of its kind in the world. In early 1999, MASA opened its British division on the campus of Sussex University (Brighton, UK) to take advantage of close links with academia. This division has a wide remit and current projects include financial prediction, constraint satisfaction, the development of controllers for unmanned vehicles, path-finding algorithms and the creation of original and powerful tools for industrial combinatorial optimization problems. As part of its continued expansion, MASA is currently looking for exceptional candidates to fill the following posts at its UK offices: 3 Research Scientists. Prospective candidates must have recently obtained (or be about to obtain) a PhD or similar high-level research experience in a relevant discipline. Ideally, they will have skills in the following areas: computing, mathematical modeling and visualization, evolutionary and adaptive systems, optimization. They will be expected to perform creative research that produces innovative solutions to hard industrial problems within commercial constraints. After an initial training period they will also be expected to manage 1-2 developers on a day-to-day basis who will help them implement their ideas and work with them as part of a team. 4 Developers. Prospective candidates will hold a MSc or equivalent in a relevant discipline. Ideally, they will have commercial software development experience, but the ability to learn new techniques quickly and to understand and implement complex algorithms is more important. MASA offers a very attractive salary and benefits package. The UK offices are situated in the Brighton Innovation Centre on the University of Sussex Campus surrounded by the beautiful South Downs and 3 miles from Brighton town centre and the sea. Please email a C.V. with the names of at least two referees and a covering letter to nick.jakobi at animaths.com. From gasser at cs.indiana.edu Thu Apr 13 23:14:57 2000 From: gasser at cs.indiana.edu (Michael Gasser) Date: Thu, 13 Apr 2000 22:14:57 -0500 (EST) Subject: Postdocs in cognitive development In-Reply-To: Message-ID: The Developmental Training Grant at Indiana University has several post-doctoral traineeships open for application. We are interested in individuals with backgrounds in cognitive science, cognitive development, language, linguistics, and connectionist modelling who who would benefit from interdisciplinary training. Information about the Training Grant may be found at http://www.indiana.edu/~psych/postdoc/multidis.html; by writing to Multidisciplinary Training in Developmental Process, c/o Melissa Foster, Department of Psychology, Indiana University, Bloomington, Indiana 4740, e-mail: mefoster at indiana.edu; or by contacting any member of the training faculty. Linda Smith, smith4 at indiana.edu Michael Gasser, gasser at indiana.edu Indiana University is an Equal Opportunity/Affirmative Action institution. Positions open until filled. From DominikD at cruxfe.com Fri Apr 14 02:22:30 2000 From: DominikD at cruxfe.com (Dominik Dersch) Date: Fri, 14 Apr 2000 16:22:30 +1000 Subject: Job Offer at CruxFE (Sydney) Message-ID: <610AC1238DA7D111B80E0020AFF2B9D1576137@mail.ucs.com.au> Financial Engineering Analyst Crux Financial Engineering (CruxFE) develops advanced systems for finance and industry, including artificial intelligence trading systems, and trading tools. CruxFE experiences significant growth over the last few years. In order to keep the competitive edge sharp we are seeking to appoint an outstanding Financial Engineering Analyst with a strong artificial neural networks background. Your role will focus on developing and implementing new time series forecasting and trading tools to a broad range of derivative instruments. Ideally you will possess: a recent Ph.D. in Physics, Engineering, Mathematics or a related field, a track record in financial time series forecasting, artificial neural networks, signal processing, pattern recognition and statistics, excellent programming skills in C/C++, Perl, and Matlab across NT and LINUX platforms and very good communication and problem solving skills. CruxFE offers a highly professional, creative working environment in a fast paced industry that experiences fundamental changes. We are located in the central business district of Sydney. For further details about the position and CruxFE please visit our web page at www.cruxfe.com or contact Dominik Dersch (dominik at cruxfe.com) Please reply quoting Ref. No.0405 to recruitment at cruxfe.com.au or in writing to Crux Financial Engineering Pty Ltd Level 7, 50 Carrington Street Sydney 2000 PO Box 656, Grosvenor Place NSW 1220 ______________________________________________________________________ Dr Dominik Dersch Research and Development Crux Financial Engineering Australia email: dominikd at cruxfe.com tel: +61 (02) 90040637 From mdorigo at iridia0.ulb.ac.be Fri Apr 14 09:38:45 2000 From: mdorigo at iridia0.ulb.ac.be (Marco Dorigo) Date: Fri, 14 Apr 2000 15:38:45 +0200 (CEST) Subject: CFP: IEEE Transactions on Evolutionary Computation Special Issue on Ant Algorithms and Swarm Intelligence Message-ID: <200004141338.PAA26995@iridia0.ulb.ac.be> ============================================= IEEE Transactions on Evolutionary Computation Special Issue on Ant Algorithms and Swarm Intelligence ============================================= ============================================= We apologize if you receive multiple copies ============================================= ================== CALL FOR PAPERS ================== The IEEE Transactions on Evolutionary Computation will publish a special issue on Ant Algorithms and Swarm Intelligence. The behavior of social insects in general, and of ants living in colonies in particular, has fascinated researchers in ethology and animal behavior for a long time. Many models have been proposed to explain their capabilities. Recently, ant algorithms and swarm intelligence systems have been offered as a novel computational approach that replaces the traditional emphasis on control, preprogramming, and centralization with designs featuring autonomy, emergence, and distributed functioning. These designs are proving flexible and robust, able to adapt quickly to changing environments and to continue functioning even when individual elements fail. The special issue will be dedicated to the publication of original research results on ant algorithms and, more in general, on swarm intelligence. Papers that prove new theoretical results on ant algorithms and swarm intelligence systems behavior, or that describe their successful applications to real-world problems are particularly welcome. The submission of papers is open to any researcher in ant algorithms. Researchers taking part in "ANTS'2000 - From Ant Colonies to Artificial Ants: Second International Workshop on Ant Colony Optimization" will be invited to submit a significantly extended version of their workshop submission to the special issue. Up-to-date information on the special issue is maintained at: http://iridia.ulb.ac.be/~ants/ants2000/ants2000-pub.html =================== EXPECTED TIMELINE =================== The ANTS'2000 workshop will take place September 8 to 9, 2000 in Brussels, Belgium. The deadline for the submission of papers to the special issue will be approximately four months after the workshop, so that authors have the time to improve their papers according to the feedback obtained at the workshop. Every paper will be refereed by at least two experts in the field and by at least one of the editors. To make the review process run smoother a special panel will be formed before the submission deadline. Based on our previous experience, we expect that a high percentage of the papers that will eventually be published will need to undergo a revision process consisting of at least two iterations. The tentative schedule is as follows: December 31, 2000. Deadline for submissions to the special issue. March 15, 2001. Referees reports and editors' decisions are sent to authors. May 31, 2001. Deadline for the revised versions of the papers. July 10, 2001. Referees reports and editors' decisions are sent to authors. September 15, 2001. Authors send final versions of the papers to the editors. October 15, 2001. Editors send the special issue to the publisher. The expected publication year of the special issue will be 2002. =================== THE GUEST EDITORS =================== Marco Dorigo, Universite' Libre de Bruxelles, Belgium Luca Maria Gambardella, IDSIA, Manno, Switzerland Martin Middendorf, Universitaet Karlsruhe, Germany Thomas Stuetzle, Technische Universitaet Darmstadt, Germany From nnk at hip.atr.co.jp Mon Apr 17 02:34:40 2000 From: nnk at hip.atr.co.jp (Neural Networks Japan Office) Date: Mon, 17 Apr 2000 15:34:40 +0900 Subject: Neural Networks 13(3) Message-ID: NEURAL NETWORKS 13(3) Contents - Volume 13, Number 3 - 2000 _______________________________________________________________ NEURAL NETWORKS LETTERS: Self-organized hierarchial structure in a plastic network of chaotic units J. Ito, K. Kaneko Improving local minima of Hopfield networks with augmented Lagrange multipliers for large scale TSPs M. Martin-Valdivia, A. Ruiz-Sepulveda, F. Triguero-Ruiz CURRENT OPINIONS: Learning non-stationary conditional probability distributions D. Husmeier ARTICLES: *** Psychology and Cognitive Science *** The patchwork engine: image segmentation from symmetries G.J. Van Tonder, Y. Ejima *** Neuroscience and Neuropsychology *** Position invariant recognition in the visual system with cluttered environments S.M. Stringer, E.T. Rolls *** Mathematical and Computational Analysis *** Local minima and plateaus in hierarchical structures of multilayer perceptrons K. Fukumizu, S.-I. Amari Learning in higher order Boltzmann machines using linear response M.A.R. Leisink, H.J. Kappen A recurrent neural network for solving linear projection equations J. Xia, J. Wang Efficient perceptron learning using constrained steepest descent S.J. Perantonis, V. Virvilis Information complexity of neural networks M.A. Kon, L. Paskota *** Technology and Applications *** A connectionist model for convex-hull of a planar set A. Datta, N.R. Pal, N.R. Pal Multilayer neural networks for solving a class of partial differential equations S. He, K. Reif, R. Unbehauen BOOK REVIEW: Book review: A fruitful blend, or a trinket-box? The MIT Encyclopedia of the Cognitive Sciences R. Raizada _______________________________________________________________ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** end. ==================================================================== NEURAL NETWORKS Editorial Office ATR Human Information Processing Research Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-1008 E-MAIL nnk at hip.atr.co.jp ==================================================================== From wmking at physics.bell-labs.com Mon Apr 17 16:38:33 2000 From: wmking at physics.bell-labs.com (Wayne M. King) Date: Mon, 17 Apr 2000 16:38:33 -0400 Subject: Workshop for the Analysis of Neural Data Message-ID: <38FB7649.1E7EC6DF@physics.bell-labs.com> Hello all, I wanted to share with you this announcement for an upcoming workshop on the analysis of neural data to be held this summer at woods hole, MA. The workshop has been extremely productive in the past years and we anticipate another good working group this year. I am including the poster for this year's workshop below. I would appreciate it if you could forward this e-mail to any of your colleagues who might be interested in these issues. If you have any further questions I can be reached via e-mail at wmking at bell-labs.com or by phone at (908) 582-2669. Applications should be mailed or faxed to the address listed in the poster, but I would be happy to answer any of your questions you might have about the workshop. Please check out the workshop's web site in order to see what has been discussed in previous years, as well as a list of participants. Thank you for your attention and help in disseminating this information. Sincerely, Wayne King Analysis of Neural Data Modern methods and open issues in the analysis and interpretation of multivariate time series and imaging data in the neurosciences 20 August - 2 September 2000 Marine Biological Laboratories - Woods Hole, MA A working group of scientists committed to quantitative approaches to problems in neuroscience will focus their efforts on experimental and theoretical issues related to the analysis of single and multichannel data sets. The work group is motivated by issues in two complimentary areas that are critical to an understanding of brain function. The first involves advanced signal processing methods, particularly those appropriate for emerging multisite recording and noninvasive imaging techniques. The second involves the development of a calculus to study the dynamical behavior of nervous systems and the computations they perform. A distinguishing feature of the work group is a close collaboration between experimentalists and theorists with regard to the analysis of data and the planning of experiments. The work group will have a limited number of research lectures, supplemented by tutorials on relevant computational, experimental, and mathematical techniques. The topics covered in the workgroup will maintain continuity with past years and will include the analysis of point process data (spike trains) as well as continuous processes (LFP, imaging data), and miscellaneous topics such as spike waveform classification. We will have two one-day workshops in addition to the scheduled activities: 28 August-Neuronal Control Signals for Prosthetic Devices 1 September-Statistical Inference for fMRI Time Series Participants: About twenty five participants, both experimentalists and theorists. Experimentalists are encouraged to bring data records to the work group; appropriate computational facilities will be provided. The work group will further take advantage of interested investigators and course faculty concurrently present at the MBL. We encourage graduate students and postdoctoral fellows as well as senior researchers to apply. PARTICIPANT FEE: $300 Accepted participants will be provided with shared dormitory accomodations at MBL and board. Support: National Institutes of Health-NIMH, NIA, NIAAA, NICHD/NCRR, NIDCD, NIDA, and NINDS. Organizers: David Kleinfeld (UCSD) and Partha P. Mitra (Bell Laboratories, Lucent Technologies). Website: www.vis.caltech.edu/~WAND/ Application: Send a copy of your c.v. together with a cover letter that contains a brief (ca. 200 word) paragraph on why you wish to attend the work group to: Ms. Jean B. Ainge Bell Laboratories, Lucent Technologies 700 Mountain Avenue 1D-427 Murray Hill, NJ 07974 908-582-4702 (fax) or Graduate students and postdoctoral fellows are encouraged to include a brief letter of support from their research advisor. Applications must be received by 19 May 2000 Participants will be notified by 29 May 2000 From ms at acl.icnet.uk Tue Apr 18 12:08:43 2000 From: ms at acl.icnet.uk (Margarita Sordo) Date: Tue, 18 Apr 2000 17:08:43 +0100 (BST) Subject: DPhil Thesis: A Neurosymbolic Approach to the Classification of Scarce and Complex Data Message-ID: <200004181608.RAA02502@marr.acl.icnet.uk> Dear Connectionists, My DPhil thesis on knowledge-based neural networks for classification of scarce and complex medical data is now available. Title: "A Neurosymbolic Approach to the Classification of Scarce and Complex Data" It can be found at: http://www.acl.icnet.uk/lab/aclsanchez.html Abstract: Artificial neural networks possess characteristics that make them a useful tool for pattern recognition and classification. Important features such as generalization, tolerance to noise and graceful degradation make them a robust learning paradigm. However, their performance strongly relies on large amounts of data for training. Therefore, their applicability is precluded in domains where data are scarce. Knowledge-based artificial neural networks (KBANNs) provide a means for combining symbolic and connectionist approaches into a hybrid methodology capable of dealing with small datasets. The suitability of such networks has been evaluated in binary-valued domain theories. After replicating some initial results with a binary-valued domain theory, this thesis presents new results with scarce and complex real-valued medical data. 31P magnetic resonance spectroscopy (MRS) of normal and cancerous breast tissues provide good testbeds to assess the advantages of such a methodology over other, more traditional connectionist approach for classification purposes in constrained domains. Experimental work confirmed the suitability of the proposed neurosymbolic approach for real-life applications with such constraints. Knowledge in the symbolic module helps to overcome the difficulties found by the connectionist module when confronted with small datasets. Details of breast tissue metabolism and MRS are presented. Knowledge acquisition methodologies for gathering the required knowledge for the definition of the domain theories are also described. Future directions for improving the KBANN methodology are discussed. =============================================================================== Margarita Sordo Sanchez ms at acl.icnet.uk Advanced Computation Laboratory Imperial Cancer Research Fund 61 Lincoln's Inn Fields, London WC2A 3PX, England, United Kingdom phone: 44 (020) 7242 0200 Ext 2911 44 (020) 7269 2911 (direct) fax: 44 (020) 7269 3186 =============================================================================== From sam26 at cam.ac.uk Tue Apr 18 10:57:15 2000 From: sam26 at cam.ac.uk (Stu McLellan) Date: Tue, 18 Apr 2000 15:57:15 +0100 Subject: Post Doctoral Position Message-ID: <02bd01bfa946$62d6c370$0cbe6f83@psychol.cam.ac.uk> UNIVERSITY OF CAMBRIDGE CENTRE FOR SPEECH AND LANGUAGE DEPARTMENT OF EXPERIMENTAL PSYCHOLOGY Post-doctoral Research Associate (full time) Connectionist modeller Applications are invited for a post-doctoral RA to work as part of a multi-disciplinary team, led by Professor L K. Tyler, investigating the interface between functional and neural accounts of the language system. Candidates should have a strong background and training in connectionist modelling and an interest in semantics and/or morphology and will be expected to contribute to the research programme by developing and analysing computational models with the aim of extending and testing theoretical accounts and generating novel predictions. Related experience in psycholinguistics, neuropsychology or neuroimaging would also be an advantage. This post is funded for a maximum of 5 years, starting as soon as possible. Salary will be on the RA1A scale =A315735 - =A323,651 (under review) according to age and experience.=20 =20 Applications in the form of a covering letter, full c.v., and the names and addresses of three referees (including email address) should be sent to Professor L. K. Tyler, Department of Experimental Psychology, University of Cambridge, Downing Street, Cambridge CB2 3EB to arrive no later than 15 May 2000. Informal enquiries can be emailed to hem10 at cam.ac.uk (until 28.4.00) and thereafter to lktyler at csl.psychol.cam.ac.uk. The University of Cambridge is an equal opportunities employer. From josh at vlsia.uccs.edu Tue Apr 18 14:12:16 2000 From: josh at vlsia.uccs.edu (Alspector) Date: Tue, 18 Apr 2000 12:12:16 -0600 (MDT) Subject: Research programmer position at university spinoff Internet startup Message-ID: Personalogy, Inc. is a fast growing company in Colorado Springs, CO that develops and applies state-of-the-art machine learning techniques to personalize information on the Internet. We are currently looking for a research programmer with the following credentials: -Master's or PhD degree in Computer Science, EE or related field -1-4 years in research and professional programming experience -strong programming skills in Perl/C/C++ -good knowledge of HTML/CGI -strong interests in information retrieval and user modeling -background in machine learning/neural networks/intelligent data mining -comfortable with Windows/NT/2K and Unix/Solaris/Linux systems -very good communication and problem solving skills The person will be responsible for enhancing and optimizing the core algorithms of the company as well as researching novel ways of reorganizing internet page contents using user-specific information. The person will also be involved in the overall system design. Personalogy offers a competitive salary, benefits and stock options. Please respond by sending a resume to personalogy at personalogy.net. Professor Joshua Alspector Univ. of Colorado at Col. Springs Dept. of Elec. & Comp. Eng. P.O. Box 7150 Colorado Springs, CO 80933-7150 (719) 262 3510 (719) 262 3589 (fax) josh at eas.uccs.edu From anderson at europa.cog.brown.edu Tue Apr 18 16:49:16 2000 From: anderson at europa.cog.brown.edu (anderson) Date: Tue, 18 Apr 2000 16:49:16 -0400 (EDT) Subject: Position available at Simpli.com Message-ID: <200004182049.QAA04226@europa.cog.brown.edu> Computer Scientist with Experience in Linguistics and Neural Networks Apply your skills to a new problem developing advanced language tools for Web search and other applications. Work in a small, dynamic internet start-up. Simpli.com (www.simpli.com) seeks a computer scientist or software engineer with a strong background in computational linguistics, preferably having experience with natural language processing, neural networks or advanced statistics. MA or PhD or 3-4 years experience required. Salary and Benefits: Salary commensurate with experience. Competitive benefits package. Environment: Simpli.com is an up-and-coming internet start-up devoted to improving web search. We were a "Company to Watch" at Demo 2000 in February. We are located in downtown Providence, Rhode Island within walking distance of Brown University and the Rhode Island School of Design. Providence has a vibrant arts community, museums, theaters, and world class dining. Living costs are extremely reasonable. Providence is an hour from Boston and 4 hours from New York. To apply: Mail, fax, or e-mail a cover letter, current resume and names of three references to: Andrew Duchon, Simpli.com, Inc., 203 S. Main, Providence, RI 02903. fax: 401-621-3220. email: aduchon at simpli.com. From mjhealy at u.washington.edu Tue Apr 18 22:19:00 2000 From: mjhealy at u.washington.edu (M. Healy) Date: Tue, 18 Apr 2000 19:19:00 -0700 (PDT) Subject: IJCNN 2000 paper available (fwd) Message-ID: My paper accepted for IJCNN 2000, M. J. Healy (2000), "Category Theory Applied to Neural Modeling and Graphical Representations" is available (with minor revisions) on the web in postscript format at http://cialab.ee.washington.edu/pubs.htm . I can supply .pdf format for anyone who requests it from me at mjhealy at u.washington.edu . This has a larger scope than my IJCNN 99 paper. Following a brief tutorial on category theory, it shows a simplified view of how functors implement concept hierarchies in a category of neural architecture components through the use of colimits (the content of the 1999 paper). It goes on to show how natural transformations between the functors interconnect the hierarchies at all their levels of abstraction. This yields a mathematical model of the semantics of the formation of ever-more-complex concepts in a connectionist memory during adaptation, and multiple implementations in subnetworks associated with different processing functions (vision, tactile and other sensors, association regions, motor function, etc.). The multiple implementations are hierarchies directed from the abstract to the specialized. These must be interconnected in a manner consistent with abstraction. Here, we mean that, first, different implementations of concepts at different abstraction levels must maintain the same relative positions in the implementation hierarchies. Second, performing a cross-hierarchy association and then abstracting or specializing (moving "down" or "up" in the associated hierarchy) must be interchangeable with first moving "down" or "up" and then moving across, so that the order of different stages in a perception or reasoning exercise does not change the semantics of what is being perceived or reasoned about. My claim is that the functors and natural transformations capture these peoperties mathematically. A full paper is under construction. Mike -- =========================================================================== e Michael J. Healy A FA ----------> GA (425)865-3123 | | FAX(425)865-2964 | | Ff | | Gf c/o The Boeing Company | | PO Box 3707 MS 7L-66 \|/ \|/ Seattle, WA 98124-2207 ' ' USA FB ----------> GB -or for priority mail- e "I'm a natural man." 2760 160th Ave SE MS 7L-66 B Bellevue, WA 98008 USA michael.j.healy at boeing.com -or- mjhealy at u.washington.edu ============================================================================ From kruschke at indiana.edu Wed Apr 19 10:27:54 2000 From: kruschke at indiana.edu (John K. Kruschke) Date: Wed, 19 Apr 2000 09:27:54 -0500 Subject: Post Doc in Cognitive Modeling at Indiana U. Message-ID: <38FDC26A.753DCB58@indiana.edu> > POSTDOCTORAL TRAINING FELLOWSHIPS in MODELING OF COGNITIVE > PROCESSES. The Psychology Department and Cognitive Science Program at > Indiana University anticipate one or more Postdoctoral Traineeships > funded by the National Institutes of Health. > Appointments will pay rates appropriate for a new or > recent Ph.D. and will be for one or two years, beginning July > 1, 2000 or later. Traineeships will be offered to qualified individuals > who wish to further their training in mathematical modeling or computer > simulation modeling, in any substantive area of cognitive psychology or > Cognitive Science. Women, minority group members, and handicapped > individuals are urged to apply. The NIMH awards are restricted to U.S. > citizens or permanent residents. Deadline for submission of application > materials has been extended to May 1, 2000, but we encourage earlier > applications. Applicants should send an up-to-date vita, > relevant reprints and preprints, a personal letter describing > their research interests, background, goals, and career plans, > and reference letters from two individuals. Send Materials to > Professor Jerome R. Busemeyer, Department of Psychology, Rm 367, Indiana > University, 1101 E. 10th St. Bloomington, IN 47405-7007. > Cognitive Science information may be obtained at > http://www.psych.indiana.edu/ Indiana University is an > Affirmative Action Employer. From X.Yao at cs.bham.ac.uk Wed Apr 19 06:44:16 2000 From: X.Yao at cs.bham.ac.uk (Xin Yao) Date: Wed, 19 Apr 2000 11:44:16 +0100 (BST) Subject: Lecturer in Computer Science (3 posts) Message-ID: Dear colleagues, We are currently inviting applications for the following three posts. Evolutionary and natural computation is one of the areas that we are particularly interested in. General enquiries should be directed to the HoS (contact info below). I'm happy to answer questions related to the research activities in evolutionary and natural computation. Preliminary announcement: A three-year research fellowship in evolutionary computation in the School of CS at the University of Birmingham will be advertised formally soon. Informal enquiries can be made to me. Best regards, Xin Yao (x.yao at cs.bham.ac.uk) ----------------------------------------------------------------------- URL for further particulars: http://www.bham.ac.uk/personnel/s35434.htm ----------------------------------------------------------------------- REFERENCE NUMBER S35434/00 JOB TITLE Lecturer in Computer Science (3 posts) DEPARTMENT/SCHOOL School of Computer Science HOURS Full time STARTING SALARY On Lecturer A or B scale in the range GBP17,238 - 30,065 per annum (Depending on experience and qualifications) DURATION Open STARTING DATE As soon as possible INFORMAL ENQUIRIES Prof Achim Jung (Head of School) phone (+44) 121 414 4776 email: A.Jung at cs.bham.ac.uk CLOSING DATE FOR RECEIPT OF APPLICATIONS 16 May 2000 Late applications may be considered APPLICATION FORMS RETURNABLE TO The Director of Personnel Services The University of Birmingham Edgbaston, Birmingham, B15 2TT England RECRUITMENT OFFICE FAX NUMBER: +44 121 414 4802 RECRUITMENT OFFICE TELEPHONE NUMBER: +44 121 414 6486 RECRUITMENT OFFICE E-MAIL ADDRESS: h.h.luong at bham.ac.uk Applications are invited for three Lectureships in Computer Science at the University of Birmingham. Applications from all areas of Computer Science will be considered but preference will be given to candidates who show promise in the areas discussed below. (See the web page for more details) The successful candidate should have or be about to complete a PhD in Computer Science or an appropriate, closely related field. (S)he is expected to have research experience as evidenced by publications in leading international journals or conference proceedings. The research potential of a new PhD may also be judged from his/her PhD thesis. The successful candidate must have the commitment to achieve excellence in teaching at all levels (from undergraduate teaching to research student supervision), including teaching subjects that may not be in his/her research areas. All academic staff are also expected to help with administration. ----------------------------------------------------------------------- From a540aa at email.sps.mot.com Wed Apr 19 20:16:24 2000 From: a540aa at email.sps.mot.com (Kari Torkkola (a540aa)) Date: Wed, 19 Apr 2000 17:16:24 -0700 Subject: papers available on dimension reduction Message-ID: <38FE4C58.D66F5AFA@email.mot.com> Two papers on dimension reduction are available: 1. Kari Torkkola and William M. Campbell, Mutual Information in Learning Feature Transformations Abstract We present feature transformations useful for exploratory data analysis or for pattern recognition. Transformations are learned from example data sets by maximizing the mutual information between transformed data and their class labels. We make use of Renyi's quadratic entropy, and we extend the work of Principe et al. to mutual information between continuous multidimensional variables and discrete-valued class labels. The paper can be retrieved through page http://members.home.net/torkkola/mmi.html together with some illustrative examples. 2. William M. Campbell, Kari Torkkola, and Sreeram V. Balakrishnan, Dimension Reduction Techniques for Training Polynomial Networks Abstract We propose two novel methods for reducing dimension in training polynomial networks. We consider the class of polynomial networks whose output is the weighted sum of a basis of monomials. Our first method for dimension reduction eliminates redundancy in the training process. Using an implicit matrix structure, we derive iterative methods that converge quickly. A second method for dimension reduction involves a novel application of random dimension reduction to ``feature space.'' The combination of these algorithms produces a method for training polynomial networks on large data sets with decreased computation over traditional methods and model complexity reduction and control. http://members.home.net/torkkola/sp_papers/campbell-icml2000.ps.gz or http://members.home.net/torkkola/sp_papers/campbell-icml2000.pdf Both papers will appear in the Proceedings of ICML 2000, June 29 - July 2, Stanford, CA. -- Kari Torkkola phone: +1-480-4134129 Motorola Labs, MD EL508 fax: +1-480-4137281 2100 East Elliot Road email: a540aa at email.mot.com Tempe, AZ 85284 http://members.home.net/torkkola From janetw at csee.uq.edu.au Thu Apr 20 02:12:40 2000 From: janetw at csee.uq.edu.au (Janet Wiles) Date: Thu, 20 Apr 2000 16:12:40 +1000 (EST) Subject: Lecturer in Computer Science and Electrical Engineering Message-ID: Dear colleagues, We are currently inviting applications for several tenurable posts. Neural and evolutionary computation are areas we are particularly interested in. General enquiries should be directed to the HoD (contact info below). I'm happy to answer questions related to the research activities in neural and evolutionary computation. Best regards, Janet Wiles -------------------------------------------- URL for CSEE Dept http://www.csee.uq.edu.au/ -------------------------------------------- THE UNIVERSITY OF QUEENSLAND (Brisbane, Australia) DEPARTMENT OF COMPUTER SCIENCE & ELECTRICAL ENGINEERING Lecturer/Senior Lecturer/Associate Professor in Computer Systems The Department of Computer Science & Electrical Engineering within the Faculty of Engineering, Physical Sciences and Architecture is one of the largest in Australia, with a strong research base and a large postgraduate research school. The position(s) available are continuing, with level of appointment depending on candidate's qualifications and experience. Appointees must have a strong commitment to research and demonstrated achievement appropriate to the level of appointment in one of the following areas of computer systems: Distributed Systems; Intelligent Machines/Systems; or Microelectronics/ Digital System Design. Opportunities for research into computer systems are extensive, particularly through involvement with the departmental Intelligent Systems and Digital Systems research groups, large research centres (such as CRC for Distributed Systems Technology and ARC Special Research Centre in Applied Genomics) and the proposed Australian Microelectronics Network. Further information on the Department may be found at http://www.csee.uq.edu.au/ Appointees will also be expected to contribute to well-established programs in Engineering (Computer Systems, Electrical, Software) and Information Technology, including supervision of Honours, Masters, and PhD students, and to contribute to the development of postgraduate coursework and to the internationalisation of the department's teaching. To these ends, appropriate levels of knowledge and demonstrated expertise in more than one of the following areas are essential: Computer Architecture; Digital System Design; Distributed Systems; Embedded/Systems Programming; and Neural Computing. The appointments will be continuing at Levels B, C or D. Salary ranges are: Aus$49,535 - $58,823 per annum (Level B); $60,681 - $69,968 per annum (Level C); $73,064 - $80,495 per annum (Level D), plus employer superannuation contribution of 17%. Contact Professor Paul Bailes, Head, Department of Computer Science and Electrical Engineering, The University of Queensland, on telephone (07) 3365 3869; fax (07) 3365 4999; email hod at csee.uq.edu.au to discuss the role or to obtain a position description and selection criteria. Alternatively, see the University web site. Send nine (9) copies of your application (an original plus eight (8) copies) to the Personnel Officer, Faculty of Engineering, Physical Sciences and Architecture, The University of Queensland, QLD 4072, Australia. Please quote Reference No. 17700, address the selection criteria and include a resume and the names and contact details of 3 referees. Closing date for applications: 26 May 2000. From RaymonB at heartsol.wmids.nhs.uk Thu Apr 20 06:17:52 2000 From: RaymonB at heartsol.wmids.nhs.uk (Raymond Ben) Date: Thu, 20 Apr 2000 11:17:52 +0100 Subject: Thesis on data visualisation and HRV analysis Message-ID: The following PhD thesis is available from http://www.eleceng.adelaide.edu.au/Personal/braymond/thesis.pdf.gz It deals mostly with the application of data visualisation techniques to a biomedical problem (heart rate variability analysis) but does delve a little into issues of training and regularisation in the LSS and GTM. Comments welcome. Ben Abstract Variations in heart rate on a beat-to-beat basis reflect variations in autonomic tone. Heart rate variability (HRV) analysis is a popular research tool for probing autonomic function and has found a wide range of applications. This thesis studies data visualisation and classification of HRV as alternatives to established processing methods. Data visualisation algorithms transform a high-dimension data set into an easily-visualised, two-dimensional representation, or mapping. This transformation is conducted such that the interesting structure of the data set is preserved. Visualisaton techniques thus allow the researcher to investigate relationships between HRV data without the need to define fixed bands of interest within each spectrum. Two visualisation algorithms are primarily used throughout this thesis: the least-squares scaling (a form of multidimensional scaling) and the generative topographic mapping (GTM). The least-squares scaling (LSS) may be implemented using radial basis function neural networks, adding the ability to project new data onto an existing mapping. The training of such networks can be done in conjunction with the construction of the map itself, or as a separate step. It has previously been suggested that the former approach yields smoother networks and thus better generalisation; here, it is shown that, with appropriate network regularisation, the generalisation properties of the two methods are comparable. It is also shown that the incorporation of prior knowledge (such as class labels) into the LSS can improve the visual properties of the resulting mapping. A simple modification to the GTM is given to allow similar use of prior information. The visualisation of HRV data is demonstrated on two data sets. The first was from a simple study involving postural and pharmacological intervention in healthy subjects. The LSS and GTM both produced logical mappings of the data, with the ordering of the points within the map reflecting the sympathovagal balance during the various phases of the study. Data visualisation is also demonstrated on HRV data from overnight studies into the sleep apnoea/hypopnoea syndrome. The ordering of the points within the map in this case was strongly related to the power in the very low frequency region of the spectra, known to be an indicator of sleep apnoea. Subjects who suffered predominantly hypopnoeas rather than true apnoeas were found to show HRV similar to control subjects. The final section of the thesis briefly addresses the classification of HRV, emphasising the combination of HRV with information from other diagnostic signals and sources. Classification of data from the intervention study showed that mean heart rate together with HRV allowed more reliable classification than did either mean heart rate or HRV alone. In the classification of sleep apnoea data, the addition of body mass index and age did not improve classification; however, the inclusion of oxyhaemoglobin desaturation information did improve the classification accuracy. From golden at utdallas.edu Thu Apr 20 17:19:24 2000 From: golden at utdallas.edu (Richard M Golden) Date: Thu, 20 Apr 2000 16:19:24 -0500 (CDT) Subject: Special Issue on "Model Selection" in Journal of Mathematical Psychology Message-ID: I would like to call people's attention to the special issue on "model selection" in the Journal of Mathematical Psychology which just came out. Here is the table of contents. Journal of Mathematical Psychology, Vol. 44, March 2000 "Accuracy, Scope, and Flexibility of Models" James E. Cutting "How to Assess a Model's Testablity and Identifiability" Bamber and van Santen "An Introduction to Model Selection" Walter Zucchini "Akaike's Information Criterion and Recent Developments in Information Complexity" Hamparsum Bozdogan "Bayesian Model Selection and Model Averaging" Larry Wasserman "Cross-Validation Methods" P. Grunwald "Statistical Tests for Comparing Possibly Misspecified and Nonnested Models" Richard Golden "Model Comparisons and Model Selections based on Generalization Criterion Methodology" Jerome Busemeyer and Yi-Min Wang "The Importance of Complexity in Model Selection" In Jae Myung "Key Concepts in Model Selection: Performance and Generalizability" Malcolm Forster ******************************************************************************* Richard M. Golden, Associate Professor Cognitive Science & Engineering The University of Texas at Dallas, Box 830688 Richardson, Texas 75083-0688, PHONE: (972) 883-2423 EMAIL: golden at utdallas.edu, WEB: http://www.utdallas.edu/~golden/index.html ******************************************************************************* From wahba at stat.wisc.edu Thu Apr 20 20:37:03 2000 From: wahba at stat.wisc.edu (Grace Wahba) Date: Thu, 20 Apr 2000 19:37:03 -0500 (CDT) Subject: Intro Model Bldg w. RKHS Message-ID: <200004210037.TAA18824@hera.stat.wisc.edu> `An Introduction to Model Building With Reproducing Kernel Hilbert Spaces' notes from a shortcourse given at Interface 2000 - now available at ftp://ftp.stat.wisc.edu/pub/wahba/interf/index.html Abstract: We assume no knowledge of reproducing kernel Hilbert spaces, but review some basic concepts, with a view towards demonstrating how this setting allows the building of interesting statistical models that allow the simultaneous analysis of heterogenous, scattered observations, and other information. The abstract ideas will be illustrated with several data analyses including modeling risk factors for eye diseases. ............................................... Gaussian processes, radial basis functions, support vector machines, functional ANOVA decompositions and the bias-variance tradeoff fit in the framework discussed. ................................................ Grace Wahba From hzs at cns.brown.edu Fri Apr 21 12:51:24 2000 From: hzs at cns.brown.edu (Harel Z. Shouval) Date: Fri, 21 Apr 2000 12:51:24 -0400 (EDT) Subject: Symposium anouncement Message-ID: SYMPOSIUM ANNOUNCEMENT: The Dynamic Brain: Molecules, Mathematics, the Mind Brown University, May 31 - June 3, 2000 The "Dynamic Brain" Meeting celebrates interdisciplinary research and education as well as the Brain Science Program at Brown University, which was launched in October 1999. Twenty-four of the world's leading researchers will present the latest findings in brain development and function. The meeting will bring together experimentalists and theoreticians to discuss how we can accelerate our understanding of the brain through interdisciplinary studies that blend mathematics, biology, computation, andcognitive and behavioral sciences; this collaboration across disciplines is the model for the Brain Science Program at Brown. In particular, there will be an emphasis on the intraction of theory and experiment that has greatly enriched both endeavors. This interaction has enabled scientists to pose new questions with precision and clarity. Each of the four sessions will span levels of study and will include group discussion of challenges and interdisciplinary approaches to understanding brain function. We invite you to join us for this landmark event. John Donoghue and Leon Cooper, conference organizers PROGRAM Wednesday, May 31 Registration: Begins at 3:00 p.m. Opening Reception: 6:00 p.m. Thursday, June 1 Session I: 9:00 a.m.- 3:00 p.m. Receptive Field Plasticity: From Molecule to Systems Speakers: Mark Bear, Tobias Bonhieffer, Leon Cooper, Yves Fregnac, Robert Malenka, Susumu Tonegawa Session II: 3:00-6:30 p.m. Temporal Dynamics: From Synapse to Systems Speakers: Laurence Abbott, Barry Connors, John Donoghue, Eve Marder, Henry Markram, Carla Shatz Friday, June 2 Session II: (continued) 9:00-11:00 a.m. Temporal Dynamics: From Synapse to Systems Session III: 11:00 a.m.- 5:00 p.m. Memory Consolidation: From Molecule to Behavior Speakers: Cristina Alberini, Justin Fallon, Eric Kandel, Richard Morris, Larry Squire, Jerry Yin Saturday, June 3 Session IV: 9:00 a.m.- 3:00 p.m. The Neuronal Mind Speakers: Jean-Pierre Changeux, Richard Frackowiak, David Mumford, Keiji Tanaka, Michael Tarr REGISTRATION INFORMATION Fees Graduate and Postdoctoral Students $50 General Registration $100 Accommodations Westin Hotel, Waterplace Park, Providence. Special conference rates are available until April 30, 2000: US $149 single/double; $175 triple/quad. Contact the hotel directly at: Tel 401.598.8000 Fax 401.598.8200 Web site http://www.westin.com/ FOR MORE INFORMATION Tel 401.863 9524 E-mail brainscience at brown.edu Web site http://www.brainscience.brown.edu/ From rsun at cecs.missouri.edu Fri Apr 21 14:35:20 2000 From: rsun at cecs.missouri.edu (Ron Sun) Date: Fri, 21 Apr 2000 13:35:20 -0500 Subject: recent issues of Cognitive Systems Research Message-ID: <200004211835.NAA07121@pc113.cecs.missouri.edu> Contents of the recent issues of Cognitive Systems Research: ------------------------- Table of Contents for Cognitive Systems Research Volume 1, Issue 1, 1999 Ron Sun, Vasant Honavar and Gregg C. Oden Editorial: Integration of cognitive systems across disciplinary boundaries 1-3 Andy Clark Where brain, body, and world collide 5-17 Arthur M. Glenberg, David A. Robertson, Jennifer L. Jansen and Mina C. Johnson-Glenberg Not Propositions 19-33 Arthur C. Graesser, Katja Wiemer-Hastings, Peter Wiemer-Hastings and Roger Kreuz AutoTutor: A simulation of a human tutor 35-51 Pentti Kanerva Book Review: Artificial Minds, Stan Franklin, MIT Press, Cambridge, MA, 1995 53-57 Xin Yao Conference Report: Evolutionary computation comes of age 59-64 ------------------------- Table of Contents for Cognitive Systems Research Volume 1, Issue 2, January 2000 Mark H. Bickhard Information and representation in autonomous agents 65-75 Valerie Gray Hardcastle The development of the self 77-86 Umberto Castiello et al. Human inferior parietal cortex `programs' the action class of grasping 89-97 Marsha C. Lovett, Larry Z. Daily and Lynne M. Reder A source activation theory of working memory: cross-task prediction of performance in ACT-R 99-118 A. El Imrani, A. Bouroumi, H. Zine El Abidine, M. Limouri and A. Essad A fuzzy clustering-based niching approach to multimodal function optimization 119-133 ------------------------- Table of Contents for Cognitive Systems Research Volume 1, Issue 3, April 2000 Brijesh Verma and Chris Lane Vertical jump height prediction using EMG characteristics and neural networks 135-141 Scott A. Huettel and Gregory Lockhead Psychologically rational choice: selection between alternatives in a multiple-equilibrium game 143-160 Robert C. Mathews, Lewis G. Roussel, Barbara P. Cochran, Ann E. Cook and Deborah L. Dunaway The role of implicit learning in the acquisition of generative knowledge 161-174 ------------------------------------------------------------------- Publish your work with Cognitive Systems Research --- the new journal devoted to the interdisciplinary study of cognitive science http://www.elsevier.nl/locate/cogsys Elsevier Science Co-Editors-in-Chief Ron Sun, University of Missouri-Columbia. E-mail: rsun at cecs.missouri.edu Vasant Honavar, Iowa State University. E-mail: honavar at cs.iastate.edu Gregg Oden, University of Iowa. E-mail: gregg-oden at uiowa.edu Cognitive Systems Research covers all topics of cognition, including ' Problem-Solving and Cognitive Skills ' Knowledge Representation and Reasoning ' Perception ' Action and Behavior ' Memory ' Learning ' Language and Communication ' Agents ' Integrative and Interdisciplinary Studies For a full description of subjects and submission information, access the Website: http://www.elsevier.nl/locate/cogsys or http://www.cecs.missouri.edu/~rsun/journal.html ------------------------------------------------------------------- From tommi at ai.mit.edu Fri Apr 21 16:04:43 2000 From: tommi at ai.mit.edu (Tommi Jaakkola) Date: Fri, 21 Apr 2000 16:04:43 -0400 Subject: AISTATS 2001: Call for papers Message-ID: <200004212004.QAA06446@susi.ai.mit.edu> (apologies for multiple posting) ==================================================================== AI and STATISTICS 2001 Eighth International Workshop on Artificial Intelligence and Statistics January 3-6, 2001, Hyatt Hotel, Key West, Florida http://www.ai.mit.edu/conferences/aistats2001/ This is the eighth in a series of workshops which have brought together researchers in Artificial Intelligence (AI) and in Statistics to discuss problems of mutual interest. The exchange has broadened research in both fields and has strongly encouraged interdisciplinary work. Papers on all aspects of the interface between AI & Statistics are encouraged. To encourage interaction and a broad exchange of ideas, the presentations will be limited to about 20 discussion papers in single session meetings over three days (Jan. 4-6). Focused poster sessions will provide the means for presenting and discussing the remaining research papers. Papers for poster sessions will be treated equally with papers for presentation in publications. Attendance at the workshop will not be limited. The three days of research presentations will be preceded by a day of tutorials (Jan. 3). These are intended to expose researchers in each field to the methodology and techniques used in other related areas. The Eighth workshop especially encourages submissions related to the following workshop themes in the interface between information retrieval and statistics: Statistical natural language processing Game theory Missing information; unlabeled examples Error correcting codes In addition, papers on all aspects of the interface between AI & Statistics are strongly encouraged, including but not limited to Automated data analysis Cluster analysis and unsupervised learning Statistical advisory systems, experimental design Integrated man-machine modeling methods Interpretability in modelling Knowledge discovery in databases Metadata and the design of statistical data bases Model uncertainty, multiple models Multivariate graphical models, belief networks, causal modeling Online analytic processing in statistics Pattern recognition Prediction: classification and regression Probabilistic neural networks Probability and search Statistical strategy Vision, robotics, natural language processing, speech recognition Visualization of very large datasets Submission Requirements: ----------------------- Electronic submission of abstracts is required. The abstracts (up to 4 pages in length) should be submitted through the AI and Statistics Conference Management page supported by Microsoft Research. More specific instructions will be made available at http://cmt.research.microsoft.com/AISTATS2001/ In special circumstances other arrangements can be made to facilitate submission. For more information about possible arrangements, please contact the conference chairs. Submissions will be considered if they are received by midnight July 1, 2000. Please indicate the theme and/or the topic(s) your abstract addresses. Receipt of all submissions will be confirmed via electronic mail. Acceptance notices will be emailed by September 1, 2000. Preliminary papers (up to 12 pages, double column) must be received by November 1, 2000. These preliminary papers will be copied and distributed at the workshop. Program Chairs: -------------- Thomas Richardson, University of Washington, tsr at stat.washington.edu Tommi Jaakkola, MIT, tommi at ai.mit.edu Program Committee: ----------------- Russell Almond, Educational Testing Service, Princeton Hagai Attias, Microsoft Research, Cambridge Yoshua Bengio, University of Montreal Max Chickering, Microsoft Research, Redmond Greg Cooper, University of Pittsburgh Robert Cowell, City University, London Phil Dawid, University College, London Vanessa Didelez, University of Munich David Dowe, Monash University Brendan Frey, University of Waterloo Nir Friedman, Hebrew University, Jerusalem Dan Geiger, Technion Edward George, University of Texas Paolo Giudici, University of Pavia Zoubin Ghahramani, University College, London Clark Glymour, Carnegie-Mellon University David Heckerman, Microsoft Research, Redmond Thomas Hofmann, Brown University Reimar Hofmann, Siemens Michael Jordan, University of California, Berkeley David Madigan, Soliloquy Chris Meek, Microsoft Research, Redmond Marina Meila, Carnegie-Mellon University Kevin Murphy, University of California, Berkeley Mahesan Niranjan, University of Sheffield John Platt, Microsoft Research, Redmond Greg Ridgeway, University of Washington Lawrence Saul, AT&T Research Prakash Shenoy, University of Kansas Dale Schuurmans, University of Waterloo Padhraic Smyth, University of California, Irvine David Spiegelhalter, University of Cambridge Peter Spirtes, Carnegie-Mellon University Milan Studeny, Academy of Sciences, Czech Republic Michael Tipping, Microsoft Research, Cambridge Henry Tirri, University of Helsinki Volker Tresp, Siemens Chris Watkins, Royal Holloway and Bedford New College, Nanny Wermuth, University of Mainz Joe Whittaker, Lancaster University Chris Williams, University of Edinburgh From maass at igi.tu-graz.ac.at Sat Apr 22 11:46:47 2000 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Sat, 22 Apr 2000 17:46:47 +0200 Subject: Program of the NeuroCOLT Workshop May 2000 in Graz (Austria) Message-ID: <3901C967.A8561F95@igi.tu-graz.ac.at> Program of the NeuroCOLT Workshop NEW PERSPECTIVES IN THE THEORY OF NEURAL NETS May 3 to 5 , 2000 at Schloss St. Martin in Graz (Austria). Organizer: Wolfgang Maass, Graz University of Technology ---------------------------------------------------- Wednesday, May 3 morning: Shai Ben-David, Israel: An Efficient Agnostic Learning Algorithm for Half-Spaces Michael Schmitt, Germany: On the Complexity of Computing and Learning with Multiplicative Neural Networks Pekka Orponen, Finland: Some new results on the computational properties of analog recurrent neural networks Georg Dorffner, Austria: Recurrent neural networks and symbolic dynamics afternoon: Juergen Schmidhuber, CH: Long Short-Term Memory and Context Sensitive Languages Volker Tresp, Germany: The Generalized Bayesian Committee Machine Nicol N.Schraudolph, CH: Stochastic Meta-Descent Ron Meir, Israel: Localized Boosting Algorithms and Weak Learning Klaus Obermeyer, Germany: TBA ---------------------------------------------------------- Thursday, May 4 morning: Rodney Douglas, CH: Transposing cortical processing into neuromorphic analog VLSI circuits Wolfgang Maass/Robert Legenstein, Austria: Foundations of a Circuit Complexity Theory for Sensory Processing Georg Schnitger, Germany: Neural Circuits for Elementary Vision Problems Andreas Herz, Germany: Neural representation of acoustic communication signals afternoon: Wulfram Gerstner, CH: Spike-time Dependent Hebbian Learning Thomas Natschlaeger, Austria: Dynamic Synapses as Nonlinear Filters Peter Koenig, CH: Learning and synchronization (at 4pm we leave for the opening of the exhibition gr2000az in Schloss Eggenberg, see http://www.comm.gr2000az.at/ ) --------------------------------------------------- Friday, May 5 morning: Bob Williamson, Australia: Margins, Sparsity and Perceptrons Bernhard Schoelkopf, GB: Kernels: Similarities and Dissimilarities Chris Bishop, GB: The Variational Relevance Vector Machine John Shawe-Taylor, GB: Bounds Combining Sparsity and Margins afternoon: Helene Paugam-Moisy, France: Multiclass discrimination, SVM, and multimodal learning Manfred Opper, GB: The TAP Mean Field approach for probabilistic models Martin Anthony, GB: Some Results on Cross-Validation Bhaskar Das Gupta, USA: On Approximate Learning by Multi-layered Feedforward Circuits Pascal Koiran, France: The stability of saturated linear dynamical systems is undecidable Zoubin Ghahramani, GB: Bayesian Learning of Model Structure (evening: discussion of the future of research on neural nets in Europe, including meetings and funding options) ----------------------------------------------------- POSTERS: Nello Cristianini, Harald Burgsteiner (A Learning Algorithm for Winner-Take-All Circuits), Jyrki Kivinen, Robert Legenstein (Circuit Complexity Theory for Sensory Processing), Gabor Lugosi, Bojan Novak (Conditions and Requirements for an Efficient Parallel Implementation of Various Learning Algorithms), Laurent Perrinet (Networks of Integrate-and-Fire Neuron using Rank Order Coding: How to implement Hebbian Learning), Jiri Sima, Eva Volna (Optimal Neural Network Topology for Real Problem of Pattern Recognition). ------------------------------------------------------------- ABSTRACTS, REGISTRATION AND TRAVEL INFORMATION: http://www.tu-graz.ac.at/igi/maass/nn2000/ A few slots are still open for registration; for lodging see http://www.graztourism.at/ From russell at CS.Berkeley.EDU Sun Apr 23 16:58:27 2000 From: russell at CS.Berkeley.EDU (Stuart Russell) Date: Sun, 23 Apr 2000 13:58:27 -0700 (PDT) Subject: UC Berkeley postdoc position(s) Message-ID: <200004232058.NAA01632@tower.CS.Berkeley.EDU> POSTDOC POSITION: MOTOR CONTROL LEARNING The Complex Motor Learning project at UC Berkeley would like to hire one or more postdoctoral research scientists. The project combines approaches from reinforcement learning, adaptive control theory, and biological motor control in order to study and develop systems that learn complex motor control behaviors such as walking, running, throwing, and flying. Experimental subjects include humans, insects, and robotic systems. Faculty investigators include Fearing, Russell, and Sastry (EECS); Dickinson, Farley, and Full (Int. Biol.); and Ivry (Psych.). A more complete project description appears at http://www.cs.berkeley.edu/~russell/cml/ Candidates should have: - a very strong background in at least one and preferably two of the three disciplines listed above; - excellent mathematical skills; and proficiency in at least one of - software development (ideally, physical simulation systems) - experimental robotics - behavioral studies of human or animal subjects Interested candidates should send a short letter stating interest and a CV with names and email addresses of three references to Stuart Russell, Computer Science Division, University of California, Berkeley, CA 94720. Email applications (PLAIN TEXT AND/OR UNENCODED POSTSCRIPT ONLY) to russell at cs.berkeley.edu From xjliu at wspc.com.sg Sun Apr 23 20:58:01 2000 From: xjliu at wspc.com.sg (Xuejun) Date: Mon, 24 Apr 2000 08:58:01 +0800 Subject: New Books on Robotics Message-ID: <00Apr24.085802sst.14978@gateway.wspc.com.sg> **** Apologies if you receive multiple copies **** Dear Colleagues, I wish to inform you on the following recent books and volumes by World Scientific and Imperial College Press. ==================================== Series in Robotics and Intelligent Systems Vol. 24 Robot Learning - An Interdisciplinary Approach Edited by J. Demiris (Univ. Edinburgh) & A. Birk (Vrije Universiteit Brussel) 220pp, May 2000 ISBN: 981-02-4320-0 http://www.worldscientific.com/books/compsci/4436.html ==================================== Associative Learning for a Robot Intelligence By J. H. Andreae (Univ. Canterbury) 360pp, Sept 1998 ISBN: 1-86094-132-x http://www.worldscientific.com/books/compsci/p113.html ===================================== Geometrical Foundations of Robotics Edited by J. M. Selig (South Bank Univ. UK) 164pp, Mar 2000 ISBN: 981-02-4113-5 http://www.worldscientific.com/books/compsci/4257.html ===================================== Series in Intelligent Control and Intelligent Automation - Vol. 11 Multisensor Fusion - A Minimal Representation Framework By R.Joshi (Real-Time Innovations Inc., USA) & A. C Sanderson (Rensselaer Polytechnic Institute, USA) 336pp, Dec 1999 ISBN: 981-02-3880-0 http://www.worldscientific.com/books/compsci/4106.html ===================================== For more information on these and previous books and volumes see the WWW page of the series: ***************************************** Series in Robotics and Intelligent Systems Edited by C. J. Harris (Univ. Southampton) http://www.worldscientific.com/books/series/wssris_series.html ***************************************** Series on Machine Perception & Artificial Intelligence Edited by H. Bunke (Univ. Bern) & P. S. P. Wang (Northeastern Univ.) http://www.worldscientific.com/books/series/wsmpai_series.html ***************************************** Series in Intelligent Control and Intelligent Automation Edited by F. Y. Wang (Univ. Arizona) http://www.worldscientific.com/books/series/sicia_series.html ***************************************** Many new books and volumes are coming soon in these series. With my best regards. Sincerely yours, Xuejun Liu World Scientific and Imperial College Press http://www.worldscientific.com/ From terry at salk.edu Mon Apr 24 14:34:28 2000 From: terry at salk.edu (terry@salk.edu) Date: Mon, 24 Apr 2000 11:34:28 -0700 (PDT) Subject: NEURAL COMPUTATION 12:5 Message-ID: <200004241834.LAA03549@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 5 - May 1, 2000 REVIEW Expanding Neuron's Repertoire of Mechanisms With NMODL M. L. Hines and N. Y. Carnevale ARTICLE Latent Attractors: A Model For Context-Dependent Place Representations in The Hippocampus Simona Doboli, Ali A. Minai and Phillip J. Best NOTE The Approach of A Neuron Population Firing Rate to A New Equilibrium: An Exact Theoretical Result B.W. Knight, A. Omurtag and L. Sirovich Formation of Direction Selectivity in Natural Scene Environments Brian Blais, Harel Shouval and Leon N. Cooper LETTERS A Phase Model of Temperature-Dependent Mammalian Cold Receptors Peter Roper, Paul C. Bressloff and Andre Longtin The Number of Synaptic Inputs and The Synchrony of Large Sparse Neuronal Networks D. Golomb and D. Hansel Neural Network Architecture For Visual Selection Yali Amit Using Bayes Rule to Model Multisensory Enhancement in the Superior Coliculus Thomas J. Anastasio, Paul E. Patton, and Kamel Belkacem-Boussaid Choice and Value Flexibility Jointly Contribute to The Capacity of a Subsampled Quadratic Classifier Panayiota Poirazi and Bartlett W. Mel New Support Vector Algorithms Alex Smola, Berhard Scholkopf, Robert Williamson, Peter Bartlett ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2000 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From mayank at MIT.EDU Tue Apr 25 20:59:59 2000 From: mayank at MIT.EDU (Mayank R. Mehta) Date: Tue, 25 Apr 2000 20:59:59 -0400 Subject: Effect of learning on receptive field shape and direction selectivity. Message-ID: <200004260100.VAA01957@all-night-tool.mit.edu> The following two papers on effect of Hebbian learning, or temporally asymmetric NMDA dependent LTP, on receptive field shape [1], and direction selectivity in V1 [2], are available at: http://www.mit.edu/~mayank/ -Mayank -------------------------------------------------------------------- Mayank R. Mehta http://www.mit.edu/~mayank/ E25-236, 45 E Carleton Street Work: 617 252 1841 Massachusetts Institute of Technology FAX: 617 258 7978 Cambridge, MA 02139 Email: Mayank at MIT.edu -------------------------------------------------------------------- Paper 1: `Experience-dependent, asymmetric shape of hippocampal receptive fields'. Mayank R. Mehta, Michael. C. Quirk & Matthew A. Wilson. Neuron (2000) 25:707-715. Abstract: We propose a novel parameter, namely the skewness or the asymmetry of the shape of a receptive field, and use this measure to characterize two properties of hippocampal place fields that may reflect the underlying mechanism of experience dependent plasticity. First, a majority of hippocampal receptive fields on linear tracks are negatively skewed, such that during a single pass the firing rate is low as the rat enters the field, but high as it exits. Second, while the place fields are symmetric at the beginning of a session, they become highly asymmetric with experience. Further experiments suggest that these results are likely to arise due to synaptic plasticity during behavior, and not due to other non-specific mechanisms. Using a purely feed forward neural network model we show that following repeated directional activation, the temporally asymmetric nature of NMDA dependent LTP/D could result in an experience dependent asymmetrization of receptive fields. Paper 2: `From Hippocampus to V1: Effect of LTP on spatio-temporal dynamics of receptive fields'. Mayank R. Mehta & Matthew A. Wilson. To appear in Neurocomputing, (2000). Recent studies have revealed novel effects of patterns of neuronal activity and synaptic plasticity on the size and specificity of receptive fields. However, little has been done to quantify their effect on the receptive field {\it shape}. It has been shown that place fields are highly asymmetric such that, the firing rate of a place cell rises slowly as a rat enters a place field but the firing rate drops off abruptly at the end of the place field in an experience dependent fashion. Here we present a computational model that can explain the results, based on NMDA dependent LTP. Striking similarities between the hippocampal and striate receptive field dynamics are pointed out. Our model suggests that LTP/D could result in diverse phenomena such as phase precession in the hippocampal neurons and the origin of directional receptive fields in the striate cortex. It is suggested that the key feature underlying directionality and inseparable spatio-temporal dynamics is the asymmetric shape of the receptive field. From allan at biomedica.org Wed Apr 26 10:28:42 2000 From: allan at biomedica.org (Allan Kardec Barros) Date: Wed, 26 Apr 2000 11:28:42 -0300 Subject: Biomedical Engineering - Mailing list Message-ID: <3906FD1A.EE7A9E03@biomedica.org> Dear connectionists, We have created in Nagoya University a new mailing list for those who are interested in biomedical engineering. Further details you can find at: http://www.ohnishi.nuie.nagoya-u.ac.jp/BME. Best regards, Allan Kardec Barros, Yoshinori Takeuchi, Hiroaki Kudo From robert.smith at uwe.ac.uk Wed Apr 26 11:21:09 2000 From: robert.smith at uwe.ac.uk (Robert E. Smith) Date: Wed, 26 Apr 2000 16:21:09 +0100 Subject: Call for Abstracts: Workshop on Self-Organazing Multi-Agent Systems (UK) Message-ID: Sent to: connectionists at MAILBOX.SRV.CS.CMU.EDU Self-organisation in Multi-agent Systems (SOMAS) Date: July 27-28, 2000 Milton Keynes, UK A Workshop organised by the Emergent Computing Network http://images.ee.umist.ac.uk/emergent/ Call for Contributors 1 Introduction Multi-agent systems (MAS) are collections of interacting autonomous entities. The behaviour of the MAS is a result of the repeated asynchronous action and interaction of the agents. Understanding how to engineer self-organisation is thus central to the application of agents on a large scale. Multi-agent simulations can also be used to study emergent behaviour in real systems. Interest in large-scale systems of agents is growing, as is illustrated by the recent Framework Five (Future and Emergent Technologies) action on the so-called Universal Information Ecosystem. Advances in telecommunications and the spread of the Internet, electronic commerce, etc. mean that information infrastructure operates as a global dynamic system. As time passes, the density and diversity of interconnections in such system will increase rapidly. Moreover, such systems are being required to service the needs of a diverse set of users (whatever their distinctive needs), not just a virtual 'representative' user. Thus, such systems must adapt to personal requirements, by providing highly customised packages of services. Simultaneously providing highly diverse services to a huge user population in an enormous, interconnected system is a task beyond centralised management techniques. The only way to manage this form of agent-based system is to utilise its emergent properties to make it self-organising and self-regulating. Desirable self-organisation is observed in many biological, social and physical systems. However, fostering these conditions in artificial systems proves to be difficult and offers the potential for undesirable behaviours to emerge. Thus, it is vital to be able to understand and shape emergent behaviours in agent based systems. Current mathematical and empirical tools give only a partial insight into emergent behaviour in large, agent-based societies. The goal of this workshop is to open a dialog among practitioners from diverse fields, including: agent based systems, complex systems, AI, optimisation theory and non-linear systems, neural networks, evolutionary computation, neuro-biology, and computer science The workshop will focus on localised means of measuring, understanding, and shaping emergent behaviour in large scale distributed systems. The workshop represents an important opportunity for those active or interested in emergent behaviour research, to hear about current work, discuss future directions and priorities, and form invaluable research contacts. 2 Venue and Format The Workshop will commence on July, 27th and will take place over 2 days at the BT Conference Facility in Milton Keynes, UK. Since the primary goal of this workshop is to provide time for communication between presenters and attendees ample opportunity will be provided for structured discussion. A detailed programme will be issued by June, 2000. 3 Invited Speakers We have a promising list of invited speakers to address the workshop's theme from a variety of perspectives. Names will be announced as they are confirmed. 4 Call for Contribution You are invited to contribute to this workshop. Your 250-300 word abstract should include title, authors, affiliations, keywords, source of external support (if any) and the body of the abstract should stress the relevance of your work to the workshop topic. All accepted talks will be allocated 20-25 minutes. Speakers will be asked to provide copies of their overheads for inclusion in the Workshop information pack. Selected contributions from this workshop and others in the series will be published by Springer-Verlag as a highlighted volume of their "Lecture Notes in Computer Science" series. Detailed submission instructions for this series publication will be issued later. 5 Registration Details of workshop registration are forthcoming. Please contact one of the organizers, listed below. 6 General Enquiries To obtain submission or registration instructions, or to make any general questions relating to this Workshop, please contact one of the organizers: Paul Kearney BT Labs, Adastral Park Phone: 01473 605544 Email: paul.3.kearney at bt.com Robert Smith The Intelligent Computer Systems Centre The University of The West of England Phone: 0117 942 1495 Email: robert.smith at uwe.ac.uk Andy Wright BAe Sowerby Research (currently a Visiting Fellow at Bristol University) Phone: 0117 95 46883 Email: Andy.Wright at bristol.ac.uk 7 Important Dates: Abstract Submission Deadline: May 21st, 2000 Notification of Acceptance of Abstract: June 30, 2000. Workshop Dates: July 27th-28th, 2000 8 Emergent Computing Workshop Series This is the 6th workshop being organised by the `Emergent Computing' network, to bring together multi-disciplinary ideas from complex systems, AI, optimisation theory and non-linear systems, neural networks, neuro-biology and computer science. The workshops are: 1. Self-Organising Systems at the University of Manchester Institute of Science and Technology, UK 2. Spatially Distributed Nonlinear Systems at the University of Leeds, UK (December 1999) 3. Associative Computing at the University of York, UK (February 2000) 4. Emergent Computation in Molecular and Cellular Biology at the University of Hertfordshire, UK (April 2000) 5. Strategies for Implementing Large Scale Emergent Computing Systems at the University of Wales, Cardiff, UK (June 2000) 6. Self-organisation in Multi-agent Systems (SOMAS) at The BT Conference Centre, Milton Keynes, UK (July 27-28, 2000). From hali at theophys.kth.se Wed Apr 26 20:04:46 2000 From: hali at theophys.kth.se (Hans =?iso-8859-1?Q?Liljenstr=F6m?=) Date: Thu, 27 Apr 2000 02:04:46 +0200 Subject: Nordic Symposium on Computational Biology Message-ID: <3907841E.1DCE3F88@theophys.kth.se> NORDIC SYMPOSIUM ON COMPUTATIONAL BIOLOGY 2000 18-23 JUNE 2000 AGORA FOR BIOSYSTEMS, SIGTUNA, SWEDEN Co-organized by Agora for Biosystems and Nordita The symposium addresses questions of high current interest in biology and related fields. The main focus is on bioinformatics, but computational methods applied to molecular and cellular biology, as well as computational neurobiology and ecology will also be included. The symposium is primarily intended to attract young scientists in the Nordic and surrounding countries, but all interested are welcome as long as space allows. A major objective is to give an introduction to some of the most important problems and challenges within the field. This will be accomplished by several tutorials, in addition to invited talks given by top scientists from abroad and from the region. There will also be ample time for poster presentations and for formal and informal discussions. TOPICS INCLUDE: - Bioinformatics - Macromolecular dynamics - Computational (functional) genomics - Computational neurobiology - Computational ecology INVITED SPEAKERS INCLUDE: Edward Cox, Dept. of Molecular Biology, Princeton University Mats Gyllenberg, Dept. of Mathematics, University of Turku John Hopfield, Dept. of Molecular Biology, Princeton University Eric Jakobsson, Dept. of Molecular and Integrative Physiology, University of Illinois Inge Jonassen, Dept. of Informatics, University of Bergen Erik Lindahl, Dept. of Physics, Royal Institute of Technology Kristian Lindgren, Dept. of Physical Resource Theory, Chalmers and G=F6teborg University Michael Mackey, Department of Physiology, McGill University Johan Paulsson, Dept. of Molecular Biology, Uppsala University Hans Plesser, Dept. of Physics, G=F6ttingen Dirk Repsilber, Inst. for Forest Genetics and Forest Tree Breeding, University of Hamburg Mattias Wahde, Dept. of Mechanical Engineering, Chalmers ORGANIZING COMMITTEE Soren Brunak, Center for Biological Sequence Analysis, Technical University of Denmark Olle Edholm, Theoretical Physics, Royal Institute of Technology, Stockholm Gaute Einevoll, Dept. of Physics, Agricultural University of Norway Jarl-Thure Eriksson, Electrical Engineering Labs, Tampere University Gunnar von Heijne, Center for Bioinformatics, Stockholm University John Hertz, Nordita, Copenhagen Hans Liljenstr=F6m, Agora for Biosystems, Sigtuna Dietrich von Rosen, Dept. of Biometrics, SLU, Uppsala More information and registration form can be obtained: - http://www.agora.kva.se/meetings/CompBio2000 Please register electronically using a web browser if possible. Abstract submissions can also be made via the online registration form. - email: hans.liljenstrom at sdi.slu.se - by mail: Hans Liljenstrom Dept. of Biometrics, SLU Box 7013 SE-750 07 Uppsala Sweden FAX: +46-18-673502 REGISTRATION DEADLINE: May 31, 2000. From Alex.Smola at anu.edu.au Wed Apr 26 23:25:57 2000 From: Alex.Smola at anu.edu.au (Alex Smola) Date: Thu, 27 Apr 2000 13:25:57 +1000 Subject: New Website on Kernel Machines Message-ID: <3907B345.C7497895@anu.edu.au> We are pleased to announce a new website on Kernel Machines and related methods. It can be found at http://www.kernel-machines.org It is a superset of the Support Vector website at GMD FIRST. Most links to http://svm.first.gmd.de will still be operational and should result in the near future (as soon as the chanes are made to the site in Berlin) in a redirect of your browser to the new site. However, we would like to ask you to update any existing links. Compared to the old site, the main difference is that we have enlarged the scope to form a repository not only for research on SVMs, but also Gaussian Process prediction, Mathematical Programming with Kernels, Regularization Networks, Reproducing Kernel Hilbert Spaces, and related methods. The aim is to serve as a central information source by providing links to papers, upcoming events, datasets, code, a discussion board, etc. On the technical side, novelties include the fact that data entry is now fully automatic, papers can be uploaded to the website, there exists a search option for papers, and data can also be provided in BibTeX format which should make it easier referencing to papers available at the site. We would like to express thanks to GMD FIRST for allowing us to host the webpage in the past three years in Berlin. Links to research at the Berlin group on Intelligent Data Analysis can now be found at http://ida.first.gmd.de The changes were needed since the boundaries between Support Vectors and other methods have become less well defined and we feel that there is scientific benefit in bringing the various research areas together. Moreover Alex Smola and Bernhard Scholkopf have moved to the Australian National University (Canberra), and Microsoft Research (Cambridge), respectively. The server for http://www.kernel-machines.org is located at the Australian National University. We thank ANU for the resources. The organizational structure is novel. From the beginning, we have strived to create a forum which would provide a balanced representation of the emerging field of SVM and kernel methods research. It is our hope that this forum has contributed its share to the exciting developments that all of us have witnessed over the last years. Now that the field has become more mature, we felt that it was time to support the website with an editorial board. This change also reflects the increasing importance of dissemination of research via the world wide web. If web sites gradually take over part of what journals were responsible for in the past, then they should also adhere to comparable levels of scholarly standards. We believe that the changes will ensure that the website will continue to be a useful resource for researchers. The editorial board comprises Nello Cristianini, Royal Holloway College, University of London Bernhard Scholkopf, Microsoft Research, Cambridge (UK) John Shawe-Taylor, Royal Holloway College, University of London Alex Smola, Australian National University, Canberra Vladimir Vapnik, AT&T, New Jersey Bob Williamson, Australian National University, Canberra. -- / Alexander J. Smola / spigot.anu.edu.au/~smola / / Australian National University / Alex.Smola at anu.edu.au / / Dept. of Engineering and RSISE / Tel: (+61) 410 457 686 / / Canberra, ACT 0200 / Fax: (+61) 2 6249 0506 / From Mark.Butler at unilever.com Thu Apr 27 08:34:16 2000 From: Mark.Butler at unilever.com (mark butler) Date: Thu, 27 Apr 2000 13:34:16 +0100 (GMT Daylight Time) Subject: JOB: Research Scientist in Adaptive Computation Message-ID: Adaptive Computation Scientist - Unilever Research Wirral - North West A truly multi-local multinational, Unilever are taking the needs of millions of people across the globe seriously. We invest 550 million in pioneering new research to ensure that our products remain the preferred choice, 150 million times a day. Adaptive Computation techniques play a key role in the development of a wide spectrum of our product and manufacturing applications for household brands such as Impulse, Organics, Persil, Flora and Wall's Ice Cream. Our recently established Centre of Excellence in Port Sunlight is committed to this area as well as developing and maintaining links with leading academics. Due to success and expansion of the Group we have a number of exciting opportunities for talented individuals who will relish the challenge of developing leading-edge solutions to complex industrial problems. A combination of world-class research and an understanding of how technology may be applied in practice will enable you to extend both your technological capabilities and application areas. Flexible, self motivated and a strong team player with a broad scientific interest, you will need a high level of numeracy and the ability to work to tight deadlines. Your good first Degree encompassing a strong mathematical component should be supported by a PhD, MSc, academic or industrial experience. Specialists in one or more of the following areas would be of particular interest: Neuro-Fuzzy, Genetic Algorithms, Neural Networks, Data Mining, Evolutionary Systems, Adaptive Agents, Fitness Landscapes, Machine Learning, Pattern Recognition. As a world-leading organisation we can offer an attractive salary and benefits package and excellent career opportunities. To take up the challenge, please write with full CV quoting ref 22429/NS to Vanessa Gilroy, TMP Response Management, 32 Aybrook Street, London W1M 3JL. Or e-mail your details to: response at tmpw.co.uk Closing date for applications is 19th May 2000. For more information about Unilever Research and Unilever visit our Internet Web Site at http://research.unilever.com From rsun at cecs.missouri.edu Thu Apr 27 13:10:25 2000 From: rsun at cecs.missouri.edu (Ron Sun) Date: Thu, 27 Apr 2000 12:10:25 -0500 Subject: IJCNN'2000 Call for Participation Message-ID: <200004271710.MAA21742@pc113.cecs.missouri.edu> Call For Participation *** I J C N N -2 0 0 0 *** IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS to be held in Grand Hotel di Como, Como, Italy -- July 24-27, 2000 This is the premier international neural networks conference, sponsored by the IEEE Neural Network Council, the International Neural Network Society, and the European Neural Network Society, and with the technical cooperation of the Japanese Neural Network Society, AEI (the Italian Association of Electrical and Electronic Engineers), SIREN (the Italian Association of Neural Networks), and AI*IA (the Italian Association for Artificial Intelligence). The list of accepted papers and the tentative conference program is now available on the web page. For complete information regarding the conference (including information about Como, Italy), visit the conference web site at: http://www.ims.unico.it/2000ijcnn.html The organizers may be contacted by email at ijcnn2000 at elet.polimi.it. From hinton at gatsby.ucl.ac.uk Fri Apr 28 11:52:33 2000 From: hinton at gatsby.ucl.ac.uk (Geoffrey Hinton) Date: Fri, 28 Apr 2000 16:52:33 +0100 Subject: technical reports available Message-ID: <200004281552.QAA08520@axon.gatsby.ucl.ac.uk> Two new technical reports are now available at http://www.gatsby.ucl.ac.uk/hinton/chronological.html _______________________________ Training Products of Experts by Maximizing Contrastive Divergence Geoffrey Hinton Technical Report GCNU TR 2000-004 ABSTRACT It is possible to combine multiple probabilistic models of the same data by multiplying their probability distributions together and then renormalizing. This is a very efficient way to model high-dimensional data which simultaneously satisfies many different low-dimensional constraints because each individual expert model can focus on giving high probability to data vectors that satisfy just one of the constraints. Data vectors that satisfy this one constraint but violate other constraints will be ruled out by their low probability under the other experts. Training a product of experts appears difficult because, in addition to maximizing the probability that each individual expert assigns to the observed data, it is necessary to make the experts be as different as possible. This ensures that the product of their distributions is small which allows the renormalization to magnify the probability of the data under the product of experts model. Fortunately, if the individual experts are tractable there is an efficient way to train a product of experts. __________________________________ Learning Distributed Representations of Concepts Using Linear Relational Embedding Alberto Paccanaro and Geoffrey Hinton Technical Report GCNU TR 2000-002 ABSTRACT In this paper we introduce Linear Relational Embedding as a means of learning a distributed representation of concepts from data consisting of binary relations between concepts. The key idea is to represent concepts as vectors, binary relations as matrices, and the operation of applying a relation to a concept as a matrix-vector multiplication that produces an approximation to the related concept. A repesentation for concepts and relations is learned by maximizing an appropriate discriminative goodness function using gradient ascent. On a task involving family relationships, learning is fast and leads to good generalization. From reggia at cs.umd.edu Fri Apr 28 13:13:32 2000 From: reggia at cs.umd.edu (James A. Reggia) Date: Fri, 28 Apr 2000 13:13:32 -0400 (EDT) Subject: Post-Doc Position: Computational Neuroscience of Language Message-ID: <200004281713.NAA25338@avion.cs.umd.edu> Post-Doctoral Fellowship in Computational Neuroscience of Language A post-doctoral fellowship position is available in the area of computational modeling of the neurobiological basis of language. The fellowship can begin summer/fall of 2000, and is jointly at the Neurology Dept. (Baltimore campus) and Computer Science Dept. (near Washington DC) of the University of Maryland. Research in this position could focus on any topic related to normal or impaired language, although we have special interest in areas such as cerebral specialization, reading disorders, origins of language, aphasia recovery following stroke, and functional imaging correlates of language. Applicants are expected to have a recent PhD in either a biological/cognitive/linguistic discipline or a computational discipline (e.g., computer science, applied mathematics, physics, engineering). US citizenship is required. For full consideration, applications should be received by May 26, 2000. To apply, send two copies (*hard copies only*) of a CV and research interest statement, how to contact you by mail/fax/email/phone, and the names and contact information of two references to: James A. Reggia, Dept. of Computer Science, A. V. Williams Bldg., University of Maryland, College Park, MD 20742 USA. Questions may be directed to reggia at cs.umd.edu