From S.Singh-1 at plymouth.ac.uk Wed Oct 1 18:21:50 1997 From: S.Singh-1 at plymouth.ac.uk (Sameer Singh) Date: Wed, 1 Oct 1997 18:21:50 BST Subject: PhD Studentship available Message-ID: <33B2B973B9A@cs_fs15.csd.plym.ac.uk> ADVERTISEMENT FOR : EPSRC PhD STUDENSHIP IN THE UNIVERSITY OF PLYMOUTH,UK SUBJECT: Engineering/Computing TITLE: EPSRC PhD Research Studentship in Development of a Tool for Intelligent Analysis of Stammering in Telephonic Speech CITY: Plymouth, UK DETAILS: Research Opportunity, Applications are invited for a PhD Studentship due to begin in late October/early November 1997. The studentship will be a collaborative project between Frenchay Hospital (Bristol), BT Research Labs and the University of Plymouth. You should have a first or upper second class degree in Engineering or Computing (preferably an MSc). A knowledge of C/C++ and AI is essential, however, a knowledge of speech characterisation and/or pattern recognition/neural networks would be a distinct advantage. Informal enquiries for this post can be made to Dr K Burn-Thornton, telephone (01752) 232519, e-mail: kburn-thornton at plymouth.ac.uk. Application forms and further particulars for this Studentship are available from Ms S Lock Administrative Assistant (Research), School of Electronic, Communication and Electrical Engineering, University of Plymouth, Drake Circus, Plymouth PL4 8AA, telephone (01752) 232608; e-mail: slocke at plymouth.ac.uk. Closing date: 20th October, 1997. NOTE: The university would welcome applications from non-UK resident students. UNIVERSITY OF PLYMOUTH Promoting equal opportunity, A Leading Centre for teaching and research. From patrik at enterprise.cs.unm.edu Wed Oct 1 18:11:59 1997 From: patrik at enterprise.cs.unm.edu (Patrik D'haeseleer) Date: Wed, 1 Oct 1997 16:11:59 -0600 (MDT) Subject: Adaptive Computation: new web site + post-doc ad Message-ID: <199710012211.QAA05600@consequences.cs.unm.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 2092 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/3591c080/attachment.ksh From jlm at cnbc.cmu.edu Thu Oct 2 07:32:47 1997 From: jlm at cnbc.cmu.edu (Jay McClelland) Date: Thu, 2 Oct 1997 07:32:47 -0400 (EDT) Subject: Experimental Physics Opening / Neuroscience Candidates Welcome Message-ID: <199710021132.HAA23892@eagle.cnbc.cmu.edu> The Department of Physics at CMU has initiated a search for an experimental physicist (see announcement below). The search for this position will be broad. Several members of the CMU physics department have grown increasingly enthusiastic about considering candidates in biological physics and the external advisory board for the department has strongly suggested biological physics, including topics in the area of neuroscience. Candidates will be considered from many areas. Candidates with interests in experimental neuroscience who have strong physics backgrounds and excellent research programs will receive very serious consideration. There is a strong emphasis in neuroscience in Pittsburgh both at CMU and the University of Pittsburgh. The Center for the Neural Basis of Cognition (CNBC) serves as a bridge between the relevant communities on the two campuses. While a physics appointee would be expected to establish a laboratory in physics, CMU emphasizes interdisciplinary research, so significant interaction with the CNBC would be viewed positively. The following announcement is being published in Physics Today. Individuals interested in further information about the position should may contact Prof. Michael Widom, as indicated below. Jay McClelland Co-Director, Center for the Neural Basis of Cognition --------------------------------------- Tenure Track Faculty Position Experimental Condensed Matter/Biological Physics Carnegie Mellon University The Department of Physics at Carnegie Mellon University invites applications for a tenure track experimentalist in condensed matter and/or biological physics. The appointment will be at a junior faculty level and will take effect July 1998 or later. We seek an individual of exceptional ability and promise to establish a vigorous research program. Excellent candidates in any area of specialization (including neuroscience) will be considered. Preference will be given to candidates who are likely to interact synergistically with current projects and facilities at Carnegie Mellon. Departmental interests include interfaces, lipid membranes, magnetic nanoparticles, semiconductors, scanning probe microscopy and x-ray scattering (see http://www-cmp.phys.cmu.edu). Applicants should send their curriculum vitae, publication list, a statement of research and teaching interests, and have at least three letters of reference sent before November 30, 1997 to Prof. Michael Widom, Chair, Search Committee, Department of Physics, Carnegie Mellon University, Pittsburgh PA, 15213. Carnegie Mellon University is an equal opportunity/affirmative action employer. From terry at salk.edu Thu Oct 2 17:53:31 1997 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 2 Oct 1997 14:53:31 -0700 (PDT) Subject: NEURAL COMPUTATION 9:8 Message-ID: <199710022153.OAA20194@helmholtz.salk.edu> Neural Computation - Contents Volume 9, Number 8 - November 15, 1997 ARTICLE Minimax Entropy Principle and its Application to Texture Modeling Song Chun Zhu, Ying Nian Wu, and David Mumford NOTE A Local Learning Rule that Enables Information Maximization for Arbitrary Input Distributions Ralph Linsker Convergence and Ordering of Kohonen's Batch Map Yizong Cheng LETTER Solitary Waves of Integrate-and-Fire Neural Fields David Horn and Irit Opher Time Series Segmentation Using Predictive Modular Neural Networks Athanasios Kehagias, and Vassilios Petridis Adaptive Mixtures of Probabilistic Transducers Yoram Singer Long Short-Term Memory Sepp Hochreiter and Jurgen Schmidhuber Factor Analysis Using Delta-Rule Wake-Sleep Learning Radford M. Neal and Peter Dayan Data Clustering Using a Model Granular Magnet Marcelo Blatt, Shai Wiseman, and Eytan Domany ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1998 - VOLUME 10 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $78 Individual $82 $87.74 $110 Institution $285 $304.95 $318 * includes 7% GST (Back issues from Volumes 1-8 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From zhaoping at ai.mit.edu Fri Oct 3 14:25:33 1997 From: zhaoping at ai.mit.edu (Zhaoping Li) Date: Fri, 3 Oct 97 14:25:33 EDT Subject: TR available --- Visual segmentation without classification in a model of V1 Message-ID: <9710031825.AA24096@sewer.mit.edu> Dear Connectionists, The following Technical report published by the AI publications office at MIT may be accessed via ftp://publications.ai.mit.edu/ai-publications/1500-1999/AIM-1613.ps TITLE: Visual segmentation without classification in a model of the primary visual cortex AUTHOR: Zhaoping Li ABSTRACT: Stimuli outside classical receptive fields significantly influence the neurons' activities in primary visual cortex. We propose that such contextual influences are used to segment regions by detecting the breakdown of homogeneity or translation invariance in the input, thus computing GLOBAL region boundaries using LOCAL interactions. This is implemented in a biologically based model of V1, and demonstrated in examples of texture segmentation and figure-ground segregation. By contrast with traditional approaches, segmentation occurs without classification or comparison of features within or between regions and is performed by exactly the same neural circuit responsible for the dual problem of the grouping and enhancement of contours. Sincerely, Zhaoping Li (zhaoping at ai.mit.edu) From dwang at cis.ohio-state.edu Sun Oct 5 15:31:48 1997 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Sun, 5 Oct 1997 15:31:48 -0400 (EDT) Subject: Faculty position in image understanding Message-ID: <199710051931.PAA00423@shirt.cis.ohio-state.edu> JOB Description: Image Understanding For one of several open faculty positions resulting from a major initiative for image understanding, the Ohio State University is seeking a tenure-track computer scientist who works on theoretical aspects of image understanding and whose work relates to human visual perception. We are particularly interested in candidates who are capable of interdisciplinary and collaborative research. This position will be a joint appointment between the Department of Computer and Information Science and the Center for Cognitive Science, and is related to the image understanding initiative involving a number of departments and centers at OSU. The successful candidate will hold a Ph.D and have a demonstrated record of research accomplishments in image understanding. Applicants should send a curriculum vitae, along with a cover letter, by e-mail to fsearch at cis.ohio-state.edu, or by hardcopy to Chair, Faculty Search Committee Department of Computer and Information Science The Ohio State University 395 Dreese Lab 2015 Neil Avenue Columbus, OH 43210-1277 Applications will be accepted until the position is filled. From dld at cs.monash.edu.au Mon Oct 6 10:39:23 1997 From: dld at cs.monash.edu.au (David L Dowe) Date: Tue, 7 Oct 1997 00:39:23 +1000 Subject: Final Call for Abstracts, 1998 Pacific Symposium on Biocomputing (fwd) Message-ID: <199710061439.AAA23151@dec11.cs.monash.edu.au> > From news at nlm.nih.gov Tue Oct 7 00:34:25 1997 > To: bionet-announce at net.bio.net > From: < @nlm.nih.gov> > Newsgroups: bionet.announce,bionet.biology.computational, bionet.molbio.proteins,news.announce.conferences > Subject: Final Call for Abstracts, 1998 Pacific Symposium on Biocomputing CALL FOR ABSTRACTS Pacific Symposium on Biocomputing Kapalua, Maui, Hawaii -- January 4-9, 1998 http://www.cgl.ucsf.edu/psb The Pacific Symposium on Biocomputing (PSB) is soliciting the submission of abstracts for poster presentation and computer demonstrations. Abstracts may report on research results in any aspect of computational biology. PSB is a forum for the presentation of research in databases, algorithms, interfaces, visualization, modeling and other computational methods, as applied to biological problems, with emphasis on applications in data-rich areas of molecular biology. For more information on the program of PSB, see our web site, which contains the full list of peer reviewed papers being presented this year and the topical sessions into which the conference is organized. Abstracts MUST be submitted electronically by November 1st to Russ Altman (russ.altman at stanford.edu). Hardcopy abstracts will not be accepted. Abstracts must be in ASCII format with header information in the following order: Title, Authors, Institution, Mailing Address, Email Addresses, body of abstract. The abstracts should be no longer than 500 words, including header information. There can be no figures in abstracts. Workstations and internet connections will be available for demonstrations, in addition to space in poster presentation sessions. Please submit detailed requests for demonstration facilities along with your abstract. Remember, the FINAL DEADLINE IS NOVEMBER 1. We look forward to seeing you in Hawaii! -- , PhD. National Library of Medicine phone: +1 (301) 496-9303 Bldg. 38A, 9th fl, MS-54 fax: +1 (301) 496-0673 Bethesda. MD 20894 USA email: @nlm.nih.gov From prior at MIT.EDU Tue Oct 7 08:41:48 1997 From: prior at MIT.EDU (Robert Prior) Date: Tue, 7 Oct 97 08:41:48 EDT Subject: Series Announcement and Call for Proposals Message-ID: ------------------------------------------------------------------ The MIT Press - Adaptive Computation and Machine Learning Series ------------------------------------------------------------------ Tom Dietterich, Series Editor Christopher Bishop, David Heckerman, Michael Jordan, and Michael Kearns, Associate Editors The goal of building systems that can adapt to their environments and learn from their experience has attracted researchers from many fields, including computer science, engineering, mathematics, physics, neuroscience, and cognitive science. Out of this research has come a wide variety of learning techniques, including methods for learning decision trees, decision rules, neural networks, statistical classifiers, and probabilistic graphical models. These learning methods have the potential to transform many industrial and scientific fields. Many successful and profitable applications have already been developed. The researchers in these various areas have also produced several different theoretical frameworks for understanding these methods, such as computational learning theory, Bayesian learning theory, classical statistical theory, minimum description length theory, and statistical mechanics approaches. These theories provide insight into experimental results and help to guide the development of improved learning algorithms. Recently, the many separate research communities have begun to converge on a common set of issues surrounding supervised, unsupervised, and reinforcement learning problems. A goal of the series is to promote the unification of the many diverse strands of machine learning research and to foster high quality research and innovative applications. This book series will publish works of the highest quality that advance the understanding and practical application of machine learning and adaptive computation. Books appropriate for the series include: * Research monographs on any of the topics listed above * Textbooks at the introductory or advanced level * How-to books aimed at practitioners * Books intended to introduce the main goals and challenges of this area to a general technical audience. For information on the submission of proposals and manuscripts, please contact the editor, the publisher, or any of the associate editors listed above: Thomas G. Dietterich Robert V. Prior Computer Science Department The MIT Press Oregon State University 5 Cambridge Center Corvallis, OR 97331-3202 Cambridge, MA 02142 (541) 737-5559 (617) 253-1584 Fax: (541) 737-3014 Fax: (617) 258-6779 tgd at cs.orst.edu prior at mit.edu From ericr at mech.gla.ac.uk Wed Oct 8 07:32:22 1997 From: ericr at mech.gla.ac.uk (Eric Ronco) Date: Wed, 8 Oct 1997 12:32:22 +0100 (BST) Subject: No subject Message-ID: <4311.199710081132@googie.mech.gla.ac.uk> From shavlik at cs.wisc.edu Fri Oct 10 12:19:29 1997 From: shavlik at cs.wisc.edu (Jude Shavlik) Date: Fri, 10 Oct 1997 11:19:29 -0500 (CDT) Subject: Call for Workshop and Tutorial Proposals: 1998 ML Conf Message-ID: <199710101619.LAA02888@jersey.cs.wisc.edu> ICML-98: Call for Workshop and Tutorial Proposals ------------------------------------------------- The Fifteenth International Conference on Machine Learning (ICML-98) will be held at the University of Wisconsin, Madison USA from July 24 to July 26, 1998. ICML-98 will be co-located with the Eleventh Annual Conference on Computational Learning Theory (COLT-98) and the Fourteenth Annual Conference on Uncertainty in Artificial Intelligence (UAI-98). Seven additional AI conferences, including the Fifteenth National Conference on Artificial Intelligence (AAAI-98), will also be held in Madison next summer (see http://www.cs.wisc.edu/icml98/ for a complete list). Since ICML is being co-located with AAAI, there will NOT be a separate ICML workshop and tutorial program in 1998. Instead, people interested in submitting ML-related workshop or tutorial proposals should submit to the corresponding AAAI program. Members of the ML community are serving as AAAI workshop and tutorial co-chairs, and they are aware of the plan to have joint AAAI/ICML workshops and tutorials. Joint AAAI/ICML workshops and tutorials will be scheduled for Monday, July 27, 1998, the day between the ICML and AAAI technical programs. (AAAI has agreed to allow ICML attendees to attend AAAI workshops andtutorials without requiring attendance at AAAI.) Please note that the deadlines for these programs are near. October 31, 1997 is the deadline for AAAI workshop proposals, while November 14, 1997 is the deadline for AAAI tutorial proposals. For those who like to plan far ahead, the deadline for ICML technical-paper submissions will be March 2, 1998. A preliminary call for papers, as well as additional conference information including copies of the AAAI calls for tutorial and workshop proposals, is available at: http://www.cs.wisc.edu/icml98/ Jude Shavlik ICML-98, Chair icml98 at cs.wisc.edu PS - As usual, my apologies to those who receive this posting multiple times. From barto at cs.umass.edu Fri Oct 10 13:53:46 1997 From: barto at cs.umass.edu (Andy Barto) Date: Fri, 10 Oct 1997 13:53:46 -0400 Subject: Post Doctoral Position Message-ID: Below is a notice concerning a postdoctoral position in our lab for someone interested in motor control, learning, and development. ------------------------------------------- Adaptive Networks Laboratory, Department of Computer Science, University of Massachusetts, Amherst Postdoctoral Fellowship A postdoctoral fellowship will be available starting Jan 1, 1998 for interdisciplinary research related to the development of motor skills in human infants. The project involves collaboration between computer science researchers and developmental psychologists focusing on the development of reaching skills in infants. The goals are to develop computational models of motor learning that are consistent with modern developmental data and to design and conduct behavioral experiments related to these models. Applicants should have a Ph.D. in Computer Science, Psychology, or a related discipline, have knowledge of motor control, modeling techniques, programming experience, and should show evidence of exceptional research promise. The position is in the Adaptive Networks Laboratory, Department of Computer Science, University of Massachusetts, Amherst. For further information, contact: Dr. Andrew Barto, Tel. 413-545-2109; FAX: 413-545-1249; Email: Barto at cs.umass.edu. For further information regarding the laboratory, the department, and the pleasant environs, look at the WWW pages http://www.cs.umass.edu/ and http://www-anw.cs.umass.edu/. Also see http://forte.sbs.umass.edu/~berthier for relevant developmental literature. The University of Massachusetts is an Affirmative Action/Equal Opportunity employer. From brychcy at informatik.tu-muenchen.de Tue Oct 14 06:01:32 1997 From: brychcy at informatik.tu-muenchen.de (Till Brychcy) Date: Tue, 14 Oct 1997 12:01:32 +0200 Subject: CALL FOR PAPERS: FNS '98 in Munich, Germany Message-ID: <97Oct14.120144+0200met_dst.49366+9@papa.informatik.tu-muenchen.de> (A copy of this message has also been posted to the following newsgroups: comp.ai.neural-nets, comp.ai.fuzzy,de.sci.informatik.ki,de.sci.informatik.misc) CALL FOR PAPERS 5. International GI-Workshop Fuzzy-Neuro Systems '98 - Computational Intelligence - 18 - 20 March 1998, Munich Fuzzy-Neuro Systems '98 is the fifth event of a well established series of workshops with international participation. Its aim is to give an overview of the state of art in research and development of fuzzy systems and artificial neural networks. Another aim is to highlight applications of these methods and to forge innovative links between theory and application by means of creative discussions. Fuzzy-Neuro Systems '98 is being organized by the Research Committee 1.2 "Inference Systems" (Fachausschuss 1.2 "Inferenzsysteme") of the German Society of Computer Science GI (Gesellschaft fuer Informatik e. V.) and Technische Universitaet Muenchen with support by Siemens AG. The workshop takes place at the European Patent Office in Munich from March 18 to 20, 1998 Scientific Topics: - theory and principles of multivalued logic and fuzzy logic - representation of fuzzy knowledge - approximate reasoning - fuzzy control in theory and practice - fuzzy logic in data analysis, signal processing and pattern recognition - fuzzy classification systems - fuzzy decision support systems - fuzzy logic in non-technical areas like business administration, management etc. - fuzzy databases - theory and principles of artificial neural networks - hybrid learning algorithms - neural networks in pattern recognition, classification, process monitoring and production control - theory and principles of evolutionary algorithms: genetic algorithms and evolution strategies - discrete parameter and structure optimization - hybrid systems like neuro-fuzzy systems, connectionistic expert systems etc. - special hardware and software Please send four copies of your scientific contribution (4 to 6 pages) by 30 Nov. 1997 to: Prof. Dr. Dr. h.c. W. Brauer - FNS '98 - Institut fuer Informatik Technische Universitaet Muenchen D-80290 Muenchen Germany For further information visit the Internet homepage at: http://wwwbrauer.informatik.tu-muenchen.de/~fns98/ From A_BROWNE at europa.nene.ac.uk Wed Oct 15 10:12:16 1997 From: A_BROWNE at europa.nene.ac.uk (Tony Browne) Date: Wed, 15 Oct 1997 14:12:16 +0000 Subject: Two New Books Message-ID: <7B677E7061@europa.nene.ac.uk> Two New Books on Neural Networks Volume 1: Neural Network Perspectives on Cognition and Adaptive Robotics A. Browne (Ed.). Institute of Physics Press, Bristol, UK. ISBN 0-7503-0455-3 Volume 2: Neural Network Analysis, Architectures and Applications. A. Browne (Ed.). Institute of Physics Press, Bristol, UK. ISBN 0-7503-0499-5 CONTENTS: Volume 1: Part 1: Representation Challenges for Neural Computing. Antony Browne Representing Structure and Structured Representations in Connectionist Networks. Lars Niklasson and Mikael Boden. Chaos, Dynamics and Computational Power in Biologically Plausible Neural Networks. Robert Kentridge. Information-Theoretic Approaches to Neural Network Learning. Mark Plumbley. Part 2: Cognitive Modelling Exploring Different Approaches towards Everyday Commonsense Reasoning. Ron Sun Natural Language Processing with Subsymbolic Neural Networks. Risto Miikkilainen. The Relational Mind. Professor John Taylor. Neuroconsciousness: A Fundamental Postulate. Professor Igor Aleksander. Part 3: Adaptive Robotics The Neural Mind and The Robot. Professor Noel Sharkey and Jan Heemskerk. Teaching a Robot to See How it Moves. Patrick van der Smagt. Designing a Nervous System for an Adaptive Mobile Robot. Tom Scutt and Robert Damper. Bibliography (347 References) VOLUME 2: Part 1: Understanding and Simplifying Networks Analysing the Internal Representations of Trained Neural Networks. John Bullinaria. Information Maximization to Simplify Internal Representation. Ryotaro Kamimura. Rule Extraction from Trained Artificial Neural Networks. Robert Andrews, Alan Tickle, Mostefa Golea and Joachim Diederich. Part 2: Novel Architectures and Algorithms Pulse-Stream Techniques and Circuits for Implementing Neural Networks. Robin Woodburn and Professor Alan Murray. Cellular Neural Networks. Mark Joy. Efficient Training of Feed-Forward Neural Networks. Martin Moller. Exploiting Local Optima in Multiversion Neural Computing. Professor Derek Partridge. Part 3: Applications Neural and Neuro-Fuzzy Control Systems. Phil Picton. Image Compression using Neural Networks. Christopher Cramer and Erol Gelenbe. Oil Spill Detection: A Case Study using Recurrent Artificial Neural Networks. Tom Ziemke, Mikael Boden and Lars Niklasson. Bibliography (216 References) Dr A. Browne School of Information Systems Nene College Northampton NN2 7AL, UK From michal at neuron.tau.ac.il Wed Oct 15 06:11:36 1997 From: michal at neuron.tau.ac.il (Michal Finkelman) Date: Wed, 15 Oct 1997 12:11:36 +0200 (IST) Subject: Symposium on Neural Computation in honor of Prof. David Horn - Second Notice Message-ID: %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% TEL AVIV UNIVERSITY THE RAYMOND & BEVERLY SACKLER FACULTY OF EXACT SCIENCES SCHOOL OF PHYSICS AND ASTRONOMY SYMPOSIUM ON PARTICLE PHYSICS AND NEURAL COMPUTATION ---------------------------------------------------- IN HONOR OF DAVID HORN'S 60TH BIRTHDAY -------------------------------------- Monday, October 27th 1997 (9:15 AM - 05:30 PM) Lev Auditorium, Tel-Aviv University PROGRAM ---------- 9:15 AM: Opening addresses: Nili Cohen, Rector of Tel-Aviv University Yuval Ne'eman (Tel Aviv) 9:30 - 10:30: Gabriele Veneziano (CERN) - From s-t-u Duality to S-T-U Duality 10:30 - 11:00: Coffee break 11:00 - 12:00: Fredrick J Gilman (Carnegie Mellon) - CP Violation 12:00 - 1:30: Lunch break 1:30 - 2:30: Leon N Cooper (Brown) - From Receptive Fields to the Cellular Basis for Learning and Memory Storage: A Unified Learning Hypothesis 2:30 - 3:30: John J Hopfield (Princeton) - How Can We Be So Smart? Information Representation and Neurobiological Computation. 3:30 - 4:00: Coffee break 4:00 - 5:00: Yakir Aharonov (Tel Aviv) - A New Approach to Quantum Mechanics 5:00 PM: David Horn - Closing Remarks %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 1. Colleages and friends who wish to attend the symposium are kindly requested to NOTIFY US IN ADVANCE by e-mailing to michal at neuron.tau.ac.il. fax: 972-3-6407932 2. http://neuron.tau.ac.il/Symposium %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% From Peter.Nordin at dacapo.se Wed Oct 15 11:56:06 1997 From: Peter.Nordin at dacapo.se (Peter Nordin) Date: Wed, 15 Oct 1997 16:56:06 +0100 Subject: Announcement: Thesis available Message-ID: <01BCD98B.3BA54660@dhcp1.dacapo.se> My thesis on evolution or induction of register machine code is now available from the publisher Krehl-verlag in Germany. The described machine learning technique is a variant of genetic programming applied to evolution of binary machine code for a real computer--a technique which is very machine efficient. The title of the thesis is: Evolutionary Program Induction of Binary Machine Code and its Applications The thesis includes applications to robot control as well as image and sound processing. It can be ordered from: Krehl Verlag Postfach 51 01 42 D-48163 Muenster GERMANY Tel/Fax +49 231 261550 email: krehl-verlag at t-online.de Price: 68DM Below you will find the abstract, table of contents and details on how to order. Best regards, Peter Nordin --------------------------------------------------------------------- Evolutionary Program Induction of Binary Machine Code and its Applications Author: Peter Nordin Supervisor: Wolfgang Banzhaf ISBN 3-931546-07-1 290 pages Abstract This thesis presents the Compiling Genetic Programming System (CGPS) a machine learning method for automatic program induction. The objective of the system is to automatically produce computer programs. CGPS is the marriage of two new ideas (1) a special evolutionary program induction algorithm variant and (2) the use of large scale meta manipulation of binary code. Both ideas may have merits on their own but it is by combining the two that the real benefits emerge, making CGPS a powerful machine learning paradigm. (1) The evolutionary program induction method is an instance of an evolutionary algorithm (EA)|a class of algorithms that borrow metaphors from biology, evolution and natural selection. It uses a linear program representation in contrast to other well used methods such as Koza's genetic programming (GP) approach which has a hierarchical tree{based program structure. One way to view CGPS is as a large alphabet genetic algorithm (GA) where each letter in the alphabet corresponds to a syntactically closed computer program structure. A letter in the GA could for example be a line in a computer program i.e. an assignment a := a + 1. CGPS uses recombination (crossover) in between letters which guarantees syntactic closure during evolution. However, from a GA point of view, the letter in the linear string normally does not have an internal structure. A program line in a computer language the program or a machine code instruction have plenty of internal structure determining the operation to be performed and the operands used. A better metaphor could therefore be the gene concept in nature. Genes in DNA are syntactically closed sequences providing the recipe of a protein. As in CGPS, crossover normally acts between genes/instructions preserving the syntactic closure of the object. There is a mutation operator, to produce variation within the internal structure of the gene/instruction. In CGPS, the mutation operator changes the syntactically closed program object into another syntactically closed program object|it replaces a valid letter with another valid letter in the GA alphabet. CGPS also uses two significant features in the program individual (genome) which is the header and the footer. The header is a prefix part of the individual while the footer is the suffix part. The genetic operators prevent the header and the footer from being changed, and the header and the footer are used to ensure syntactic closure of the whole individual. (2) CGPS in this thesis is mostly applied to program induction of binary machine code. Binary machine code is the code in the computer that is directly executed by the processor. The program individual in CGPS is a binary machine code function and the genetic operators operate directly on the binary machine code. This implies that CGPS is a meta-manipulating program. In low level programming, a meta-manipulating program is a program that changes its own binary code or the binary code of another pro- gram. Usually, this only means changing a few bytes e.g. dynamic linking of a function. However, CGPS is the first real instance of a large scale meta manipulating program which constantly manipulates, shuffles around and changes large chunks of binary code as a part of the learning process. So, the output from CGPS is a binary machine code program that can be executed directly by the processor. The meta manipulation of the individual (genome) is done with the evolutionary algorithm, mentioned in (1)" above. The headers and footers are used to make sure that the individual always preserves syntactic closure during evolution, no matter what the genetic operators do with the code in between. The marriage of these two ideas produce a system which can induce turing-complete machine code programs very efficiently. The system also has several other attractive properties, such as constant memory usage, compact representation and uncomplicated memory management. In this thesis the background to CGPS is presented together with other evolutionary algorithms and other evolutionary program induction techniques. The detailed descrip- tion of CGPS and its implementation is described together with several evaluations or case studies in different feasible domains such as robotic control, image processing and natural language processing. Some of the evaluations make comparisons to other ma- chine learning algorithms; neural networks and hierarchical genetic programming. The formal definition of CGPS is given followed by aninvestigation of generalization in vari- able length evolutionary algorithms. Finally, several possible directions for the future of CGPS research are presented. Table of Contents 1 Introduction .....21 1.1 Evolutionary Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . 22 1.2 Genetic Programming . . .. . .. .. . .. . .. . .. . .. . .. .. . . 25 1.3 Machine Language Genetic Programming . . .. . .. . .. . .. .. . . 31 1.4 Introduction to CGPS . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 I Implementation 47 2 Computer Hardware 49 2.1 Introduction to Computer Hardware. . .. . .. . .. . .. . .. .. . . 50 2.2 Von Neumann Computers . . . . . . . . . . . . . . . . . . . . . . . . . 50 3 The SPARC Architecture and CGPS 55 3.1 Fundamentals of CGPS . .. . .. .. . .. . .. . .. . .. . .. .. . . 56 3.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.3 Floating Point Instructions . . .. .. . .. . .. . .. . .. . .. .. . 69 3.4 Self Modifying Code in C language . . . . . . . . . . . . . . . . . . . . 69 3.5 How to Call a Self|made Function from C . .. . .. . .. . .. .. . . 69 3.6 Genetic Operators . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 70 3.7 Initialisation . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . 72 4 Additional Features of the System 75 4.1 Leaf Procedures and Primitives.... . .. . .. . .. . .. . .. .. . . 77 4.2 Memory in Tree{Based GP . . .. .. . .. . .. . .. . .. . .. .. . . 78 4.3 Conditionals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.4 Automatically De ned Subroutines in CGPS .. . .. . .. . .. .. . . 80 4.5 Leaf Procedure Examples .. . .. .. . .. . .. . .. . .. . .. .. . . 84 4.6 Loops and Recursion in Tree{Based GP . . . . . . . . . . . . . . . . . 85 4.7 Loops and Recursion in CGPS .... . .. . .. . .. . .. . .. .. . . 88 4.8 Loop Example in CGPS . . . . . . . . . . . . . . . . . . . . . . . . . . 88 4.9 External Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 4.10 Strings and Lists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 4.11 Parameters to the System . . . . . . . . . . . . . . . . . . . . . . . . . 92 4.12 C-language Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 4.13 Platforms for CGPS .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 94 4.14 Portability Methods .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 94 4.15 Caveats . .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 94 4.16 How to Get Started .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 95 4.17 Using Tree Representation . . . . . . . . . . . . . . . . . . . . . . . . . 96 4.18 Speed Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 4.19 Why is Binary Manipulation so Fast? . .. . .. . .. . .. . .. .. . . 98 4.20 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 5 A Walkthrough of an Example System. 101 5.1 Variables Constants and Parameters . . .. . .. . .. . .. . .. .. . . 102 5.2 Random Number Generation . .. .. . .. . .. . .. . .. . .. .. . . 104 5.3 Initialisation . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . 107 5.4 Output Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 5.5 The Fitness Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 5.6 Reproduce an Individual . . . . . . . . . . . . . . . . . . . . . . . . . . 109 5.7 Crossover .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 110 5.8 Mutation .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 110 5.9 Tournament Selection . . .. . .. .. . .. . .. . .. . .. . .. .. . . 111 5.10 Read Training Data .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 111 5.11 The Main Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 II Evaluations 115 6 On-line CGPS 117 6.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 6.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 6.3 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 6.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 6.5 Conclusions of On-line CGPS . . . . . . . . . . . . . . . . . . . . . . . 135 7 Control Using Memory ofPast Events 139 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 7.2 The Evolutionary Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 140 7.3 Setup . . .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 141 7.4 Objectives.. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 141 7.5 The Memory Based GP Control Architecture .. . .. . .. . .. .. . . 142 7.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146 7.7 Future Directions of Memory Based GP in Control . . . . . . . . . . . 152 7.8 Summary .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 153 8 High-performance Applications 155 8.1 Historic Remarks on CGPS and Image Coding . . .. . .. . .. .. . . 156 8.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156 8.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157 8.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 8.5 Summary and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . 161 9 CGPS and Programmatic Compression 163 9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164 9.2 Programmatic Compression (PC) . . . . . . . . . . . . . . . . . . . . . 164 9.3 Compression of Sound . . . . . . . . . . . . . . . . . . . . . . . . . . . 166 9.4 Compression of Pictures . . . . . . . . . . . . . . . . . . . . . . . . . . 171 9.5 Summary and Conclusion .. . .. .. . .. . .. . .. . .. . .. .. . . 173 10 CGPS and Tree{Based GP 175 10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176 10.2 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176 10.3 The Evolutionary Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 177 10.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177 10.5 Discussion and Conclusions . . .. .. . .. . .. . .. . .. . .. .. . 177 11 Neural Networks and CGPS 179 11.1 The Sample Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180 11.2 The Neural Network.. . .. . .. .. . .. . .. . .. . .. . .. .. . . 180 11.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182 11.4 Summary .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 183 12 Neural Networks and Generalisation 185 12.1 The Problems Used In This Study .. . .. . .. . .. . .. . .. .. . . 186 12.2 Classi cation as Symbolic Regression . . . . . . . . . . . . . . . . . . . . 187 12.3 Introns and Explicitly De ned Introns . . . . . . . . . . . . . . . . . . . 187 12.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187 12.5 Summary .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 190 III Explanation 193 13 Formalisation of CGPS 195 13.1 Register Machines . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 196 13.2 Evolutionary Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . 199 13.3 Compiling Genetic Programming . . . . . . . . . . . . . . . . . . . . . 203 14 Complexity, Compression and Evolution 205 14.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206 14.2 Complexity, E ective Fitness and Evolution . . . . . . . . . . . . . . . . 209 14.3 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 14.4 Kolmogorov Complexity and Generalisation . . . . . . . . . . . . . . . 217 14.5 Empirical Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220 14.6 Other Evolutionary Techniques . . . . . . . . . . . . . . . . . . . . . . 223 14.7 Summary and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . 223 15 Explicitly Defined Introns 229 15.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230 15.2 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232 15.3 The Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . 233 15.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235 16 Conclusions and Outline of Future Perspectives 245 16.1 A Computer Language for Meta-manipulation . . . . . . . . . . . . . . 246 16.2 Typed GP, Constrained Crossover, Grammar and Search Bias . . . . . 247 16.3 Other Machine Learning Algorithms . . .. . .. . .. . .. . .. .. . . 251 16.4 Special Processors . .. . .. . .. .. . .. . .. . .. . .. . .. .. . 252 16.5 Large Number of Input Parameters . . . . . . . . . . . . . . . . . . . . 253 16.6 Reasoning about Machine Code Programs with a Meta GP System . . . 254 16.7 The Logic of Genetic Reasoning . . . . . . . . . . . . . . . . . . . . . . 258 16.8 Some Brief Initial Results .. . .. .. . .. . .. . .. . .. . .. .. . 259 Appendix A: Flow Charts of CGPS 263 * Orders within Germany can easily order in any bookstore (DM 68,-), although firms/institutions or (Master or Visa) credit card owners also can order directly (shipping and credit card service charge included). * International Sales are possible to Master or Visa-Card owners: - order should arrive via FAX (+49 2501 261550) or PGP-signed (and optionally encrypted, our PGP-key is available upon request) email (with PGP public key trustfully available). In the moment, these ways are the only ones protecting the customers sensitive payment data (we are working on secure order via WWW but have no final timeline for this now) The order should mention credit card type, number and expiration date, as well as card holders name and address. - Beside the cost of the book (w/o VAT), i.e. DM 68,- minus 7% the interested party has to decide about "surface" or "air" mail (rates available upon request for different areas, e.g. USA DM 3.50 for 3-5 weeks delivery or DM 24,-- for 1-2 weeks delivery). The total amount is drawn from the credit card in german currency. From ataxr at IMAP1.ASU.EDU Wed Oct 15 22:01:31 1997 From: ataxr at IMAP1.ASU.EDU (Asim Roy) Date: Wed, 15 Oct 1997 22:01:31 -0400 (EDT) Subject: COULD THERE BE REAL-TIME, INSTANTANEOUS LEARNING IN THE BRAIN? Message-ID: I am posting this memo to various newsgroups. So my apologies if you get multiple copies. I thought this would be an interesting topic of discussion in these scientific communities. Please respond to me directly and I will post all responses at the appropriate time. Asim Roy Arizona State University ------------------------------------------------ COULD THERE BE REAL-TIME, INSTANTANEOUS LEARNING IN THE BRAIN? One of the fundamental beliefs in neuroscience, cognitive science and artificial neural networks is that the brain learns in real-time. That is, it learns instantaneously from each and every learning example provided to it by adjusting the synaptic strengths or connection weights in a network of neurons. The learning is generally thought to be accomplished using a Hebbian-style mechanism or some other variation of the idea (a local learning law). In these scientific fields, real-time learning also implies memoryless learning. In memoryless learning, no training examples are stored explicitly in the memory of the learning system, such as the brain. It can use any particular training example presented to it to adjust whatever network it is learning in, but must forget that example before examining others. The idea is to obviate the need for large amounts of memory to store a large number of training examples. This section looks at the possibility of real-time learning in the brain from two different perspectives. First, some factual behavioral evidence from a recent neuroscience study on learning of motor skills is examined. Second, the idea of real-time learning is examined from a broader behavioral perspective. A recent study by Shadmehr and Holcomb [1997] may lend some interesting insight on how the brain learns. In this study, a positron emission tomography (PET) device was used to monitor neural activity in the brain as subjects were taught and then retested on a motor skill. The task required them to manipulate an object on a computer screen by using a motorized robot arm. It required making precise and rapid reaching movements to a series of targets while holding the handle of the robot. And these movements could be learned only through practice. During practice, the blood flow was most active in the prefrontal cerebral cortex of the brain. After the practice session, some of the subjects were allowed to do unrelated routine things for five to six hours and then retested on their recently acquired motor skill. During retesting of this group, it was found that they had learned the motor skill quite well. But it was also found that the blood flow now was most active in a different part of the brain, in the posterior parietal and cerebella areas. The remaining test subjects were trained on a new motor task immediately after practicing the first one. Later, those subjects were retested on the first motor task to find out how much of it they had learnt. It was found that they had reduced levels of skill (learning) on the first task compared to the other group. So Shadmehr and Holcomb [1997] conclude that after practicing a new motor skill, it takes five to six hours for the memory of the new skill to move from a temporary storage site in the front of the brain to a permanent storage site at the back. But if that storage process is interrupted by practicing another new skill, the learning of the first skill is hindered. They also conclude that the shift of location of the memory in the brain is necessary to render it invulnerable and permanent. That is, it is necessary to consolidate the motor skill. What are the real implications of this study? One of the most important facts is that although both groups had identical training sessions, they had different levels of learning of the motor task because of what they did subsequent to practice. From this fact alone one can conclude with some degree of certainty that real-time, instantaneous learning is not used for learning motor skills. How can one say that? One can make that conclusion because if real-time learning was used, there would have been continuous and instantaneous adjustment of the synaptic strengths or connection weights during practice in whatever net the brain was using to learn the motor task. This means that all persons trained in that particular motor task should have had more or less the same "trained net," performance-wise, at the end of that training session, regardless of what they did subsequently. (It is assumed here that the task was learnable, given enough practice, and that both groups had enough practice.) With complete, permanent learning (weight-adjustments) from "real-time learning," there should have been no substantial differences in the learnt skill between the two groups resulting from any activity subsequent to practice. But this study demonstrates the opposite, that there were differences in the learnt skill simply because of the nature of subsequent activity. So real-time, instantaneous and permanent weight-adjustment (real-time learning) is contradictory to the results here. Second, from a broader behavioral perspective, all types of "learning" by the brain involves collection and storage of information prior to actual learning. As is well known, the fundamental process of learning involves: (1) collection and storage of information about a problem, (2) examination of the information at hand to determine the complexity of the problem, (3) development of trial solutions (nets) for the problem, (4) testing of trial solutions (nets), (5) discarding such trial solutions (nets) if they are not good enough, and (6) repetition of these processes until an acceptable solution is found. Real-time learning is not compatible with these learning processes. One has to remember that the essence of learning is generalization. In order to generalize well, one has to look at the whole body of information relevant to a problem, not just bits and pieces of the information at a time as in real-time learning. So the argument against real-time learning is simple: one cannot learn (generalize) unless one knows what is there to learn (generalize). One finds out what is there to learn (generalize) by collecting and storing information about the problem. In other words, no system, biological or otherwise, can prepare itself to learn (generalize) without having any information about what is to be learnt (generalized). Learning of motor skills is no exception to this process. The process of training is simply to collect and store information on the skill to be learnt. For example, in learning any sport, one not only remembers the various live demonstrations given by an instructor (pictures are worth a thousand words), but one also remembers the associated verbal explanations and other great words of advise. Instructions, demonstrations and practice of any motor skill are simply meant to provide the rules, exemplars and examples to be used for learning (e.g. a certain type of body, arm or leg movement in order to execute a certain task). During actual practice of a motor skill, humans not only try to follow the rules and exemplars to perform the actual task, but they also observe and store new information about which trial worked (example trial execution of a certain task) and which didn't. One only ought to think back to the days of learning tennis, swimming or some such sport in order to verify information collection and storage by humans to learn motor skills. It shouldn't be too hard too explain the "loss of skill" phenomenon, from back-to-back instructions on new motor skills, that was observed in the study. The explanation shouldn't be different from the one for the "forgetting of instructions" phenomenon that occurs with back-to-back instructions in any learning situation. A logical explanation perhaps for the "loss of motor skill" phenomenon, as for any other similar phenomenon, is that the brain has a limited amount of working or short term memory. And when encountering important new information, the brain stores it simply by erasing some old information from the working memory. And the prior information gets erased from the working memory before the brain has the time to transfer it to a more permanent or semi-permanent location for actual learning. So "loss of information" in working memory leads to a "loss of skill." Another fact from the study that is highly significant is that the brain takes time to learn. Learning is not quick and instantaneous. Reference: Shadmehr, R. and Holcomb, H. (August 1997). "Neural Correlates of Motor Memory Consolidation." Science, Vol. 277, pp. 821-825. From protasi at mat.utovrm.it Thu Oct 16 07:31:16 1997 From: protasi at mat.utovrm.it (Marco Protasi) Date: Thu, 16 Oct 1997 13:31:16 +0200 Subject: Special issue of DAM Message-ID: CALL FOR PAPERS Special Issue on Discrete vs analog computation: Links between computational complexity and local minima Discrete Applied Mathematics (Elsevier) M. Gori and M. Protasi (Eds) ================================================================================ In the last few years, the resurgence of interest in fields like artificial neural networks has been raising some very intriguing theoretical questions on the links between discrete and analog computation. Basically, most analog schemes massively adopted in machine learning and combinatorial optimization rely on function optimization. In this setting, the inherent complexity of the problem at hand seems to appear in terms of local minima and, more generally, in terms of numerical problems of the chosen optimization algorithm. While most practitioners use to accept without reluctance the flavor of suspiciousness which arises from function optimization (in which one has often no guarantee to reach the global optimum), most theoreticians are instead quite skeptical. On the other hand, the success of analog computation for either learning or problem solving is often related to the problem at hand and, therefore, one can expect an excellent behavior for a class of problems, while one can raise serious suspects about the solution of others. To the best of our knowledge, this intuitive idea has not been satisfactorily explained from a theoretical point of view. Basically, there is neither theory to support naturally the intuitive concept of suspiciousness which arises from approaches based on continuous optimization, nor theory to relate this concept to computational complexity, traditionally developed in the discrete setting. The study of the complexity of algorithms has been essentially performed on discrete structures and an impressive theory has been developed in the last two decades. On the other hand optimization theory has a twofold face: discrete optimization and continuous optimization. Actually there are some important approaches of the computational complexity theory that were proposed for the continuous cases, for instance, the information based-complexity (Traub) and real Turing Machines (Blum-Shub-Smale). These approaches can be fruitfully applied to problems arising in continuous optimization but, generally speaking, the formal study of the efficiency of algorithms and problems has received much more attention in the discrete environment, where the theory can be easily used and error and precision problems are not present. The face that the complexity assumes in the analog setting is not very clear and, moreover, the links with traditional computational complexity for the discrete setting still need to be explored. The aim of the special issue is mainly to study the links between discrete and continuous versions of the same problem. Since, until now, these links have been rarely explored, the special issue is supposed to stimulate different points of view; people working on discrete optimization are in fact likely not to be expert on the continuous side and vice-versa. Prospecting authors are theoretical computer scientists working in the field of algorithms, operations research people, researchers working in neural network, and researchers in nonlinear system theory. Possible topics for papers submitted to the special issue include, but are not limited to: - Links between the complexity of algorithms in the continuous and discrete settings - Approximate solutions of problems in the continuous and discrete settings - Analog computing and dynamical systems - Combinatorial optimization by neural networks: Complexity issues - Learning in artificial neural networks: Local minima and complexity issues All submissions will undergo thorough refereeing process, according to the usual standards of the journal. Prospective authors should submit six copies of a manuscript to one of the Guest Editors by April10, 1998. ================================================================================ Marco Gori Dipartimento di Ingegneria dell'Informazione Universita' di Siena Via Roma, 56 53100 Siena (Italy) Voice: +39 (577) 26.36.10 Fax: +39 (577) 26.36.02 E-mail: marco at ing.unisi.it WWW: http://www-dsi.ing.unifi.it/neural/~marco Marco Protasi Dipartimento di Matematica Universita' di Roma "Tor Vergata" Via della Ricerca Scientifica 00133 Roma (Italy) Voice: +39 (6) 72.59.46.78 Fax: +39 (6) 72.59.46.99 or 72.59.42.95 E-mail: protasi at mat.utovrm.it WWW:http://www.mat.utovrm.it ----------------------------------------------------------------------------- Marco Protasi Tel: +39-6-72594678 Dipartimento di Matematica Fax: +39-6-72594699 Universita' di Roma "Tor Vergata" e-mail: protasi at mat.utovrm.it Via della Ricerca Scientifica protasi at utovrm.it 00133 Roma - Italy ---------------------------------------------------------------------------- From kmccumb at mri.jhu.edu Thu Oct 16 08:23:32 1997 From: kmccumb at mri.jhu.edu (Karen McCumber) Date: Thu, 16 Oct 1997 08:23:32 -0400 Subject: Distinguished Postdoctoral Fellowships at Hopkins Message-ID: <1.5.4.32.19971016122332.006fc028@128.220.160.41> Johns Hopkins University Biomedical Engineering Department invites applications for the Distinguished Postdoctoral Fellowship Program. This program is funded by the Whitaker Foundation. It aims to promote an inter-disciplinary approach to the study of complex biomedical systems and provide the very best of the recently graduated PhDs an opportunity to perform independent research in a supportive environment. Each recipient will be sponsored by two faculty, at least one of which must have a primary appointment in BME. Of particular interest are candidates with a background in computational neuroscience who wish to pursue projects in laboratories of the Systems Neuroscience or Theoretical and Computational Biology groups of faculty. Salary of the fellows will start at $30,000 per year, with additional funds to cover health insurance. The duration of the fellowship is two years. Interested applicants should submit the following documents: * CV * Two letters of reference * Two page summary of research interests Send your application materials to: Dr. Murray Sachs Dept. of Biomedical Engineering, Johns Hopkins School of Medicine 720 Rutland Ave Baltimore, MD 21205 Deadline for receipt of applications is Dec. 15, 1997. For further information: http://www.bme.jhu.edu From thimm at idiap.ch Thu Oct 16 08:45:22 1997 From: thimm at idiap.ch (Georg Thimm) Date: Thu, 16 Oct 1997 14:45:22 +0200 Subject: Events on Neural Networks on a WWW page (new WWW address!) Message-ID: <199710161245.OAA27738@rotondo.idiap.ch> WWW page for Announcements of Conferences, Workshops and Other Events on Neural Networks and Related Fields (i.e. Vision and Speech) ----------------------------------------- This WWW page allows you to enter and look up announcements for conferences, workshops, and other events on neural networks and related fields (i.e. vision and speech). The event lists, which is updated almost daily, contains more than 150 forthcoming events and can be accessed via the URL: http://www.idiap.ch/NN-events The entries are ordered chronologically and presented in format for fast and easy lookup of: - the date and place of the event, - the title of the event, - a contact address (surface mail, email, ftp, and WWW address, as well as telephone or fax number), and - deadlines for submissions, registration, etc. - topics of the event Conference organizers are kindly asked to enter their conference into the database. The list is in parts published in the journal Neurocomputing by Elsevier Science B.V. Information on passed conferences are also available. Regards, Georg Thimm P.S. Please distribute this announcement to neural network, vision or speech related mailing lists. Comments and suggestions are welcome! From reggia at cs.umd.edu Fri Oct 17 13:06:54 1997 From: reggia at cs.umd.edu (James A. Reggia) Date: Fri, 17 Oct 1997 13:06:54 -0400 (EDT) Subject: CFP: Neural Models Brain & Cognitive Disorders Message-ID: <199710171706.NAA13289@avion.cs.umd.edu> SECOND INTERNATIONAL WORKSHOP ON NEURAL MODELING OF BRAIN AND COGNITIVE DISORDERS ** Initial Announcement and Call for Abstracts ** Sponsors: National Institute of Mental Health Whitaker Foundation Univ. of Maryland Inst. for Advanced Computer Studies Center for Neural Basis of Cognition, CMU & Univ. of Pittsburgh Adams Super Center for Brain Studies, Tel Aviv Neuroscience and Cognitive Science Program, UMCP A workshop on Neural Modeling of Brain and Cognitive Disorders will be held on June 4 - 6, 1998 at the University of Maryland, College Park, just outside of Washington, DC. The focus of this meeting will be on the lesioning of neural network models to study disorders in neurology, neuropsychology and psychiatry, such as Alzheimer's disease, amnesia, aphasia, depression, epilepsy, neglect, parkinsonism, schizophrenia, and stroke. These models attempt to explain how specific pathological neuroanatomical and neurophysiological changes can result in various clinical manifestations, and to investigate the functional organization of the symptoms that result from specific brain pathologies. A Proceedings consisting of abstracts from the presentations will be available for attendees. The emphasis at the workshop will be on reviewing and discussing new contributions to this field since the first meeting was held in 1995. Many of the invited contributions from the first workshop appeared in a World Scientific book last year; see web page indicated below. *** CALL FOR ABSTRACTS *** Individuals wishing to present a poster related to any aspect of the workshop's themes should submit an abstract describing the nature of their presentation. The single page submission should include title, author(s), contact information (address and email/fax), and abstract. One inch margins and a typesize of at least 10 points should be used. Abstracts will be reviewed by the Program Committee; those accepted will be published in the workshop proceedings. Six copies of the camera-ready abstract should be mailed TO ARRIVE by February 1, 1998 to James A. Reggia, Dept. of Computer Science, A.V. Williams Bldg., University of Maryland, College Park, MD 20742 USA. Web Page -------- The latest information about this meeting can be found at http://www.cs.umd.edu/~reggia/workshop/ Travel Fellowships: ------------------ Funding is expected for a few fellowships to offset travel cost of students, postdocs, and/or residents. Further details will be forthcoming. CME Credit: ---------- The possibility of offering CME credits for attendance is currently being explored. Program Committee: ----------------- Rita Berndt (UMAB), Avis Cohen (UMCP), Tim Gale (Univ. Hertfordshire), Helen Gigley (ONR), Dennis Glanzman (NIMH), Barry Gordon (Hopkins), Michael Hasselmo (Harvard), James McClelland (CMU), James Reggia (UMCP), Eytan Ruppin (Tel Aviv), Greg Siegel (San Diego), Nitish Thankor (Hopkins). Registration and Further Information: ----------------------------------- To receive registration materials (distributed most likely in January/February), please send your name, address, email address, phone number and fax number to Cecilia Kullman, UMIACS, A. V. Williams Bldg., University of Maryland, College Park, MD 20742 USA. (Tel: (301) 405-0304, Fax: (301) 314-9658, and email: cecilia at umiacs.umd.edu). Further questions about conference administration, hotel reservations, etc. should also be directed to Ms. Kullman. For questions about the workshop technical/scientific content or abstract submissions, please contact Jim Reggia (address above, Fax: (301) 405-6707, email: reggia at cs.umd.edu). Preliminary List of Speakers ---------------------------- PARKINSONISM/OTHER BASAL GANGLIA DISORDERS Discussant and Chair: Steven Wise, NIMH Jose Contreras-Vidal, Arizona State University A Neural Network Model of the Effects of L-dopa Therapy in Parkinson's Disease Donald Borrett, Toronto East General Hospital Recurrent Neural Networks and Parkinson's Disease Rolf Kotter, University of Dusseldorf Striatal Mechanisms in Parkinson's Disease: Insights from Computer Modeling LANGUAGE/COGNITIVE DISORDERS Discussant and Chair: Gary Dell, University of Illinois Kate Mayall, University of Birmingham A Connectionist Model of Peripheral Dyslezia Jay McClelland, Carnegie-Mellon University Reopening the Critical Period: A Hebbian Account of Interventions that Induce Change in Language Perception Risto Miikkulainen, University of Texas at Austin Dyslexic and Aphasic Impairments in a Self-Organizing Model of the Lexicon David Plaut, Carnegie-Mellon University Systematicity and Specialization in Semantics: A Connectionist Account of Optic Aphasia STROKE AND EPILEPSY Discussant and Chair: Mark Hallett, NINDS Bill Lytton, Univ. of Wisconsin & Wm. S. Middleton VA Hospital Modeling Recovery from Experimental Ablation Jim Reggia, University of Maryland Modeling the Interhemispheric Effects of Stroke Eytan Ruppin, Tel-Aviv University The Pathogenesis of Spreading Tissue Damage Following Acute Focal Stroke: A Computational Investigation Terry Sejnowski, Howard Hughes Medical Institute and Salk Institute Thalamic Model of Absence Epilepsy NEGLECT AND RELATED DISORDERS Discussant and Chair: Marlene Behrmann, Carnegie Mellon University Mike Mozer, University of Colorado Modeling Neglect of Objects and Space Alexandre Pouget, Georgetown University A Neural Theory of Hemineglect Richard Shillcock, University of Edinburgh Connectionist Modelling of Unilateral Visual Neglect: the Crossover Effect in Line Bisection Rita Sloan Berndt & Carol Whitney, University of Maryland Positional Reading Errors: A New Interpretation of Right Neglect Dyslexia AMNESIA/ALZHEIMER'S DISEASE Discussant and Chair: John Lisman, Brandeis University Pablo Alvarez, Boston University A Neural Model of Retrograde Amnesia and Memory Consolidation Mike Hasselmo, Harvard University Memory Function and Dysfunction in a Network Simulation of the Hippocampal Formation David Horn, Tel Aviv University Response of Multimodular Memory Networks to Different Lesion Types Mark Gluck, Rutgers University Empirical Tests of Models of Hippocampal Function with Amnesic and Elderly Populations SCHIZOPHRENIA/PSYCHIATRY Discussant and Chair: William Carpenter, Maryland Psychiatric Research Center and UMAB Jonathan Cohen, University of Pittsburgh and Carnegie-Mellon University The Role of Dopamine in Regulating Access to Prefrontal Cortex: Normal Function and Disturbances in Schizophrenia Ralph Hoffman, Yale University Modeling Postnatal Neurodevelopment, Psychosis Induction, and the Locus of Action of Antipsychotic Drugs Sunjay Berdia, Yale University Neural Network Modeling of the Wisconsin Card Sorting Test Greg Siegle, San Diego State Univ. and Univ. of California, San Diego A Neural Network Model of Affective Interference in Depression THE FUTURE: SCIENTIFIC AND FUNDING EXPECTATIONS Dennis Glanzman, National Institute of Mental Health From kaspar.althoefer at kcl.ac.uk Fri Oct 17 13:11:36 1997 From: kaspar.althoefer at kcl.ac.uk (Althoefer, Kaspar) Date: Fri, 17 Oct 1997 18:11:36 +0100 Subject: PhD-Thesis by Kaspar Althoefer - "Neuro-Fuzzy Motion Planning ...." Message-ID: <34479C48.4C46370C@kcl.ac.uk> The following PhD thesis is now available: "Neuro-Fuzzy Motion Planning for Robotic Manipulators" by Kaspar ALTHOEFER. The thesis will not be available on a Web or ftp site, but I would be pleased to send my thesis as a postscript file to everybody who wants a copy. Please, contact me via e-mail, if you are interested. My e-mail address is Kaspar.Althoefer at kcl.ac.uk. Below you will find the abstract and the table of contents. Best regards, Kaspar Althoefer. --------------------------------------------------------------------- Abstract On-going research efforts in robotics aim at providing mechanical systems, such as robotic manipulators and mobile robots, with more intelligence so that they can operate autonomously. Advancing in this direction, this thesis proposes and investigates novel manipulator path planning and navigation techniques which have their roots in the field of neural networks and fuzzy logic. Path planning in the configuration space makes necessary a transformation of the workspace into a configuration space. A radial-basis-function neural network is proposed to construct the configuration space by repeatedly mapping individual workspace obstacle points into so-called C-space patterns. The method is extended to compute the transformation for planar manipulators with n links as well as for manipulators with revolute and prismatic joints. A neural-network-based implementation of a computer emulated resistive grid is described and investigated. The grid, which is a collection of nodes laterally connected by weights, carries out global path planning in the manipulator?s configuration space. In response to a specific obstacle constellation, the grid generates an activity distribution whose gradient can be exploited to construct collision-free paths. A novel update algorithm, the To&Fro algorithm, which rapidly spreads the activity distribution over the nodes is proposed. Extensions to the basic grid technique are presented. A novel fuzzy-based system, the fuzzy navigator, is proposed to solve the navigation and obstacle avoidance problem for robotic manipulators. The presented system is divided into separate fuzzy units which individually control each manipulator link. The competing functions of goal following and obstacle avoidance are combined in each unit providing an intelligent behaviour. An on-line reinforcement learning method is introduced which adapts the performance of the fuzzy units continuously to any changes in the environment. All above methods have been tested in different environments on simulated manipulators as well as on a physical manipulator. The results proved these methods to be feasible for real-world applications. ........................................................................................ TABLE OF CONTENTS Abstract ii Acknowledgments iii Table of Contents iv List of Figures vii List of Tables ix 1 Introduction 1 1.1 Methodology 2 1.2 Constructing the C-space, Global Path Planning, Local Navigation: An Overview 4 1.2.1 Manipulator Motion Planning 4 1.2.2 The Computation of the C-space and the Building of Maps 5 1.2.3 Global Path Planning in C-space 6 1.2.4 Local Navigation 8 1.3 Contributions made by this Thesis 10 2 Workspace to C-space Transformation 11 2.1 Introduction 11 2.2 The Configuration Space in Context 13 2.3 The Configuration Space of a Robotic Manipulator 15 2.4 The Mapping of Obstacle Points into their C-space Counterpart 16 2.4.1 The Single Point Mapping 16 2.4.2 The C-space of a 2-Link Revolute Arm 18 2.4.3 The C-space of a 2-Link Arm with Prismatic and Revolute Joints 23 2.5 The C-space of an n-Link Arm 26 2.5.1.1 Comparison of configuration space representations 32 2.5.1.2 Reduction in Complexity 33 2.6 A Radial Basis Function Network for the Workspace to C-space Transformation 35 2.6.1 The RBF-Network for the C-space Calculation 35 2.6.2 The Training of the Network: Insertion of Nodes 37 2.7 A Three-link Manipulator 39 2.8 Real-world Applications 41 2.8.1 C-space Patterns for a Physical Manipulator 41 2.8.2 Timing Considerations 45 2.8.3 A Real-World Planning System 47 2.8.3.1 Image Processing 48 2.8.3.2 Input to the Radial-Basis-Function Network 50 2.9 Summary 51 3 A Neuro-Resistive Grid for Path Planning 53 3.1 Problem Definition and Overview of the Algorithm 53 3.2 Related Work 55 3.2.1 Resistive Grids for Path Planning 55 3.2.2 The Hopfield Network 56 3.2.3 Cellular Neural Network 57 3.2.4 Dynamic Programming 58 3.3 Path Planning in the Configuration Space 60 3.4 The Neuro-Resistive Grid 61 3.4.1 Implementation of the Neuro-Resistive Grid 61 3.4.2 Functioning of the Resistive Grid 63 3.4.3 Harmonic Functions 66 3.4.4 Boundary Conditions - Dirichlet vs. Neumann 68 3.4.5 Convergence Criterion for the Neuro-resistive Grid 72 3.5 Enhanced Activity Propagation 74 3.5.1 Methodology 74 3.5.2 Higher Dimensions 77 3.5.3 Global Extremum and Collision-free Path 78 3.5.4 A Non-Topologically-Ordered Grid 82 3.5.5 Soft Safety Margin 83 3.6 Experiments 84 3.6.1 Real-World Experiments with the MA 2000 Manipulator 84 3.6.2 A Planar Three-Link Manipulator 94 3.6.3 A Three-dimensional SCARA Manipulator 101 3.6.4 A Mobile Robot in a 3D-Workspace 101 3.7 Comparative Studies 102 3.7.1 Comparisons to Other Update Rules 102 3.7.2 Comparison to Other Update Sequences 105 3.7.3 Comparison to the A*-Algorithm 106 3.8 Summary 108 4 Fuzzy-Based Navigation and Obstacle Avoidance for Robotic Manipulators 110 4.1 Problem Definition and System Overview 110 4.2 Local Navigation in Context 113 4.2.1 Artificial Potential Fields 113 4.2.2 An Overview of Fuzzy-Based Navigation Techniques for Mobile Robots 115 4.2.3 Unreachable Situations and Local Minima 116 4.3 Fuzzy Navigation and Obstacle Avoidance for Robotic Manipulators 117 4.3.1 Introduction to Fuzzy Control 117 4.3.2 Manipulator-Specific Implementation Aspects 121 4.3.3 The Fuzzy Algorithm 123 4.4 Computer Simulations 128 4.4.1 Two-Link Manipulator 128 4.4.2 Three-Link Manipulator 132 4.4.3 Moving Obstacles 134 4.4.4 Safety Aspects 134 4.5 Fuzzy Navigation for the MA 2000 Manipulator 135 4.5.1 Simulated MA 2000 and Comparison to the Resistive Grid Approach 135 4.5.2 Real-World Results 138 4.6 Reinforcement Learning 138 4.7 Summary and Discussion 144 5 Conclusions and Future Work 147 5.1 Conclusions 147 5.1.1 Workspace to C-space Transformation for Robotic Manipulators 147 5.1.2 A Neural Resistive Grid for Path Planning 147 5.1.3 Fuzzy-based Navigation and Obstacle Avoidance for Robotic Manipulators 149 5.2 Future work 150 5.2.1 Hybrid System 150 5.2.2 Implementational Aspects 151 5.2.3 Sensors 152 5.2.4 Transformation of Complex Obstacle Primitives 152 Appendix A-1 153 Appendix A-2 155 Appendix A-3 160 Appendix B 165 Bibliography 169 -- |_/ I N G'S Dr Kaspar ALTHOEFER | \ COLLEGE Ph.D., Dipl.-Ing., AMIEE L O N D O N Department of Mechanical Engineering Founded1829 King's College, Strand, London WC2R 2LS, UK TEL: +44 (0)171 873 2431, FAX: +44 (0)171 836 4781 http://www.eee.kcl.ac.uk/~kaspar From jhf at stat.Stanford.EDU Fri Oct 17 16:38:10 1997 From: jhf at stat.Stanford.EDU (Jerome H. Friedman) Date: Fri, 17 Oct 1997 13:38:10 -0700 (PDT) Subject: Technical Report Available. Message-ID: <199710172038.NAA21535@rgmiller.Stanford.EDU> *** Technical Report Available *** Bump Hunting in High-Dimensional Data Jerome H. Friedman Stanford University Nicholas I. Fisher CMIS - CSIRO, Sydney ABSTRACT Many data analytic questions can be formulated as (noisy) optimization problems. They explicitly or implicitly involve finding simultaneous combinations of values for a set of ("input") variables that imply unusually large (or small) values of another designated ("output") variable. Specifically, one seeks a set of subregions of the input variable space within which the value of the output variable is considerably larger (or smaller) than its average value over the entire input domain. In addition it is usually desired that these regions be describable in an interpretable form involving simple statements ("rules") concerning the input values. This paper describes a new procedure directed towards this goal based on the notion of "patient" rule induction. This patient strategy is contrasted with the greedy ones used by most rule induction methods, and semi-greedy ones used by some partitioning tree techniques such as CART. Applications involving scientific and commercial data bases are presented. Keywords: noisy function optimization, classification, association, rule induction, data mining. Available by ftp from: "ftp://stat.stanford.edu/pub/friedman/prim.ps.Z" Note: This postscript does not view properly on some older versions of ghostview. It seems to print OK on nearly all postscript printers. From mschmitt at igi.tu-graz.ac.at Mon Oct 20 11:18:55 1997 From: mschmitt at igi.tu-graz.ac.at (Michael Schmitt) Date: Mon, 20 Oct 1997 17:18:55 +0200 Subject: Preprint available Message-ID: <344B765F.4C7C@igi.tu-graz.ac.at> Dear Connectionists, the following preprint (21 pages) is available at http://www.cis.tu-graz.ac.at/igi/maass/96.ps.gz (104463 bytes, gzipped PostScript) or at http://www.cis.tu-graz.ac.at/igi/mschmitt/spikingneurons.ps.Z (158163 bytes, compressed PostScript). TITLE: On the Complexity of Learning for Spiking Neurons with Temporal Coding AUTHORS: Wolfgang Maass and Michael Schmitt ABSTRACT: In a network of spiking neurons a new set of parameters becomes relevant which has no counterpart in traditional neural network models (such as threshold or sigmoidal networks): the time that a pulse needs to travel through a connection between two neurons (also known as delay of a connection). We investigate the VC-dimension of networks of spiking neurons where the delays are viewed as programmable parameters and we prove tight bounds for this VC-dimension. Thus we get quantitative estimates for the diversity of functions that a network with fixed architecture can compute with different settings of its delays. In particular, it turns out that a network of spiking neurons with $k$ adjustable delays is able to compute a much richer class of functions than a threshold circuit with $k$ adjustable weights. The results also yield bounds for the number of training examples that an algorithm needs for tuning the delays of a network of spiking neurons. Results about the computational complexity of such algorithms are also given. -- Michael Schmitt Institute for Theoretical Computer Science TU Graz, Klosterwiesgasse 32/2, A-8010 Graz, Austria Tel: +43 316 873-5814, Fax: +43 316 873-5805 E-mail: mschmitt at igi.tu-graz.ac.at http://www.cis.tu-graz.ac.at/igi/mschmitt/ From young at psy.ox.ac.uk Mon Oct 20 06:46:33 1997 From: young at psy.ox.ac.uk (Steven Young) Date: Mon, 20 Oct 1997 11:46:33 +0100 (BST) Subject: Oxford Summer School on Connectionist Modelling Message-ID: <199710201046.LAA08721@axp01.mrc-bbc.ox.ac.uk> ** CALL FOR ATTENDANCE ** Oxford Summer School on Connectionist Modelling Department of Experimental Psychology, University of Oxford 19 - 31 July 1998 Applications are invited for participation in a 2-week residential Summer School on techniques in connectionist modelling. The course is aimed primarily at researchers who wish to exploit neural network models in their teaching and/or research and it will provide a general introduction to connectionist modelling, biologically plausible neural networks and brain function through lectures and exercises on Macintoshs and PCs. The course is interdisciplinary in content though many of the illustrative examples are taken from cognitive and developmental psychology, and cognitive neuroscience. The instructors with primary responsibility for teaching the course are Kim Plunkett and Edmund Rolls. No prior knowledge of computational modelling will be required though simple word processing skills will be assumed. Participants will be encourages to start work on their own modelling projects during the Summer School. The cost of participation in the Summer School is £950 for Faculty and £750 for Graduate Students. This figure covers the cost of accommodation (bed and breakfast at St. John's College), registration and all literature required for the Summer School. Participants will be expected to cover their own travel and meal costs. A small number of partial bursaries will be available for graduate students. Applicants should indicate whether they wish to be considered for a graduate student scholarship but are advised to seek their own funding as well, since in previous years the number of graduate student applications has far exceeded the number of scholarships available. Further information about contents of the course can be obtained from Steven.Young at psy.ox.ac.uk. If you are interested in participating in the Summer School, please contact: Mrs Sue King Department of Experimental Psychology South Parks Road University of Oxford Oxford OX1 3UD Tel: +44 (1865) 271 353 Email: sking at psy.ox.ac.uk Please send a brief description of your background with an explanation of why you would like to attend the Summer School (one page maximum) no later than 31st January 1998. -- Computer Officer, IRC for Cognitive Neuroscience, Department of Experimental Psychology, Oxford University From kompe at fb.sony.de Mon Oct 20 13:29:29 1997 From: kompe at fb.sony.de (Ralf Kompe) Date: Mon, 20 Oct 1997 19:29:29 +0200 Subject: NEW BOOK: Applications of NNs to speech understanding Message-ID: <344B94F9.1BF1CF3E@fb.sony.de> (Sorry, if you receive this message more than once) To whom it may concern: The following book, which describes the application of neural networks to real-word data, is now available: Ralf Kompe Prosody in Speech Understanding Systems Lecture Notes in Artificial Intelligence, Vol. 1307 Subseries of Lecture Notes in Computer Science Springer Berlin, New York 1997 (370 pages) ISBN 3-540-63580-7 ------------------ ABSTRACT Prosody covers acoustic phenomena of speech which are not spe- cific to phonemes. These are mainly intonation, indicators for phrase boundaries, and accentuation. This information can sup- port the intelligibility of speech or even sometimes disambiguate the meaning. The aim of this book is to describe algorithms developed by the author for the use of prosodic information on many levels of speech understanding such as syntax, semantics, dialog, and translation. An implementation of these algorithms has suc- cessfully been integrated into the speech-to-speech translation system Verbmobil and in the dialog system Evar. This is for the first time that prosody is used in a fully operational speech understanding and translation system. The Verbmobil prototype system has been publicly demonstrated at several conferences and industrial fairs. The emphasis of the book lies on the improvement of parsing of spontaneous speech with the help of prosodic clause boundary in- formation. Prosody reduces the parse-time of word hypotheses graphs by 92% and the number of parse trees by 96%. This is achieved by integrating several knowledge sources such as proba- bilities for prosodic events computed by neural networks and n- gramms in an A*-search for the optimal parse. Without prosody the automatic interpretation of spontaneous speech would be infeasible. The book gives a comprehensive review of the mathematical and computational background of the algorithms and statistical models useful for the integration of prosody in speech understanding. It also shows unconventional applications of hidden Markov mod- els, stochastic language models, and neural networks. The latter, for example, are apart from several classification tasks used for the inverse filtering of speech signals. The book also ex- plains in detail the acoustic-prosodic phenomena of speech and their functional role in communication. In contrast to many other reports, it gives a lot of examples taken from real human-human dialogs; many examples are supported by speech signals accessible over the WWW. The use of prosodic information relies on the ro- bust extraction of relevant features from digitized speech sig- nals, on adequate labeling of large speech databases for train- ing classifiers, and on the detection of prosodic events; the methods used in Verbmobil and Evar are summarized as well in this book. Furthermore, an overview of these state-of-the-art speech understanding systems is given. ------------------ The book has been awarded with the "Dissertation Price" of the German Institutes for Artificial Intelligence. Sincerely yours, Ralf Kompe __________________________________________________________________________ Ralf Kompe Sony International (Europe) GmbH . . o o O O European Research and Development Stuttgart (ERDS) . . o o O Advanced Developments . . o o O Stuttgarter Str. 106 . . o o O O D-70736 Fellbach . . o o O O O Germany . . o o O O O . . o o O O Phone: +49-711-5858-366 Fax: +49-711-58-31-85 E-mail: kompe at fb.sony.de __________________________________________________________________________ From marcusg at elec.uq.edu.au Mon Oct 20 03:28:29 1997 From: marcusg at elec.uq.edu.au (Marcus Gallagher) Date: Mon, 20 Oct 1997 17:28:29 +1000 Subject: ACNN'98: 2nd Call for Papers Message-ID: <344B081D.41C67EA6@elec.uq.edu.au> -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- The Ninth Australian Conference on Neural Networks ACNN'98 Brisbane, Australia Feb 11-13, 1998 Second Announcement and Final Call for Papers http://www.elec.uq.edu.au/acnn98/ -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- The ninth Australian conference on neural networks will be held in Brisbane on Feb 11-13 1998 at the University of Queensland. ACNN'98 is the annual national meeting of the Australian neural network community. It is a multi-disciplinary meeting and seeks contributions from Neuroscientists, Engineers, Computer Scientists, Mathematicians, Physicists and Psychologists. Invited Speakers ---------------- The program will include several invited talks, at least 3 of which will be presented by overseas speakers. Pre-Conference Workshop ----------------------- A Pre-Conference Workshop will be held on Tuesday 10th February 1998 at the same venue as the conference. The emphasis of the workshop will be on neural network simulation and robotics. Further details (including registration, etc.) will be made available in the near future. Special Sessions ---------------- The following Special Sessions have been proposed for ACNN'98. These will consist of invited talks followed by regular presentations of relevant papers. Submissions which are relevant to a special session are being invited for inclusion in the session. All papers presented in special sessions will appear in the conference proceedings. Rule Extraction & Connectionist Knowledge Representation Modelling Higher Cognitive Processes Computational Learning Theory Modularity in Neural Networks Submissions ----------- Papers should be submitted to the ACNN'98 Secretariat as close as possible to final form and must not exceed 5 single pages (2 column format, 10pt or more). The ACNN'98 web page (http://www.elec.uq.edu.au/acnn98/) includes LateX style and template files for authors using LateX. An example document showing the general layout of submissions is also available via the web page in Postscript and Pdf formats - authors are encouraged to use this document as a general guide to formatting of papers. For full submission details please refer to the web page. Following notification of acceptance results, authors will be required to submit a one page, extended abstract for accepted papers. These abstracts will be distributed to delegates at the conference. Camera-ready papers will then be required for submission shortly after the conference, and proceedings will then be sent out to delegates. Submissions of initial papers, extended abstracts and camera-ready papers in electronic format (via email) are encouraged, to give authors the maximum available time to prepare submissions. Authors ACNN'98 will include a special poster session devoted to recent work and work-in-progress. Abstracts are solicited for this session (1 page limit) and may be submitted up to 10 days before the commencement of the conference. They will not be refereed or included in the proceedings, but will be distributed to attendees upon arrival. Students are especially encouraged to participate in this session. Submission Categories --------------------- Submissions are encouraged in, but not limited to, the following topics: Applications: Examples - Signal processing and analysis; Pattern recognition; Speech, Machine vision; Motor control; Robotics; Forecasting; Medical Architectures and Learning Algorithms: New architectures and learning algorithms; Hierarchy; Modularity; Learning pattern sequences; Information integration; Evolutionary computation; Machine learning Cognitive Science: Models of perception and pattern recognition; Memory; Concept formation; Problem solving and reasoning; Language acquisition and production Neuroscience: Vision, Audition, Motor Somatosensory and Autonomic functions; Synaptic function; Cellular information processing Theory: Learning; Generalisation; Complexity; Stability; Dynamics Implementation: Hardware implementations of neural nets; Analog and digital VLSI implementation; Optical implementation Important Dates --------------- Paper Submissions Due Mon 17th Nov 97 Workshop & Tutorial Proposals Mon 6th Oct 97 Notification of Acceptance Fri 19th Dec 97 Work In Progress Abstracts Due Mon 2nd Feb 98 Extended Abstracts Due Mon 2nd Feb 98 Registration Fees ----------------- Students Regular After 15th Jan 98 ACNN'98 A$100 A$300 A$400 Organising Committee: --------------------- Prof. Tom Downs University of Queensland (Chair) Dr. Janet Wiles University of Queensland Prof. Joachim Diederich Queensland University of Technology Dr. P Suganthan University of Queensland Dr. Marcus Frean University of Queensland Marcus Gallagher University of Queensland Robert Andrews Queensland University of Technology Ian Wood University of Queensland Peter Stratton University of Queensland Contact Information: -------------------- ACNN'98 Secretariat Dept of Electrical & Computer Engineering University of Queensland QLD. 4072. WWW: http://www.elec.uq.edu.au/acnn98/ email: acnn98 at elec.uq.edu.au From jbower at bbb.caltech.edu Tue Oct 21 19:56:07 1997 From: jbower at bbb.caltech.edu (James M. Bower) Date: Tue, 21 Oct 1997 15:56:07 -0800 Subject: CNS*98 Message-ID: ********************************************************************** With this email we announce CNS*98, the next Computational Neuroscience Annual Meeting to be held this coming July in Santa Barbara, California. For those of you attending the Society for Neuroscience Annual Meeting this coming week in New Orleans, you can pick up a freshly printed poster for the meeting by ERICA, at the NIMH booth. Those of you already on the CNS meeing mailing list will recieve the poster and call for papers through the mail in the next few weeks. Please note in the following call for papers that we have introduced a new all electronic form of paper submission using a custom designed JAVA/HTML interface. Additional information on paper submission and the meeting itself is available at: http://www.bbb.caltech.edu/cns-meetings/cns98/ ********************************************************************** CALL FOR PAPERS Seventh Annual Computational Neuroscience Meeting CNS*98 July 26 - 30, 1998 Santa Barbara, California DEADLINE FOR SUMMARIES AND ABSTRACTS: 11:59pm January 26, 1998 This is the seventh annual meeting of an interdisciplinary conference addressing a broad range of research approaches and issues involved in the field of computational neuroscience. These meetings bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in the functioning of biological nervous systems. Peer reviewed papers are presented all related to understanding how nervous systems compute. As in previous years, CNS*98 will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. The meeting in 1998 will take place at the Fess Parker's Double Tree Resort in Santa Barbara, California, and include plenary, contributed, and poster sessions. The first session starts at 9 am, Sunday, July 26th and ends with the annual banquet on Thursday evening, July 30th. There will be no parallel sessions. The meeting will include time for informal workshops focused on current issues in computational neuroscience. Travel funds will be available for students and postdoctoral fellows presenting papers. Child day care will also be available. Santa Barbara, California is approximately 1 1/2 hours by car from the Los Angeles International airport. Airport shuttles from the airport to Santa Barbara run regularly. In addition, Santa Barbara has its own small airport. The hotel itself is located on the ocean and within walking distance from distinctive downtown Santa Barbara. **NEW** SUBMISSION INSTRUCTIONS: With this announcement we solicit the submission of presented papers all of which will be refereed. Peer review will be conducted based on a 1000-word (or less) summary describing the methods, nature, and importance of your results. Authors will be notified of acceptance by the second week of May, 1998. This year, for the first, time, submission of papers will be performed electronically using a custom designed JAVA/HTML interface. Full instructions for submission can be found at the meeting web site: http://www.bbb.caltech.edu/cns-meetings/cns98/. However, in brief, authors should cut and paste text from their own word processors into the forms available on the web site. It is important that all requested information be provided, including a 100 word abstract for publication in the conference program, all author information, and selection of the appropriate category and theme from the list provided. Authors should especially note the mechanisms used for handling figures and mathematical equations. All submissions will be acknowledged immediately by email. Program committee decisions will be sent to the designated correspondence author only. Submissions will not be considered if they lack category information, abstracts, author addresses, or are late. FURTHER MEETING CORRESPONDENCE We would like to strongly encourage authors to make their submissions electronically. However, if you do not have ready access to the internet or a web server, we will send you instructions for paper submission if you contact us either by email to cns98 at smaug.bbb.caltech.edu or at the following address: CNS*98 Division of Biology 216-76 Caltech Pasadena, CA 91125 ADDITIONAL INFORMATION concerning the meeting, including hotel and travel arrangements or information on past meetings can be obtained by: o Using our on-line WWW information and registration server, URL of: http://www.bbb.caltech.edu/cns/cns98/cns98.html o ftp-ing to our ftp site. yourhost% ftp ftp.bbb.caltech.edu Name (ftp.bbb.caltech.edu:): ftp Password: yourname at yourhost.yourside.yourdomain ftp> cd cns98 ftp> ls o Sending Email to: cns98 at smaug.bbb.caltech.edu CNS*98 ORGANIZING COMMITTEE: Co-meeting Chair / Logistics - John Miller, Montana State University Co-meeting Chair / Finances and Program - Jim Bower, Caltech Governmental Liaison - Dennis Glanzman, NIMH/NIH Workshop Organizer - to be announced 1998 Program Committee: Axel Borst, Max-Planck Inst., Tuebingen, Germany Leif Finkel, University of Pennsylvania Anders Lansner, Royal Institute of Technology, Sweden Linda Larson-Prior, Pennsylvania State University Medical College David Touretzky, Carnegie Mellon University Gina Turrigiano, Brandeis University Ranu Jung, University of Kentucky Simon Thorpe, CNRS, Toulouse, France 1998 Regional Organizers: Europe- Erik DeSchutter (Belgium) Middle East - Idan Segev (Jerusalem) Down Under - Mike Paulin (New Zealand) South America - Renato Sabbatini (Brazil) Asia - Zhaoping Li (MIT) India - Upinder Bhalla (Bangalore) ==== From stein at biodec.wustl.edu Tue Oct 21 14:57:16 1997 From: stein at biodec.wustl.edu (Paul S.G. Stein) Date: Tue, 21 Oct 1997 13:57:16 -0500 Subject: NeuronsNetworksMotorBehavior Message-ID: <9710211857.AA17081@biodec.wustl.edu> The volume "NEURONS, NETWORKS, AND MOTOR BEHAVIOR" edited by PSG Stein, S Grillner, AI Selverston, and DG Stuart is now available from the MIT Press. See the MIT Press website for additional information: mitpress.mit.edu/book-home.tcl?isbn=0262193906 Many of the authors of chapters of this volume were speakers at the 1995 Tucson conference on Neurons, Networks, and Motor Behavior. The volume can be viewed at the MIT Press booth at the Society for Neuroscience meeting in New Orleans. MIT Press offers a 20% discount for orders placed using forms available at the meeting. A similar discount is available for attendees of the 1995 Tucson conference. An EMail to 1995 meeting attendees was recently sent by Pat Pierce in Doug Stuart's lab. Contact Pat at DGStuart at U.Arizona.EDU if you did not receive her EMail. -----The following is additional information about the volume----- Recent advances in motor behavior research rely on detailed knowledge of the characteristics of the neurons and networks that generate motor behavior. At the cellular level, Neurons, Networks, and Motor Behavior describes the computational characteristics of individual neurons and how these characteristics are modified by neuromodulators. At the network and behavioral levels, the volume discusses how network structure is dynamically modulated to produce adaptive behavior. Comparisons of model systems throughout the animal kingdom provide insights into general principles of motor control. Contributors describe how networks generate such motor behaviors as walking, swimming, flying, scratching, reaching, breathing, feeding, and chewing. An emerging principle of organization is that nervous systems are remarkably efficient in constructing neural networks that control multiple tasks and dynamically adapt to change. The volume contains six sections: selection and initiation of motor patterns; generation and formation of motor patterns: cellular and systems properties; generation and formation of motor patterns: computational approaches; modulation and reconfiguration; short-term modulation of pattern generating circuits; and sensory modification of motor output to control whole body orientation. -----TABLE OF CONTENTS OF NEURONS, NETWORKS, AND MOTOR BEHAVIOR----- SELECTION AND INITIATION OF MOTOR PATTERNS 1. Selection and Initiation of Motor Behavior Sten Grillner, Apostolos P. Georgopoulos, and Larry M. Jordan 2. The Role of Population Coding in the Control of Movement David L. Sparks, William B. Kristan, Jr., and Brian K. Shaw 3. Neural Substrates for Initation of Startle Responses Roy E. Ritzmann and Robert C. Eaton GENERATION AND FORMATION OF MOTOR PATTERNS: CELLULAR AND SYSTEMS PROPERTIES 4. Basic Building Blocks of Vertebrate Spinal Central Pattern Generators Ole Kiehn, Jorn Hounsgaard, and Keith T. Sillar 5. Neural and Biomechanical Control Strategies for Different Forms of Vertebrate Hindlimb Motor Tasks Paul S.G. Stein and Judith L. Smith 6. Spinal Networks and Sensory Feedback in the Control of Undulatory Swimming in Lamprey Peter Wallen 7. Spinal Networks Controlling Swimming in Hatchling Xenopus Tadpoles Alan Roberts, Steve R. Soffe, and Ray Perrins 8. Role of Ionic Currents in the Operation of Motor Circuits in the Xenopus Embryo Nicholas Dale 9. Integration of Cellular and Network Mechanisms in Mammalian Oscillatory Motor Circuits: Insights from the Respiratory Oscillator Jeffrey C. Smith 10. Shared Features of Invertebrate Central Pattern Generators Allen I. Selverston, Yuri V. Panchin, Yuri I. Arshavsky, and Grigori N. Orlovsky 11. Intrinsic Membrane Properties and Synaptic Mechanisms in Motor Rhythm Generators Ronald L. Calabrese and Jack L. Feldman 12. Organization of Neural Networks for the Control of Posture and Locomotion in an Insect Malcolm Burrows GENERATION AND FORMATION OF MOTOR PATTERNS: COMPUTATIONAL APPROACHES 13. How Computation Aids in Understanding Biological Networks Eve Marder, Nancy Kopell, and Karen Sigvardt 14. Dynamical Systems Analyses of Real Neuronal Networks John Guckenheimer and Peter Rowat 15. Realistic Modeling of Burst Generation and Swimming in Lamprey Anders Lansner, Orjan Ekeberg, and Sten Grillner 16. Integrate-and-Fire Simulations of Two Molluscan Neural Circuits William N. Frost, James R. Lieb, Jr., Mark J. Tunstall, Brett D. Mensh, and Paul S. Katz MODULATION AND RECONFIGURATION 17. Chemical Modulation of Vertebrate Motor Circuits Keith T. Sillar, Ole Kiehn, and Norio Kudo 18. Modulation of Neural Circuits by Steroid Hormones in Rodent and Insect Model Systems Janis C. Weeks and Bruce McEwen 19. Chemical Modulation of Crustacean Stomatogastric Pattern Generator Networks Ronald M. Harris-Warrick, Deborah J. Baro, Lisa M. Coniglio, Bruce R. Johnson, Robert M. Levini, Jack H. Peck, and Bing Zhang 20. Reconfiguration of the Peripheral Plant during Various Forms of Feeding Behaviors in the Mollusc Aplysia Irving Kupfermann, Vladimir Brezina, Elizabeth C. Cropper, Dillip Deodhar, William C. Probst, Steven C. Rosen, Ferdinand S. Vilim, and Klaudiusz R. Weiss SHORT-TERM MODULATION OF PATTERN GENERATING CIRCUITS 21. Sensory Modulation of Pattern Generating Circuits Keir G. Pearson and Jan-Marino Ramirez 22. Presynaptic Mechanisms during Rhythmic Activity in Vertebrates and Invertebrates Michael P. Nusbaum, Abdeljabbar El Manira, Jean-Pierre Gossard, and Serge Rossignol SENSORY MODIFICATION OF MOTOR OUTPUT TO CONTROL WHOLE BODY ORIENTATION 23. Control of Body Orientation and Equilibrium in Vertebrates Jane M. Macpherson, Tatiana G. Deliagina, and Grigori N. Orlovsky 24. Centrally-Patterned Behavior Generates Sensory Input for Adaptive Control Mark A. Willis and Edmund A. Arbas 25. Oculomotor Control in Insects: From Muscles to Elementary Motion Detectors Nicholas J. Strausfeld ________________________________________________________________ Paul S.G. Stein EMail reply to: STEIN at BIODEC.WUSTL.EDU Voice Phone: 314-935-6824 FAX Phone: 314-935-4432 Mail: Dept Biology, Washington Univ, St Louis, MO 63130 USA Home Page: http://biosgi.wustl.edu/faculty/stein.html Book Website for Neurons, Networks, and Motor Behavior: http://www-mitpress.mit.edu/book-home.tcl?isbn=0262193906 Conference Website for Neurons, Networks, and Motor Behavior: http://www.physiol.arizona.edu/CELL/Department/Conferences.html ________________________________________________________________ From vnissen at gwdg.de Wed Oct 22 10:50:20 1997 From: vnissen at gwdg.de (Volker Nissen) Date: Wed, 22 Oct 1997 14:50:20 +0000 Subject: CfP 4. Symp. Softcomputing Message-ID: This call was sent to several lists. We apologize should you receive it multiple times. --------------------------------- CALL FOR PAPERS ================== 4. SYMPOSIUM SOFTCOMPUTING "Softcomputing in Production and Material Management" Neural Nets, Fuzzy Set Theory, Evolutionary Algorithms University of Goettingen, Thu. 12. March 1998 (10 am - 6 pm) THEME: The theme of this years symposium are softcomputing applications in production and material management. Softcomputing as a technical term includes the complementary core areas of artificial neural networks, fuzzy set theory, and evolutionary algorithms. While softcomputing is actively being applied in the technical sectors, we believe that the great potential of softcomputing for the management domain has not yet been sufficiently appreciated in industry. The Goettingen symposium was established to serve as a link between science and practice, focussing on innovative applications and know-how transfer. Possible topics of contributions include but are not limited to: * production planning * lot sizing and scheduling * cutting problems * inventory control * machine diagnosis * data mining * maintenance planning * system layout * line balancing * process control and optimization * waste management ORGANISATION: The symposium is strictly application-orientated. It is organised by members of the ,Workgroup on Softcomputing in Business", jointly with the bureau of technology transfer of the University of Goettingen. It takes place at the University of Goettingen, central lecturing building. All three previous symposia received very positive judgements from scientists and practicians. DEADLINES: 12. Jan. 1998 Deadline for extended abstract submission (ca. 2 pages, e-mail submission OK) 19. Jan. 1998 Notification of acceptance 16. Feb. 1998 Camera ready full papers due 27. Feb. 1998 Registration deadline and latest date to pay conference fee 12. Mar 1998 Symposium CONFERENCE FEE: The conference fee is DM 100,- fuer speakers und DM 200,- for other participants, and includes the proceedings, lunch and coffee breaks. Please pay to our account no. 9242 058 at Nord LB (German bank code 250 500 00). Account holder is the ,Foerderverein FH BS-WF". Please mention "4. Symposium Softcomputing" with your payment. We accept euro cheques, but unfortunately cannot accept credit cards. PROCEEDINGS: A proceedings volume will be distributed at the conference. In preparing your manuscript, please follow these format requirements: printing area 17 x 24 cm, 12 pt Times Roman or similar font, single line spacing, no page numbering, max. 16 pages, title in 18 pt Arial bold and centered. Below please state authors names and affiliations (centered). Please include short abstract and key words. Contributions in German or English. Please send two copies of your manuscript in reproducable form as well as the formated file on disk to the address stated below. At least one of the authors of an accepted paper is required to participate in the symposium and present the paper. Talks should not exceed 40 minutes (including 10 minutes for discussion). SYMPOSIUM WWW-PAGE: HTML://www.wi1.wiso.uni-goettingen.de/pa/afn/4symp_e.htm CONTACT: PLEASE SEND SUBMISSIONS TO: 4. Symposium Softcomputing Dipl.-Phys. Martin Tietze Universitaet Goettingen Abt. Wirtschaftsinformatik I Platz der Goettinger Sieben 5 D-37073 Goettingen Germany REGISTRATION ADDRESS: Dipl.-Kfm. Dipl.-Ing. Detlef Puchert FH Braunschweig-Wolfenbuettel Technologietransfer-Kontaktstelle Salzdahlumer Str. 46/48 38302 Wolfenbuettel Germany E-mail: d.puchert at verwaltung.fh-wolfenbuettel.de FOR QUESTIONS VIA E-MAIL PLEASE REFER TO: vnissen at gwdg.de (Dr. Volker Nissen) From arbib at pollux.usc.edu Wed Oct 22 11:52:43 1997 From: arbib at pollux.usc.edu (Michael Arbib) Date: Wed, 22 Oct 1997 08:52:43 -0700 Subject: Neural Organization Message-ID: The volume "Neural Organization: Structure, Function, and Dynamics" by Michael A. Arbib, Peter Erdi, and Janos Szentagothai is now available from the MIT Press. See the MIT Press website for additional information: http://mitpress.mit.edu/book-home.tcl?isbn=026201159X The volume can be viewed at the MIT Press booth at the Society for Neuroscience meeting in New Orleans. MIT Press offers a 20% discount for orders placed using forms available at the meeting. -----The following is additional information about the volume----- "Neural Organization: Structure, Function, and Dynamics" Michael A. Arbib, Peter Erdi, and Janos Szentagothai ISBN 0-262-01159-X 328 pp. (8.5 x 11 double column), 163 illus. $60.00 (cloth) In Neural Organization, Arbib, Erdi, and Szentagothai integrate structural, functional, and dynamical approaches to the interaction of brain models and neurobiologcal experiments. Both structure-based "bottom-up" and function- based "top-down" models offer coherent concepts by which to evaluate the experimental data. The goal of this book is to point out the advantages of a multidisciplinary, multistrategied approach to the brain. Part I of Neural Organization provides a detailed introduction to each of the three areas of structure, function, and dynamics. Structure refers to the anatomical aspects of the brain and the relations between different brain regions. Function refers to skills and behaviors, which are explained by means of functional schemas and biologically based neural networks. Dynamics refers to the use of a mathematical framework to analyze the temporal change of neural activities and synaptic connectivities that underlie brain development and plasticity--in terms of both detailed single-cell models and large-scale network models. In part II, the authors show how their systematic approach can be used to analyze specific parts of the nervous system--the olfactory system, hippocampus, thalamus, cerebral cortex, cerebellum, and basal ganglia--as well as to integrate data from the study of brain regions, functional models, and the dynamics of neural networks. In conclusion, they offer a plan for the use of their methods in the development of cognitive neuroscience. ********************************* Michael A. Arbib USC Brain Project University of Southern California Los Angeles, CA 90089-2520, USA arbib at pollux.usc.edu (213) 740-9220; Fax: (213) 740-5687 http://www-hbp.usc.edu/HBP/ From cns-cas at cns.bu.edu Wed Oct 22 13:37:16 1997 From: cns-cas at cns.bu.edu (Boston University - Cognitive and Neural Systems) Date: Wed, 22 Oct 1997 13:37:16 -0400 Subject: CALL FOR PAPERS - 2nd International COnference on CNS Message-ID: <3.0.3.32.19971022133716.0109b4cc@cns.bu.edu> ****CALL FOR PAPERS**** SECOND INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS May 27-30, 1998 Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from DARPA and ONR http://cns-web.bu.edu/cns-meeting/ HOW DOES THE BRAIN CONTROL BEHAVIOR? HOW CAN TECHNOLOGY EMULATE BIOLOGICAL INTELLIGENCE? The conference will include invited lectures and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Abstract submissions encourage submissions of the latest results. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. Although Memorial Day falls on Saturday, May 30, it is observed on Monday, May 25, 1998. CONFIRMED INVITED SPEAKERS TUTORIALS: WEDNESDAY, MAY 27, 1998 (to be announced) KEYNOTE SPEAKERS: Stephen Grossberg, Adaptive resonance theory: From biology to technology Ken Nakayama, Psychological studies of visual attention INVITED SPEAKERS: THURSDAY, MAY 28, 1998: Azriel Rosenfeld, Understanding object motion Takeo Kanade, Computational sensors: Further progress Tomaso Poggio, Sparse representations for learning Gail Carpenter, Applications of ART neural networks Rodney Brooks, Experiments in development models for a neurally controlled humanoid robot Lee Feldkamp, Recurrent networks: Promise and practice FRIDAY, MAY 29, 1998: J. Anthony Movshon, Contrast gain control in the visual cortex Hugh Wilson, Global processes at intermediate levels of form vision Mel Goodale, Biological teleassistance: Perception and action in the human visual system Ken Stevens, The categorical representation of speech and its traces in acoustics and articulation Carol Fowler, Production-perception links in speech Frank Guenther, A theoretical framework for speech acquisition and production SATURDAY, MAY 30, 1998: Howard Eichenbaum, The hippocampus and mechanisms of declarative memory Earl Miller, Neural mechanisms for working memory and cognition Bruce McNaughton, Neuronal population dynamics and the interpretation of dreams Richard Thompson, The cerebellar circuitry essential for classical conditioning of discrete behavioral responses Daniel Bullock, Cortical control of arm movements Andrew Barto, Reinforcement learning applied to large-scale dynamic optimization problems There will be contributed oral and poster sessions on each day of the conference. CALL FOR ABSTRACTS Contributors are requested to list a first and second choice from among the topics below in their cover letter, and to say whether it is biological (B) or technological (T) work, when they submit their abstract, as described below. vision spatial mapping and navigation object recognition neural circuit models image understanding neural system models audition mathematics of neural systems speech and language robotics unsupervised learning neuromorphic VLSI supervised learning hybrid systems (fuzzy, evolutionary, digital) reinforcement and emotion industrial applications cognition, planning, and other attention Example: first choice: vision (T); second choice: neural system models (B). CALL FOR ABSTRACTS: Contributed Abstracts must be received, in English, by January 31, 1998. Notification of acceptance will be given by February 28, 1998. A meeting registration fee of $45 for regular attendees and $30 for students must accompany each Abstract. See Registration Information for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted abstracts will be returned on request only until April 15, 1998. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; and preference for oral or poster presentation. (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, and VCR facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows are available to cover meeting travel and living costs. The deadline to apply for fellowship support is January 31, 1998. Applicants will be notified by February 28, 1998. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. Reimbursement checks will be distributed after the meeting. ________________________________________________________________ REGISTRATION FORM Second International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University, 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 27, 1998, Meeting: May 28-30, 1998 FAX: (617) 353-7755 (Please Type or Print) Mr/Ms/Dr/Prof: __________________________________________________ Name: ___________________________________________________________ Affiliation: ____________________________________________________ Address: ________________________________________________________ City, State, Postal Code: _______________________________________ Phone and Fax: __________________________________________________ Email: _________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) 70 Conference plus Tutorial (Regular) ( ) 45 Conference plus Tutorial (Student) ( ) 45 Conference Only (Regular) ( ) 30 Conference Only (Student) ( ) 25 Tutorial Only (Regular) ( ) 15 Tutorial Only (Student) Method of Payment: (Please FAX or mail) [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: __________________________________ Type of card: _____________ Account number: ______________________ Signature: ____________________________ Expiration date: _________ From S.Singh-1 at plymouth.ac.uk Wed Oct 22 16:27:26 1997 From: S.Singh-1 at plymouth.ac.uk (Sameer Singh) Date: Wed, 22 Oct 1997 16:27:26 BST Subject: PhD Studentship available Message-ID: <53153C87C1F@cs_fs15.csd.plym.ac.uk> University of Plymouth, UK School of Computing PhD Research Studentship Available Salary: See below Applications are now invited for a PhD studentship in the School of Computing in the area of unstructured information processing and extraction using intelligent techniques such as neural networks. The research project will be carried out in collaboration with Ranco Controls Ltd., Plymouth, a world leading manufacturer of control equipment. The project will also collaborate with the School of Electronic, Communication and Electrical Engineering. You should have a background in computer science or engineering with a good honours degree, and preferably with a Masters qualification. The project requires good knowledge in areas including information systems, artificial intelligence and C/C++. The studentship covers the tuition fee and a maintenance of Pounds 5510 per year. Application forms and further details are available from the School Office on +44-1752- 232 541. Further information and informal enquiries on the project should be directed to Dr. Sameer Singh, School of Computing, University of Plymouth, UK (tel: +44-1752-232 612, fax: +44-1752-232 540, e-mail: s1singh at plym.ac.uk). UK and EU country residents are especially encouraged to apply. Closing date: Completed application forms should reach by the 7th November, 1997 Promoting equal opportunities A Leading Centre for Teaching and Research From Dave_Touretzky at cs.cmu.edu Thu Oct 23 00:17:28 1997 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Thu, 23 Oct 1997 00:17:28 -0400 Subject: faculty position in computational neuroscience Message-ID: <13994.877580248@skinner.boltz.cs.cmu.edu> The Computer Science Department at Carnegie Mellon University, in collaboration with the Center for the Neural Basis of Cognition (CNBC), is soliciting applications for a tenure-track faculty position in computational neuroscience. The successful applicant will be expected to carry out a research program in the theoretical analysis of computational properties of real neural systems, and how these properties contribute to aspects of cognition such as perception, memory, language, or the planning and coordination of action. Theoretically-oriented CNBC faculty have many opportunities to collaborate with experimentalists using a variety of techniques, including primate and rat electrophysiology, MRI and PET functional brain imaging, and neuropsychological assessment of clinical populations. Researchers with a strong background in mathematical analysis, dynamical systems theory, probablistic and statistical approaches, or other analytical techniques relevant to the study of brain function are especially encouraged to apply. The deadline for initial review of applications is February 1, 1998, but applications arriving after that date will be considered until the position is filled. Send a vita, a statement of research interests, copies of relevant publications, and three letters of reference to: Dr. David S. Touretzky, Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213-3891 Additional information about the CNBC may be found at http://www.cnbc.cmu.edu. Carnegie Mellon is an Equal Opportunity Employer. From atick at monaco.rockefeller.edu Thu Oct 23 08:44:49 1997 From: atick at monaco.rockefeller.edu (Joseph Atick) Date: Thu, 23 Oct 1997 08:44:49 -0400 Subject: Network:CNS Table of Contents Message-ID: <9710230844.ZM8556@monaco.rockefeller.edu> NETWORK: COMPUTATION IN NEURAL SYSTEMS Table of Contents Volume 8, Issue 4, November 1997 Pages: R77--R109, 355--466 TOPICAL REVIEW R77 Basal ganglia: structure and computations J Wickens PAPERS 355 Networks of spiking neurons can emulate arbitrary Hopfield nets in temporal coding W Maass and T Natschlger 405 Information processing by a noisy binary channel E Korutcheva, N Parga and J-P Nadal 373 Dynamics of a recurrent network of spiking neurons before and following learning D J Amit and N Brunel 425 Generalisation and discrimination emerge from a self-organising componential network: a speech example C J S Webber 441 Unsupervised discovery of invariances S Eglen, A Bray and J Stone 453 Second-hand supervised learning in Hebbian perceptrons M A P Idiart 465 AUTHOR INDEX (with titles), Volume 8 -- Joseph J. Atick Rockefeller University 1230 York Avenue New York, NY 10021 Tel: 212 327 7421 Fax: 212 327 7422 From pbolland at lbs.ac.uk Fri Oct 24 09:44:06 1997 From: pbolland at lbs.ac.uk (Peter Bolland) Date: Fri, 24 Oct 1997 13:44:06 UTC Subject: Computational Finance 97: Call for participation Message-ID: <4CAB1457A11@deimos.lbs.ac.uk> ANNOUNCEMENT AND CALL FOR PARTICIPATION PRELIMARY PROGRAMME & REGISTRATION FORM _________________________________________________________ COMPUTATIONAL FINANCE 1997 _________________________________________________________ The Fifth International Conference on NEURAL NETWORKS IN THE CAPITAL MARKETS Monday-Wednesday, December 15-17, 1997 London Business School, London, England. After four years of continuous success and evolution, NNCM has emerged as a truly multi-disciplinary international conference. Born out of neurotechnology, NNCM now provides an international focus for innovative research on the application of a multiplicity of advanced decision technologies to many areas of financial engineering. It draws upon theoretical advances in financial economics and robust methodological developments in the statistical, econometric and computer sciences. The fifth NNCM conference will be held in London December 15-17 1997 under the new title COMPUTATIONAL FINANCE 1997 to reflect its multi-disciplinary nature. COMPUTATIONAL FINANCE 1997 is a research meeting where original, high-quality contributions are presented and discussed. In addition, a day of introductory tutorials (Monday, December 15) will be included to familiarise participants of different backgrounds with the financial, and methodological aspects of the field. Location: The conference will be held at London Business School which is situated near Regent's Park, London and is a short walk from Baker Street Underground Station. Further directions including a map will be sent to all registries. Registration and Mailing List: if you wish to be added to the mailing list or register for COMPUTATIONAL FINANCE 1997, please send your postal address, e-mail address, and fax number to the secretariat. __________________________________________________ Secretariat: __________________________________________________ Ms Deborah King, London Business School, Sussex Place, Regent's Park, London NW1 4SA, UK. E-mail: boguntula at lbs.ac.uk. Phone (+44) (0171)-262 50 50, Fax (+44) (0171) 724 78 75. __________________________________________________ WEB PAGE __________________________________________________ For more information on COMPUTATIONAL FINANCE 1997, please visit the NNCM web site at London Business School, homepage http://www.lbs.lon.ac.uk/desci/nncmhome.html registration form http://www.lbs.lon.ac.uk/desci/nncm97reg.html preliminary programme http://www.lbs.lon.ac.uk/desci/nncm97prog.html __________________________________________________ Preliminary programme: __________________________________________________ Computational Finance 1997 15-17th December 1997 London Business School __________________________________________________ Tutorials: Monday 15th December __________________________________________________ 9.15 - 10.15 Computational Finance: Challenges and Prospects Dr. Domingo Tavella Align Risk Analytics 10.45 - 12.45 An Introduction to Yield Curve Models Dr. Piotr Karazinski Citibank, UK 14.00 - 15.45 Uncertainty Analysis and Model Identification Prof. Chris Chatfield University of Bath, UK 16.15 - 18.00 Structural Time Series Analysis and the Kalman Filter Prof. Andrew Harvey Cambridge University, UK __________________________________________________ Day One: Tuesday 16th December __________________________________________________ ----------------Oral Session 1: Market Dynamics---------------- Invited Talk: Volatility Forecasting and Risk Management Prof. Francis X. Diebold, (University of Pennsylvania) Cross bicorrelations in high-frequency exchange rates: testing and forecasting. C. Brooks (ISMA Centre, University of Reading, UK), M. J. Hinich (University of Texas) Volume and return in the stock market: stability analysis and forecasting implications. J. del Hoyo and J. G. Llorente, (Madrid University) The multiplicative statistical mechanics of stock markets S. Solomon, (Hebrew University, Jerusalem) Time-varying risk premia from an asset allocation perspective - a GMDH analysis M. Steiner and S. Schneider, (Augsburg University, Germany) A data matrix to investigate independence,over-reaction and/or shock persistence in financial data R. Dacco and S. Satchell, (University of Cambridge, UK) With discussion by B. LeBaron, (MIT) ------Oral Session 2: Trading and Arbitrage Strategies--------------- Nonlinear equilibrium dynamics and investment strategy evaluation with cointegration R. N. Markellos, (Loughborough University, UK) Modelling asset prices using a portfolio of cointegration models approach A. N. Burgess, (London Business School) Technical analysis and central bank intervention C. Neely (Federal Reserve Bank of St. Louis), P. Weller, (University of Iowa, USA) With discussion by A.Timmerman (UCSD) Multitask learning in a neural VEC approach for exchange rate forecasting F. Rauscher, (Daimler Benz Research) Immediate and future rewards: reinforcement learning for trading systems and portfolios J. E. Moody, M. Saffell, Y. Liao, L. Wu (Oregon Graduate Institute, Portland) An evolutionary bootstrap method for selecting dynamic trading strategies B. LeBaron, (MIT) With discussion by A. S. Weigend (Stern Business School, New York University) ------------------------Poster Session 1---------------------------- __________________________________________________ Day Two: Wednesday 17th December __________________________________________________ ---Oral Session 3: Volatility Modelling and Option Pricing------- Invited Talk: Modelling S&P 100 volatility: the information content of stock returns Prof. S. Taylor, (Lancaster University, UK) Forecasting properties of neural network generated volatility estimates P. Ahmed, (University of North Carolina) Bootstrapping GARCH(1,1) models G. Maerker (Institut fur Techno und Wirtschaftsmathematick, Kaiserslautern, Germany) Pricing and hedging derivative securities with neural networks and the homogeneity hint R. Gencay, (University of Windsor, Canada) With discussion by Y. Abu-Mostafa (CalTech) Recovering risk aversion from option prices and realised returns J. Jackwerth and M. Rubinstein, (Haas Business School, University of California, Berkeley) With discussion by A. Neuberger (London Business School ) --------Oral Session 4: Term Structure and Factor Models------------- Kalman filtering of generalised Vasicek term-structure models S. H. Babbs and K. B. Nowman, (First National Bank of Chicago, London, UK) Modelling the term structure of interest rates: a neural network perspective J. T. Connor and N. Towers, (London Business School) A non-parametric test for nonlinear cointegration J. Breitung, (Humboldt University, Berlin) With discussion by H. White (UCSD) Unconstrained and constrained time-varying factor sensitivities in equity investment management Y. Bentz and J. T. Connor, (London Business School) Discovering structure in finance using independent component analysis D. Back, (Frontier Research Program, RIKEN, Japan), A. S. Weigend, (Stern Business School, New York University) ---------------------Poster Session 2-------------------------------- __________________________________________________ Posters (oral presentations plus those below) __________________________________________________ Classification of sector allocation in the German stock market, E. Steurer, (Daimler-Benz Research, Germany) Are neural network and econometric forecasts good for trading ? The case of the Italian stock index future, R. Bramante, R. Colombo, G. Gabbi, (University Bocconi, Milan) Interest rates structure dynamics: a non-parametric approach, M. Cottrell, E. Bodt and P. Gregoire (Paris I University) Modifying the distributional properties of financial ratios to improve the performance of linear and non-linear discriminant systems, G. Albanis, J. A. Long and M. Hiscock (City University, London, UK) Time series techniques and neural networks: a combined framework to forecast USD/DEM exchange rate, F Bourgoin, (Millenium Global Investments Limited, London), A. Vigier (Decalog, Paris) Credit assessment using evolutionary MLP networks, A. Carvalho, E.F.M. Filho, A. B. Matias (San Paulo University, Brazil) Exploring corporate bankruptcy with two-level self-organising map, K. Kiviluoto, (Helsinki University of Technology, Finland), P. Gergius, (Kera Ltd, Finland) Incorporating prior knowledge about financial markets through neural multi-task learning, K. Bartlmae, S. Gutjahr and G.Nakhaeizadeh (University of Karlsruhe, Germany) Predicting time-series with a committee of independent experts based on fuzzy rules, M. Rast, (Ludwig-Maximilians-Universitat, Munich, Germany) Multiscale analysis of time-series based on a neuro-fuzzy chaos methodology applied to financial data, N. K. Kasabov, R. Kozma, (University of Otago, N.Z.) Estimating and forecasting non-stationary financial data with IIR-filters and CT (composed threshold) models, M. Wildi, (Switzerland) Probabilistic neural network for company failure prediction, Z. Yang, H. James, A. Packer (University of Portsmouth, UK) An improved parametric density model for risk analysis of FX returns, J. Utans (London Business School), P. Sondhi (Citibank) Currency forecasting using recurrent RBF networks optimised by genetic algorithms, A. Adamopoulos, A. Andreou, et al., (University of Patras , Greece) On the market timing ability of neural networks: an empirical study testing the forecasting performance, T. H. Hann, (Karlsruhe University, Germany), J. Hofmeister (University of Ulm) Exchange rate trading using a fast retraining procedure for generalised RBF networks, D. R. Dersch, B. Flower, and S. J. Pickard (Crux Cybernetics, Australia) Prediction of volatility and option prices using an extended Kalman filter, V. P. Kumar and S. Mukhergee, (MIT) The ex-ante classification of take-over targets using neural networks, D. Fairclough, (Buckinghamshire College, UK), J. Hunter (Brunel University, UK) Portfolio optimisation with cap weight restrictions, N. Wagner, (Bayerische Vereinsbank AG, Munich) Management of a futures portfolio using conditional mean-variance estimates from wavelet-encoding neural networks, D. L. Toulson and S. P. Toulson, (Intelligent Financial Systems Ltd., London) Selecting relative value stocks with nonlinear cointegration, C. Kollias, (Hughes Financial Analytics), K. Metaxas, (University of Athens) Dynamic hedging of property liability portfolios in a multiple objective framework, G. H. Dash Jnr, R. C. Hanumara, N. Kajiji (University of Rhode Island, US) Model complexity versus transparency: an empirical comparison of the tradeoffs among different trading models, R. Madhavan, V. Dhar and A. S. Weigend (Stern Business School, New York University) A constrained hybrid approach to option pricing, P. Lajbcygier (Monash University, Australia), J. T. Connor (London Business School) Predicting corporate financial distress using quantitative and qualitative data: a comparison of traditional and collapsible neural networks, Q. Booker, R. E. Dorsey, and J. D. Johnson, (University of Mississipi, USA) State space ARCH: forecasting volatility with a stochastic coefficient model, A. Veiga, M. C. Medeiros and C. Fernandes (PUC, Rio de Janeiro) Using option prices to recover probability distributions, F. Gonzales-Miranda (Swedish School of Economics, Helsinki) A. N. Burgess (London Business School) Multivariate mutual funds analysis using neural networks, A. d'Almeida Monteiro, C. E. Pedreira and C. P. Samanez (PUC, Rio de Janeiro, Brazil) Cointegration by MCA and modular MCA, L. Xu and W. M. Leung (Chinese University, Hong Kong) On the complexity of stock returns, M. A. Kaboudan, (Penn State University, US) Modelling volatility using state-space models, J. Timmer (Freiburg University, Germany), A. S. Weigend (Stern School of Business, New York University) __________________________________________________ COMPUTATIONAL FINANCE 97 Registration Form December 15-17, 1997 Name:___________________________________________________ Affiliation:________________________________________________ Mailing Address: __________________________________________ ________________________________________________________ Telephone:_______________________________________________ ***Please circle the applicable fees and write the total below**** Main Conference (December 16-17): Registration fee stlg450 Discounted fee for academicians stlg250 (letter on university letterhead required) Discounted fee for full-time students stlg100 (letter from registrar or faculty advisor required) Tutorials (December 15): You must be registered for the main conference in order to register for the tutorials. Morning Session Only stlg100 Afternoon Session Only stlg100 Both Sessions stlg150 Full-time students stlg50 (letter from registrar or faculty advisor required) TOTAL: stlg Payment may be made by: (please tick) * Check payable to London Business School * VISA *Access *American Express Card Number:___________________________________ __________________________________________________ From aapo at myelin.hut.fi Fri Oct 24 10:33:51 1997 From: aapo at myelin.hut.fi (Aapo Hyvarinen) Date: Fri, 24 Oct 1997 17:33:51 +0300 Subject: Two TechReps on ICA and PP Message-ID: <199710241433.RAA02744@myelin.hut.fi> The following technical reports on independent component analysis and projection pursuit are available at: http://www.cis.hut.fi/~aapo/pub.html Aapo Hyvarinen: INDEPENDENT COMPONENT ANALYSIS BY MINIMIZATION OF MUTUAL INFORMATION Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, the linear version of the ICA problem is approached from an information-theoretic viewpoint, using Comon's framework of minimizing mutual information of the components. Using maximum entropy approximations of differential entropy, we introduce a family of new contrast (objective) functions for ICA, which can also be considered 1-D projection pursuit indexes. The statistical properties of the estimators based on such contrast functions are analyzed under the assumption of the linear mixture model. It is shown how to choose optimal contrast functions according to different criteria. Novel algorithms for maximizing the contrast functions are then introduced. Hebbian-like learning rules are shown to result from gradient descent methods. Finally, in order to speed up the convergence, a family of fixed-point algorithms for maximization of the contrast functions is introduced. Aapo Hyvarinen: NEW APPROXIMATIONS OF DIFFERENTIAL ENTROPY FOR INDEPENDENT COMPONENT ANALYSIS AND PROJECTION PURSUIT (To appear in NIPS*97) We derive a first-order approximation of the density of maximum entropy for a continuous 1-D random variable, given a number of simple constraints. This results in a density expansion which is somewhat similar to the classical polynomial density expansions by Gram-Charlier and Edgeworth. Using this approximation of density, an approximation of 1-D differential entropy is derived. The approximation of entropy is both more exact and more robust against outliers than the classical approximation based on the polynomial density expansions, without being computationally more expensive. The approximation has applications, for example, in independent component analysis and projection pursuit. From brychcy at informatik.tu-muenchen.de Fri Oct 24 09:37:32 1997 From: brychcy at informatik.tu-muenchen.de (Till Brychcy) Date: Fri, 24 Oct 1997 15:37:32 +0200 Subject: (Detailed) CFP: Fuzzy-Neuro Systems '98 in Munich Message-ID: <97Oct24.153739+0200met_dst.49149+255@papa.informatik.tu-muenchen.de> C A L L F O R P A P E R S 5. International GI-Workshop Fuzzy-Neuro Systems '98 - Computational Intelligence - 18 - 20 March 1998, Munich Fuzzy-Neuro Systems '98 is the fifth event of a well established series of workshops with international participation. Its aim is to give an overview of the state of art in research and development of fuzzy systems and artificial neural networks. Another aim is to highlight applications of these methods and to forge innovative links between theory and application by means of creative discussions. Fuzzy-Neuro Systems '98 is being organized by the Research Committee 1.2 "Inference Systems" (Fachausschuss 1.2 "Inferenzsysteme") of the German Society of Computer Science GI (Gesellschaft fur Informatik e. V.) and Technische Universitat Munchen in cooperation with Siemens AG. The workshop takes place at the European Patent Office in Munich from March 18 to 20, 1998 Scientific Topics ----------------- * theory and principles of multivalued logic and fuzzy logic * representation of fuzzy knowledge * approximate reasoning * fuzzy control in theory and practice * fuzzy logic in data analysis, signal processing and pattern recognition * fuzzy classification systems * fuzzy decision support systems * fuzzy logic in non-technical areas like business administration, management etc. * fuzzy databases * theory and principles of artificial neural networks * hybrid learning algorithms * neural networks in pattern recognition, classification, process monitoring and production control * theory and principles of evolutionary algorithms: genetic algorithms and evolution strategies * discrete parameter and structure optimization * hybrid systems like neuro-fuzzy systems, connectionistic expert systems etc. * special hardware and software Program Committee ----------------- Prof. Dr. W. Banzhaf, University of Dortmund Dr. M. Berthold, University of Karlsruhe Prof. Dr. Dr. h.c. W. Brauer, Technische Universitat Munchen (Chairman) Prof. Dr. G. Brewka, University of Leipzig Dr. K. Eder, Kratzer Automation GmbH, Unterschlei=DFheim Prof. Dr. C. Freksa, University of Hamburg Prof. Dr. M. Glesner, Technical University of Darmstadt Prof. Dr. S. Gottwald, University of Leipzig Prof. Dr. A. Grauel, University of Paderborn, Dept. Soest Prof. Dr. H.-M. Gross, Technical University of Ilmenau Dr. A. Gunter, University of Bremen Dr. J. Hollatz, Siemens AG, Munich Prof. Dr. R. Isermann, Technical University of Darmstadt Prof. Dr. P. Klement, University of Linz, Austria Prof. Dr. R. Kruse, University of Magdeburg (Vice Chairman) Prof. Dr. B. Mertsching, University of Hamburg Prof. Dr. R. Nakaeizadeh, Daimler Research Laboratory, Ulm Prof. Dr. K. Obermayer, Technical University of Berlin Prof. Dr. G. Palm, University of Ulm Dr. R. Palm, Siemens AG, Munich Dr. L. Peters, Institute for System Design Technology, St. Augustin Prof. Dr. F. Pichler, University of Linz, Austria Dr. P. Protzel, FORWISS, Erlangen Prof. Dr. B. Reusch, University of Dortmund Prof. Dr. Rigoll, University of Duisburg Prof. Dr. R. Rojas, University of Halle Prof. Dr. B. Schurmann, Siemens AG, Munich (Vice Chairman) Prof. Dr. W. von Seelen, University of Bochum Prof. Dr. H. Thiele, University of Dortmund Prof. Dr. W. Wahlster, University of Saarbrucken Prof. Dr. H.-J. Zimmermann, Technical University of Aachen Organization Committee ---------------------- Prof. Dr. Dr. h.c. W. Brauer, Technische Universitat Munchen (Chairman) Dr. J. Hollatz, Siemens AG, Munich (Vice Chairman) C. Kirchmair, Technische Universitat Munchen C. Harms, GMD National Research Center for Information Technology, St. Augustin Organizational Information -------------------------- 30.11.1997: abridged version (English, 4 to 6 pages DIN A4 size) of following structure: * title * author(s), address, phone, fax, e-mail * contents: 1. abstract 2. key words (not more than 5) 3. state of the art 4. new aspects 5. theory, simulation or experiment 6. results and conclusion 7. references December '97: notification of acceptance or rejection of contribution 28.01.1997: final camera-ready papers for proceedings (up to 8 pages DIN A4 size) Formatting instructions will be available soon at: http://wwwbrauer.informatik.tu-muenchen.de/~fns98/format.html Please send four copies of your scientific contribution to: Prof. Dr. Dr. h.c. W. Brauer - FNS '98 - Institut fur Informatik Technische Universitat Munchen D-80290 Munchen Germany If you would like to take part in the workshop without submitting a paper please send your email adress to fns98 at tiki.informatik.tu-muenchen.de We will send you a copy of the workshop program as soon as available. For further information visit the Internet homepage at: http://wwwbrauer.informatik.tu-muenchen.de/~fns98 From tony at salk.edu Sun Oct 26 11:09:20 1997 From: tony at salk.edu (Tony Bell) Date: Sun, 26 Oct 1997 08:09:20 -0800 Subject: NIPS hotel/registration reminders Message-ID: <3.0.32.19971026080903.0068e620@salk.edu> The deadline for early registration for NIPS is October 31. After that the costs go up a bit. You can register online on the NIPS web page, using your Visa or Mastercard. http://www.cs.cmu.edu/Groups/NIPS/NIPS97/ In addition, hotel rooms are held for us only till early November as follows: Denver Marriot Hotel (Conference), Nov 14. tel. 800-228-9290 Beaver Run Resort (Workshop), Nov 6. tel. 800-288-1282 Breckinridge Hilton (Workshop), Nov 4. tel. 800-321-8444 Further accommodation details (international telephones numbers etc) are all on the NIPS web page. - Tony Bell, NIPS Publicity From smc at decsai.ugr.es Sun Oct 26 20:47:34 1997 From: smc at decsai.ugr.es (Serafin Moral) Date: Mon, 27 Oct 1997 01:47:34 +0000 Subject: UAI-98 Call for Papers Message-ID: <3453F2B6.2DB3C53D@decsai.ugr.es> We apologize if you receive multiple copies of this message. Please distribute to interested persons. ====================================================== C A L L F O R P A P E R S ====================================================== ** U A I 98 ** THE FOURTEENTH ANNUAL CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE July 24-26, 1998 University of Wisconsin Business School Madison, Wisconsin, USA ======================================= Please visit the UAI-98 WWW page at http://www.uai98.cbmi.upmc.edu ************************************************************** CO-LOCATION ANNOUNCEMENT The 1998 UAI Conference will be co-located with ICML-98 (International Conference on Machine Learning) and COLT-98 (Computational Learning Theory). Registrants to any of the three conferences will be allowed to attend without additional costs the technical sessions of the other conferences. Joint invited speakers, poster sessions and a panel session are planned for the three conferences. The day after the co-located conferences (Monday, July 27, 1998), full day workshops and/or tutorials will be offered by each of ICML, COLT, and UAI. UAI will offer a full day course in which an overview of the field of uncertain reasoning will be presented by a faculty of its distinguished researchers. The AAAI-98 conference technical program begins on Tuesday, July 28th. ========================================================================== UAI-98 will meet at the University of Wisconsin Business School, in close proximity to the Convention Center, where AAAI-98 will be held. * * * CALL FOR PAPERS Uncertainty management in artificial intelligence has now been established as a well founded discipline, with a degree of development that has allowed the construction of practical applications that are able to solve difficult AI problems. Since 1985, the Conference on Uncertainty in Artificial Intelligence (UAI) has served as the central meeting on advances in methods for reasoning under uncertainty in computer-based systems. The conference is a primary international forum for exchanging results on the use of principled uncertain-reasoning methods, and it has helped the scientific community move along the path from theoretical foundations, to efficient algorithms, to successful applications. The UAI Proceedings have become a basic reference for researches and practitioners who want to know about both theoretical advances and the latest applied developments in the field. We are very pleased to announce that UAI-98 will be co-located with ICML-98 (International Conference in Machine Learning) and COLT-98 (Computational Learning Theory). This will be an outstanding opportunity for members of the three communities to share ideas and techniques. The scope of UAI covers a broad spectrum of approaches to automated reasoning and decision making under uncertainty. Contributions to the proceedings address topics that advance theoretical principles or provide insights through empirical study of applications. Interests include quantitative and qualitative approaches, and traditional as well as alternative paradigms of uncertain reasoning. We encourage the submission of papers proposing new methodologies and tools for model construction, representation, learning, inference and experimental validation. Innovative ways to increase the expressive power and the applicability spectrum of existing methods is encouraged as well; hybrid approaches may, for example, provide one way to achieve these goals. Papers are welcome that present new applications of uncertain reasoning that stress the methodological aspects of their construction and use. Highlighting difficulties in existing procedures and pointing at the necessary advances in foundations and algorithms is considered an important role of presentations of applied research. Topics of interest include (but are not limited to): >> Foundations * Theoretical foundations of uncertain belief and decision * Uncertainty and models of causality * Representation of uncertainty and preference * Generalization of semantics of belief * Conceptual relationships among alternative calculi * Models of confidence in model structure and belief * Knowledge revision and combination >> Principles and Methods * Planning under uncertainty * Temporal reasoning * Markov processes and decisions under uncertainty * Qualitative methods and models * Automated construction of decision models * The representation and discovery of causal relationships * Uncertainty and methods for learning and data mining * Abstraction in representation and inference * Computation and action under limited resources * Control of computational processes under uncertainty * Time-dependent utility and time-critical decisions * Uncertainty and economic models of problem solving * Integration of logical and probabilistic inference * Statistical methods for automated uncertain reasoning * Hybridization of methodologies and techniques * Algorithms for uncertain reasoning * Advances in diagnosis, troubleshooting, and test selection * Formal languages to represent uncertain information * Data structures for representation and inference * Fusion of models * Uncertain reasoning and information retrieval * Enhancing the human-computer interface with uncertain reasoning * Automated explanation of results of uncertain reasoning >> Empirical Study and Applications * Empirical validation of methods for planning, learning, and diagnosis * Uncertain reasoning in embedded, situated systems (e.g., softbots) * Nature and performance of architectures for real-time reasoning * Experimental studies of inference strategies * Experience with knowledge-acquisition methods * Comparison of representation and inferential adequacy of different calculi * Methodologies for problem modeling For papers focused on applications in specific domains, we suggest that the following issues be addressed in the submission: - Why was it necessary to represent uncertainty in your domain? - What are the distinguishing properties of the domain and problem? - What kind of uncertainties does your application address? - Why did you decide to use your particular uncertainty formalism? - Which practical procedure did you follow to build the application? - What theoretical problems, if any, did you encounter? - What practical problems did you encounter? - Did users/clients of your system find the results useful? - Did your system lead to improvements in decision making? - What approaches were effective (ineffective) in your domain? - What methods were used to validate the effectiveness of the system? ================================= SUBMISSION AND REVIEW OF PAPERS ================================= Papers submitted for review should represent original, previously unpublished work. Papers should not be under review for presentation in any other conference, however, an extended version of the paper may be under review for publication in a scientific journal. Submitted papers will be carefully evaluated on the basis of originality, significance, technical soundness, and clarity of exposition. Papers may be accepted for presentation in plenary or poster sessions. There will be a joint poster session with the ICML-98 and COLT-98 Conferences. Some of the papers selected for this poster session may also have a plenary presentation at UAI. All accepted papers will be included in the Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence, published by Morgan Kaufmann Publishers. An outstanding student paper will be selected for special distinction. Submitted papers must be at most 20 pages of 12pt Latex article style or equivalent (about 4500 words). We strongly encourage the electronic submission of papers. To submit a paper electronically, send an electronic version of the paper (Postscript format) to the following address: uai98 at cbmi.upmc.edu The subject line of this message should be: $.ps, where $ is an identifier created from the last name of the first author, followed by the first initial of the author's first name. Multiple submissions by the same first author should be indicated by adding a number (e.g., pearlj2.ps) to the end of the identifier. Additionally, the paper abstract and data should be sent by using the electronic form at the following address: http://www.uai98.cbmi.upmc.edu/data.html Authors unable to submit papers electronically should send 5 copies of the complete paper to one of the Program Chairs at the addresses listed below. Authors unable to use the electronic form to submit the abstract should provide the following information (by sending a message to the e-mail address above): * Paper title (plain text) * Author names, including student status (plain text) * Surface mail address, e-mail address, and voice phone number for a contact author (plain text) * A short abstract including keywords (plain text) * Primary and secondary classification indices selected from conference topics listed above. * Indicate whether the paper is appropriate for a joint session with ICML-98 and COLT-98 * Indicate the preferred type of presentation: poster or plenary ++++++++++++++++++++++++++++++ Important Dates ++++++++++++++++++++++++++++++ >> All submissions must be received by: Monday, February 23, 1998 >> Notification of acceptance on or before: Friday, April 10, 1998 >> Camera-ready copy due: Friday, May 8, 1998 >> Conference dates: July 24, 25, 26, 1998 >> Full day course on Uncertain Reasoning: Monday, July 27, 1998 ========================== Conference E-mail Address ========================= Please send all inquiries (submissions and conference organization) to the following e-mail address: uai98 at cbmi.upmc.edu Program Co-chairs: =================== Gregory F. Cooper Center for Biomedical Informatics University of Pittsburgh Suite 8084 Forbes Tower 200 Lothrop Street Pittsburgh, PA 15213-2582 USA Phone: (412) 647-7113 Fax: (412) 647-7190 E-mail: gfc at cbmi.upmc.edu Serafin Moral Dpto. Ciencias de la Computacion e IA Universidad de Granada 18071 - Granada SPAIN Phone: +34 58 242819 Fax: +34 58 243317 E-mail: smc at decsai.ugr.es WWW: http://decsai.ugr.es/~smc General Conference Chair: ======================================================== Prakash P. Shenoy University of Kansas School of Business Summerfield Hall Lawrence, KS 66045-2003 USA Phone: (913) 864-7551 Fax: (913) 864-5328 E-mail: pshenoy at ukans.edu WWW: http://stat1.cc.ukans.edu/~pshenoy ======================================================== Refer to the UAI-98 WWW home page for late-breaking information: http://www.uai98.cbmi.upmc.edu From scheler at informatik.tu-muenchen.de Mon Oct 27 06:24:50 1997 From: scheler at informatik.tu-muenchen.de (Gabriele Scheler) Date: Mon, 27 Oct 1997 12:24:50 +0100 Subject: Two preprints on Language, Brain and Computation Message-ID: <97Oct27.122503+0100met_dst.49139+105@papa.informatik.tu-muenchen.de> Two preprints on Lexical Feature Learning and Narrative Understanding are available from Gabriele Scheler's homepage: http://www7.informatik.tu-muenchen.de/~scheler/publications.html ------------------------------------------------------------- 1. Lexical feature Learning Constructing semantic representations using the MDL principle Niels Fertig and Gabriele Scheler Words receive a significant part of their meaning from use in communicative settings. The formal mechanisms of lexical acquisition, as they apply to rich situational settings, may also be studied in the limited case of corpora of written texts. This work constitutes an approach to deriving semantic representations for lexemes using techniques from statistical induction. In particular, a number of variations on the MDL principle were applied to selected sample sets and their influence on emerging theories of word meaning explored. We found that by changing the definition of description length for data and theory - which is equivalent to different encodings of data and theory - we may customize the emerging theory, augmenting and altering frequency effects. Also the influence of stochastic properties of the data on the size of the theory has been demonstrated. The results consist in a set of distributional properties of lexemes, which reflect cognitive distinctions in the meaning of words. ------------------------------------------------------------- 2. Narrative Understanding Connectionist Modeling of Human Event Memorization Processes with Application to Automatic Text Summarization Maria Aretoulaki, Gabriele Scheler and Wilfried Brauer We present a new approach to text summarization from the perspective of neuropsychological evidence and the related methodology of connectionist modeling. The goal of this project is the computational modeling of the specific neuropsychological processes involved in event and text memorization and the creation of a working model of text summarization as a specific problem area. Memorization and summarization are seen as inherently related processes: linguistic material (e.g.~spoken stories or written reports) is compressed into a smaller unit, a {\em schema}, which conveys the most central of the states and events described, making extensive use of feature representations of linguistic material. It is in this compressed form that the source material is ``stored'' in memory and on this basis it is later retrieved. We discuss the ways whereby these schemata are formed in memory and the associated processes of schema selection, instantiation and change - both in order to further the understanding of these processes and to promote the development of NLP applications concerning the automatic condensation of texts. ------------------------------------------------------------------------ From xli at sckcen.be Mon Oct 27 09:31:41 1997 From: xli at sckcen.be (Xiaozhong Li) Date: Mon, 27 Oct 1997 15:31:41 +0100 Subject: CFP: FLINS'98 workshop, September 14-16, 1998, in Antwerp, Belgium Message-ID: <9710271431.AA16501@vitoosf1.vito.be> Dear friends, Attached is an announcement of FLINS'98 which will be held in September 14-16, 1998,, Antwerp, Belgium. For other information, please visit our homepage: http://www.sckcen.be/dept/flinhome/welcome.html Regards. X. LI _____________________________________________________________________ * Xiaozhong Li. PhD, Currently Postdoctoral Research Fellow * * FLINS-Fuzzy Logic and Intelligent technologies in Nuclear Science * * Belgian Nuclear Research Centre (SCK.CEN) *----------* * Boeretang 200, B-2400 Mol, Belgium | _L_ * * phone: (+32-14) 33 22 30(O); 32 25 52(H) | /\X/\ * * fax: (+32-14) 32 15 29 | \/Z\/ * * e-mail:xli at sckcen.be http://www.sckcen.be/people/xli | / \ @ * *________________________________________________________*----------* FIRST CALL FOR PAPERS FLINS'98 Third International FLINS Workshop on Fuzzy Logic and Intelligent Technologies for Nuclear Science and Industry Organized by SCK?CEN September 14-16, 1998 The Astrid Park Plaza Hotel, Antwerp, Belgium. Abstract submission deadline: December 15, 1997 Notification of acceptance: February 15, 1998 Final manuscript deadline: April 15, 1998 Introduction. FLINS, an acronym for Fuzzy Logic and Intelligent technologies in Nuclear Science, is a recent established international research forum aiming to advance the theory and applications of fuzzy logic and novel intelligent technologies in nuclear science and industry. Following FLINS'94 and FLINS'96, the first and second international workshops on this topic, FLINS'98 aims to bring together scientists and researchers and to introduce the principles of intelligent systems and soft computing such as fuzzy logic (FL), neural network (NN), genetic algorithms (GA) and any combinations of FL, NN, and GA, knowledge-based expert systems and complex problem-solving techniques within nuclear industry and related research fields. FLINS'98 offers a unique international forum to present and discuss techniques that are new and useful for nuclear science and industry. Workshop organization: SCK?CEN FLINS Co-sponsored by: OMRON Belgium NV, Ghent University, Belgium TRACTEBEL Energy Engineering, Belgium International Scientific Advisory Committee: Honorary chairman: L.A. Zadeh (University of California at Berkeley, USA) Chairman: H.-J. Zimmermann (RWTH Aachen, Germany) Members: Z. Bien (Korea Advanced Institute of Science & Technology) D. Dubois (Universit? Paul Sabatier, France) A. Fattah (IAEA, Austria) P. Govaerts (SCKoCEN, Belgium) C.F. Huang (Beijing Normal University, China) J. Kacprzyk (Polish Academy of Science) N. Kasabov (University of Otago, New Zealand) E.E. Kerre (Ghent University, Belgium) G.J. Klir (State University of New Youk at Binghamton, USA) M.M. Gupta (University of Saskatchewan, Canada) M. Modarres (University of Maryland at College Park, USA) J. Montero, (Complutense University Madrid, Spain) Y. Nishiwaki (University of Vienna, Austria) T. Onisawa (University of Tsukuba, Japan) H. Prade (Universite Paul Sabatier,France) M. Roubens (Universit? de Li?ge, Belgium) G. Resconi (Catholic University del S. Cuore, Brescia, Italy) Ph. Smets (Universit? Libre de Bruxelles, Belgium) H. Takagi (Kyushu Inst. of Design, Japan) E. Uchino (Kyushu Institute of Technology, Japan) A.J. van der Wal (TNO, the Netherlands) P. P. Wang (Duke University, USA) R.R. Yager (Iona College, USA) Organizing Committee: Honorary Chairman: E.E. Kerre (Ghent University, Belgium) SCK?CEN Advisors: P. D'hondt Chairman: D. Ruan Members: W. Bogaerts, B. Carl?, B. De Baets, G. de Cooman, R. Roman, B. Van de Walle S. Lenaerts C. Poortmans H. Quets H. A?t Abderrahim TOPICS Contributions describing original work, research projects, and state-of-the-art reviews are solicited on the following (nonrestrictive) list of topics to be covered at FLINS'98: fuzzy logic, neural networks, genetic algorithms, learning techniques, robotics, man-machine interface, decision-support techniques, control theory, clustering, rough set theory, evidence theory, belief theory, functional modeling, with applications in nuclear science and industry, and related research fields, such as: nuclear reactor control, nuclear energy and environmental protection, safety assessment, human reliability, risk analysis, safeguards, production processes in the fuel cycle, dismantling, waste and disposal, power systems control, scheduling, load forecasting, telecommunications. SUBMISSION Authors are invited to submit a one-page abstract containing the paper title, the authors' names and affiliations, complete address (including email, fax, and phone) of the corresponding author(s), five keywords and the abstract (200-250 words), before December 15, 1997 by mail, fax, email to the FLINS'98 Chairman (see below), or to fill in the abstracts submission form on the Flins 98 Website. The organizers intend to have the conference proceedings available to the delegates (all accepted papers reviewed by the scientific committee of FLINS'98 will be published as a book by World Scientific Publishers). Conference Fee 1.Regular early registration (before April 15, 1998) 15,000 BEF late registration (after April 15, 1998) 18,000 BEF 2.Student/invited session chair: early registration (before April 15, 1998) 9,000 BEF late registration (after April 15, 1998) 12,000 BEF Registration fee (21 % VAT incl.) entitles you to (for everyone): access to the conference sessions get a copy of the final program and the proceedings (book form by World Scientific) have coffeebreaks with refreshments have conference reception have 3-day lunches have daily one free drink at the exclusive Diamond Club of the hotel 1 US $ is approximately 35 BEF Important Dates abstract submission deadline: December 15, 1997, notification of acceptance: February 15, 1998, final manuscript deadline: April 15, 1998. From jose at tractatus.rutgers.edu Tue Oct 28 11:00:18 1997 From: jose at tractatus.rutgers.edu (Stephen Hanson) Date: Tue, 28 Oct 1997 11:00:18 -0500 Subject: COGNITIVE SCIENCE@RUTGERS(Newark Campus) Message-ID: <199710281600.LAA22872@tractatus.rutgers.edu> The Psychology Department at Rutgers University (Newark Campus) is pleased to announce a new track in its graduate program for Cognitive Science. Please go directly to our WEB PAGE: www-psych.rutgers.edu for information on the program, research, faculty, stipends and applications. From Mary-Ellen_Flinn at brown.edu Tue Oct 28 11:30:38 1997 From: Mary-Ellen_Flinn at brown.edu (Mary-Ellen Flinn) Date: Tue, 28 Oct 1997 11:30:38 -0500 (EST) Subject: Instructor Position/Brown University Message-ID: Instructor for Semester II 1997-98 to Teach Cognitive Neuroscience Course at Brown University A temporary position is available in the Spring semester (II) of 1998 to teach a course in the Department of Neuroscience in the area of Cognitive Neuroscience. This course will deal with fundamental issues of Cognitive Neuroscience at a level appropriate for advanced undergraduate neuroscience concentrators and for graduate students. This lecture course emphasizes a systems approach to neuroscience and covers several neural systems from among consciousness, sleeping and waking, thinking, selection of action, higher visual and motor processes, sensorimotor integration, learning and memory, attention and emotion. Discussions focus on cerebral cortical mechanisms of behavior and cognition, though subcortical neural mechanisms are discussed. Emphasis on experimental work from functional neuroimaging in humans, behavioral neurophysiology, and observations from human pathology. Some degree of flexibility in course content is possible. Job Requirements: At least 1 year's prior teaching experience. Ph.D. in Brain or Behavioral Sciences The position is available as an adjunct assistant, associate or full professor for one semester only. Appropriately trained postdoctoral fellows will be considered for this position. Strong teaching skills are essential. We encourage applications from women and minority candidates. Brown University is an Equal Opportunity Affirmative Action Employer. Interested individuals should send a CV and names of 3 references by November 20, 1997 to: John P. Donoghue, Ph.D. Chairman Department of Neuroscience Box 1953 Providence, RI 02912 From zenon at ruccs.rutgers.edu Tue Oct 28 12:17:30 1997 From: zenon at ruccs.rutgers.edu (Zenon Pylyshyn) Date: Tue, 28 Oct 1997 12:17:30 -0500 Subject: Immediate Post Doc at Rutgers NB Message-ID: <199710281717.MAA11711@ruccs.rutgers.edu> The Center for Cognitive Science at Rutgers, New Brunswick, NJ has an opening for a two-year Post-Doctoral Fellow to start as early as this January. Salary commensurate with experience and in line with that recommended by federal funding agencies. Emphasis will be on independent research in visual attention, with special focus on multiple-object indexing and tracking. The applicant's overall interests should match those of the lab, as described in URL: http://ruccs.rutgers.edu/finstlab/finstsum.html and in the reports and papers listed in URL: http://ruccs.rutgers.edu/faculty/pylyshyn.html The candidate is expected to have a background in visual science, including the methods of visual psychophysics and/or computer modeling. Familiarity with the use of MAC, SGI and PC platforms for vision research is required. Mail Applications with letter and CV to: Zenon Pylyshyn, Rutgers Center for Cognitive Sceince, Rutgers University, Busch Campus, Psychology Building Addition, New Brunswick, NJ 08903 From dkim at vlsi.donga.ac.kr Wed Oct 29 04:23:32 1997 From: dkim at vlsi.donga.ac.kr (Daijin Kim) Date: Wed, 29 Oct 1997 18:23:32 +0900 Subject: CFP: AFSS'98, June 18-21, 1998, Masan/Tongyoung, Kyungnam, Korea Message-ID: <34570086.BA8A797F@vlsi.donga.ac.kr> Attached is an announcement of AFSS'98 which will be held in June 18-21, 1998, Masan/Tongyoung, Kyungnam, Korea. For more information, please visit our homepage: http://www.donga.ac.kr/~djkim/afss98.html Regards. Daijin Kim ---------------------------------------------------------------------------- AFSS'98 THE 3RD ASIAN FUZZY SYSTEMS SYMPOSIUM June 18-21, 1998 Masan / Tongyoung, Kyungnam, Korea In Cooperation With Korea Fuzzy Logic and Intelligent Systems Society(KFIS) Organizing Institution Kyungnam University, Masan, Korea AFSS'98 : Conference Information The KFIS(Korea Fuzzy Logic and Intelligent Systems Society) is pleased to announce that the 3rd AFSS(AFSS'98) will be held in Masan and Tongyoung, Kyungnam province, during June 18-21, 1998. The first conference of this symposium was held in singapore, November, 1993. And the secod conference was held in Kenting in Taiwan, December, 1996. AFSS'98 aims to encourage information exchange between researchers from Asia to the world for new researchers in the fields of fuzzy systems and related areas. Masan and Tongyoung are famous scenic sites as well as historic harbor cities in Kyungnam province. The beautiful scenery and the best climate during the conference period will provide all the participants with the comfortable and satisfactory meetings. ---------------------------------------------------------------------------- Conference Theme The state-of-art and the future of soft computing and intelligent systems ---------------------------------------------------------------------------- Conference Objective The objective of the symposium is to encourage information exchange between researchers in Asia and Western world in the fields of fuzzy systems and the related areas. It is also an opportunity to povide industrial applications, new technologies and products which is based on fuzzy sets, fuzzy logic and other soft computing methods ---------------------------------------------------------------------------- Topics: AFSS'98 covers the following topics, but others relevant to fuzzy sets and systems ara also represented. Soft computing Fundamentals of fuzzy sets and fuzzy logic Approximate reasoning Qualitive and approximate modeling Learning and acquisition of opproximate models Integration of fuzzy logic and neural networks Integration of fuzzy logic and evolutionary computing Hardware implementation of fuzzy logic and algorithms Design and synthesis of fuzzy logic controllers Applications to System modeling and control Computer vision Robotics and manufacturing Signal processing Image understanding Decision systems Finance Databases Information systems Virtual reality Man-machine interfaces and etc. ---------------------------------------------------------------------------- Important Dates: Submission of Abstrations of Papers: November 15, 1997 Notification of Acceptance: December 15, 1997 Submission of Camera-Ready Papers: March 15, 1998 Deadline for registration: May 15, 1998 Conference: June 18-21, 1998 ---------------------------------------------------------------------------- Conference Organization Honorary Chair L.A. Zadeh, UC Berkeley, USA General Co-Chair C.K. Park, Kyunghee University, Korea K.C. Min, Yonsei University, Korea International Advisory Committee Chair : Z. Bien, KAIST, Korea Members: K. Hirota, Tokyo Institute of Technology, Japan M. Mizumoto, Osaka Electro-Communication Univ., Japan M. Mukaidono, Meiji University, Japan M. Sugeno, Tokyo Institute of Technology, Japan H. Tanaka, Osaka Prefecture University, Japan L. C. Jain, University of South Australia, Australia M. R. Pal, Indian Statistical Institute, India Y.M. Liu, Sichuan Union University, China H.W. Lee, Samsung Electronics Co., Korea M.S. No, LG Electronics Co., Korea H. F. Wang, National Tsing-hua University, Taiwan L. Ding, Institute of System Sciences, Singapore J.C. Bezdek, U.S.A. J.M. Keller, U.S.A.---------------------------------------------------------------------------- Sponsors International Fuzzy Systems Association Japan Society of Fuzzy Theory and Systems, Japan Fuzzy Mathematics and Systems Association of China Indian Society for Fuzzy Mathematics and Information Processing Chinese Fuzzy Systems Association, Taiwan, China Kyungnam University, Masan, Kyungnam, Korea Ministry of Information and Communication, Korea Korea Research Foundation, Korea Korea Science & Engineering Foudation, Korea ---------------------------------------------------------------------------- Venue The Conference will take place at the campus of the Kyungnam University(15000 students) which is located in the southern part of Masan. Kyungnam University is one of the leading Korean private universities, very well equipped with modern audiovisual technology and projection facilities for all available media as well as with all kinds of Internet connections. The banguet and social events of AFSS98 will be held in Tongyoung at Tongyoung Marina Resort. Tongyoung is the most beautiful port called Naples of Korea. ---------------------------------------------------------------------------- Paper Submission All materials must be written in English. Extended abstracts(3 copies): double-spaced, 2 pages of A4 size white papers. Choose the session from the above topics Please indicate title, author(s), affiliation(s), mailing address(es), Telephone and FAX numbers and E-mail address Abstracts must be received by November 15, 1997 Papers for those registered and presenting at the conference will be published in the conference proceedings;specific details on format and paper length will be sent upon acceptance Send abstracts to: Prof. Yong Gi Kim Department of Computer Science Kyungsang National University 900 Gajwa, chinju, Kyungnam, 660-701Korea Tel : +82-591-751-5997 Fax : +82-591-762-1944 E-mail : ygkim at nongae.gsun.ac.kr ---------------------------------------------------------------------------- Pre-registration Form AFSS'98 The Third Asian Fuzzy Systems Symposium June 18-21, 1998 Masan/Tongyoung, Kyungnam, Korea Pre-registration Form Please type or write in block letters Last Name _______________________________________ First Name __________________[ ]Mr. [ ] Ms. [ ]Prof. [ ]Dr. Organaization ___________________________________ ___________________________________ Address _________________________________________ _________________________________________ ZIP____________________City______________________ Country _________________________________________ Phone __________________________________________ Fax _____________________________________________ E-mail __________________________________________ Please mark the appropriate boxes. [ ] I intend to attend the conference [ ] I intend to submit a paper on the following topic ______________________________________________ ______________________________________________ [ ] I wish to receive further information [ ] I intend to attend the industrial visit [ ] I intend to attend the cultural tour Date __________________ Signature _________________ To be sent as soon as possible but not later than by November 15, 1997(Using Fax, E-mail, Mail) ---------------------------------------------------------------------------- All the information concerning the conference and its program can be obtained from: Secretariat of AFSS98 Prof. Seung Gook Hwang Program Committee Chair of AFSS98 Department of Industrial Engineering Kyungnam University 449 Weolyoung, Happo, Masan, Kyungnam, 631-701 Korea Tel : +82-551-49-2705 Fax : +82-551-43-8133 E-mail : hwangsg at hanma.kyungnam.ac.kr http://www.donga.ac.kr/~djkim/afss98.html ---------------------------------------------------------------------------- From jose at tractatus.rutgers.edu Tue Oct 28 11:02:23 1997 From: jose at tractatus.rutgers.edu (Stephen Hanson) Date: Tue, 28 Oct 1997 11:02:23 -0500 Subject: RUTGERS UNIVERSITY (Newark Campus) Junior Position in Cognitive Science Message-ID: <199710281602.LAA22890@tractatus.rutgers.edu> Please Post- Oct 27, 1997 Announcement-- ASSISTANT PROFESSOR -- Rutgers University (Newark Campus). Rutgers University-Newark Campus: The Department of Psychology anticipates making one tenure-track appointment in Cognitive Science at the Assistant Professor level.Candidates should have an active research program in one or more of the following areas: action, learning, high-level vision, language. Of particular interest are candidates who combine one or more of these research interests with cognitive neuroscience, mathematical and/or computational approaches. The position calls for candidates who are effective teachers at both the graduate and undergraduate levels. Review of applications will begin on December 15, 1997. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Send CV and three letters of recommendation to Professor S. J. Hanson, Chair, Department of Psychology - Cognitive Search, Rutgers University, Newark, NJ 07102. Email inquiries can be made to cogsci at psychology.rutgers.edu From jose at tractatus.rutgers.edu Tue Oct 28 10:26:46 1997 From: jose at tractatus.rutgers.edu (Stephen Hanson) Date: Tue, 28 Oct 1997 10:26:46 -0500 Subject: SENIOR POSITION-RUTGERS UNIVERSITY (Newark Campus) COGNITIVE NEUROSCIENCE Message-ID: <199710281526.KAA22530@tractatus.rutgers.edu> Please Post- Oct, 28, 1997 **** NEW ANNOUNCEMENT-- SENIOR POSITION AT RUTGERS UNIVERSITY-(Newark Campus) **** ***COGNITIVE NEUROSCIENCE *** Rutgers University-Newark Campus: The Department of Psychology anticipates making one senior level appointment in Cognitive Neuroscience. We seek applicants with a demonstrated program of interdisciplinary research and teaching in areas such as cognitive psychology, computation, imaging, or neuroscience. Areas of research are open, however we hope to find candidates that can expand potential connections with the nearby engineering school (NJIT) and/or UMDNJ (with a focus on a fMRI research facility). The position calls for candidates who are effective teachers at both the graduate and undergraduate levels. Review of applications will begin on December 15, 1997. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Please send current CV and three letters of recommendation to Professor S. J. Hanson, Chair, Department of Psychology Cognitive Neuroscience Search, Rutgers University, Newark, NJ 07102. From mephu at cril.univ-artois.fr Wed Oct 29 12:00:50 1997 From: mephu at cril.univ-artois.fr (Engelbert Mephu-Nguifo) Date: Wed, 29 Oct 1997 18:00:50 +0100 Subject: JFA'98 - 1st Call for papers Message-ID: <34576BC2.310A2D5D@cril.univ-artois.fr> First Call for Papers Thirteenth French-speaking Conference on Machine Learning Arras, May 18-20, 1998 Submission deadline: February 13, 1998 WWW temporary location: http://www.lifl.fr/~mephu-ng/conf/jfa98_uk.html WWW permanent location: http://www.univ-artois.fr/jfa98 General Information JFA'98 is the annual french-speaking conference on Machine Learning. It is made of reviewed papers, invited lectures and tutorials. JFA accepts both symbolic and numeric approaches in machine learning, and research or application papers in this area. Papers may be submitted in english but the final version must be written in french. The thirteenth JFA conference will be held on May 18-20, 1998 at Artois University (Arras). Submissions Papers relevant to the discipline of Machine Learning are sollicited, including, but not limited to: Applications of Machine Learning Case-based Learning Computational Learning Theory Data Mining Evolutionary Computation Hybrid Learning Systems Inductive Learning Inductive Logic Programming Knowledge Discovery in Databases Language Learning Learning and Problem Solving Learning by Analogy Learning in Multi-Agent Systems Learning in Dynamic Domains Learning to Search Multistrategy Learning Neural Networks Reinforcement Learning Robot Learning Scientific Discovery  Papers are limited to 12 pages (using a 10pt Times Roman font, single-spaced, with 3cm margins on all sides) including figures, title page, references, and appendices. (see Format instructions) The papers will be refereed according to clarity and overall quality criteria, focusing primarily on their relevance to the conference. Email submissions are strongly preferred. Please send an attached PostScript file to jnicolas at irisa.fr Those unable to produce a PostScript file may send 4 hardcopies of their paper submission to the program chair: Jacques Nicolas IRISA - INRIA Campus Universitaire de Beaulieu, 35042 Rennes Cedex, France Tel: (+33) 2 99 84 73 12 E.mail: jnicolas at irisa.fr Timetable February 13, 1998 deadline for submission April 6, 1998 notification of acceptance/rejection May 4, 1998 deadline for final versions of papers May 18-20, 1998 JFA'98 Program committee Conference Chair: Jacques Nicolas, IRISA - INRIA, Rennes, jnicolas at irisa.fr Members: Michel Benaim Univerist? Paul Sabatier, Toulouse Francesco Berganado Universit? di Torino, Italy Gilles Bisson Imag, Grenoble Pierre Brezellec Universit? de Paris 13 St?phane Canu Heudiasyc, Compi?gne Antoine Cornuejols LRI, Orsay Colin De la Higuera Universit? de St Etienne Patrick Gallinari Laforia, Paris Fr?d?rick Garcia Inra, Toulouse Olivier Gascuel Lirmm, Montpellier Engelbert Mephu Nguifo CRIL, Universit? d'Artois, Lens Laurent Miclet Enssat, Lannion Mohamed Quafafou IRIN, Nantes C?line Rouveirol LRI, Orsay Mich?le Sebag LMS, Paris Dominique Snyers ENSTbr, Brest Christel Vrain Universit? d'Orl?ans Jean Daniel Zucker Laforia, Paris Organizers: Engelbert Mephu Nguifo Local Arrangement Chair CRIL - IUT de Lens - Universit? d'Artois Claire N?dellec Publicity Chair LRI, Orsay Jean Daniel Zucker Tutorial Chair Laforia, Paris Secretariat: CRIL - JFA'98 IUT de Lens - Universit? d'Artois Rue de l'Universit? SP 16 62307 Lens cedex, France Tel: (33) 3 21 79 32 55 / 73 Fax: (33) 3 21 79 32 72 E.mail: jfa98 at cril.univ-artois.fr From S.Singh-1 at plymouth.ac.uk Wed Oct 1 18:21:50 1997 From: S.Singh-1 at plymouth.ac.uk (Sameer Singh) Date: Wed, 1 Oct 1997 18:21:50 BST Subject: PhD Studentship available Message-ID: <33B2B973B9A@cs_fs15.csd.plym.ac.uk> ADVERTISEMENT FOR : EPSRC PhD STUDENSHIP IN THE UNIVERSITY OF PLYMOUTH,UK SUBJECT: Engineering/Computing TITLE: EPSRC PhD Research Studentship in Development of a Tool for Intelligent Analysis of Stammering in Telephonic Speech CITY: Plymouth, UK DETAILS: Research Opportunity, Applications are invited for a PhD Studentship due to begin in late October/early November 1997. The studentship will be a collaborative project between Frenchay Hospital (Bristol), BT Research Labs and the University of Plymouth. You should have a first or upper second class degree in Engineering or Computing (preferably an MSc). A knowledge of C/C++ and AI is essential, however, a knowledge of speech characterisation and/or pattern recognition/neural networks would be a distinct advantage. Informal enquiries for this post can be made to Dr K Burn-Thornton, telephone (01752) 232519, e-mail: kburn-thornton at plymouth.ac.uk. Application forms and further particulars for this Studentship are available from Ms S Lock Administrative Assistant (Research), School of Electronic, Communication and Electrical Engineering, University of Plymouth, Drake Circus, Plymouth PL4 8AA, telephone (01752) 232608; e-mail: slocke at plymouth.ac.uk. Closing date: 20th October, 1997. NOTE: The university would welcome applications from non-UK resident students. UNIVERSITY OF PLYMOUTH Promoting equal opportunity, A Leading Centre for teaching and research. From patrik at enterprise.cs.unm.edu Wed Oct 1 18:11:59 1997 From: patrik at enterprise.cs.unm.edu (Patrik D'haeseleer) Date: Wed, 1 Oct 1997 16:11:59 -0600 (MDT) Subject: Adaptive Computation: new web site + post-doc ad Message-ID: <199710012211.QAA05600@consequences.cs.unm.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 2092 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/3591c080/attachment-0001.ksh From jlm at cnbc.cmu.edu Thu Oct 2 07:32:47 1997 From: jlm at cnbc.cmu.edu (Jay McClelland) Date: Thu, 2 Oct 1997 07:32:47 -0400 (EDT) Subject: Experimental Physics Opening / Neuroscience Candidates Welcome Message-ID: <199710021132.HAA23892@eagle.cnbc.cmu.edu> The Department of Physics at CMU has initiated a search for an experimental physicist (see announcement below). The search for this position will be broad. Several members of the CMU physics department have grown increasingly enthusiastic about considering candidates in biological physics and the external advisory board for the department has strongly suggested biological physics, including topics in the area of neuroscience. Candidates will be considered from many areas. Candidates with interests in experimental neuroscience who have strong physics backgrounds and excellent research programs will receive very serious consideration. There is a strong emphasis in neuroscience in Pittsburgh both at CMU and the University of Pittsburgh. The Center for the Neural Basis of Cognition (CNBC) serves as a bridge between the relevant communities on the two campuses. While a physics appointee would be expected to establish a laboratory in physics, CMU emphasizes interdisciplinary research, so significant interaction with the CNBC would be viewed positively. The following announcement is being published in Physics Today. Individuals interested in further information about the position should may contact Prof. Michael Widom, as indicated below. Jay McClelland Co-Director, Center for the Neural Basis of Cognition --------------------------------------- Tenure Track Faculty Position Experimental Condensed Matter/Biological Physics Carnegie Mellon University The Department of Physics at Carnegie Mellon University invites applications for a tenure track experimentalist in condensed matter and/or biological physics. The appointment will be at a junior faculty level and will take effect July 1998 or later. We seek an individual of exceptional ability and promise to establish a vigorous research program. Excellent candidates in any area of specialization (including neuroscience) will be considered. Preference will be given to candidates who are likely to interact synergistically with current projects and facilities at Carnegie Mellon. Departmental interests include interfaces, lipid membranes, magnetic nanoparticles, semiconductors, scanning probe microscopy and x-ray scattering (see http://www-cmp.phys.cmu.edu). Applicants should send their curriculum vitae, publication list, a statement of research and teaching interests, and have at least three letters of reference sent before November 30, 1997 to Prof. Michael Widom, Chair, Search Committee, Department of Physics, Carnegie Mellon University, Pittsburgh PA, 15213. Carnegie Mellon University is an equal opportunity/affirmative action employer. From terry at salk.edu Thu Oct 2 17:53:31 1997 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 2 Oct 1997 14:53:31 -0700 (PDT) Subject: NEURAL COMPUTATION 9:8 Message-ID: <199710022153.OAA20194@helmholtz.salk.edu> Neural Computation - Contents Volume 9, Number 8 - November 15, 1997 ARTICLE Minimax Entropy Principle and its Application to Texture Modeling Song Chun Zhu, Ying Nian Wu, and David Mumford NOTE A Local Learning Rule that Enables Information Maximization for Arbitrary Input Distributions Ralph Linsker Convergence and Ordering of Kohonen's Batch Map Yizong Cheng LETTER Solitary Waves of Integrate-and-Fire Neural Fields David Horn and Irit Opher Time Series Segmentation Using Predictive Modular Neural Networks Athanasios Kehagias, and Vassilios Petridis Adaptive Mixtures of Probabilistic Transducers Yoram Singer Long Short-Term Memory Sepp Hochreiter and Jurgen Schmidhuber Factor Analysis Using Delta-Rule Wake-Sleep Learning Radford M. Neal and Peter Dayan Data Clustering Using a Model Granular Magnet Marcelo Blatt, Shai Wiseman, and Eytan Domany ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1998 - VOLUME 10 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $78 Individual $82 $87.74 $110 Institution $285 $304.95 $318 * includes 7% GST (Back issues from Volumes 1-8 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From zhaoping at ai.mit.edu Fri Oct 3 14:25:33 1997 From: zhaoping at ai.mit.edu (Zhaoping Li) Date: Fri, 3 Oct 97 14:25:33 EDT Subject: TR available --- Visual segmentation without classification in a model of V1 Message-ID: <9710031825.AA24096@sewer.mit.edu> Dear Connectionists, The following Technical report published by the AI publications office at MIT may be accessed via ftp://publications.ai.mit.edu/ai-publications/1500-1999/AIM-1613.ps TITLE: Visual segmentation without classification in a model of the primary visual cortex AUTHOR: Zhaoping Li ABSTRACT: Stimuli outside classical receptive fields significantly influence the neurons' activities in primary visual cortex. We propose that such contextual influences are used to segment regions by detecting the breakdown of homogeneity or translation invariance in the input, thus computing GLOBAL region boundaries using LOCAL interactions. This is implemented in a biologically based model of V1, and demonstrated in examples of texture segmentation and figure-ground segregation. By contrast with traditional approaches, segmentation occurs without classification or comparison of features within or between regions and is performed by exactly the same neural circuit responsible for the dual problem of the grouping and enhancement of contours. Sincerely, Zhaoping Li (zhaoping at ai.mit.edu) From dwang at cis.ohio-state.edu Sun Oct 5 15:31:48 1997 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Sun, 5 Oct 1997 15:31:48 -0400 (EDT) Subject: Faculty position in image understanding Message-ID: <199710051931.PAA00423@shirt.cis.ohio-state.edu> JOB Description: Image Understanding For one of several open faculty positions resulting from a major initiative for image understanding, the Ohio State University is seeking a tenure-track computer scientist who works on theoretical aspects of image understanding and whose work relates to human visual perception. We are particularly interested in candidates who are capable of interdisciplinary and collaborative research. This position will be a joint appointment between the Department of Computer and Information Science and the Center for Cognitive Science, and is related to the image understanding initiative involving a number of departments and centers at OSU. The successful candidate will hold a Ph.D and have a demonstrated record of research accomplishments in image understanding. Applicants should send a curriculum vitae, along with a cover letter, by e-mail to fsearch at cis.ohio-state.edu, or by hardcopy to Chair, Faculty Search Committee Department of Computer and Information Science The Ohio State University 395 Dreese Lab 2015 Neil Avenue Columbus, OH 43210-1277 Applications will be accepted until the position is filled. From dld at cs.monash.edu.au Mon Oct 6 10:39:23 1997 From: dld at cs.monash.edu.au (David L Dowe) Date: Tue, 7 Oct 1997 00:39:23 +1000 Subject: Final Call for Abstracts, 1998 Pacific Symposium on Biocomputing (fwd) Message-ID: <199710061439.AAA23151@dec11.cs.monash.edu.au> > From news at nlm.nih.gov Tue Oct 7 00:34:25 1997 > To: bionet-announce at net.bio.net > From: < @nlm.nih.gov> > Newsgroups: bionet.announce,bionet.biology.computational, bionet.molbio.proteins,news.announce.conferences > Subject: Final Call for Abstracts, 1998 Pacific Symposium on Biocomputing CALL FOR ABSTRACTS Pacific Symposium on Biocomputing Kapalua, Maui, Hawaii -- January 4-9, 1998 http://www.cgl.ucsf.edu/psb The Pacific Symposium on Biocomputing (PSB) is soliciting the submission of abstracts for poster presentation and computer demonstrations. Abstracts may report on research results in any aspect of computational biology. PSB is a forum for the presentation of research in databases, algorithms, interfaces, visualization, modeling and other computational methods, as applied to biological problems, with emphasis on applications in data-rich areas of molecular biology. For more information on the program of PSB, see our web site, which contains the full list of peer reviewed papers being presented this year and the topical sessions into which the conference is organized. Abstracts MUST be submitted electronically by November 1st to Russ Altman (russ.altman at stanford.edu). Hardcopy abstracts will not be accepted. Abstracts must be in ASCII format with header information in the following order: Title, Authors, Institution, Mailing Address, Email Addresses, body of abstract. The abstracts should be no longer than 500 words, including header information. There can be no figures in abstracts. Workstations and internet connections will be available for demonstrations, in addition to space in poster presentation sessions. Please submit detailed requests for demonstration facilities along with your abstract. Remember, the FINAL DEADLINE IS NOVEMBER 1. We look forward to seeing you in Hawaii! -- , PhD. National Library of Medicine phone: +1 (301) 496-9303 Bldg. 38A, 9th fl, MS-54 fax: +1 (301) 496-0673 Bethesda. MD 20894 USA email: @nlm.nih.gov From prior at MIT.EDU Tue Oct 7 08:41:48 1997 From: prior at MIT.EDU (Robert Prior) Date: Tue, 7 Oct 97 08:41:48 EDT Subject: Series Announcement and Call for Proposals Message-ID: ------------------------------------------------------------------ The MIT Press - Adaptive Computation and Machine Learning Series ------------------------------------------------------------------ Tom Dietterich, Series Editor Christopher Bishop, David Heckerman, Michael Jordan, and Michael Kearns, Associate Editors The goal of building systems that can adapt to their environments and learn from their experience has attracted researchers from many fields, including computer science, engineering, mathematics, physics, neuroscience, and cognitive science. Out of this research has come a wide variety of learning techniques, including methods for learning decision trees, decision rules, neural networks, statistical classifiers, and probabilistic graphical models. These learning methods have the potential to transform many industrial and scientific fields. Many successful and profitable applications have already been developed. The researchers in these various areas have also produced several different theoretical frameworks for understanding these methods, such as computational learning theory, Bayesian learning theory, classical statistical theory, minimum description length theory, and statistical mechanics approaches. These theories provide insight into experimental results and help to guide the development of improved learning algorithms. Recently, the many separate research communities have begun to converge on a common set of issues surrounding supervised, unsupervised, and reinforcement learning problems. A goal of the series is to promote the unification of the many diverse strands of machine learning research and to foster high quality research and innovative applications. This book series will publish works of the highest quality that advance the understanding and practical application of machine learning and adaptive computation. Books appropriate for the series include: * Research monographs on any of the topics listed above * Textbooks at the introductory or advanced level * How-to books aimed at practitioners * Books intended to introduce the main goals and challenges of this area to a general technical audience. For information on the submission of proposals and manuscripts, please contact the editor, the publisher, or any of the associate editors listed above: Thomas G. Dietterich Robert V. Prior Computer Science Department The MIT Press Oregon State University 5 Cambridge Center Corvallis, OR 97331-3202 Cambridge, MA 02142 (541) 737-5559 (617) 253-1584 Fax: (541) 737-3014 Fax: (617) 258-6779 tgd at cs.orst.edu prior at mit.edu From ericr at mech.gla.ac.uk Wed Oct 8 07:32:22 1997 From: ericr at mech.gla.ac.uk (Eric Ronco) Date: Wed, 8 Oct 1997 12:32:22 +0100 (BST) Subject: No subject Message-ID: <4311.199710081132@googie.mech.gla.ac.uk> From shavlik at cs.wisc.edu Fri Oct 10 12:19:29 1997 From: shavlik at cs.wisc.edu (Jude Shavlik) Date: Fri, 10 Oct 1997 11:19:29 -0500 (CDT) Subject: Call for Workshop and Tutorial Proposals: 1998 ML Conf Message-ID: <199710101619.LAA02888@jersey.cs.wisc.edu> ICML-98: Call for Workshop and Tutorial Proposals ------------------------------------------------- The Fifteenth International Conference on Machine Learning (ICML-98) will be held at the University of Wisconsin, Madison USA from July 24 to July 26, 1998. ICML-98 will be co-located with the Eleventh Annual Conference on Computational Learning Theory (COLT-98) and the Fourteenth Annual Conference on Uncertainty in Artificial Intelligence (UAI-98). Seven additional AI conferences, including the Fifteenth National Conference on Artificial Intelligence (AAAI-98), will also be held in Madison next summer (see http://www.cs.wisc.edu/icml98/ for a complete list). Since ICML is being co-located with AAAI, there will NOT be a separate ICML workshop and tutorial program in 1998. Instead, people interested in submitting ML-related workshop or tutorial proposals should submit to the corresponding AAAI program. Members of the ML community are serving as AAAI workshop and tutorial co-chairs, and they are aware of the plan to have joint AAAI/ICML workshops and tutorials. Joint AAAI/ICML workshops and tutorials will be scheduled for Monday, July 27, 1998, the day between the ICML and AAAI technical programs. (AAAI has agreed to allow ICML attendees to attend AAAI workshops andtutorials without requiring attendance at AAAI.) Please note that the deadlines for these programs are near. October 31, 1997 is the deadline for AAAI workshop proposals, while November 14, 1997 is the deadline for AAAI tutorial proposals. For those who like to plan far ahead, the deadline for ICML technical-paper submissions will be March 2, 1998. A preliminary call for papers, as well as additional conference information including copies of the AAAI calls for tutorial and workshop proposals, is available at: http://www.cs.wisc.edu/icml98/ Jude Shavlik ICML-98, Chair icml98 at cs.wisc.edu PS - As usual, my apologies to those who receive this posting multiple times. From barto at cs.umass.edu Fri Oct 10 13:53:46 1997 From: barto at cs.umass.edu (Andy Barto) Date: Fri, 10 Oct 1997 13:53:46 -0400 Subject: Post Doctoral Position Message-ID: Below is a notice concerning a postdoctoral position in our lab for someone interested in motor control, learning, and development. ------------------------------------------- Adaptive Networks Laboratory, Department of Computer Science, University of Massachusetts, Amherst Postdoctoral Fellowship A postdoctoral fellowship will be available starting Jan 1, 1998 for interdisciplinary research related to the development of motor skills in human infants. The project involves collaboration between computer science researchers and developmental psychologists focusing on the development of reaching skills in infants. The goals are to develop computational models of motor learning that are consistent with modern developmental data and to design and conduct behavioral experiments related to these models. Applicants should have a Ph.D. in Computer Science, Psychology, or a related discipline, have knowledge of motor control, modeling techniques, programming experience, and should show evidence of exceptional research promise. The position is in the Adaptive Networks Laboratory, Department of Computer Science, University of Massachusetts, Amherst. For further information, contact: Dr. Andrew Barto, Tel. 413-545-2109; FAX: 413-545-1249; Email: Barto at cs.umass.edu. For further information regarding the laboratory, the department, and the pleasant environs, look at the WWW pages http://www.cs.umass.edu/ and http://www-anw.cs.umass.edu/. Also see http://forte.sbs.umass.edu/~berthier for relevant developmental literature. The University of Massachusetts is an Affirmative Action/Equal Opportunity employer. From brychcy at informatik.tu-muenchen.de Tue Oct 14 06:01:32 1997 From: brychcy at informatik.tu-muenchen.de (Till Brychcy) Date: Tue, 14 Oct 1997 12:01:32 +0200 Subject: CALL FOR PAPERS: FNS '98 in Munich, Germany Message-ID: <97Oct14.120144+0200met_dst.49366+9@papa.informatik.tu-muenchen.de> (A copy of this message has also been posted to the following newsgroups: comp.ai.neural-nets, comp.ai.fuzzy,de.sci.informatik.ki,de.sci.informatik.misc) CALL FOR PAPERS 5. International GI-Workshop Fuzzy-Neuro Systems '98 - Computational Intelligence - 18 - 20 March 1998, Munich Fuzzy-Neuro Systems '98 is the fifth event of a well established series of workshops with international participation. Its aim is to give an overview of the state of art in research and development of fuzzy systems and artificial neural networks. Another aim is to highlight applications of these methods and to forge innovative links between theory and application by means of creative discussions. Fuzzy-Neuro Systems '98 is being organized by the Research Committee 1.2 "Inference Systems" (Fachausschuss 1.2 "Inferenzsysteme") of the German Society of Computer Science GI (Gesellschaft fuer Informatik e. V.) and Technische Universitaet Muenchen with support by Siemens AG. The workshop takes place at the European Patent Office in Munich from March 18 to 20, 1998 Scientific Topics: - theory and principles of multivalued logic and fuzzy logic - representation of fuzzy knowledge - approximate reasoning - fuzzy control in theory and practice - fuzzy logic in data analysis, signal processing and pattern recognition - fuzzy classification systems - fuzzy decision support systems - fuzzy logic in non-technical areas like business administration, management etc. - fuzzy databases - theory and principles of artificial neural networks - hybrid learning algorithms - neural networks in pattern recognition, classification, process monitoring and production control - theory and principles of evolutionary algorithms: genetic algorithms and evolution strategies - discrete parameter and structure optimization - hybrid systems like neuro-fuzzy systems, connectionistic expert systems etc. - special hardware and software Please send four copies of your scientific contribution (4 to 6 pages) by 30 Nov. 1997 to: Prof. Dr. Dr. h.c. W. Brauer - FNS '98 - Institut fuer Informatik Technische Universitaet Muenchen D-80290 Muenchen Germany For further information visit the Internet homepage at: http://wwwbrauer.informatik.tu-muenchen.de/~fns98/ From A_BROWNE at europa.nene.ac.uk Wed Oct 15 10:12:16 1997 From: A_BROWNE at europa.nene.ac.uk (Tony Browne) Date: Wed, 15 Oct 1997 14:12:16 +0000 Subject: Two New Books Message-ID: <7B677E7061@europa.nene.ac.uk> Two New Books on Neural Networks Volume 1: Neural Network Perspectives on Cognition and Adaptive Robotics A. Browne (Ed.). Institute of Physics Press, Bristol, UK. ISBN 0-7503-0455-3 Volume 2: Neural Network Analysis, Architectures and Applications. A. Browne (Ed.). Institute of Physics Press, Bristol, UK. ISBN 0-7503-0499-5 CONTENTS: Volume 1: Part 1: Representation Challenges for Neural Computing. Antony Browne Representing Structure and Structured Representations in Connectionist Networks. Lars Niklasson and Mikael Boden. Chaos, Dynamics and Computational Power in Biologically Plausible Neural Networks. Robert Kentridge. Information-Theoretic Approaches to Neural Network Learning. Mark Plumbley. Part 2: Cognitive Modelling Exploring Different Approaches towards Everyday Commonsense Reasoning. Ron Sun Natural Language Processing with Subsymbolic Neural Networks. Risto Miikkilainen. The Relational Mind. Professor John Taylor. Neuroconsciousness: A Fundamental Postulate. Professor Igor Aleksander. Part 3: Adaptive Robotics The Neural Mind and The Robot. Professor Noel Sharkey and Jan Heemskerk. Teaching a Robot to See How it Moves. Patrick van der Smagt. Designing a Nervous System for an Adaptive Mobile Robot. Tom Scutt and Robert Damper. Bibliography (347 References) VOLUME 2: Part 1: Understanding and Simplifying Networks Analysing the Internal Representations of Trained Neural Networks. John Bullinaria. Information Maximization to Simplify Internal Representation. Ryotaro Kamimura. Rule Extraction from Trained Artificial Neural Networks. Robert Andrews, Alan Tickle, Mostefa Golea and Joachim Diederich. Part 2: Novel Architectures and Algorithms Pulse-Stream Techniques and Circuits for Implementing Neural Networks. Robin Woodburn and Professor Alan Murray. Cellular Neural Networks. Mark Joy. Efficient Training of Feed-Forward Neural Networks. Martin Moller. Exploiting Local Optima in Multiversion Neural Computing. Professor Derek Partridge. Part 3: Applications Neural and Neuro-Fuzzy Control Systems. Phil Picton. Image Compression using Neural Networks. Christopher Cramer and Erol Gelenbe. Oil Spill Detection: A Case Study using Recurrent Artificial Neural Networks. Tom Ziemke, Mikael Boden and Lars Niklasson. Bibliography (216 References) Dr A. Browne School of Information Systems Nene College Northampton NN2 7AL, UK From michal at neuron.tau.ac.il Wed Oct 15 06:11:36 1997 From: michal at neuron.tau.ac.il (Michal Finkelman) Date: Wed, 15 Oct 1997 12:11:36 +0200 (IST) Subject: Symposium on Neural Computation in honor of Prof. David Horn - Second Notice Message-ID: %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% TEL AVIV UNIVERSITY THE RAYMOND & BEVERLY SACKLER FACULTY OF EXACT SCIENCES SCHOOL OF PHYSICS AND ASTRONOMY SYMPOSIUM ON PARTICLE PHYSICS AND NEURAL COMPUTATION ---------------------------------------------------- IN HONOR OF DAVID HORN'S 60TH BIRTHDAY -------------------------------------- Monday, October 27th 1997 (9:15 AM - 05:30 PM) Lev Auditorium, Tel-Aviv University PROGRAM ---------- 9:15 AM: Opening addresses: Nili Cohen, Rector of Tel-Aviv University Yuval Ne'eman (Tel Aviv) 9:30 - 10:30: Gabriele Veneziano (CERN) - From s-t-u Duality to S-T-U Duality 10:30 - 11:00: Coffee break 11:00 - 12:00: Fredrick J Gilman (Carnegie Mellon) - CP Violation 12:00 - 1:30: Lunch break 1:30 - 2:30: Leon N Cooper (Brown) - From Receptive Fields to the Cellular Basis for Learning and Memory Storage: A Unified Learning Hypothesis 2:30 - 3:30: John J Hopfield (Princeton) - How Can We Be So Smart? Information Representation and Neurobiological Computation. 3:30 - 4:00: Coffee break 4:00 - 5:00: Yakir Aharonov (Tel Aviv) - A New Approach to Quantum Mechanics 5:00 PM: David Horn - Closing Remarks %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 1. Colleages and friends who wish to attend the symposium are kindly requested to NOTIFY US IN ADVANCE by e-mailing to michal at neuron.tau.ac.il. fax: 972-3-6407932 2. http://neuron.tau.ac.il/Symposium %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% From Peter.Nordin at dacapo.se Wed Oct 15 11:56:06 1997 From: Peter.Nordin at dacapo.se (Peter Nordin) Date: Wed, 15 Oct 1997 16:56:06 +0100 Subject: Announcement: Thesis available Message-ID: <01BCD98B.3BA54660@dhcp1.dacapo.se> My thesis on evolution or induction of register machine code is now available from the publisher Krehl-verlag in Germany. The described machine learning technique is a variant of genetic programming applied to evolution of binary machine code for a real computer--a technique which is very machine efficient. The title of the thesis is: Evolutionary Program Induction of Binary Machine Code and its Applications The thesis includes applications to robot control as well as image and sound processing. It can be ordered from: Krehl Verlag Postfach 51 01 42 D-48163 Muenster GERMANY Tel/Fax +49 231 261550 email: krehl-verlag at t-online.de Price: 68DM Below you will find the abstract, table of contents and details on how to order. Best regards, Peter Nordin --------------------------------------------------------------------- Evolutionary Program Induction of Binary Machine Code and its Applications Author: Peter Nordin Supervisor: Wolfgang Banzhaf ISBN 3-931546-07-1 290 pages Abstract This thesis presents the Compiling Genetic Programming System (CGPS) a machine learning method for automatic program induction. The objective of the system is to automatically produce computer programs. CGPS is the marriage of two new ideas (1) a special evolutionary program induction algorithm variant and (2) the use of large scale meta manipulation of binary code. Both ideas may have merits on their own but it is by combining the two that the real benefits emerge, making CGPS a powerful machine learning paradigm. (1) The evolutionary program induction method is an instance of an evolutionary algorithm (EA)|a class of algorithms that borrow metaphors from biology, evolution and natural selection. It uses a linear program representation in contrast to other well used methods such as Koza's genetic programming (GP) approach which has a hierarchical tree{based program structure. One way to view CGPS is as a large alphabet genetic algorithm (GA) where each letter in the alphabet corresponds to a syntactically closed computer program structure. A letter in the GA could for example be a line in a computer program i.e. an assignment a := a + 1. CGPS uses recombination (crossover) in between letters which guarantees syntactic closure during evolution. However, from a GA point of view, the letter in the linear string normally does not have an internal structure. A program line in a computer language the program or a machine code instruction have plenty of internal structure determining the operation to be performed and the operands used. A better metaphor could therefore be the gene concept in nature. Genes in DNA are syntactically closed sequences providing the recipe of a protein. As in CGPS, crossover normally acts between genes/instructions preserving the syntactic closure of the object. There is a mutation operator, to produce variation within the internal structure of the gene/instruction. In CGPS, the mutation operator changes the syntactically closed program object into another syntactically closed program object|it replaces a valid letter with another valid letter in the GA alphabet. CGPS also uses two significant features in the program individual (genome) which is the header and the footer. The header is a prefix part of the individual while the footer is the suffix part. The genetic operators prevent the header and the footer from being changed, and the header and the footer are used to ensure syntactic closure of the whole individual. (2) CGPS in this thesis is mostly applied to program induction of binary machine code. Binary machine code is the code in the computer that is directly executed by the processor. The program individual in CGPS is a binary machine code function and the genetic operators operate directly on the binary machine code. This implies that CGPS is a meta-manipulating program. In low level programming, a meta-manipulating program is a program that changes its own binary code or the binary code of another pro- gram. Usually, this only means changing a few bytes e.g. dynamic linking of a function. However, CGPS is the first real instance of a large scale meta manipulating program which constantly manipulates, shuffles around and changes large chunks of binary code as a part of the learning process. So, the output from CGPS is a binary machine code program that can be executed directly by the processor. The meta manipulation of the individual (genome) is done with the evolutionary algorithm, mentioned in (1)" above. The headers and footers are used to make sure that the individual always preserves syntactic closure during evolution, no matter what the genetic operators do with the code in between. The marriage of these two ideas produce a system which can induce turing-complete machine code programs very efficiently. The system also has several other attractive properties, such as constant memory usage, compact representation and uncomplicated memory management. In this thesis the background to CGPS is presented together with other evolutionary algorithms and other evolutionary program induction techniques. The detailed descrip- tion of CGPS and its implementation is described together with several evaluations or case studies in different feasible domains such as robotic control, image processing and natural language processing. Some of the evaluations make comparisons to other ma- chine learning algorithms; neural networks and hierarchical genetic programming. The formal definition of CGPS is given followed by aninvestigation of generalization in vari- able length evolutionary algorithms. Finally, several possible directions for the future of CGPS research are presented. Table of Contents 1 Introduction .....21 1.1 Evolutionary Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . 22 1.2 Genetic Programming . . .. . .. .. . .. . .. . .. . .. . .. .. . . 25 1.3 Machine Language Genetic Programming . . .. . .. . .. . .. .. . . 31 1.4 Introduction to CGPS . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 I Implementation 47 2 Computer Hardware 49 2.1 Introduction to Computer Hardware. . .. . .. . .. . .. . .. .. . . 50 2.2 Von Neumann Computers . . . . . . . . . . . . . . . . . . . . . . . . . 50 3 The SPARC Architecture and CGPS 55 3.1 Fundamentals of CGPS . .. . .. .. . .. . .. . .. . .. . .. .. . . 56 3.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.3 Floating Point Instructions . . .. .. . .. . .. . .. . .. . .. .. . 69 3.4 Self Modifying Code in C language . . . . . . . . . . . . . . . . . . . . 69 3.5 How to Call a Self|made Function from C . .. . .. . .. . .. .. . . 69 3.6 Genetic Operators . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 70 3.7 Initialisation . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . 72 4 Additional Features of the System 75 4.1 Leaf Procedures and Primitives.... . .. . .. . .. . .. . .. .. . . 77 4.2 Memory in Tree{Based GP . . .. .. . .. . .. . .. . .. . .. .. . . 78 4.3 Conditionals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.4 Automatically De ned Subroutines in CGPS .. . .. . .. . .. .. . . 80 4.5 Leaf Procedure Examples .. . .. .. . .. . .. . .. . .. . .. .. . . 84 4.6 Loops and Recursion in Tree{Based GP . . . . . . . . . . . . . . . . . 85 4.7 Loops and Recursion in CGPS .... . .. . .. . .. . .. . .. .. . . 88 4.8 Loop Example in CGPS . . . . . . . . . . . . . . . . . . . . . . . . . . 88 4.9 External Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 4.10 Strings and Lists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 4.11 Parameters to the System . . . . . . . . . . . . . . . . . . . . . . . . . 92 4.12 C-language Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 4.13 Platforms for CGPS .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 94 4.14 Portability Methods .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 94 4.15 Caveats . .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 94 4.16 How to Get Started .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 95 4.17 Using Tree Representation . . . . . . . . . . . . . . . . . . . . . . . . . 96 4.18 Speed Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 4.19 Why is Binary Manipulation so Fast? . .. . .. . .. . .. . .. .. . . 98 4.20 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 5 A Walkthrough of an Example System. 101 5.1 Variables Constants and Parameters . . .. . .. . .. . .. . .. .. . . 102 5.2 Random Number Generation . .. .. . .. . .. . .. . .. . .. .. . . 104 5.3 Initialisation . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . 107 5.4 Output Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 5.5 The Fitness Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 5.6 Reproduce an Individual . . . . . . . . . . . . . . . . . . . . . . . . . . 109 5.7 Crossover .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 110 5.8 Mutation .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 110 5.9 Tournament Selection . . .. . .. .. . .. . .. . .. . .. . .. .. . . 111 5.10 Read Training Data .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 111 5.11 The Main Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 II Evaluations 115 6 On-line CGPS 117 6.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 6.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 6.3 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 6.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 6.5 Conclusions of On-line CGPS . . . . . . . . . . . . . . . . . . . . . . . 135 7 Control Using Memory ofPast Events 139 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 7.2 The Evolutionary Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 140 7.3 Setup . . .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 141 7.4 Objectives.. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 141 7.5 The Memory Based GP Control Architecture .. . .. . .. . .. .. . . 142 7.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146 7.7 Future Directions of Memory Based GP in Control . . . . . . . . . . . 152 7.8 Summary .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 153 8 High-performance Applications 155 8.1 Historic Remarks on CGPS and Image Coding . . .. . .. . .. .. . . 156 8.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156 8.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157 8.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 8.5 Summary and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . 161 9 CGPS and Programmatic Compression 163 9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164 9.2 Programmatic Compression (PC) . . . . . . . . . . . . . . . . . . . . . 164 9.3 Compression of Sound . . . . . . . . . . . . . . . . . . . . . . . . . . . 166 9.4 Compression of Pictures . . . . . . . . . . . . . . . . . . . . . . . . . . 171 9.5 Summary and Conclusion .. . .. .. . .. . .. . .. . .. . .. .. . . 173 10 CGPS and Tree{Based GP 175 10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176 10.2 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176 10.3 The Evolutionary Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 177 10.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177 10.5 Discussion and Conclusions . . .. .. . .. . .. . .. . .. . .. .. . 177 11 Neural Networks and CGPS 179 11.1 The Sample Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180 11.2 The Neural Network.. . .. . .. .. . .. . .. . .. . .. . .. .. . . 180 11.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182 11.4 Summary .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 183 12 Neural Networks and Generalisation 185 12.1 The Problems Used In This Study .. . .. . .. . .. . .. . .. .. . . 186 12.2 Classi cation as Symbolic Regression . . . . . . . . . . . . . . . . . . . . 187 12.3 Introns and Explicitly De ned Introns . . . . . . . . . . . . . . . . . . . 187 12.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187 12.5 Summary .. . .. . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 190 III Explanation 193 13 Formalisation of CGPS 195 13.1 Register Machines . .. . .. . .. .. . .. . .. . .. . .. . .. .. . . 196 13.2 Evolutionary Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . 199 13.3 Compiling Genetic Programming . . . . . . . . . . . . . . . . . . . . . 203 14 Complexity, Compression and Evolution 205 14.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206 14.2 Complexity, E ective Fitness and Evolution . . . . . . . . . . . . . . . . 209 14.3 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 14.4 Kolmogorov Complexity and Generalisation . . . . . . . . . . . . . . . 217 14.5 Empirical Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220 14.6 Other Evolutionary Techniques . . . . . . . . . . . . . . . . . . . . . . 223 14.7 Summary and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . 223 15 Explicitly Defined Introns 229 15.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230 15.2 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232 15.3 The Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . 233 15.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235 16 Conclusions and Outline of Future Perspectives 245 16.1 A Computer Language for Meta-manipulation . . . . . . . . . . . . . . 246 16.2 Typed GP, Constrained Crossover, Grammar and Search Bias . . . . . 247 16.3 Other Machine Learning Algorithms . . .. . .. . .. . .. . .. .. . . 251 16.4 Special Processors . .. . .. . .. .. . .. . .. . .. . .. . .. .. . 252 16.5 Large Number of Input Parameters . . . . . . . . . . . . . . . . . . . . 253 16.6 Reasoning about Machine Code Programs with a Meta GP System . . . 254 16.7 The Logic of Genetic Reasoning . . . . . . . . . . . . . . . . . . . . . . 258 16.8 Some Brief Initial Results .. . .. .. . .. . .. . .. . .. . .. .. . 259 Appendix A: Flow Charts of CGPS 263 * Orders within Germany can easily order in any bookstore (DM 68,-), although firms/institutions or (Master or Visa) credit card owners also can order directly (shipping and credit card service charge included). * International Sales are possible to Master or Visa-Card owners: - order should arrive via FAX (+49 2501 261550) or PGP-signed (and optionally encrypted, our PGP-key is available upon request) email (with PGP public key trustfully available). In the moment, these ways are the only ones protecting the customers sensitive payment data (we are working on secure order via WWW but have no final timeline for this now) The order should mention credit card type, number and expiration date, as well as card holders name and address. - Beside the cost of the book (w/o VAT), i.e. DM 68,- minus 7% the interested party has to decide about "surface" or "air" mail (rates available upon request for different areas, e.g. USA DM 3.50 for 3-5 weeks delivery or DM 24,-- for 1-2 weeks delivery). The total amount is drawn from the credit card in german currency. From ataxr at IMAP1.ASU.EDU Wed Oct 15 22:01:31 1997 From: ataxr at IMAP1.ASU.EDU (Asim Roy) Date: Wed, 15 Oct 1997 22:01:31 -0400 (EDT) Subject: COULD THERE BE REAL-TIME, INSTANTANEOUS LEARNING IN THE BRAIN? Message-ID: I am posting this memo to various newsgroups. So my apologies if you get multiple copies. I thought this would be an interesting topic of discussion in these scientific communities. Please respond to me directly and I will post all responses at the appropriate time. Asim Roy Arizona State University ------------------------------------------------ COULD THERE BE REAL-TIME, INSTANTANEOUS LEARNING IN THE BRAIN? One of the fundamental beliefs in neuroscience, cognitive science and artificial neural networks is that the brain learns in real-time. That is, it learns instantaneously from each and every learning example provided to it by adjusting the synaptic strengths or connection weights in a network of neurons. The learning is generally thought to be accomplished using a Hebbian-style mechanism or some other variation of the idea (a local learning law). In these scientific fields, real-time learning also implies memoryless learning. In memoryless learning, no training examples are stored explicitly in the memory of the learning system, such as the brain. It can use any particular training example presented to it to adjust whatever network it is learning in, but must forget that example before examining others. The idea is to obviate the need for large amounts of memory to store a large number of training examples. This section looks at the possibility of real-time learning in the brain from two different perspectives. First, some factual behavioral evidence from a recent neuroscience study on learning of motor skills is examined. Second, the idea of real-time learning is examined from a broader behavioral perspective. A recent study by Shadmehr and Holcomb [1997] may lend some interesting insight on how the brain learns. In this study, a positron emission tomography (PET) device was used to monitor neural activity in the brain as subjects were taught and then retested on a motor skill. The task required them to manipulate an object on a computer screen by using a motorized robot arm. It required making precise and rapid reaching movements to a series of targets while holding the handle of the robot. And these movements could be learned only through practice. During practice, the blood flow was most active in the prefrontal cerebral cortex of the brain. After the practice session, some of the subjects were allowed to do unrelated routine things for five to six hours and then retested on their recently acquired motor skill. During retesting of this group, it was found that they had learned the motor skill quite well. But it was also found that the blood flow now was most active in a different part of the brain, in the posterior parietal and cerebella areas. The remaining test subjects were trained on a new motor task immediately after practicing the first one. Later, those subjects were retested on the first motor task to find out how much of it they had learnt. It was found that they had reduced levels of skill (learning) on the first task compared to the other group. So Shadmehr and Holcomb [1997] conclude that after practicing a new motor skill, it takes five to six hours for the memory of the new skill to move from a temporary storage site in the front of the brain to a permanent storage site at the back. But if that storage process is interrupted by practicing another new skill, the learning of the first skill is hindered. They also conclude that the shift of location of the memory in the brain is necessary to render it invulnerable and permanent. That is, it is necessary to consolidate the motor skill. What are the real implications of this study? One of the most important facts is that although both groups had identical training sessions, they had different levels of learning of the motor task because of what they did subsequent to practice. From this fact alone one can conclude with some degree of certainty that real-time, instantaneous learning is not used for learning motor skills. How can one say that? One can make that conclusion because if real-time learning was used, there would have been continuous and instantaneous adjustment of the synaptic strengths or connection weights during practice in whatever net the brain was using to learn the motor task. This means that all persons trained in that particular motor task should have had more or less the same "trained net," performance-wise, at the end of that training session, regardless of what they did subsequently. (It is assumed here that the task was learnable, given enough practice, and that both groups had enough practice.) With complete, permanent learning (weight-adjustments) from "real-time learning," there should have been no substantial differences in the learnt skill between the two groups resulting from any activity subsequent to practice. But this study demonstrates the opposite, that there were differences in the learnt skill simply because of the nature of subsequent activity. So real-time, instantaneous and permanent weight-adjustment (real-time learning) is contradictory to the results here. Second, from a broader behavioral perspective, all types of "learning" by the brain involves collection and storage of information prior to actual learning. As is well known, the fundamental process of learning involves: (1) collection and storage of information about a problem, (2) examination of the information at hand to determine the complexity of the problem, (3) development of trial solutions (nets) for the problem, (4) testing of trial solutions (nets), (5) discarding such trial solutions (nets) if they are not good enough, and (6) repetition of these processes until an acceptable solution is found. Real-time learning is not compatible with these learning processes. One has to remember that the essence of learning is generalization. In order to generalize well, one has to look at the whole body of information relevant to a problem, not just bits and pieces of the information at a time as in real-time learning. So the argument against real-time learning is simple: one cannot learn (generalize) unless one knows what is there to learn (generalize). One finds out what is there to learn (generalize) by collecting and storing information about the problem. In other words, no system, biological or otherwise, can prepare itself to learn (generalize) without having any information about what is to be learnt (generalized). Learning of motor skills is no exception to this process. The process of training is simply to collect and store information on the skill to be learnt. For example, in learning any sport, one not only remembers the various live demonstrations given by an instructor (pictures are worth a thousand words), but one also remembers the associated verbal explanations and other great words of advise. Instructions, demonstrations and practice of any motor skill are simply meant to provide the rules, exemplars and examples to be used for learning (e.g. a certain type of body, arm or leg movement in order to execute a certain task). During actual practice of a motor skill, humans not only try to follow the rules and exemplars to perform the actual task, but they also observe and store new information about which trial worked (example trial execution of a certain task) and which didn't. One only ought to think back to the days of learning tennis, swimming or some such sport in order to verify information collection and storage by humans to learn motor skills. It shouldn't be too hard too explain the "loss of skill" phenomenon, from back-to-back instructions on new motor skills, that was observed in the study. The explanation shouldn't be different from the one for the "forgetting of instructions" phenomenon that occurs with back-to-back instructions in any learning situation. A logical explanation perhaps for the "loss of motor skill" phenomenon, as for any other similar phenomenon, is that the brain has a limited amount of working or short term memory. And when encountering important new information, the brain stores it simply by erasing some old information from the working memory. And the prior information gets erased from the working memory before the brain has the time to transfer it to a more permanent or semi-permanent location for actual learning. So "loss of information" in working memory leads to a "loss of skill." Another fact from the study that is highly significant is that the brain takes time to learn. Learning is not quick and instantaneous. Reference: Shadmehr, R. and Holcomb, H. (August 1997). "Neural Correlates of Motor Memory Consolidation." Science, Vol. 277, pp. 821-825. From protasi at mat.utovrm.it Thu Oct 16 07:31:16 1997 From: protasi at mat.utovrm.it (Marco Protasi) Date: Thu, 16 Oct 1997 13:31:16 +0200 Subject: Special issue of DAM Message-ID: CALL FOR PAPERS Special Issue on Discrete vs analog computation: Links between computational complexity and local minima Discrete Applied Mathematics (Elsevier) M. Gori and M. Protasi (Eds) ================================================================================ In the last few years, the resurgence of interest in fields like artificial neural networks has been raising some very intriguing theoretical questions on the links between discrete and analog computation. Basically, most analog schemes massively adopted in machine learning and combinatorial optimization rely on function optimization. In this setting, the inherent complexity of the problem at hand seems to appear in terms of local minima and, more generally, in terms of numerical problems of the chosen optimization algorithm. While most practitioners use to accept without reluctance the flavor of suspiciousness which arises from function optimization (in which one has often no guarantee to reach the global optimum), most theoreticians are instead quite skeptical. On the other hand, the success of analog computation for either learning or problem solving is often related to the problem at hand and, therefore, one can expect an excellent behavior for a class of problems, while one can raise serious suspects about the solution of others. To the best of our knowledge, this intuitive idea has not been satisfactorily explained from a theoretical point of view. Basically, there is neither theory to support naturally the intuitive concept of suspiciousness which arises from approaches based on continuous optimization, nor theory to relate this concept to computational complexity, traditionally developed in the discrete setting. The study of the complexity of algorithms has been essentially performed on discrete structures and an impressive theory has been developed in the last two decades. On the other hand optimization theory has a twofold face: discrete optimization and continuous optimization. Actually there are some important approaches of the computational complexity theory that were proposed for the continuous cases, for instance, the information based-complexity (Traub) and real Turing Machines (Blum-Shub-Smale). These approaches can be fruitfully applied to problems arising in continuous optimization but, generally speaking, the formal study of the efficiency of algorithms and problems has received much more attention in the discrete environment, where the theory can be easily used and error and precision problems are not present. The face that the complexity assumes in the analog setting is not very clear and, moreover, the links with traditional computational complexity for the discrete setting still need to be explored. The aim of the special issue is mainly to study the links between discrete and continuous versions of the same problem. Since, until now, these links have been rarely explored, the special issue is supposed to stimulate different points of view; people working on discrete optimization are in fact likely not to be expert on the continuous side and vice-versa. Prospecting authors are theoretical computer scientists working in the field of algorithms, operations research people, researchers working in neural network, and researchers in nonlinear system theory. Possible topics for papers submitted to the special issue include, but are not limited to: - Links between the complexity of algorithms in the continuous and discrete settings - Approximate solutions of problems in the continuous and discrete settings - Analog computing and dynamical systems - Combinatorial optimization by neural networks: Complexity issues - Learning in artificial neural networks: Local minima and complexity issues All submissions will undergo thorough refereeing process, according to the usual standards of the journal. Prospective authors should submit six copies of a manuscript to one of the Guest Editors by April10, 1998. ================================================================================ Marco Gori Dipartimento di Ingegneria dell'Informazione Universita' di Siena Via Roma, 56 53100 Siena (Italy) Voice: +39 (577) 26.36.10 Fax: +39 (577) 26.36.02 E-mail: marco at ing.unisi.it WWW: http://www-dsi.ing.unifi.it/neural/~marco Marco Protasi Dipartimento di Matematica Universita' di Roma "Tor Vergata" Via della Ricerca Scientifica 00133 Roma (Italy) Voice: +39 (6) 72.59.46.78 Fax: +39 (6) 72.59.46.99 or 72.59.42.95 E-mail: protasi at mat.utovrm.it WWW:http://www.mat.utovrm.it ----------------------------------------------------------------------------- Marco Protasi Tel: +39-6-72594678 Dipartimento di Matematica Fax: +39-6-72594699 Universita' di Roma "Tor Vergata" e-mail: protasi at mat.utovrm.it Via della Ricerca Scientifica protasi at utovrm.it 00133 Roma - Italy ---------------------------------------------------------------------------- From kmccumb at mri.jhu.edu Thu Oct 16 08:23:32 1997 From: kmccumb at mri.jhu.edu (Karen McCumber) Date: Thu, 16 Oct 1997 08:23:32 -0400 Subject: Distinguished Postdoctoral Fellowships at Hopkins Message-ID: <1.5.4.32.19971016122332.006fc028@128.220.160.41> Johns Hopkins University Biomedical Engineering Department invites applications for the Distinguished Postdoctoral Fellowship Program. This program is funded by the Whitaker Foundation. It aims to promote an inter-disciplinary approach to the study of complex biomedical systems and provide the very best of the recently graduated PhDs an opportunity to perform independent research in a supportive environment. Each recipient will be sponsored by two faculty, at least one of which must have a primary appointment in BME. Of particular interest are candidates with a background in computational neuroscience who wish to pursue projects in laboratories of the Systems Neuroscience or Theoretical and Computational Biology groups of faculty. Salary of the fellows will start at $30,000 per year, with additional funds to cover health insurance. The duration of the fellowship is two years. Interested applicants should submit the following documents: * CV * Two letters of reference * Two page summary of research interests Send your application materials to: Dr. Murray Sachs Dept. of Biomedical Engineering, Johns Hopkins School of Medicine 720 Rutland Ave Baltimore, MD 21205 Deadline for receipt of applications is Dec. 15, 1997. For further information: http://www.bme.jhu.edu From thimm at idiap.ch Thu Oct 16 08:45:22 1997 From: thimm at idiap.ch (Georg Thimm) Date: Thu, 16 Oct 1997 14:45:22 +0200 Subject: Events on Neural Networks on a WWW page (new WWW address!) Message-ID: <199710161245.OAA27738@rotondo.idiap.ch> WWW page for Announcements of Conferences, Workshops and Other Events on Neural Networks and Related Fields (i.e. Vision and Speech) ----------------------------------------- This WWW page allows you to enter and look up announcements for conferences, workshops, and other events on neural networks and related fields (i.e. vision and speech). The event lists, which is updated almost daily, contains more than 150 forthcoming events and can be accessed via the URL: http://www.idiap.ch/NN-events The entries are ordered chronologically and presented in format for fast and easy lookup of: - the date and place of the event, - the title of the event, - a contact address (surface mail, email, ftp, and WWW address, as well as telephone or fax number), and - deadlines for submissions, registration, etc. - topics of the event Conference organizers are kindly asked to enter their conference into the database. The list is in parts published in the journal Neurocomputing by Elsevier Science B.V. Information on passed conferences are also available. Regards, Georg Thimm P.S. Please distribute this announcement to neural network, vision or speech related mailing lists. Comments and suggestions are welcome! From reggia at cs.umd.edu Fri Oct 17 13:06:54 1997 From: reggia at cs.umd.edu (James A. Reggia) Date: Fri, 17 Oct 1997 13:06:54 -0400 (EDT) Subject: CFP: Neural Models Brain & Cognitive Disorders Message-ID: <199710171706.NAA13289@avion.cs.umd.edu> SECOND INTERNATIONAL WORKSHOP ON NEURAL MODELING OF BRAIN AND COGNITIVE DISORDERS ** Initial Announcement and Call for Abstracts ** Sponsors: National Institute of Mental Health Whitaker Foundation Univ. of Maryland Inst. for Advanced Computer Studies Center for Neural Basis of Cognition, CMU & Univ. of Pittsburgh Adams Super Center for Brain Studies, Tel Aviv Neuroscience and Cognitive Science Program, UMCP A workshop on Neural Modeling of Brain and Cognitive Disorders will be held on June 4 - 6, 1998 at the University of Maryland, College Park, just outside of Washington, DC. The focus of this meeting will be on the lesioning of neural network models to study disorders in neurology, neuropsychology and psychiatry, such as Alzheimer's disease, amnesia, aphasia, depression, epilepsy, neglect, parkinsonism, schizophrenia, and stroke. These models attempt to explain how specific pathological neuroanatomical and neurophysiological changes can result in various clinical manifestations, and to investigate the functional organization of the symptoms that result from specific brain pathologies. A Proceedings consisting of abstracts from the presentations will be available for attendees. The emphasis at the workshop will be on reviewing and discussing new contributions to this field since the first meeting was held in 1995. Many of the invited contributions from the first workshop appeared in a World Scientific book last year; see web page indicated below. *** CALL FOR ABSTRACTS *** Individuals wishing to present a poster related to any aspect of the workshop's themes should submit an abstract describing the nature of their presentation. The single page submission should include title, author(s), contact information (address and email/fax), and abstract. One inch margins and a typesize of at least 10 points should be used. Abstracts will be reviewed by the Program Committee; those accepted will be published in the workshop proceedings. Six copies of the camera-ready abstract should be mailed TO ARRIVE by February 1, 1998 to James A. Reggia, Dept. of Computer Science, A.V. Williams Bldg., University of Maryland, College Park, MD 20742 USA. Web Page -------- The latest information about this meeting can be found at http://www.cs.umd.edu/~reggia/workshop/ Travel Fellowships: ------------------ Funding is expected for a few fellowships to offset travel cost of students, postdocs, and/or residents. Further details will be forthcoming. CME Credit: ---------- The possibility of offering CME credits for attendance is currently being explored. Program Committee: ----------------- Rita Berndt (UMAB), Avis Cohen (UMCP), Tim Gale (Univ. Hertfordshire), Helen Gigley (ONR), Dennis Glanzman (NIMH), Barry Gordon (Hopkins), Michael Hasselmo (Harvard), James McClelland (CMU), James Reggia (UMCP), Eytan Ruppin (Tel Aviv), Greg Siegel (San Diego), Nitish Thankor (Hopkins). Registration and Further Information: ----------------------------------- To receive registration materials (distributed most likely in January/February), please send your name, address, email address, phone number and fax number to Cecilia Kullman, UMIACS, A. V. Williams Bldg., University of Maryland, College Park, MD 20742 USA. (Tel: (301) 405-0304, Fax: (301) 314-9658, and email: cecilia at umiacs.umd.edu). Further questions about conference administration, hotel reservations, etc. should also be directed to Ms. Kullman. For questions about the workshop technical/scientific content or abstract submissions, please contact Jim Reggia (address above, Fax: (301) 405-6707, email: reggia at cs.umd.edu). Preliminary List of Speakers ---------------------------- PARKINSONISM/OTHER BASAL GANGLIA DISORDERS Discussant and Chair: Steven Wise, NIMH Jose Contreras-Vidal, Arizona State University A Neural Network Model of the Effects of L-dopa Therapy in Parkinson's Disease Donald Borrett, Toronto East General Hospital Recurrent Neural Networks and Parkinson's Disease Rolf Kotter, University of Dusseldorf Striatal Mechanisms in Parkinson's Disease: Insights from Computer Modeling LANGUAGE/COGNITIVE DISORDERS Discussant and Chair: Gary Dell, University of Illinois Kate Mayall, University of Birmingham A Connectionist Model of Peripheral Dyslezia Jay McClelland, Carnegie-Mellon University Reopening the Critical Period: A Hebbian Account of Interventions that Induce Change in Language Perception Risto Miikkulainen, University of Texas at Austin Dyslexic and Aphasic Impairments in a Self-Organizing Model of the Lexicon David Plaut, Carnegie-Mellon University Systematicity and Specialization in Semantics: A Connectionist Account of Optic Aphasia STROKE AND EPILEPSY Discussant and Chair: Mark Hallett, NINDS Bill Lytton, Univ. of Wisconsin & Wm. S. Middleton VA Hospital Modeling Recovery from Experimental Ablation Jim Reggia, University of Maryland Modeling the Interhemispheric Effects of Stroke Eytan Ruppin, Tel-Aviv University The Pathogenesis of Spreading Tissue Damage Following Acute Focal Stroke: A Computational Investigation Terry Sejnowski, Howard Hughes Medical Institute and Salk Institute Thalamic Model of Absence Epilepsy NEGLECT AND RELATED DISORDERS Discussant and Chair: Marlene Behrmann, Carnegie Mellon University Mike Mozer, University of Colorado Modeling Neglect of Objects and Space Alexandre Pouget, Georgetown University A Neural Theory of Hemineglect Richard Shillcock, University of Edinburgh Connectionist Modelling of Unilateral Visual Neglect: the Crossover Effect in Line Bisection Rita Sloan Berndt & Carol Whitney, University of Maryland Positional Reading Errors: A New Interpretation of Right Neglect Dyslexia AMNESIA/ALZHEIMER'S DISEASE Discussant and Chair: John Lisman, Brandeis University Pablo Alvarez, Boston University A Neural Model of Retrograde Amnesia and Memory Consolidation Mike Hasselmo, Harvard University Memory Function and Dysfunction in a Network Simulation of the Hippocampal Formation David Horn, Tel Aviv University Response of Multimodular Memory Networks to Different Lesion Types Mark Gluck, Rutgers University Empirical Tests of Models of Hippocampal Function with Amnesic and Elderly Populations SCHIZOPHRENIA/PSYCHIATRY Discussant and Chair: William Carpenter, Maryland Psychiatric Research Center and UMAB Jonathan Cohen, University of Pittsburgh and Carnegie-Mellon University The Role of Dopamine in Regulating Access to Prefrontal Cortex: Normal Function and Disturbances in Schizophrenia Ralph Hoffman, Yale University Modeling Postnatal Neurodevelopment, Psychosis Induction, and the Locus of Action of Antipsychotic Drugs Sunjay Berdia, Yale University Neural Network Modeling of the Wisconsin Card Sorting Test Greg Siegle, San Diego State Univ. and Univ. of California, San Diego A Neural Network Model of Affective Interference in Depression THE FUTURE: SCIENTIFIC AND FUNDING EXPECTATIONS Dennis Glanzman, National Institute of Mental Health From kaspar.althoefer at kcl.ac.uk Fri Oct 17 13:11:36 1997 From: kaspar.althoefer at kcl.ac.uk (Althoefer, Kaspar) Date: Fri, 17 Oct 1997 18:11:36 +0100 Subject: PhD-Thesis by Kaspar Althoefer - "Neuro-Fuzzy Motion Planning ...." Message-ID: <34479C48.4C46370C@kcl.ac.uk> The following PhD thesis is now available: "Neuro-Fuzzy Motion Planning for Robotic Manipulators" by Kaspar ALTHOEFER. The thesis will not be available on a Web or ftp site, but I would be pleased to send my thesis as a postscript file to everybody who wants a copy. Please, contact me via e-mail, if you are interested. My e-mail address is Kaspar.Althoefer at kcl.ac.uk. Below you will find the abstract and the table of contents. Best regards, Kaspar Althoefer. --------------------------------------------------------------------- Abstract On-going research efforts in robotics aim at providing mechanical systems, such as robotic manipulators and mobile robots, with more intelligence so that they can operate autonomously. Advancing in this direction, this thesis proposes and investigates novel manipulator path planning and navigation techniques which have their roots in the field of neural networks and fuzzy logic. Path planning in the configuration space makes necessary a transformation of the workspace into a configuration space. A radial-basis-function neural network is proposed to construct the configuration space by repeatedly mapping individual workspace obstacle points into so-called C-space patterns. The method is extended to compute the transformation for planar manipulators with n links as well as for manipulators with revolute and prismatic joints. A neural-network-based implementation of a computer emulated resistive grid is described and investigated. The grid, which is a collection of nodes laterally connected by weights, carries out global path planning in the manipulator?s configuration space. In response to a specific obstacle constellation, the grid generates an activity distribution whose gradient can be exploited to construct collision-free paths. A novel update algorithm, the To&Fro algorithm, which rapidly spreads the activity distribution over the nodes is proposed. Extensions to the basic grid technique are presented. A novel fuzzy-based system, the fuzzy navigator, is proposed to solve the navigation and obstacle avoidance problem for robotic manipulators. The presented system is divided into separate fuzzy units which individually control each manipulator link. The competing functions of goal following and obstacle avoidance are combined in each unit providing an intelligent behaviour. An on-line reinforcement learning method is introduced which adapts the performance of the fuzzy units continuously to any changes in the environment. All above methods have been tested in different environments on simulated manipulators as well as on a physical manipulator. The results proved these methods to be feasible for real-world applications. ........................................................................................ TABLE OF CONTENTS Abstract ii Acknowledgments iii Table of Contents iv List of Figures vii List of Tables ix 1 Introduction 1 1.1 Methodology 2 1.2 Constructing the C-space, Global Path Planning, Local Navigation: An Overview 4 1.2.1 Manipulator Motion Planning 4 1.2.2 The Computation of the C-space and the Building of Maps 5 1.2.3 Global Path Planning in C-space 6 1.2.4 Local Navigation 8 1.3 Contributions made by this Thesis 10 2 Workspace to C-space Transformation 11 2.1 Introduction 11 2.2 The Configuration Space in Context 13 2.3 The Configuration Space of a Robotic Manipulator 15 2.4 The Mapping of Obstacle Points into their C-space Counterpart 16 2.4.1 The Single Point Mapping 16 2.4.2 The C-space of a 2-Link Revolute Arm 18 2.4.3 The C-space of a 2-Link Arm with Prismatic and Revolute Joints 23 2.5 The C-space of an n-Link Arm 26 2.5.1.1 Comparison of configuration space representations 32 2.5.1.2 Reduction in Complexity 33 2.6 A Radial Basis Function Network for the Workspace to C-space Transformation 35 2.6.1 The RBF-Network for the C-space Calculation 35 2.6.2 The Training of the Network: Insertion of Nodes 37 2.7 A Three-link Manipulator 39 2.8 Real-world Applications 41 2.8.1 C-space Patterns for a Physical Manipulator 41 2.8.2 Timing Considerations 45 2.8.3 A Real-World Planning System 47 2.8.3.1 Image Processing 48 2.8.3.2 Input to the Radial-Basis-Function Network 50 2.9 Summary 51 3 A Neuro-Resistive Grid for Path Planning 53 3.1 Problem Definition and Overview of the Algorithm 53 3.2 Related Work 55 3.2.1 Resistive Grids for Path Planning 55 3.2.2 The Hopfield Network 56 3.2.3 Cellular Neural Network 57 3.2.4 Dynamic Programming 58 3.3 Path Planning in the Configuration Space 60 3.4 The Neuro-Resistive Grid 61 3.4.1 Implementation of the Neuro-Resistive Grid 61 3.4.2 Functioning of the Resistive Grid 63 3.4.3 Harmonic Functions 66 3.4.4 Boundary Conditions - Dirichlet vs. Neumann 68 3.4.5 Convergence Criterion for the Neuro-resistive Grid 72 3.5 Enhanced Activity Propagation 74 3.5.1 Methodology 74 3.5.2 Higher Dimensions 77 3.5.3 Global Extremum and Collision-free Path 78 3.5.4 A Non-Topologically-Ordered Grid 82 3.5.5 Soft Safety Margin 83 3.6 Experiments 84 3.6.1 Real-World Experiments with the MA 2000 Manipulator 84 3.6.2 A Planar Three-Link Manipulator 94 3.6.3 A Three-dimensional SCARA Manipulator 101 3.6.4 A Mobile Robot in a 3D-Workspace 101 3.7 Comparative Studies 102 3.7.1 Comparisons to Other Update Rules 102 3.7.2 Comparison to Other Update Sequences 105 3.7.3 Comparison to the A*-Algorithm 106 3.8 Summary 108 4 Fuzzy-Based Navigation and Obstacle Avoidance for Robotic Manipulators 110 4.1 Problem Definition and System Overview 110 4.2 Local Navigation in Context 113 4.2.1 Artificial Potential Fields 113 4.2.2 An Overview of Fuzzy-Based Navigation Techniques for Mobile Robots 115 4.2.3 Unreachable Situations and Local Minima 116 4.3 Fuzzy Navigation and Obstacle Avoidance for Robotic Manipulators 117 4.3.1 Introduction to Fuzzy Control 117 4.3.2 Manipulator-Specific Implementation Aspects 121 4.3.3 The Fuzzy Algorithm 123 4.4 Computer Simulations 128 4.4.1 Two-Link Manipulator 128 4.4.2 Three-Link Manipulator 132 4.4.3 Moving Obstacles 134 4.4.4 Safety Aspects 134 4.5 Fuzzy Navigation for the MA 2000 Manipulator 135 4.5.1 Simulated MA 2000 and Comparison to the Resistive Grid Approach 135 4.5.2 Real-World Results 138 4.6 Reinforcement Learning 138 4.7 Summary and Discussion 144 5 Conclusions and Future Work 147 5.1 Conclusions 147 5.1.1 Workspace to C-space Transformation for Robotic Manipulators 147 5.1.2 A Neural Resistive Grid for Path Planning 147 5.1.3 Fuzzy-based Navigation and Obstacle Avoidance for Robotic Manipulators 149 5.2 Future work 150 5.2.1 Hybrid System 150 5.2.2 Implementational Aspects 151 5.2.3 Sensors 152 5.2.4 Transformation of Complex Obstacle Primitives 152 Appendix A-1 153 Appendix A-2 155 Appendix A-3 160 Appendix B 165 Bibliography 169 -- |_/ I N G'S Dr Kaspar ALTHOEFER | \ COLLEGE Ph.D., Dipl.-Ing., AMIEE L O N D O N Department of Mechanical Engineering Founded1829 King's College, Strand, London WC2R 2LS, UK TEL: +44 (0)171 873 2431, FAX: +44 (0)171 836 4781 http://www.eee.kcl.ac.uk/~kaspar From jhf at stat.Stanford.EDU Fri Oct 17 16:38:10 1997 From: jhf at stat.Stanford.EDU (Jerome H. Friedman) Date: Fri, 17 Oct 1997 13:38:10 -0700 (PDT) Subject: Technical Report Available. Message-ID: <199710172038.NAA21535@rgmiller.Stanford.EDU> *** Technical Report Available *** Bump Hunting in High-Dimensional Data Jerome H. Friedman Stanford University Nicholas I. Fisher CMIS - CSIRO, Sydney ABSTRACT Many data analytic questions can be formulated as (noisy) optimization problems. They explicitly or implicitly involve finding simultaneous combinations of values for a set of ("input") variables that imply unusually large (or small) values of another designated ("output") variable. Specifically, one seeks a set of subregions of the input variable space within which the value of the output variable is considerably larger (or smaller) than its average value over the entire input domain. In addition it is usually desired that these regions be describable in an interpretable form involving simple statements ("rules") concerning the input values. This paper describes a new procedure directed towards this goal based on the notion of "patient" rule induction. This patient strategy is contrasted with the greedy ones used by most rule induction methods, and semi-greedy ones used by some partitioning tree techniques such as CART. Applications involving scientific and commercial data bases are presented. Keywords: noisy function optimization, classification, association, rule induction, data mining. Available by ftp from: "ftp://stat.stanford.edu/pub/friedman/prim.ps.Z" Note: This postscript does not view properly on some older versions of ghostview. It seems to print OK on nearly all postscript printers. From mschmitt at igi.tu-graz.ac.at Mon Oct 20 11:18:55 1997 From: mschmitt at igi.tu-graz.ac.at (Michael Schmitt) Date: Mon, 20 Oct 1997 17:18:55 +0200 Subject: Preprint available Message-ID: <344B765F.4C7C@igi.tu-graz.ac.at> Dear Connectionists, the following preprint (21 pages) is available at http://www.cis.tu-graz.ac.at/igi/maass/96.ps.gz (104463 bytes, gzipped PostScript) or at http://www.cis.tu-graz.ac.at/igi/mschmitt/spikingneurons.ps.Z (158163 bytes, compressed PostScript). TITLE: On the Complexity of Learning for Spiking Neurons with Temporal Coding AUTHORS: Wolfgang Maass and Michael Schmitt ABSTRACT: In a network of spiking neurons a new set of parameters becomes relevant which has no counterpart in traditional neural network models (such as threshold or sigmoidal networks): the time that a pulse needs to travel through a connection between two neurons (also known as delay of a connection). We investigate the VC-dimension of networks of spiking neurons where the delays are viewed as programmable parameters and we prove tight bounds for this VC-dimension. Thus we get quantitative estimates for the diversity of functions that a network with fixed architecture can compute with different settings of its delays. In particular, it turns out that a network of spiking neurons with $k$ adjustable delays is able to compute a much richer class of functions than a threshold circuit with $k$ adjustable weights. The results also yield bounds for the number of training examples that an algorithm needs for tuning the delays of a network of spiking neurons. Results about the computational complexity of such algorithms are also given. -- Michael Schmitt Institute for Theoretical Computer Science TU Graz, Klosterwiesgasse 32/2, A-8010 Graz, Austria Tel: +43 316 873-5814, Fax: +43 316 873-5805 E-mail: mschmitt at igi.tu-graz.ac.at http://www.cis.tu-graz.ac.at/igi/mschmitt/ From young at psy.ox.ac.uk Mon Oct 20 06:46:33 1997 From: young at psy.ox.ac.uk (Steven Young) Date: Mon, 20 Oct 1997 11:46:33 +0100 (BST) Subject: Oxford Summer School on Connectionist Modelling Message-ID: <199710201046.LAA08721@axp01.mrc-bbc.ox.ac.uk> ** CALL FOR ATTENDANCE ** Oxford Summer School on Connectionist Modelling Department of Experimental Psychology, University of Oxford 19 - 31 July 1998 Applications are invited for participation in a 2-week residential Summer School on techniques in connectionist modelling. The course is aimed primarily at researchers who wish to exploit neural network models in their teaching and/or research and it will provide a general introduction to connectionist modelling, biologically plausible neural networks and brain function through lectures and exercises on Macintoshs and PCs. The course is interdisciplinary in content though many of the illustrative examples are taken from cognitive and developmental psychology, and cognitive neuroscience. The instructors with primary responsibility for teaching the course are Kim Plunkett and Edmund Rolls. No prior knowledge of computational modelling will be required though simple word processing skills will be assumed. Participants will be encourages to start work on their own modelling projects during the Summer School. The cost of participation in the Summer School is £950 for Faculty and £750 for Graduate Students. This figure covers the cost of accommodation (bed and breakfast at St. John's College), registration and all literature required for the Summer School. Participants will be expected to cover their own travel and meal costs. A small number of partial bursaries will be available for graduate students. Applicants should indicate whether they wish to be considered for a graduate student scholarship but are advised to seek their own funding as well, since in previous years the number of graduate student applications has far exceeded the number of scholarships available. Further information about contents of the course can be obtained from Steven.Young at psy.ox.ac.uk. If you are interested in participating in the Summer School, please contact: Mrs Sue King Department of Experimental Psychology South Parks Road University of Oxford Oxford OX1 3UD Tel: +44 (1865) 271 353 Email: sking at psy.ox.ac.uk Please send a brief description of your background with an explanation of why you would like to attend the Summer School (one page maximum) no later than 31st January 1998. -- Computer Officer, IRC for Cognitive Neuroscience, Department of Experimental Psychology, Oxford University From kompe at fb.sony.de Mon Oct 20 13:29:29 1997 From: kompe at fb.sony.de (Ralf Kompe) Date: Mon, 20 Oct 1997 19:29:29 +0200 Subject: NEW BOOK: Applications of NNs to speech understanding Message-ID: <344B94F9.1BF1CF3E@fb.sony.de> (Sorry, if you receive this message more than once) To whom it may concern: The following book, which describes the application of neural networks to real-word data, is now available: Ralf Kompe Prosody in Speech Understanding Systems Lecture Notes in Artificial Intelligence, Vol. 1307 Subseries of Lecture Notes in Computer Science Springer Berlin, New York 1997 (370 pages) ISBN 3-540-63580-7 ------------------ ABSTRACT Prosody covers acoustic phenomena of speech which are not spe- cific to phonemes. These are mainly intonation, indicators for phrase boundaries, and accentuation. This information can sup- port the intelligibility of speech or even sometimes disambiguate the meaning. The aim of this book is to describe algorithms developed by the author for the use of prosodic information on many levels of speech understanding such as syntax, semantics, dialog, and translation. An implementation of these algorithms has suc- cessfully been integrated into the speech-to-speech translation system Verbmobil and in the dialog system Evar. This is for the first time that prosody is used in a fully operational speech understanding and translation system. The Verbmobil prototype system has been publicly demonstrated at several conferences and industrial fairs. The emphasis of the book lies on the improvement of parsing of spontaneous speech with the help of prosodic clause boundary in- formation. Prosody reduces the parse-time of word hypotheses graphs by 92% and the number of parse trees by 96%. This is achieved by integrating several knowledge sources such as proba- bilities for prosodic events computed by neural networks and n- gramms in an A*-search for the optimal parse. Without prosody the automatic interpretation of spontaneous speech would be infeasible. The book gives a comprehensive review of the mathematical and computational background of the algorithms and statistical models useful for the integration of prosody in speech understanding. It also shows unconventional applications of hidden Markov mod- els, stochastic language models, and neural networks. The latter, for example, are apart from several classification tasks used for the inverse filtering of speech signals. The book also ex- plains in detail the acoustic-prosodic phenomena of speech and their functional role in communication. In contrast to many other reports, it gives a lot of examples taken from real human-human dialogs; many examples are supported by speech signals accessible over the WWW. The use of prosodic information relies on the ro- bust extraction of relevant features from digitized speech sig- nals, on adequate labeling of large speech databases for train- ing classifiers, and on the detection of prosodic events; the methods used in Verbmobil and Evar are summarized as well in this book. Furthermore, an overview of these state-of-the-art speech understanding systems is given. ------------------ The book has been awarded with the "Dissertation Price" of the German Institutes for Artificial Intelligence. Sincerely yours, Ralf Kompe __________________________________________________________________________ Ralf Kompe Sony International (Europe) GmbH . . o o O O European Research and Development Stuttgart (ERDS) . . o o O Advanced Developments . . o o O Stuttgarter Str. 106 . . o o O O D-70736 Fellbach . . o o O O O Germany . . o o O O O . . o o O O Phone: +49-711-5858-366 Fax: +49-711-58-31-85 E-mail: kompe at fb.sony.de __________________________________________________________________________ From marcusg at elec.uq.edu.au Mon Oct 20 03:28:29 1997 From: marcusg at elec.uq.edu.au (Marcus Gallagher) Date: Mon, 20 Oct 1997 17:28:29 +1000 Subject: ACNN'98: 2nd Call for Papers Message-ID: <344B081D.41C67EA6@elec.uq.edu.au> -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- The Ninth Australian Conference on Neural Networks ACNN'98 Brisbane, Australia Feb 11-13, 1998 Second Announcement and Final Call for Papers http://www.elec.uq.edu.au/acnn98/ -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- The ninth Australian conference on neural networks will be held in Brisbane on Feb 11-13 1998 at the University of Queensland. ACNN'98 is the annual national meeting of the Australian neural network community. It is a multi-disciplinary meeting and seeks contributions from Neuroscientists, Engineers, Computer Scientists, Mathematicians, Physicists and Psychologists. Invited Speakers ---------------- The program will include several invited talks, at least 3 of which will be presented by overseas speakers. Pre-Conference Workshop ----------------------- A Pre-Conference Workshop will be held on Tuesday 10th February 1998 at the same venue as the conference. The emphasis of the workshop will be on neural network simulation and robotics. Further details (including registration, etc.) will be made available in the near future. Special Sessions ---------------- The following Special Sessions have been proposed for ACNN'98. These will consist of invited talks followed by regular presentations of relevant papers. Submissions which are relevant to a special session are being invited for inclusion in the session. All papers presented in special sessions will appear in the conference proceedings. Rule Extraction & Connectionist Knowledge Representation Modelling Higher Cognitive Processes Computational Learning Theory Modularity in Neural Networks Submissions ----------- Papers should be submitted to the ACNN'98 Secretariat as close as possible to final form and must not exceed 5 single pages (2 column format, 10pt or more). The ACNN'98 web page (http://www.elec.uq.edu.au/acnn98/) includes LateX style and template files for authors using LateX. An example document showing the general layout of submissions is also available via the web page in Postscript and Pdf formats - authors are encouraged to use this document as a general guide to formatting of papers. For full submission details please refer to the web page. Following notification of acceptance results, authors will be required to submit a one page, extended abstract for accepted papers. These abstracts will be distributed to delegates at the conference. Camera-ready papers will then be required for submission shortly after the conference, and proceedings will then be sent out to delegates. Submissions of initial papers, extended abstracts and camera-ready papers in electronic format (via email) are encouraged, to give authors the maximum available time to prepare submissions. Authors ACNN'98 will include a special poster session devoted to recent work and work-in-progress. Abstracts are solicited for this session (1 page limit) and may be submitted up to 10 days before the commencement of the conference. They will not be refereed or included in the proceedings, but will be distributed to attendees upon arrival. Students are especially encouraged to participate in this session. Submission Categories --------------------- Submissions are encouraged in, but not limited to, the following topics: Applications: Examples - Signal processing and analysis; Pattern recognition; Speech, Machine vision; Motor control; Robotics; Forecasting; Medical Architectures and Learning Algorithms: New architectures and learning algorithms; Hierarchy; Modularity; Learning pattern sequences; Information integration; Evolutionary computation; Machine learning Cognitive Science: Models of perception and pattern recognition; Memory; Concept formation; Problem solving and reasoning; Language acquisition and production Neuroscience: Vision, Audition, Motor Somatosensory and Autonomic functions; Synaptic function; Cellular information processing Theory: Learning; Generalisation; Complexity; Stability; Dynamics Implementation: Hardware implementations of neural nets; Analog and digital VLSI implementation; Optical implementation Important Dates --------------- Paper Submissions Due Mon 17th Nov 97 Workshop & Tutorial Proposals Mon 6th Oct 97 Notification of Acceptance Fri 19th Dec 97 Work In Progress Abstracts Due Mon 2nd Feb 98 Extended Abstracts Due Mon 2nd Feb 98 Registration Fees ----------------- Students Regular After 15th Jan 98 ACNN'98 A$100 A$300 A$400 Organising Committee: --------------------- Prof. Tom Downs University of Queensland (Chair) Dr. Janet Wiles University of Queensland Prof. Joachim Diederich Queensland University of Technology Dr. P Suganthan University of Queensland Dr. Marcus Frean University of Queensland Marcus Gallagher University of Queensland Robert Andrews Queensland University of Technology Ian Wood University of Queensland Peter Stratton University of Queensland Contact Information: -------------------- ACNN'98 Secretariat Dept of Electrical & Computer Engineering University of Queensland QLD. 4072. WWW: http://www.elec.uq.edu.au/acnn98/ email: acnn98 at elec.uq.edu.au From jbower at bbb.caltech.edu Tue Oct 21 19:56:07 1997 From: jbower at bbb.caltech.edu (James M. Bower) Date: Tue, 21 Oct 1997 15:56:07 -0800 Subject: CNS*98 Message-ID: ********************************************************************** With this email we announce CNS*98, the next Computational Neuroscience Annual Meeting to be held this coming July in Santa Barbara, California. For those of you attending the Society for Neuroscience Annual Meeting this coming week in New Orleans, you can pick up a freshly printed poster for the meeting by ERICA, at the NIMH booth. Those of you already on the CNS meeing mailing list will recieve the poster and call for papers through the mail in the next few weeks. Please note in the following call for papers that we have introduced a new all electronic form of paper submission using a custom designed JAVA/HTML interface. Additional information on paper submission and the meeting itself is available at: http://www.bbb.caltech.edu/cns-meetings/cns98/ ********************************************************************** CALL FOR PAPERS Seventh Annual Computational Neuroscience Meeting CNS*98 July 26 - 30, 1998 Santa Barbara, California DEADLINE FOR SUMMARIES AND ABSTRACTS: 11:59pm January 26, 1998 This is the seventh annual meeting of an interdisciplinary conference addressing a broad range of research approaches and issues involved in the field of computational neuroscience. These meetings bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in the functioning of biological nervous systems. Peer reviewed papers are presented all related to understanding how nervous systems compute. As in previous years, CNS*98 will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. The meeting in 1998 will take place at the Fess Parker's Double Tree Resort in Santa Barbara, California, and include plenary, contributed, and poster sessions. The first session starts at 9 am, Sunday, July 26th and ends with the annual banquet on Thursday evening, July 30th. There will be no parallel sessions. The meeting will include time for informal workshops focused on current issues in computational neuroscience. Travel funds will be available for students and postdoctoral fellows presenting papers. Child day care will also be available. Santa Barbara, California is approximately 1 1/2 hours by car from the Los Angeles International airport. Airport shuttles from the airport to Santa Barbara run regularly. In addition, Santa Barbara has its own small airport. The hotel itself is located on the ocean and within walking distance from distinctive downtown Santa Barbara. **NEW** SUBMISSION INSTRUCTIONS: With this announcement we solicit the submission of presented papers all of which will be refereed. Peer review will be conducted based on a 1000-word (or less) summary describing the methods, nature, and importance of your results. Authors will be notified of acceptance by the second week of May, 1998. This year, for the first, time, submission of papers will be performed electronically using a custom designed JAVA/HTML interface. Full instructions for submission can be found at the meeting web site: http://www.bbb.caltech.edu/cns-meetings/cns98/. However, in brief, authors should cut and paste text from their own word processors into the forms available on the web site. It is important that all requested information be provided, including a 100 word abstract for publication in the conference program, all author information, and selection of the appropriate category and theme from the list provided. Authors should especially note the mechanisms used for handling figures and mathematical equations. All submissions will be acknowledged immediately by email. Program committee decisions will be sent to the designated correspondence author only. Submissions will not be considered if they lack category information, abstracts, author addresses, or are late. FURTHER MEETING CORRESPONDENCE We would like to strongly encourage authors to make their submissions electronically. However, if you do not have ready access to the internet or a web server, we will send you instructions for paper submission if you contact us either by email to cns98 at smaug.bbb.caltech.edu or at the following address: CNS*98 Division of Biology 216-76 Caltech Pasadena, CA 91125 ADDITIONAL INFORMATION concerning the meeting, including hotel and travel arrangements or information on past meetings can be obtained by: o Using our on-line WWW information and registration server, URL of: http://www.bbb.caltech.edu/cns/cns98/cns98.html o ftp-ing to our ftp site. yourhost% ftp ftp.bbb.caltech.edu Name (ftp.bbb.caltech.edu:): ftp Password: yourname at yourhost.yourside.yourdomain ftp> cd cns98 ftp> ls o Sending Email to: cns98 at smaug.bbb.caltech.edu CNS*98 ORGANIZING COMMITTEE: Co-meeting Chair / Logistics - John Miller, Montana State University Co-meeting Chair / Finances and Program - Jim Bower, Caltech Governmental Liaison - Dennis Glanzman, NIMH/NIH Workshop Organizer - to be announced 1998 Program Committee: Axel Borst, Max-Planck Inst., Tuebingen, Germany Leif Finkel, University of Pennsylvania Anders Lansner, Royal Institute of Technology, Sweden Linda Larson-Prior, Pennsylvania State University Medical College David Touretzky, Carnegie Mellon University Gina Turrigiano, Brandeis University Ranu Jung, University of Kentucky Simon Thorpe, CNRS, Toulouse, France 1998 Regional Organizers: Europe- Erik DeSchutter (Belgium) Middle East - Idan Segev (Jerusalem) Down Under - Mike Paulin (New Zealand) South America - Renato Sabbatini (Brazil) Asia - Zhaoping Li (MIT) India - Upinder Bhalla (Bangalore) ==== From stein at biodec.wustl.edu Tue Oct 21 14:57:16 1997 From: stein at biodec.wustl.edu (Paul S.G. Stein) Date: Tue, 21 Oct 1997 13:57:16 -0500 Subject: NeuronsNetworksMotorBehavior Message-ID: <9710211857.AA17081@biodec.wustl.edu> The volume "NEURONS, NETWORKS, AND MOTOR BEHAVIOR" edited by PSG Stein, S Grillner, AI Selverston, and DG Stuart is now available from the MIT Press. See the MIT Press website for additional information: mitpress.mit.edu/book-home.tcl?isbn=0262193906 Many of the authors of chapters of this volume were speakers at the 1995 Tucson conference on Neurons, Networks, and Motor Behavior. The volume can be viewed at the MIT Press booth at the Society for Neuroscience meeting in New Orleans. MIT Press offers a 20% discount for orders placed using forms available at the meeting. A similar discount is available for attendees of the 1995 Tucson conference. An EMail to 1995 meeting attendees was recently sent by Pat Pierce in Doug Stuart's lab. Contact Pat at DGStuart at U.Arizona.EDU if you did not receive her EMail. -----The following is additional information about the volume----- Recent advances in motor behavior research rely on detailed knowledge of the characteristics of the neurons and networks that generate motor behavior. At the cellular level, Neurons, Networks, and Motor Behavior describes the computational characteristics of individual neurons and how these characteristics are modified by neuromodulators. At the network and behavioral levels, the volume discusses how network structure is dynamically modulated to produce adaptive behavior. Comparisons of model systems throughout the animal kingdom provide insights into general principles of motor control. Contributors describe how networks generate such motor behaviors as walking, swimming, flying, scratching, reaching, breathing, feeding, and chewing. An emerging principle of organization is that nervous systems are remarkably efficient in constructing neural networks that control multiple tasks and dynamically adapt to change. The volume contains six sections: selection and initiation of motor patterns; generation and formation of motor patterns: cellular and systems properties; generation and formation of motor patterns: computational approaches; modulation and reconfiguration; short-term modulation of pattern generating circuits; and sensory modification of motor output to control whole body orientation. -----TABLE OF CONTENTS OF NEURONS, NETWORKS, AND MOTOR BEHAVIOR----- SELECTION AND INITIATION OF MOTOR PATTERNS 1. Selection and Initiation of Motor Behavior Sten Grillner, Apostolos P. Georgopoulos, and Larry M. Jordan 2. The Role of Population Coding in the Control of Movement David L. Sparks, William B. Kristan, Jr., and Brian K. Shaw 3. Neural Substrates for Initation of Startle Responses Roy E. Ritzmann and Robert C. Eaton GENERATION AND FORMATION OF MOTOR PATTERNS: CELLULAR AND SYSTEMS PROPERTIES 4. Basic Building Blocks of Vertebrate Spinal Central Pattern Generators Ole Kiehn, Jorn Hounsgaard, and Keith T. Sillar 5. Neural and Biomechanical Control Strategies for Different Forms of Vertebrate Hindlimb Motor Tasks Paul S.G. Stein and Judith L. Smith 6. Spinal Networks and Sensory Feedback in the Control of Undulatory Swimming in Lamprey Peter Wallen 7. Spinal Networks Controlling Swimming in Hatchling Xenopus Tadpoles Alan Roberts, Steve R. Soffe, and Ray Perrins 8. Role of Ionic Currents in the Operation of Motor Circuits in the Xenopus Embryo Nicholas Dale 9. Integration of Cellular and Network Mechanisms in Mammalian Oscillatory Motor Circuits: Insights from the Respiratory Oscillator Jeffrey C. Smith 10. Shared Features of Invertebrate Central Pattern Generators Allen I. Selverston, Yuri V. Panchin, Yuri I. Arshavsky, and Grigori N. Orlovsky 11. Intrinsic Membrane Properties and Synaptic Mechanisms in Motor Rhythm Generators Ronald L. Calabrese and Jack L. Feldman 12. Organization of Neural Networks for the Control of Posture and Locomotion in an Insect Malcolm Burrows GENERATION AND FORMATION OF MOTOR PATTERNS: COMPUTATIONAL APPROACHES 13. How Computation Aids in Understanding Biological Networks Eve Marder, Nancy Kopell, and Karen Sigvardt 14. Dynamical Systems Analyses of Real Neuronal Networks John Guckenheimer and Peter Rowat 15. Realistic Modeling of Burst Generation and Swimming in Lamprey Anders Lansner, Orjan Ekeberg, and Sten Grillner 16. Integrate-and-Fire Simulations of Two Molluscan Neural Circuits William N. Frost, James R. Lieb, Jr., Mark J. Tunstall, Brett D. Mensh, and Paul S. Katz MODULATION AND RECONFIGURATION 17. Chemical Modulation of Vertebrate Motor Circuits Keith T. Sillar, Ole Kiehn, and Norio Kudo 18. Modulation of Neural Circuits by Steroid Hormones in Rodent and Insect Model Systems Janis C. Weeks and Bruce McEwen 19. Chemical Modulation of Crustacean Stomatogastric Pattern Generator Networks Ronald M. Harris-Warrick, Deborah J. Baro, Lisa M. Coniglio, Bruce R. Johnson, Robert M. Levini, Jack H. Peck, and Bing Zhang 20. Reconfiguration of the Peripheral Plant during Various Forms of Feeding Behaviors in the Mollusc Aplysia Irving Kupfermann, Vladimir Brezina, Elizabeth C. Cropper, Dillip Deodhar, William C. Probst, Steven C. Rosen, Ferdinand S. Vilim, and Klaudiusz R. Weiss SHORT-TERM MODULATION OF PATTERN GENERATING CIRCUITS 21. Sensory Modulation of Pattern Generating Circuits Keir G. Pearson and Jan-Marino Ramirez 22. Presynaptic Mechanisms during Rhythmic Activity in Vertebrates and Invertebrates Michael P. Nusbaum, Abdeljabbar El Manira, Jean-Pierre Gossard, and Serge Rossignol SENSORY MODIFICATION OF MOTOR OUTPUT TO CONTROL WHOLE BODY ORIENTATION 23. Control of Body Orientation and Equilibrium in Vertebrates Jane M. Macpherson, Tatiana G. Deliagina, and Grigori N. Orlovsky 24. Centrally-Patterned Behavior Generates Sensory Input for Adaptive Control Mark A. Willis and Edmund A. Arbas 25. Oculomotor Control in Insects: From Muscles to Elementary Motion Detectors Nicholas J. Strausfeld ________________________________________________________________ Paul S.G. Stein EMail reply to: STEIN at BIODEC.WUSTL.EDU Voice Phone: 314-935-6824 FAX Phone: 314-935-4432 Mail: Dept Biology, Washington Univ, St Louis, MO 63130 USA Home Page: http://biosgi.wustl.edu/faculty/stein.html Book Website for Neurons, Networks, and Motor Behavior: http://www-mitpress.mit.edu/book-home.tcl?isbn=0262193906 Conference Website for Neurons, Networks, and Motor Behavior: http://www.physiol.arizona.edu/CELL/Department/Conferences.html ________________________________________________________________ From vnissen at gwdg.de Wed Oct 22 10:50:20 1997 From: vnissen at gwdg.de (Volker Nissen) Date: Wed, 22 Oct 1997 14:50:20 +0000 Subject: CfP 4. Symp. Softcomputing Message-ID: This call was sent to several lists. We apologize should you receive it multiple times. --------------------------------- CALL FOR PAPERS ================== 4. SYMPOSIUM SOFTCOMPUTING "Softcomputing in Production and Material Management" Neural Nets, Fuzzy Set Theory, Evolutionary Algorithms University of Goettingen, Thu. 12. March 1998 (10 am - 6 pm) THEME: The theme of this years symposium are softcomputing applications in production and material management. Softcomputing as a technical term includes the complementary core areas of artificial neural networks, fuzzy set theory, and evolutionary algorithms. While softcomputing is actively being applied in the technical sectors, we believe that the great potential of softcomputing for the management domain has not yet been sufficiently appreciated in industry. The Goettingen symposium was established to serve as a link between science and practice, focussing on innovative applications and know-how transfer. Possible topics of contributions include but are not limited to: * production planning * lot sizing and scheduling * cutting problems * inventory control * machine diagnosis * data mining * maintenance planning * system layout * line balancing * process control and optimization * waste management ORGANISATION: The symposium is strictly application-orientated. It is organised by members of the ,Workgroup on Softcomputing in Business", jointly with the bureau of technology transfer of the University of Goettingen. It takes place at the University of Goettingen, central lecturing building. All three previous symposia received very positive judgements from scientists and practicians. DEADLINES: 12. Jan. 1998 Deadline for extended abstract submission (ca. 2 pages, e-mail submission OK) 19. Jan. 1998 Notification of acceptance 16. Feb. 1998 Camera ready full papers due 27. Feb. 1998 Registration deadline and latest date to pay conference fee 12. Mar 1998 Symposium CONFERENCE FEE: The conference fee is DM 100,- fuer speakers und DM 200,- for other participants, and includes the proceedings, lunch and coffee breaks. Please pay to our account no. 9242 058 at Nord LB (German bank code 250 500 00). Account holder is the ,Foerderverein FH BS-WF". Please mention "4. Symposium Softcomputing" with your payment. We accept euro cheques, but unfortunately cannot accept credit cards. PROCEEDINGS: A proceedings volume will be distributed at the conference. In preparing your manuscript, please follow these format requirements: printing area 17 x 24 cm, 12 pt Times Roman or similar font, single line spacing, no page numbering, max. 16 pages, title in 18 pt Arial bold and centered. Below please state authors names and affiliations (centered). Please include short abstract and key words. Contributions in German or English. Please send two copies of your manuscript in reproducable form as well as the formated file on disk to the address stated below. At least one of the authors of an accepted paper is required to participate in the symposium and present the paper. Talks should not exceed 40 minutes (including 10 minutes for discussion). SYMPOSIUM WWW-PAGE: HTML://www.wi1.wiso.uni-goettingen.de/pa/afn/4symp_e.htm CONTACT: PLEASE SEND SUBMISSIONS TO: 4. Symposium Softcomputing Dipl.-Phys. Martin Tietze Universitaet Goettingen Abt. Wirtschaftsinformatik I Platz der Goettinger Sieben 5 D-37073 Goettingen Germany REGISTRATION ADDRESS: Dipl.-Kfm. Dipl.-Ing. Detlef Puchert FH Braunschweig-Wolfenbuettel Technologietransfer-Kontaktstelle Salzdahlumer Str. 46/48 38302 Wolfenbuettel Germany E-mail: d.puchert at verwaltung.fh-wolfenbuettel.de FOR QUESTIONS VIA E-MAIL PLEASE REFER TO: vnissen at gwdg.de (Dr. Volker Nissen) From arbib at pollux.usc.edu Wed Oct 22 11:52:43 1997 From: arbib at pollux.usc.edu (Michael Arbib) Date: Wed, 22 Oct 1997 08:52:43 -0700 Subject: Neural Organization Message-ID: The volume "Neural Organization: Structure, Function, and Dynamics" by Michael A. Arbib, Peter Erdi, and Janos Szentagothai is now available from the MIT Press. See the MIT Press website for additional information: http://mitpress.mit.edu/book-home.tcl?isbn=026201159X The volume can be viewed at the MIT Press booth at the Society for Neuroscience meeting in New Orleans. MIT Press offers a 20% discount for orders placed using forms available at the meeting. -----The following is additional information about the volume----- "Neural Organization: Structure, Function, and Dynamics" Michael A. Arbib, Peter Erdi, and Janos Szentagothai ISBN 0-262-01159-X 328 pp. (8.5 x 11 double column), 163 illus. $60.00 (cloth) In Neural Organization, Arbib, Erdi, and Szentagothai integrate structural, functional, and dynamical approaches to the interaction of brain models and neurobiologcal experiments. Both structure-based "bottom-up" and function- based "top-down" models offer coherent concepts by which to evaluate the experimental data. The goal of this book is to point out the advantages of a multidisciplinary, multistrategied approach to the brain. Part I of Neural Organization provides a detailed introduction to each of the three areas of structure, function, and dynamics. Structure refers to the anatomical aspects of the brain and the relations between different brain regions. Function refers to skills and behaviors, which are explained by means of functional schemas and biologically based neural networks. Dynamics refers to the use of a mathematical framework to analyze the temporal change of neural activities and synaptic connectivities that underlie brain development and plasticity--in terms of both detailed single-cell models and large-scale network models. In part II, the authors show how their systematic approach can be used to analyze specific parts of the nervous system--the olfactory system, hippocampus, thalamus, cerebral cortex, cerebellum, and basal ganglia--as well as to integrate data from the study of brain regions, functional models, and the dynamics of neural networks. In conclusion, they offer a plan for the use of their methods in the development of cognitive neuroscience. ********************************* Michael A. Arbib USC Brain Project University of Southern California Los Angeles, CA 90089-2520, USA arbib at pollux.usc.edu (213) 740-9220; Fax: (213) 740-5687 http://www-hbp.usc.edu/HBP/ From cns-cas at cns.bu.edu Wed Oct 22 13:37:16 1997 From: cns-cas at cns.bu.edu (Boston University - Cognitive and Neural Systems) Date: Wed, 22 Oct 1997 13:37:16 -0400 Subject: CALL FOR PAPERS - 2nd International COnference on CNS Message-ID: <3.0.3.32.19971022133716.0109b4cc@cns.bu.edu> ****CALL FOR PAPERS**** SECOND INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS May 27-30, 1998 Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from DARPA and ONR http://cns-web.bu.edu/cns-meeting/ HOW DOES THE BRAIN CONTROL BEHAVIOR? HOW CAN TECHNOLOGY EMULATE BIOLOGICAL INTELLIGENCE? The conference will include invited lectures and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Abstract submissions encourage submissions of the latest results. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. Although Memorial Day falls on Saturday, May 30, it is observed on Monday, May 25, 1998. CONFIRMED INVITED SPEAKERS TUTORIALS: WEDNESDAY, MAY 27, 1998 (to be announced) KEYNOTE SPEAKERS: Stephen Grossberg, Adaptive resonance theory: From biology to technology Ken Nakayama, Psychological studies of visual attention INVITED SPEAKERS: THURSDAY, MAY 28, 1998: Azriel Rosenfeld, Understanding object motion Takeo Kanade, Computational sensors: Further progress Tomaso Poggio, Sparse representations for learning Gail Carpenter, Applications of ART neural networks Rodney Brooks, Experiments in development models for a neurally controlled humanoid robot Lee Feldkamp, Recurrent networks: Promise and practice FRIDAY, MAY 29, 1998: J. Anthony Movshon, Contrast gain control in the visual cortex Hugh Wilson, Global processes at intermediate levels of form vision Mel Goodale, Biological teleassistance: Perception and action in the human visual system Ken Stevens, The categorical representation of speech and its traces in acoustics and articulation Carol Fowler, Production-perception links in speech Frank Guenther, A theoretical framework for speech acquisition and production SATURDAY, MAY 30, 1998: Howard Eichenbaum, The hippocampus and mechanisms of declarative memory Earl Miller, Neural mechanisms for working memory and cognition Bruce McNaughton, Neuronal population dynamics and the interpretation of dreams Richard Thompson, The cerebellar circuitry essential for classical conditioning of discrete behavioral responses Daniel Bullock, Cortical control of arm movements Andrew Barto, Reinforcement learning applied to large-scale dynamic optimization problems There will be contributed oral and poster sessions on each day of the conference. CALL FOR ABSTRACTS Contributors are requested to list a first and second choice from among the topics below in their cover letter, and to say whether it is biological (B) or technological (T) work, when they submit their abstract, as described below. vision spatial mapping and navigation object recognition neural circuit models image understanding neural system models audition mathematics of neural systems speech and language robotics unsupervised learning neuromorphic VLSI supervised learning hybrid systems (fuzzy, evolutionary, digital) reinforcement and emotion industrial applications cognition, planning, and other attention Example: first choice: vision (T); second choice: neural system models (B). CALL FOR ABSTRACTS: Contributed Abstracts must be received, in English, by January 31, 1998. Notification of acceptance will be given by February 28, 1998. A meeting registration fee of $45 for regular attendees and $30 for students must accompany each Abstract. See Registration Information for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted abstracts will be returned on request only until April 15, 1998. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; and preference for oral or poster presentation. (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, and VCR facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows are available to cover meeting travel and living costs. The deadline to apply for fellowship support is January 31, 1998. Applicants will be notified by February 28, 1998. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. Reimbursement checks will be distributed after the meeting. ________________________________________________________________ REGISTRATION FORM Second International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University, 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 27, 1998, Meeting: May 28-30, 1998 FAX: (617) 353-7755 (Please Type or Print) Mr/Ms/Dr/Prof: __________________________________________________ Name: ___________________________________________________________ Affiliation: ____________________________________________________ Address: ________________________________________________________ City, State, Postal Code: _______________________________________ Phone and Fax: __________________________________________________ Email: _________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) 70 Conference plus Tutorial (Regular) ( ) 45 Conference plus Tutorial (Student) ( ) 45 Conference Only (Regular) ( ) 30 Conference Only (Student) ( ) 25 Tutorial Only (Regular) ( ) 15 Tutorial Only (Student) Method of Payment: (Please FAX or mail) [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: __________________________________ Type of card: _____________ Account number: ______________________ Signature: ____________________________ Expiration date: _________ From S.Singh-1 at plymouth.ac.uk Wed Oct 22 16:27:26 1997 From: S.Singh-1 at plymouth.ac.uk (Sameer Singh) Date: Wed, 22 Oct 1997 16:27:26 BST Subject: PhD Studentship available Message-ID: <53153C87C1F@cs_fs15.csd.plym.ac.uk> University of Plymouth, UK School of Computing PhD Research Studentship Available Salary: See below Applications are now invited for a PhD studentship in the School of Computing in the area of unstructured information processing and extraction using intelligent techniques such as neural networks. The research project will be carried out in collaboration with Ranco Controls Ltd., Plymouth, a world leading manufacturer of control equipment. The project will also collaborate with the School of Electronic, Communication and Electrical Engineering. You should have a background in computer science or engineering with a good honours degree, and preferably with a Masters qualification. The project requires good knowledge in areas including information systems, artificial intelligence and C/C++. The studentship covers the tuition fee and a maintenance of Pounds 5510 per year. Application forms and further details are available from the School Office on +44-1752- 232 541. Further information and informal enquiries on the project should be directed to Dr. Sameer Singh, School of Computing, University of Plymouth, UK (tel: +44-1752-232 612, fax: +44-1752-232 540, e-mail: s1singh at plym.ac.uk). UK and EU country residents are especially encouraged to apply. Closing date: Completed application forms should reach by the 7th November, 1997 Promoting equal opportunities A Leading Centre for Teaching and Research From Dave_Touretzky at cs.cmu.edu Thu Oct 23 00:17:28 1997 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Thu, 23 Oct 1997 00:17:28 -0400 Subject: faculty position in computational neuroscience Message-ID: <13994.877580248@skinner.boltz.cs.cmu.edu> The Computer Science Department at Carnegie Mellon University, in collaboration with the Center for the Neural Basis of Cognition (CNBC), is soliciting applications for a tenure-track faculty position in computational neuroscience. The successful applicant will be expected to carry out a research program in the theoretical analysis of computational properties of real neural systems, and how these properties contribute to aspects of cognition such as perception, memory, language, or the planning and coordination of action. Theoretically-oriented CNBC faculty have many opportunities to collaborate with experimentalists using a variety of techniques, including primate and rat electrophysiology, MRI and PET functional brain imaging, and neuropsychological assessment of clinical populations. Researchers with a strong background in mathematical analysis, dynamical systems theory, probablistic and statistical approaches, or other analytical techniques relevant to the study of brain function are especially encouraged to apply. The deadline for initial review of applications is February 1, 1998, but applications arriving after that date will be considered until the position is filled. Send a vita, a statement of research interests, copies of relevant publications, and three letters of reference to: Dr. David S. Touretzky, Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213-3891 Additional information about the CNBC may be found at http://www.cnbc.cmu.edu. Carnegie Mellon is an Equal Opportunity Employer. From atick at monaco.rockefeller.edu Thu Oct 23 08:44:49 1997 From: atick at monaco.rockefeller.edu (Joseph Atick) Date: Thu, 23 Oct 1997 08:44:49 -0400 Subject: Network:CNS Table of Contents Message-ID: <9710230844.ZM8556@monaco.rockefeller.edu> NETWORK: COMPUTATION IN NEURAL SYSTEMS Table of Contents Volume 8, Issue 4, November 1997 Pages: R77--R109, 355--466 TOPICAL REVIEW R77 Basal ganglia: structure and computations J Wickens PAPERS 355 Networks of spiking neurons can emulate arbitrary Hopfield nets in temporal coding W Maass and T Natschlger 405 Information processing by a noisy binary channel E Korutcheva, N Parga and J-P Nadal 373 Dynamics of a recurrent network of spiking neurons before and following learning D J Amit and N Brunel 425 Generalisation and discrimination emerge from a self-organising componential network: a speech example C J S Webber 441 Unsupervised discovery of invariances S Eglen, A Bray and J Stone 453 Second-hand supervised learning in Hebbian perceptrons M A P Idiart 465 AUTHOR INDEX (with titles), Volume 8 -- Joseph J. Atick Rockefeller University 1230 York Avenue New York, NY 10021 Tel: 212 327 7421 Fax: 212 327 7422 From pbolland at lbs.ac.uk Fri Oct 24 09:44:06 1997 From: pbolland at lbs.ac.uk (Peter Bolland) Date: Fri, 24 Oct 1997 13:44:06 UTC Subject: Computational Finance 97: Call for participation Message-ID: <4CAB1457A11@deimos.lbs.ac.uk> ANNOUNCEMENT AND CALL FOR PARTICIPATION PRELIMARY PROGRAMME & REGISTRATION FORM _________________________________________________________ COMPUTATIONAL FINANCE 1997 _________________________________________________________ The Fifth International Conference on NEURAL NETWORKS IN THE CAPITAL MARKETS Monday-Wednesday, December 15-17, 1997 London Business School, London, England. After four years of continuous success and evolution, NNCM has emerged as a truly multi-disciplinary international conference. Born out of neurotechnology, NNCM now provides an international focus for innovative research on the application of a multiplicity of advanced decision technologies to many areas of financial engineering. It draws upon theoretical advances in financial economics and robust methodological developments in the statistical, econometric and computer sciences. The fifth NNCM conference will be held in London December 15-17 1997 under the new title COMPUTATIONAL FINANCE 1997 to reflect its multi-disciplinary nature. COMPUTATIONAL FINANCE 1997 is a research meeting where original, high-quality contributions are presented and discussed. In addition, a day of introductory tutorials (Monday, December 15) will be included to familiarise participants of different backgrounds with the financial, and methodological aspects of the field. Location: The conference will be held at London Business School which is situated near Regent's Park, London and is a short walk from Baker Street Underground Station. Further directions including a map will be sent to all registries. Registration and Mailing List: if you wish to be added to the mailing list or register for COMPUTATIONAL FINANCE 1997, please send your postal address, e-mail address, and fax number to the secretariat. __________________________________________________ Secretariat: __________________________________________________ Ms Deborah King, London Business School, Sussex Place, Regent's Park, London NW1 4SA, UK. E-mail: boguntula at lbs.ac.uk. Phone (+44) (0171)-262 50 50, Fax (+44) (0171) 724 78 75. __________________________________________________ WEB PAGE __________________________________________________ For more information on COMPUTATIONAL FINANCE 1997, please visit the NNCM web site at London Business School, homepage http://www.lbs.lon.ac.uk/desci/nncmhome.html registration form http://www.lbs.lon.ac.uk/desci/nncm97reg.html preliminary programme http://www.lbs.lon.ac.uk/desci/nncm97prog.html __________________________________________________ Preliminary programme: __________________________________________________ Computational Finance 1997 15-17th December 1997 London Business School __________________________________________________ Tutorials: Monday 15th December __________________________________________________ 9.15 - 10.15 Computational Finance: Challenges and Prospects Dr. Domingo Tavella Align Risk Analytics 10.45 - 12.45 An Introduction to Yield Curve Models Dr. Piotr Karazinski Citibank, UK 14.00 - 15.45 Uncertainty Analysis and Model Identification Prof. Chris Chatfield University of Bath, UK 16.15 - 18.00 Structural Time Series Analysis and the Kalman Filter Prof. Andrew Harvey Cambridge University, UK __________________________________________________ Day One: Tuesday 16th December __________________________________________________ ----------------Oral Session 1: Market Dynamics---------------- Invited Talk: Volatility Forecasting and Risk Management Prof. Francis X. Diebold, (University of Pennsylvania) Cross bicorrelations in high-frequency exchange rates: testing and forecasting. C. Brooks (ISMA Centre, University of Reading, UK), M. J. Hinich (University of Texas) Volume and return in the stock market: stability analysis and forecasting implications. J. del Hoyo and J. G. Llorente, (Madrid University) The multiplicative statistical mechanics of stock markets S. Solomon, (Hebrew University, Jerusalem) Time-varying risk premia from an asset allocation perspective - a GMDH analysis M. Steiner and S. Schneider, (Augsburg University, Germany) A data matrix to investigate independence,over-reaction and/or shock persistence in financial data R. Dacco and S. Satchell, (University of Cambridge, UK) With discussion by B. LeBaron, (MIT) ------Oral Session 2: Trading and Arbitrage Strategies--------------- Nonlinear equilibrium dynamics and investment strategy evaluation with cointegration R. N. Markellos, (Loughborough University, UK) Modelling asset prices using a portfolio of cointegration models approach A. N. Burgess, (London Business School) Technical analysis and central bank intervention C. Neely (Federal Reserve Bank of St. Louis), P. Weller, (University of Iowa, USA) With discussion by A.Timmerman (UCSD) Multitask learning in a neural VEC approach for exchange rate forecasting F. Rauscher, (Daimler Benz Research) Immediate and future rewards: reinforcement learning for trading systems and portfolios J. E. Moody, M. Saffell, Y. Liao, L. Wu (Oregon Graduate Institute, Portland) An evolutionary bootstrap method for selecting dynamic trading strategies B. LeBaron, (MIT) With discussion by A. S. Weigend (Stern Business School, New York University) ------------------------Poster Session 1---------------------------- __________________________________________________ Day Two: Wednesday 17th December __________________________________________________ ---Oral Session 3: Volatility Modelling and Option Pricing------- Invited Talk: Modelling S&P 100 volatility: the information content of stock returns Prof. S. Taylor, (Lancaster University, UK) Forecasting properties of neural network generated volatility estimates P. Ahmed, (University of North Carolina) Bootstrapping GARCH(1,1) models G. Maerker (Institut fur Techno und Wirtschaftsmathematick, Kaiserslautern, Germany) Pricing and hedging derivative securities with neural networks and the homogeneity hint R. Gencay, (University of Windsor, Canada) With discussion by Y. Abu-Mostafa (CalTech) Recovering risk aversion from option prices and realised returns J. Jackwerth and M. Rubinstein, (Haas Business School, University of California, Berkeley) With discussion by A. Neuberger (London Business School ) --------Oral Session 4: Term Structure and Factor Models------------- Kalman filtering of generalised Vasicek term-structure models S. H. Babbs and K. B. Nowman, (First National Bank of Chicago, London, UK) Modelling the term structure of interest rates: a neural network perspective J. T. Connor and N. Towers, (London Business School) A non-parametric test for nonlinear cointegration J. Breitung, (Humboldt University, Berlin) With discussion by H. White (UCSD) Unconstrained and constrained time-varying factor sensitivities in equity investment management Y. Bentz and J. T. Connor, (London Business School) Discovering structure in finance using independent component analysis D. Back, (Frontier Research Program, RIKEN, Japan), A. S. Weigend, (Stern Business School, New York University) ---------------------Poster Session 2-------------------------------- __________________________________________________ Posters (oral presentations plus those below) __________________________________________________ Classification of sector allocation in the German stock market, E. Steurer, (Daimler-Benz Research, Germany) Are neural network and econometric forecasts good for trading ? The case of the Italian stock index future, R. Bramante, R. Colombo, G. Gabbi, (University Bocconi, Milan) Interest rates structure dynamics: a non-parametric approach, M. Cottrell, E. Bodt and P. Gregoire (Paris I University) Modifying the distributional properties of financial ratios to improve the performance of linear and non-linear discriminant systems, G. Albanis, J. A. Long and M. Hiscock (City University, London, UK) Time series techniques and neural networks: a combined framework to forecast USD/DEM exchange rate, F Bourgoin, (Millenium Global Investments Limited, London), A. Vigier (Decalog, Paris) Credit assessment using evolutionary MLP networks, A. Carvalho, E.F.M. Filho, A. B. Matias (San Paulo University, Brazil) Exploring corporate bankruptcy with two-level self-organising map, K. Kiviluoto, (Helsinki University of Technology, Finland), P. Gergius, (Kera Ltd, Finland) Incorporating prior knowledge about financial markets through neural multi-task learning, K. Bartlmae, S. Gutjahr and G.Nakhaeizadeh (University of Karlsruhe, Germany) Predicting time-series with a committee of independent experts based on fuzzy rules, M. Rast, (Ludwig-Maximilians-Universitat, Munich, Germany) Multiscale analysis of time-series based on a neuro-fuzzy chaos methodology applied to financial data, N. K. Kasabov, R. Kozma, (University of Otago, N.Z.) Estimating and forecasting non-stationary financial data with IIR-filters and CT (composed threshold) models, M. Wildi, (Switzerland) Probabilistic neural network for company failure prediction, Z. Yang, H. James, A. Packer (University of Portsmouth, UK) An improved parametric density model for risk analysis of FX returns, J. Utans (London Business School), P. Sondhi (Citibank) Currency forecasting using recurrent RBF networks optimised by genetic algorithms, A. Adamopoulos, A. Andreou, et al., (University of Patras , Greece) On the market timing ability of neural networks: an empirical study testing the forecasting performance, T. H. Hann, (Karlsruhe University, Germany), J. Hofmeister (University of Ulm) Exchange rate trading using a fast retraining procedure for generalised RBF networks, D. R. Dersch, B. Flower, and S. J. Pickard (Crux Cybernetics, Australia) Prediction of volatility and option prices using an extended Kalman filter, V. P. Kumar and S. Mukhergee, (MIT) The ex-ante classification of take-over targets using neural networks, D. Fairclough, (Buckinghamshire College, UK), J. Hunter (Brunel University, UK) Portfolio optimisation with cap weight restrictions, N. Wagner, (Bayerische Vereinsbank AG, Munich) Management of a futures portfolio using conditional mean-variance estimates from wavelet-encoding neural networks, D. L. Toulson and S. P. Toulson, (Intelligent Financial Systems Ltd., London) Selecting relative value stocks with nonlinear cointegration, C. Kollias, (Hughes Financial Analytics), K. Metaxas, (University of Athens) Dynamic hedging of property liability portfolios in a multiple objective framework, G. H. Dash Jnr, R. C. Hanumara, N. Kajiji (University of Rhode Island, US) Model complexity versus transparency: an empirical comparison of the tradeoffs among different trading models, R. Madhavan, V. Dhar and A. S. Weigend (Stern Business School, New York University) A constrained hybrid approach to option pricing, P. Lajbcygier (Monash University, Australia), J. T. Connor (London Business School) Predicting corporate financial distress using quantitative and qualitative data: a comparison of traditional and collapsible neural networks, Q. Booker, R. E. Dorsey, and J. D. Johnson, (University of Mississipi, USA) State space ARCH: forecasting volatility with a stochastic coefficient model, A. Veiga, M. C. Medeiros and C. Fernandes (PUC, Rio de Janeiro) Using option prices to recover probability distributions, F. Gonzales-Miranda (Swedish School of Economics, Helsinki) A. N. Burgess (London Business School) Multivariate mutual funds analysis using neural networks, A. d'Almeida Monteiro, C. E. Pedreira and C. P. Samanez (PUC, Rio de Janeiro, Brazil) Cointegration by MCA and modular MCA, L. Xu and W. M. Leung (Chinese University, Hong Kong) On the complexity of stock returns, M. A. Kaboudan, (Penn State University, US) Modelling volatility using state-space models, J. Timmer (Freiburg University, Germany), A. S. Weigend (Stern School of Business, New York University) __________________________________________________ COMPUTATIONAL FINANCE 97 Registration Form December 15-17, 1997 Name:___________________________________________________ Affiliation:________________________________________________ Mailing Address: __________________________________________ ________________________________________________________ Telephone:_______________________________________________ ***Please circle the applicable fees and write the total below**** Main Conference (December 16-17): Registration fee stlg450 Discounted fee for academicians stlg250 (letter on university letterhead required) Discounted fee for full-time students stlg100 (letter from registrar or faculty advisor required) Tutorials (December 15): You must be registered for the main conference in order to register for the tutorials. Morning Session Only stlg100 Afternoon Session Only stlg100 Both Sessions stlg150 Full-time students stlg50 (letter from registrar or faculty advisor required) TOTAL: stlg Payment may be made by: (please tick) * Check payable to London Business School * VISA *Access *American Express Card Number:___________________________________ __________________________________________________ From aapo at myelin.hut.fi Fri Oct 24 10:33:51 1997 From: aapo at myelin.hut.fi (Aapo Hyvarinen) Date: Fri, 24 Oct 1997 17:33:51 +0300 Subject: Two TechReps on ICA and PP Message-ID: <199710241433.RAA02744@myelin.hut.fi> The following technical reports on independent component analysis and projection pursuit are available at: http://www.cis.hut.fi/~aapo/pub.html Aapo Hyvarinen: INDEPENDENT COMPONENT ANALYSIS BY MINIMIZATION OF MUTUAL INFORMATION Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, the linear version of the ICA problem is approached from an information-theoretic viewpoint, using Comon's framework of minimizing mutual information of the components. Using maximum entropy approximations of differential entropy, we introduce a family of new contrast (objective) functions for ICA, which can also be considered 1-D projection pursuit indexes. The statistical properties of the estimators based on such contrast functions are analyzed under the assumption of the linear mixture model. It is shown how to choose optimal contrast functions according to different criteria. Novel algorithms for maximizing the contrast functions are then introduced. Hebbian-like learning rules are shown to result from gradient descent methods. Finally, in order to speed up the convergence, a family of fixed-point algorithms for maximization of the contrast functions is introduced. Aapo Hyvarinen: NEW APPROXIMATIONS OF DIFFERENTIAL ENTROPY FOR INDEPENDENT COMPONENT ANALYSIS AND PROJECTION PURSUIT (To appear in NIPS*97) We derive a first-order approximation of the density of maximum entropy for a continuous 1-D random variable, given a number of simple constraints. This results in a density expansion which is somewhat similar to the classical polynomial density expansions by Gram-Charlier and Edgeworth. Using this approximation of density, an approximation of 1-D differential entropy is derived. The approximation of entropy is both more exact and more robust against outliers than the classical approximation based on the polynomial density expansions, without being computationally more expensive. The approximation has applications, for example, in independent component analysis and projection pursuit. From brychcy at informatik.tu-muenchen.de Fri Oct 24 09:37:32 1997 From: brychcy at informatik.tu-muenchen.de (Till Brychcy) Date: Fri, 24 Oct 1997 15:37:32 +0200 Subject: (Detailed) CFP: Fuzzy-Neuro Systems '98 in Munich Message-ID: <97Oct24.153739+0200met_dst.49149+255@papa.informatik.tu-muenchen.de> C A L L F O R P A P E R S 5. International GI-Workshop Fuzzy-Neuro Systems '98 - Computational Intelligence - 18 - 20 March 1998, Munich Fuzzy-Neuro Systems '98 is the fifth event of a well established series of workshops with international participation. Its aim is to give an overview of the state of art in research and development of fuzzy systems and artificial neural networks. Another aim is to highlight applications of these methods and to forge innovative links between theory and application by means of creative discussions. Fuzzy-Neuro Systems '98 is being organized by the Research Committee 1.2 "Inference Systems" (Fachausschuss 1.2 "Inferenzsysteme") of the German Society of Computer Science GI (Gesellschaft fur Informatik e. V.) and Technische Universitat Munchen in cooperation with Siemens AG. The workshop takes place at the European Patent Office in Munich from March 18 to 20, 1998 Scientific Topics ----------------- * theory and principles of multivalued logic and fuzzy logic * representation of fuzzy knowledge * approximate reasoning * fuzzy control in theory and practice * fuzzy logic in data analysis, signal processing and pattern recognition * fuzzy classification systems * fuzzy decision support systems * fuzzy logic in non-technical areas like business administration, management etc. * fuzzy databases * theory and principles of artificial neural networks * hybrid learning algorithms * neural networks in pattern recognition, classification, process monitoring and production control * theory and principles of evolutionary algorithms: genetic algorithms and evolution strategies * discrete parameter and structure optimization * hybrid systems like neuro-fuzzy systems, connectionistic expert systems etc. * special hardware and software Program Committee ----------------- Prof. Dr. W. Banzhaf, University of Dortmund Dr. M. Berthold, University of Karlsruhe Prof. Dr. Dr. h.c. W. Brauer, Technische Universitat Munchen (Chairman) Prof. Dr. G. Brewka, University of Leipzig Dr. K. Eder, Kratzer Automation GmbH, Unterschlei=DFheim Prof. Dr. C. Freksa, University of Hamburg Prof. Dr. M. Glesner, Technical University of Darmstadt Prof. Dr. S. Gottwald, University of Leipzig Prof. Dr. A. Grauel, University of Paderborn, Dept. Soest Prof. Dr. H.-M. Gross, Technical University of Ilmenau Dr. A. Gunter, University of Bremen Dr. J. Hollatz, Siemens AG, Munich Prof. Dr. R. Isermann, Technical University of Darmstadt Prof. Dr. P. Klement, University of Linz, Austria Prof. Dr. R. Kruse, University of Magdeburg (Vice Chairman) Prof. Dr. B. Mertsching, University of Hamburg Prof. Dr. R. Nakaeizadeh, Daimler Research Laboratory, Ulm Prof. Dr. K. Obermayer, Technical University of Berlin Prof. Dr. G. Palm, University of Ulm Dr. R. Palm, Siemens AG, Munich Dr. L. Peters, Institute for System Design Technology, St. Augustin Prof. Dr. F. Pichler, University of Linz, Austria Dr. P. Protzel, FORWISS, Erlangen Prof. Dr. B. Reusch, University of Dortmund Prof. Dr. Rigoll, University of Duisburg Prof. Dr. R. Rojas, University of Halle Prof. Dr. B. Schurmann, Siemens AG, Munich (Vice Chairman) Prof. Dr. W. von Seelen, University of Bochum Prof. Dr. H. Thiele, University of Dortmund Prof. Dr. W. Wahlster, University of Saarbrucken Prof. Dr. H.-J. Zimmermann, Technical University of Aachen Organization Committee ---------------------- Prof. Dr. Dr. h.c. W. Brauer, Technische Universitat Munchen (Chairman) Dr. J. Hollatz, Siemens AG, Munich (Vice Chairman) C. Kirchmair, Technische Universitat Munchen C. Harms, GMD National Research Center for Information Technology, St. Augustin Organizational Information -------------------------- 30.11.1997: abridged version (English, 4 to 6 pages DIN A4 size) of following structure: * title * author(s), address, phone, fax, e-mail * contents: 1. abstract 2. key words (not more than 5) 3. state of the art 4. new aspects 5. theory, simulation or experiment 6. results and conclusion 7. references December '97: notification of acceptance or rejection of contribution 28.01.1997: final camera-ready papers for proceedings (up to 8 pages DIN A4 size) Formatting instructions will be available soon at: http://wwwbrauer.informatik.tu-muenchen.de/~fns98/format.html Please send four copies of your scientific contribution to: Prof. Dr. Dr. h.c. W. Brauer - FNS '98 - Institut fur Informatik Technische Universitat Munchen D-80290 Munchen Germany If you would like to take part in the workshop without submitting a paper please send your email adress to fns98 at tiki.informatik.tu-muenchen.de We will send you a copy of the workshop program as soon as available. For further information visit the Internet homepage at: http://wwwbrauer.informatik.tu-muenchen.de/~fns98 From tony at salk.edu Sun Oct 26 11:09:20 1997 From: tony at salk.edu (Tony Bell) Date: Sun, 26 Oct 1997 08:09:20 -0800 Subject: NIPS hotel/registration reminders Message-ID: <3.0.32.19971026080903.0068e620@salk.edu> The deadline for early registration for NIPS is October 31. After that the costs go up a bit. You can register online on the NIPS web page, using your Visa or Mastercard. http://www.cs.cmu.edu/Groups/NIPS/NIPS97/ In addition, hotel rooms are held for us only till early November as follows: Denver Marriot Hotel (Conference), Nov 14. tel. 800-228-9290 Beaver Run Resort (Workshop), Nov 6. tel. 800-288-1282 Breckinridge Hilton (Workshop), Nov 4. tel. 800-321-8444 Further accommodation details (international telephones numbers etc) are all on the NIPS web page. - Tony Bell, NIPS Publicity From smc at decsai.ugr.es Sun Oct 26 20:47:34 1997 From: smc at decsai.ugr.es (Serafin Moral) Date: Mon, 27 Oct 1997 01:47:34 +0000 Subject: UAI-98 Call for Papers Message-ID: <3453F2B6.2DB3C53D@decsai.ugr.es> We apologize if you receive multiple copies of this message. Please distribute to interested persons. ====================================================== C A L L F O R P A P E R S ====================================================== ** U A I 98 ** THE FOURTEENTH ANNUAL CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE July 24-26, 1998 University of Wisconsin Business School Madison, Wisconsin, USA ======================================= Please visit the UAI-98 WWW page at http://www.uai98.cbmi.upmc.edu ************************************************************** CO-LOCATION ANNOUNCEMENT The 1998 UAI Conference will be co-located with ICML-98 (International Conference on Machine Learning) and COLT-98 (Computational Learning Theory). Registrants to any of the three conferences will be allowed to attend without additional costs the technical sessions of the other conferences. Joint invited speakers, poster sessions and a panel session are planned for the three conferences. The day after the co-located conferences (Monday, July 27, 1998), full day workshops and/or tutorials will be offered by each of ICML, COLT, and UAI. UAI will offer a full day course in which an overview of the field of uncertain reasoning will be presented by a faculty of its distinguished researchers. The AAAI-98 conference technical program begins on Tuesday, July 28th. ========================================================================== UAI-98 will meet at the University of Wisconsin Business School, in close proximity to the Convention Center, where AAAI-98 will be held. * * * CALL FOR PAPERS Uncertainty management in artificial intelligence has now been established as a well founded discipline, with a degree of development that has allowed the construction of practical applications that are able to solve difficult AI problems. Since 1985, the Conference on Uncertainty in Artificial Intelligence (UAI) has served as the central meeting on advances in methods for reasoning under uncertainty in computer-based systems. The conference is a primary international forum for exchanging results on the use of principled uncertain-reasoning methods, and it has helped the scientific community move along the path from theoretical foundations, to efficient algorithms, to successful applications. The UAI Proceedings have become a basic reference for researches and practitioners who want to know about both theoretical advances and the latest applied developments in the field. We are very pleased to announce that UAI-98 will be co-located with ICML-98 (International Conference in Machine Learning) and COLT-98 (Computational Learning Theory). This will be an outstanding opportunity for members of the three communities to share ideas and techniques. The scope of UAI covers a broad spectrum of approaches to automated reasoning and decision making under uncertainty. Contributions to the proceedings address topics that advance theoretical principles or provide insights through empirical study of applications. Interests include quantitative and qualitative approaches, and traditional as well as alternative paradigms of uncertain reasoning. We encourage the submission of papers proposing new methodologies and tools for model construction, representation, learning, inference and experimental validation. Innovative ways to increase the expressive power and the applicability spectrum of existing methods is encouraged as well; hybrid approaches may, for example, provide one way to achieve these goals. Papers are welcome that present new applications of uncertain reasoning that stress the methodological aspects of their construction and use. Highlighting difficulties in existing procedures and pointing at the necessary advances in foundations and algorithms is considered an important role of presentations of applied research. Topics of interest include (but are not limited to): >> Foundations * Theoretical foundations of uncertain belief and decision * Uncertainty and models of causality * Representation of uncertainty and preference * Generalization of semantics of belief * Conceptual relationships among alternative calculi * Models of confidence in model structure and belief * Knowledge revision and combination >> Principles and Methods * Planning under uncertainty * Temporal reasoning * Markov processes and decisions under uncertainty * Qualitative methods and models * Automated construction of decision models * The representation and discovery of causal relationships * Uncertainty and methods for learning and data mining * Abstraction in representation and inference * Computation and action under limited resources * Control of computational processes under uncertainty * Time-dependent utility and time-critical decisions * Uncertainty and economic models of problem solving * Integration of logical and probabilistic inference * Statistical methods for automated uncertain reasoning * Hybridization of methodologies and techniques * Algorithms for uncertain reasoning * Advances in diagnosis, troubleshooting, and test selection * Formal languages to represent uncertain information * Data structures for representation and inference * Fusion of models * Uncertain reasoning and information retrieval * Enhancing the human-computer interface with uncertain reasoning * Automated explanation of results of uncertain reasoning >> Empirical Study and Applications * Empirical validation of methods for planning, learning, and diagnosis * Uncertain reasoning in embedded, situated systems (e.g., softbots) * Nature and performance of architectures for real-time reasoning * Experimental studies of inference strategies * Experience with knowledge-acquisition methods * Comparison of representation and inferential adequacy of different calculi * Methodologies for problem modeling For papers focused on applications in specific domains, we suggest that the following issues be addressed in the submission: - Why was it necessary to represent uncertainty in your domain? - What are the distinguishing properties of the domain and problem? - What kind of uncertainties does your application address? - Why did you decide to use your particular uncertainty formalism? - Which practical procedure did you follow to build the application? - What theoretical problems, if any, did you encounter? - What practical problems did you encounter? - Did users/clients of your system find the results useful? - Did your system lead to improvements in decision making? - What approaches were effective (ineffective) in your domain? - What methods were used to validate the effectiveness of the system? ================================= SUBMISSION AND REVIEW OF PAPERS ================================= Papers submitted for review should represent original, previously unpublished work. Papers should not be under review for presentation in any other conference, however, an extended version of the paper may be under review for publication in a scientific journal. Submitted papers will be carefully evaluated on the basis of originality, significance, technical soundness, and clarity of exposition. Papers may be accepted for presentation in plenary or poster sessions. There will be a joint poster session with the ICML-98 and COLT-98 Conferences. Some of the papers selected for this poster session may also have a plenary presentation at UAI. All accepted papers will be included in the Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence, published by Morgan Kaufmann Publishers. An outstanding student paper will be selected for special distinction. Submitted papers must be at most 20 pages of 12pt Latex article style or equivalent (about 4500 words). We strongly encourage the electronic submission of papers. To submit a paper electronically, send an electronic version of the paper (Postscript format) to the following address: uai98 at cbmi.upmc.edu The subject line of this message should be: $.ps, where $ is an identifier created from the last name of the first author, followed by the first initial of the author's first name. Multiple submissions by the same first author should be indicated by adding a number (e.g., pearlj2.ps) to the end of the identifier. Additionally, the paper abstract and data should be sent by using the electronic form at the following address: http://www.uai98.cbmi.upmc.edu/data.html Authors unable to submit papers electronically should send 5 copies of the complete paper to one of the Program Chairs at the addresses listed below. Authors unable to use the electronic form to submit the abstract should provide the following information (by sending a message to the e-mail address above): * Paper title (plain text) * Author names, including student status (plain text) * Surface mail address, e-mail address, and voice phone number for a contact author (plain text) * A short abstract including keywords (plain text) * Primary and secondary classification indices selected from conference topics listed above. * Indicate whether the paper is appropriate for a joint session with ICML-98 and COLT-98 * Indicate the preferred type of presentation: poster or plenary ++++++++++++++++++++++++++++++ Important Dates ++++++++++++++++++++++++++++++ >> All submissions must be received by: Monday, February 23, 1998 >> Notification of acceptance on or before: Friday, April 10, 1998 >> Camera-ready copy due: Friday, May 8, 1998 >> Conference dates: July 24, 25, 26, 1998 >> Full day course on Uncertain Reasoning: Monday, July 27, 1998 ========================== Conference E-mail Address ========================= Please send all inquiries (submissions and conference organization) to the following e-mail address: uai98 at cbmi.upmc.edu Program Co-chairs: =================== Gregory F. Cooper Center for Biomedical Informatics University of Pittsburgh Suite 8084 Forbes Tower 200 Lothrop Street Pittsburgh, PA 15213-2582 USA Phone: (412) 647-7113 Fax: (412) 647-7190 E-mail: gfc at cbmi.upmc.edu Serafin Moral Dpto. Ciencias de la Computacion e IA Universidad de Granada 18071 - Granada SPAIN Phone: +34 58 242819 Fax: +34 58 243317 E-mail: smc at decsai.ugr.es WWW: http://decsai.ugr.es/~smc General Conference Chair: ======================================================== Prakash P. Shenoy University of Kansas School of Business Summerfield Hall Lawrence, KS 66045-2003 USA Phone: (913) 864-7551 Fax: (913) 864-5328 E-mail: pshenoy at ukans.edu WWW: http://stat1.cc.ukans.edu/~pshenoy ======================================================== Refer to the UAI-98 WWW home page for late-breaking information: http://www.uai98.cbmi.upmc.edu From scheler at informatik.tu-muenchen.de Mon Oct 27 06:24:50 1997 From: scheler at informatik.tu-muenchen.de (Gabriele Scheler) Date: Mon, 27 Oct 1997 12:24:50 +0100 Subject: Two preprints on Language, Brain and Computation Message-ID: <97Oct27.122503+0100met_dst.49139+105@papa.informatik.tu-muenchen.de> Two preprints on Lexical Feature Learning and Narrative Understanding are available from Gabriele Scheler's homepage: http://www7.informatik.tu-muenchen.de/~scheler/publications.html ------------------------------------------------------------- 1. Lexical feature Learning Constructing semantic representations using the MDL principle Niels Fertig and Gabriele Scheler Words receive a significant part of their meaning from use in communicative settings. The formal mechanisms of lexical acquisition, as they apply to rich situational settings, may also be studied in the limited case of corpora of written texts. This work constitutes an approach to deriving semantic representations for lexemes using techniques from statistical induction. In particular, a number of variations on the MDL principle were applied to selected sample sets and their influence on emerging theories of word meaning explored. We found that by changing the definition of description length for data and theory - which is equivalent to different encodings of data and theory - we may customize the emerging theory, augmenting and altering frequency effects. Also the influence of stochastic properties of the data on the size of the theory has been demonstrated. The results consist in a set of distributional properties of lexemes, which reflect cognitive distinctions in the meaning of words. ------------------------------------------------------------- 2. Narrative Understanding Connectionist Modeling of Human Event Memorization Processes with Application to Automatic Text Summarization Maria Aretoulaki, Gabriele Scheler and Wilfried Brauer We present a new approach to text summarization from the perspective of neuropsychological evidence and the related methodology of connectionist modeling. The goal of this project is the computational modeling of the specific neuropsychological processes involved in event and text memorization and the creation of a working model of text summarization as a specific problem area. Memorization and summarization are seen as inherently related processes: linguistic material (e.g.~spoken stories or written reports) is compressed into a smaller unit, a {\em schema}, which conveys the most central of the states and events described, making extensive use of feature representations of linguistic material. It is in this compressed form that the source material is ``stored'' in memory and on this basis it is later retrieved. We discuss the ways whereby these schemata are formed in memory and the associated processes of schema selection, instantiation and change - both in order to further the understanding of these processes and to promote the development of NLP applications concerning the automatic condensation of texts. ------------------------------------------------------------------------ From xli at sckcen.be Mon Oct 27 09:31:41 1997 From: xli at sckcen.be (Xiaozhong Li) Date: Mon, 27 Oct 1997 15:31:41 +0100 Subject: CFP: FLINS'98 workshop, September 14-16, 1998, in Antwerp, Belgium Message-ID: <9710271431.AA16501@vitoosf1.vito.be> Dear friends, Attached is an announcement of FLINS'98 which will be held in September 14-16, 1998,, Antwerp, Belgium. For other information, please visit our homepage: http://www.sckcen.be/dept/flinhome/welcome.html Regards. X. LI _____________________________________________________________________ * Xiaozhong Li. PhD, Currently Postdoctoral Research Fellow * * FLINS-Fuzzy Logic and Intelligent technologies in Nuclear Science * * Belgian Nuclear Research Centre (SCK.CEN) *----------* * Boeretang 200, B-2400 Mol, Belgium | _L_ * * phone: (+32-14) 33 22 30(O); 32 25 52(H) | /\X/\ * * fax: (+32-14) 32 15 29 | \/Z\/ * * e-mail:xli at sckcen.be http://www.sckcen.be/people/xli | / \ @ * *________________________________________________________*----------* FIRST CALL FOR PAPERS FLINS'98 Third International FLINS Workshop on Fuzzy Logic and Intelligent Technologies for Nuclear Science and Industry Organized by SCK?CEN September 14-16, 1998 The Astrid Park Plaza Hotel, Antwerp, Belgium. Abstract submission deadline: December 15, 1997 Notification of acceptance: February 15, 1998 Final manuscript deadline: April 15, 1998 Introduction. FLINS, an acronym for Fuzzy Logic and Intelligent technologies in Nuclear Science, is a recent established international research forum aiming to advance the theory and applications of fuzzy logic and novel intelligent technologies in nuclear science and industry. Following FLINS'94 and FLINS'96, the first and second international workshops on this topic, FLINS'98 aims to bring together scientists and researchers and to introduce the principles of intelligent systems and soft computing such as fuzzy logic (FL), neural network (NN), genetic algorithms (GA) and any combinations of FL, NN, and GA, knowledge-based expert systems and complex problem-solving techniques within nuclear industry and related research fields. FLINS'98 offers a unique international forum to present and discuss techniques that are new and useful for nuclear science and industry. Workshop organization: SCK?CEN FLINS Co-sponsored by: OMRON Belgium NV, Ghent University, Belgium TRACTEBEL Energy Engineering, Belgium International Scientific Advisory Committee: Honorary chairman: L.A. Zadeh (University of California at Berkeley, USA) Chairman: H.-J. Zimmermann (RWTH Aachen, Germany) Members: Z. Bien (Korea Advanced Institute of Science & Technology) D. Dubois (Universit? Paul Sabatier, France) A. Fattah (IAEA, Austria) P. Govaerts (SCKoCEN, Belgium) C.F. Huang (Beijing Normal University, China) J. Kacprzyk (Polish Academy of Science) N. Kasabov (University of Otago, New Zealand) E.E. Kerre (Ghent University, Belgium) G.J. Klir (State University of New Youk at Binghamton, USA) M.M. Gupta (University of Saskatchewan, Canada) M. Modarres (University of Maryland at College Park, USA) J. Montero, (Complutense University Madrid, Spain) Y. Nishiwaki (University of Vienna, Austria) T. Onisawa (University of Tsukuba, Japan) H. Prade (Universite Paul Sabatier,France) M. Roubens (Universit? de Li?ge, Belgium) G. Resconi (Catholic University del S. Cuore, Brescia, Italy) Ph. Smets (Universit? Libre de Bruxelles, Belgium) H. Takagi (Kyushu Inst. of Design, Japan) E. Uchino (Kyushu Institute of Technology, Japan) A.J. van der Wal (TNO, the Netherlands) P. P. Wang (Duke University, USA) R.R. Yager (Iona College, USA) Organizing Committee: Honorary Chairman: E.E. Kerre (Ghent University, Belgium) SCK?CEN Advisors: P. D'hondt Chairman: D. Ruan Members: W. Bogaerts, B. Carl?, B. De Baets, G. de Cooman, R. Roman, B. Van de Walle S. Lenaerts C. Poortmans H. Quets H. A?t Abderrahim TOPICS Contributions describing original work, research projects, and state-of-the-art reviews are solicited on the following (nonrestrictive) list of topics to be covered at FLINS'98: fuzzy logic, neural networks, genetic algorithms, learning techniques, robotics, man-machine interface, decision-support techniques, control theory, clustering, rough set theory, evidence theory, belief theory, functional modeling, with applications in nuclear science and industry, and related research fields, such as: nuclear reactor control, nuclear energy and environmental protection, safety assessment, human reliability, risk analysis, safeguards, production processes in the fuel cycle, dismantling, waste and disposal, power systems control, scheduling, load forecasting, telecommunications. SUBMISSION Authors are invited to submit a one-page abstract containing the paper title, the authors' names and affiliations, complete address (including email, fax, and phone) of the corresponding author(s), five keywords and the abstract (200-250 words), before December 15, 1997 by mail, fax, email to the FLINS'98 Chairman (see below), or to fill in the abstracts submission form on the Flins 98 Website. The organizers intend to have the conference proceedings available to the delegates (all accepted papers reviewed by the scientific committee of FLINS'98 will be published as a book by World Scientific Publishers). Conference Fee 1.Regular early registration (before April 15, 1998) 15,000 BEF late registration (after April 15, 1998) 18,000 BEF 2.Student/invited session chair: early registration (before April 15, 1998) 9,000 BEF late registration (after April 15, 1998) 12,000 BEF Registration fee (21 % VAT incl.) entitles you to (for everyone): access to the conference sessions get a copy of the final program and the proceedings (book form by World Scientific) have coffeebreaks with refreshments have conference reception have 3-day lunches have daily one free drink at the exclusive Diamond Club of the hotel 1 US $ is approximately 35 BEF Important Dates abstract submission deadline: December 15, 1997, notification of acceptance: February 15, 1998, final manuscript deadline: April 15, 1998. From jose at tractatus.rutgers.edu Tue Oct 28 11:00:18 1997 From: jose at tractatus.rutgers.edu (Stephen Hanson) Date: Tue, 28 Oct 1997 11:00:18 -0500 Subject: COGNITIVE SCIENCE@RUTGERS(Newark Campus) Message-ID: <199710281600.LAA22872@tractatus.rutgers.edu> The Psychology Department at Rutgers University (Newark Campus) is pleased to announce a new track in its graduate program for Cognitive Science. Please go directly to our WEB PAGE: www-psych.rutgers.edu for information on the program, research, faculty, stipends and applications. From Mary-Ellen_Flinn at brown.edu Tue Oct 28 11:30:38 1997 From: Mary-Ellen_Flinn at brown.edu (Mary-Ellen Flinn) Date: Tue, 28 Oct 1997 11:30:38 -0500 (EST) Subject: Instructor Position/Brown University Message-ID: Instructor for Semester II 1997-98 to Teach Cognitive Neuroscience Course at Brown University A temporary position is available in the Spring semester (II) of 1998 to teach a course in the Department of Neuroscience in the area of Cognitive Neuroscience. This course will deal with fundamental issues of Cognitive Neuroscience at a level appropriate for advanced undergraduate neuroscience concentrators and for graduate students. This lecture course emphasizes a systems approach to neuroscience and covers several neural systems from among consciousness, sleeping and waking, thinking, selection of action, higher visual and motor processes, sensorimotor integration, learning and memory, attention and emotion. Discussions focus on cerebral cortical mechanisms of behavior and cognition, though subcortical neural mechanisms are discussed. Emphasis on experimental work from functional neuroimaging in humans, behavioral neurophysiology, and observations from human pathology. Some degree of flexibility in course content is possible. Job Requirements: At least 1 year's prior teaching experience. Ph.D. in Brain or Behavioral Sciences The position is available as an adjunct assistant, associate or full professor for one semester only. Appropriately trained postdoctoral fellows will be considered for this position. Strong teaching skills are essential. We encourage applications from women and minority candidates. Brown University is an Equal Opportunity Affirmative Action Employer. Interested individuals should send a CV and names of 3 references by November 20, 1997 to: John P. Donoghue, Ph.D. Chairman Department of Neuroscience Box 1953 Providence, RI 02912 From zenon at ruccs.rutgers.edu Tue Oct 28 12:17:30 1997 From: zenon at ruccs.rutgers.edu (Zenon Pylyshyn) Date: Tue, 28 Oct 1997 12:17:30 -0500 Subject: Immediate Post Doc at Rutgers NB Message-ID: <199710281717.MAA11711@ruccs.rutgers.edu> The Center for Cognitive Science at Rutgers, New Brunswick, NJ has an opening for a two-year Post-Doctoral Fellow to start as early as this January. Salary commensurate with experience and in line with that recommended by federal funding agencies. Emphasis will be on independent research in visual attention, with special focus on multiple-object indexing and tracking. The applicant's overall interests should match those of the lab, as described in URL: http://ruccs.rutgers.edu/finstlab/finstsum.html and in the reports and papers listed in URL: http://ruccs.rutgers.edu/faculty/pylyshyn.html The candidate is expected to have a background in visual science, including the methods of visual psychophysics and/or computer modeling. Familiarity with the use of MAC, SGI and PC platforms for vision research is required. Mail Applications with letter and CV to: Zenon Pylyshyn, Rutgers Center for Cognitive Sceince, Rutgers University, Busch Campus, Psychology Building Addition, New Brunswick, NJ 08903 From dkim at vlsi.donga.ac.kr Wed Oct 29 04:23:32 1997 From: dkim at vlsi.donga.ac.kr (Daijin Kim) Date: Wed, 29 Oct 1997 18:23:32 +0900 Subject: CFP: AFSS'98, June 18-21, 1998, Masan/Tongyoung, Kyungnam, Korea Message-ID: <34570086.BA8A797F@vlsi.donga.ac.kr> Attached is an announcement of AFSS'98 which will be held in June 18-21, 1998, Masan/Tongyoung, Kyungnam, Korea. For more information, please visit our homepage: http://www.donga.ac.kr/~djkim/afss98.html Regards. Daijin Kim ---------------------------------------------------------------------------- AFSS'98 THE 3RD ASIAN FUZZY SYSTEMS SYMPOSIUM June 18-21, 1998 Masan / Tongyoung, Kyungnam, Korea In Cooperation With Korea Fuzzy Logic and Intelligent Systems Society(KFIS) Organizing Institution Kyungnam University, Masan, Korea AFSS'98 : Conference Information The KFIS(Korea Fuzzy Logic and Intelligent Systems Society) is pleased to announce that the 3rd AFSS(AFSS'98) will be held in Masan and Tongyoung, Kyungnam province, during June 18-21, 1998. The first conference of this symposium was held in singapore, November, 1993. And the secod conference was held in Kenting in Taiwan, December, 1996. AFSS'98 aims to encourage information exchange between researchers from Asia to the world for new researchers in the fields of fuzzy systems and related areas. Masan and Tongyoung are famous scenic sites as well as historic harbor cities in Kyungnam province. The beautiful scenery and the best climate during the conference period will provide all the participants with the comfortable and satisfactory meetings. ---------------------------------------------------------------------------- Conference Theme The state-of-art and the future of soft computing and intelligent systems ---------------------------------------------------------------------------- Conference Objective The objective of the symposium is to encourage information exchange between researchers in Asia and Western world in the fields of fuzzy systems and the related areas. It is also an opportunity to povide industrial applications, new technologies and products which is based on fuzzy sets, fuzzy logic and other soft computing methods ---------------------------------------------------------------------------- Topics: AFSS'98 covers the following topics, but others relevant to fuzzy sets and systems ara also represented. Soft computing Fundamentals of fuzzy sets and fuzzy logic Approximate reasoning Qualitive and approximate modeling Learning and acquisition of opproximate models Integration of fuzzy logic and neural networks Integration of fuzzy logic and evolutionary computing Hardware implementation of fuzzy logic and algorithms Design and synthesis of fuzzy logic controllers Applications to System modeling and control Computer vision Robotics and manufacturing Signal processing Image understanding Decision systems Finance Databases Information systems Virtual reality Man-machine interfaces and etc. ---------------------------------------------------------------------------- Important Dates: Submission of Abstrations of Papers: November 15, 1997 Notification of Acceptance: December 15, 1997 Submission of Camera-Ready Papers: March 15, 1998 Deadline for registration: May 15, 1998 Conference: June 18-21, 1998 ---------------------------------------------------------------------------- Conference Organization Honorary Chair L.A. Zadeh, UC Berkeley, USA General Co-Chair C.K. Park, Kyunghee University, Korea K.C. Min, Yonsei University, Korea International Advisory Committee Chair : Z. Bien, KAIST, Korea Members: K. Hirota, Tokyo Institute of Technology, Japan M. Mizumoto, Osaka Electro-Communication Univ., Japan M. Mukaidono, Meiji University, Japan M. Sugeno, Tokyo Institute of Technology, Japan H. Tanaka, Osaka Prefecture University, Japan L. C. Jain, University of South Australia, Australia M. R. Pal, Indian Statistical Institute, India Y.M. Liu, Sichuan Union University, China H.W. Lee, Samsung Electronics Co., Korea M.S. No, LG Electronics Co., Korea H. F. Wang, National Tsing-hua University, Taiwan L. Ding, Institute of System Sciences, Singapore J.C. Bezdek, U.S.A. J.M. Keller, U.S.A.---------------------------------------------------------------------------- Sponsors International Fuzzy Systems Association Japan Society of Fuzzy Theory and Systems, Japan Fuzzy Mathematics and Systems Association of China Indian Society for Fuzzy Mathematics and Information Processing Chinese Fuzzy Systems Association, Taiwan, China Kyungnam University, Masan, Kyungnam, Korea Ministry of Information and Communication, Korea Korea Research Foundation, Korea Korea Science & Engineering Foudation, Korea ---------------------------------------------------------------------------- Venue The Conference will take place at the campus of the Kyungnam University(15000 students) which is located in the southern part of Masan. Kyungnam University is one of the leading Korean private universities, very well equipped with modern audiovisual technology and projection facilities for all available media as well as with all kinds of Internet connections. The banguet and social events of AFSS98 will be held in Tongyoung at Tongyoung Marina Resort. Tongyoung is the most beautiful port called Naples of Korea. ---------------------------------------------------------------------------- Paper Submission All materials must be written in English. Extended abstracts(3 copies): double-spaced, 2 pages of A4 size white papers. Choose the session from the above topics Please indicate title, author(s), affiliation(s), mailing address(es), Telephone and FAX numbers and E-mail address Abstracts must be received by November 15, 1997 Papers for those registered and presenting at the conference will be published in the conference proceedings;specific details on format and paper length will be sent upon acceptance Send abstracts to: Prof. Yong Gi Kim Department of Computer Science Kyungsang National University 900 Gajwa, chinju, Kyungnam, 660-701Korea Tel : +82-591-751-5997 Fax : +82-591-762-1944 E-mail : ygkim at nongae.gsun.ac.kr ---------------------------------------------------------------------------- Pre-registration Form AFSS'98 The Third Asian Fuzzy Systems Symposium June 18-21, 1998 Masan/Tongyoung, Kyungnam, Korea Pre-registration Form Please type or write in block letters Last Name _______________________________________ First Name __________________[ ]Mr. [ ] Ms. [ ]Prof. [ ]Dr. Organaization ___________________________________ ___________________________________ Address _________________________________________ _________________________________________ ZIP____________________City______________________ Country _________________________________________ Phone __________________________________________ Fax _____________________________________________ E-mail __________________________________________ Please mark the appropriate boxes. [ ] I intend to attend the conference [ ] I intend to submit a paper on the following topic ______________________________________________ ______________________________________________ [ ] I wish to receive further information [ ] I intend to attend the industrial visit [ ] I intend to attend the cultural tour Date __________________ Signature _________________ To be sent as soon as possible but not later than by November 15, 1997(Using Fax, E-mail, Mail) ---------------------------------------------------------------------------- All the information concerning the conference and its program can be obtained from: Secretariat of AFSS98 Prof. Seung Gook Hwang Program Committee Chair of AFSS98 Department of Industrial Engineering Kyungnam University 449 Weolyoung, Happo, Masan, Kyungnam, 631-701 Korea Tel : +82-551-49-2705 Fax : +82-551-43-8133 E-mail : hwangsg at hanma.kyungnam.ac.kr http://www.donga.ac.kr/~djkim/afss98.html ---------------------------------------------------------------------------- From jose at tractatus.rutgers.edu Tue Oct 28 11:02:23 1997 From: jose at tractatus.rutgers.edu (Stephen Hanson) Date: Tue, 28 Oct 1997 11:02:23 -0500 Subject: RUTGERS UNIVERSITY (Newark Campus) Junior Position in Cognitive Science Message-ID: <199710281602.LAA22890@tractatus.rutgers.edu> Please Post- Oct 27, 1997 Announcement-- ASSISTANT PROFESSOR -- Rutgers University (Newark Campus). Rutgers University-Newark Campus: The Department of Psychology anticipates making one tenure-track appointment in Cognitive Science at the Assistant Professor level.Candidates should have an active research program in one or more of the following areas: action, learning, high-level vision, language. Of particular interest are candidates who combine one or more of these research interests with cognitive neuroscience, mathematical and/or computational approaches. The position calls for candidates who are effective teachers at both the graduate and undergraduate levels. Review of applications will begin on December 15, 1997. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Send CV and three letters of recommendation to Professor S. J. Hanson, Chair, Department of Psychology - Cognitive Search, Rutgers University, Newark, NJ 07102. Email inquiries can be made to cogsci at psychology.rutgers.edu From jose at tractatus.rutgers.edu Tue Oct 28 10:26:46 1997 From: jose at tractatus.rutgers.edu (Stephen Hanson) Date: Tue, 28 Oct 1997 10:26:46 -0500 Subject: SENIOR POSITION-RUTGERS UNIVERSITY (Newark Campus) COGNITIVE NEUROSCIENCE Message-ID: <199710281526.KAA22530@tractatus.rutgers.edu> Please Post- Oct, 28, 1997 **** NEW ANNOUNCEMENT-- SENIOR POSITION AT RUTGERS UNIVERSITY-(Newark Campus) **** ***COGNITIVE NEUROSCIENCE *** Rutgers University-Newark Campus: The Department of Psychology anticipates making one senior level appointment in Cognitive Neuroscience. We seek applicants with a demonstrated program of interdisciplinary research and teaching in areas such as cognitive psychology, computation, imaging, or neuroscience. Areas of research are open, however we hope to find candidates that can expand potential connections with the nearby engineering school (NJIT) and/or UMDNJ (with a focus on a fMRI research facility). The position calls for candidates who are effective teachers at both the graduate and undergraduate levels. Review of applications will begin on December 15, 1997. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Please send current CV and three letters of recommendation to Professor S. J. Hanson, Chair, Department of Psychology Cognitive Neuroscience Search, Rutgers University, Newark, NJ 07102. From mephu at cril.univ-artois.fr Wed Oct 29 12:00:50 1997 From: mephu at cril.univ-artois.fr (Engelbert Mephu-Nguifo) Date: Wed, 29 Oct 1997 18:00:50 +0100 Subject: JFA'98 - 1st Call for papers Message-ID: <34576BC2.310A2D5D@cril.univ-artois.fr> First Call for Papers Thirteenth French-speaking Conference on Machine Learning Arras, May 18-20, 1998 Submission deadline: February 13, 1998 WWW temporary location: http://www.lifl.fr/~mephu-ng/conf/jfa98_uk.html WWW permanent location: http://www.univ-artois.fr/jfa98 General Information JFA'98 is the annual french-speaking conference on Machine Learning. It is made of reviewed papers, invited lectures and tutorials. JFA accepts both symbolic and numeric approaches in machine learning, and research or application papers in this area. Papers may be submitted in english but the final version must be written in french. The thirteenth JFA conference will be held on May 18-20, 1998 at Artois University (Arras). Submissions Papers relevant to the discipline of Machine Learning are sollicited, including, but not limited to: Applications of Machine Learning Case-based Learning Computational Learning Theory Data Mining Evolutionary Computation Hybrid Learning Systems Inductive Learning Inductive Logic Programming Knowledge Discovery in Databases Language Learning Learning and Problem Solving Learning by Analogy Learning in Multi-Agent Systems Learning in Dynamic Domains Learning to Search Multistrategy Learning Neural Networks Reinforcement Learning Robot Learning Scientific Discovery  Papers are limited to 12 pages (using a 10pt Times Roman font, single-spaced, with 3cm margins on all sides) including figures, title page, references, and appendices. (see Format instructions) The papers will be refereed according to clarity and overall quality criteria, focusing primarily on their relevance to the conference. Email submissions are strongly preferred. Please send an attached PostScript file to jnicolas at irisa.fr Those unable to produce a PostScript file may send 4 hardcopies of their paper submission to the program chair: Jacques Nicolas IRISA - INRIA Campus Universitaire de Beaulieu, 35042 Rennes Cedex, France Tel: (+33) 2 99 84 73 12 E.mail: jnicolas at irisa.fr Timetable February 13, 1998 deadline for submission April 6, 1998 notification of acceptance/rejection May 4, 1998 deadline for final versions of papers May 18-20, 1998 JFA'98 Program committee Conference Chair: Jacques Nicolas, IRISA - INRIA, Rennes, jnicolas at irisa.fr Members: Michel Benaim Univerist? Paul Sabatier, Toulouse Francesco Berganado Universit? di Torino, Italy Gilles Bisson Imag, Grenoble Pierre Brezellec Universit? de Paris 13 St?phane Canu Heudiasyc, Compi?gne Antoine Cornuejols LRI, Orsay Colin De la Higuera Universit? de St Etienne Patrick Gallinari Laforia, Paris Fr?d?rick Garcia Inra, Toulouse Olivier Gascuel Lirmm, Montpellier Engelbert Mephu Nguifo CRIL, Universit? d'Artois, Lens Laurent Miclet Enssat, Lannion Mohamed Quafafou IRIN, Nantes C?line Rouveirol LRI, Orsay Mich?le Sebag LMS, Paris Dominique Snyers ENSTbr, Brest Christel Vrain Universit? d'Orl?ans Jean Daniel Zucker Laforia, Paris Organizers: Engelbert Mephu Nguifo Local Arrangement Chair CRIL - IUT de Lens - Universit? d'Artois Claire N?dellec Publicity Chair LRI, Orsay Jean Daniel Zucker Tutorial Chair Laforia, Paris Secretariat: CRIL - JFA'98 IUT de Lens - Universit? d'Artois Rue de l'Universit? SP 16 62307 Lens cedex, France Tel: (33) 3 21 79 32 55 / 73 Fax: (33) 3 21 79 32 72 E.mail: jfa98 at cril.univ-artois.fr