From cas-cns at cns.bu.edu Thu Aug 1 09:29:15 1996 From: cas-cns at cns.bu.edu (Boston University - CAS/CNS) Date: Thu, 01 Aug 1996 09:29:15 -0400 Subject: CALL FOR PAPERS: Vison, Recognition, Action Message-ID: <199608011329.JAA05650@cns.bu.edu> CALL FOR PAPERS International Conference on VISION, RECOGNITION, ACTION: NEURAL MODELS OF MIND AND MACHINE May 29--31, 1997 Sponsored by the Center for Adaptive Systems and the Department of Cognitive and Neural Systems Boston University with financial support from the Defense Advanced Research Projects Agency and the Office of Naval Research This conference will include 21 invited lectures and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems see, understand, and act upon a changing world. CALL FOR ABSTRACTS: Contributed abstracts by active modelers of vision, recognition, or action in cognitive science, computational neuroscience, artificial neural networks, artificial intelligence, and neuromorphic engineering are welcome. They must be received, in English, by January 31, 1997. Notification of acceptance will be given by February 28, 1997. A meeting registration fee of $35 for regular attendees and $25 for students must accompany each Abstract. See Registration Information below for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Each Abstract should fit on one 8 x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract, corresponding author and presenting author name, address, telephone, fax, and email address. Preference for oral or poster presentation should be noted. (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, and VCR facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. The original and 3 copies of each Abstract should be sent to: Neural Models of Mind and Machine, c/o Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. The program committee will determine whether papers will be accepted in an oral or poster presentation, or rejected. REGISTRATION INFORMATION: Since seating at the meeting is limited, early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to: Neural Models of Mind and Machine, c/o Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: A limited number of fellowships for PhD candidates and postdoctoral fellows are available to at least partially defray meeting travel and living costs. The deadline for applying for fellowship support is January 31, 1997. Applicants will be notifed by February 28, 1997. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on offiicial institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. Reimbursement checks will be distributed after the meeting. Their size will be determined by student need and the availability of funds. TUTORIALS: A day of tutorials will be held on May 28. Details will follow soon. MEETING INFORMATION: For meeting updates, see the web site at http://cns-web.bu.edu/cns-meeting/. REGISTRATION FORM (Please Type or Print) Vision, Recognition, Action: Neural Models of Mind and Machine Boston University Boston, Massachusetts May 29--31, 1997 Mr/Ms/Dr/Prof: Name: Affiliation: Address: City, State, Postal Code: Phone and Fax: Email: The registration fee includes the meeting program, reception, coffee breaks, and meeting proceedings. [ ] $35 Regular [ ] $25 Student Method of Payment: Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Type of card: Name as it appears on the card: Account number: Expiration date: Signature and date: From sajda at peanut.sarnoff.com Thu Aug 1 14:54:46 1996 From: sajda at peanut.sarnoff.com (Paul Sajda x2961) Date: Thu, 1 Aug 1996 14:54:46 -0400 Subject: Positions in Adaptive Information and Signal Processing Message-ID: <9608011854.AA00702@ironman.sarnoff.com> David Sarnoff Research Center The Computing Sciences Research Group of The David Sarnoff Research Center currently has openings for two technical positions in Adaptive Information and Signal Processing. 1. Member Technical Staff PhD. in Electrical Engineering, Computer Engineering or Computer Science or related discipline. Experience in image and/or signal processing (preferably medical image or signal processing), neural networks, pattern recognition and computer vision. Will conduct research and build and explore commercial opportunities in areas of medical imaging, object recognition/detection, speech recognition, and/or datamining. Special Requirements: Interest in applying neural networks and image understanding to applications in medical image and speech processing. Ambition for helping to develop new commercial neural network/vision business. Excellent communication skills and willingness to bring in new business and extend application domain of group. Experience in MATLAB, C and C++ programming and the UNIX operating system. 2. Associate Member Technical Staff Bachelor of Science in Electrical Engineering or Computer Engineering or Computer Science. Experience in image and/or signal processing. Responsibilities will include developing and proto-typing software for applications in medical imaging, signal processing and datamining. Special Requirements: Experience in MATLAB, C and C++ programming and the UNIX operating system. Background in image and/or signal processing. Experience in neural networks and datamining preferred. Comfortable with building X-based graphical user interfaces. Please send resumes and correspondence via mail to: Dr. Paul Sajda David Sarnoff Reseach center CN5300 Princeton, NJ 08543-5300 or preferably by e-mail to: csrg at sarnoff.com Postscript is preferred for e-mail submissions. From becker at curie.psychology.mcmaster.ca Thu Aug 1 22:28:19 1996 From: becker at curie.psychology.mcmaster.ca (Sue Becker) Date: Thu, 1 Aug 1996 22:28:19 -0400 (EDT) Subject: REGISTRATION FOR NIPS*96 Message-ID: REGISTRATION FOR NIPS*96 Neural Information Processing Systems Tenth Annual Conference Monday December 2 - Saturday December 7, 1996 Denver, Colorado The NIPS*96 registration brochure is now available online. NIPS*96 is the tenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks and oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session (Dec. 3-5), there will be one day of tutorial presentations (Dec. 2), both in Denver, Colorado. Following will be two days of focused workshops on topical issues at Snowmass, Colorado, a world class ski resort (Dec. 6-7). The registration brochure and other conference information may be retrieved via the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS We expect to offer online registration soon from the NIPS web site. Registration material and other information may also be obtained by writing to: NIPS*96 Registration Conference Consulting Associates 451 N. Sycamore Monticello, IA 52310 fax: (319) 465-6709 (attn: Denise Prull) e-mail: nipsinfo at salk.edu REGISTRATION FEES: Conference (includes Proceedings, Reception, Banquet and 3 Continental Breakfasts) Regular $285.00 ($360.00 after Oct. 31, 1996) Full-time students, with I.D. $100.00 ($150.00 after Oct. 31, 1996) Workshops Regular $150.00 ($200.00 after Oct. 31, 1996) Full-time students, with I.D. $75.00 ($125.00 after Oct. 31, 1996) Tutorials Regular $150.00 Full-time students, with I.D. $50.00 TUTORIAL PROGRAM December 2, 1996 Session I: 09:30-11:30 Mostly statistical methods for language processing Dan Jurafsky, University of Colorado at Boulder From postma at cs.rulimburg.nl Fri Aug 2 09:48:43 1996 From: postma at cs.rulimburg.nl (Eric Postma) Date: Fri, 2 Aug 96 15:48:43 +0200 Subject: geometric phase shift Message-ID: <9608021348.AA02835@bommel.cs.rulimburg.nl> The following paper (draft version) is available from our archive: GEOMETRIC PHASE SHIFT IN A NEURAL OSCILLATOR Eric Postma, Jaap van den Herik and Patrick Hudson Computer Science Department University of Maastricht P.O. Box 616, 6200 MD Maastricht The Netherlands ABSTRACT This paper studies the effect of slow cyclic variation of parameters on the phase of the oscillating Morris-Lecar model. In chemical oscillators it is known that a phase shift, called the geometric phase shift, is observed upon return to an initial point in parameter space. We find geometric phase shifts for a two-parameter variation in the Morris-Lecar model. As with the chemical oscillator, the magnitude of the shift is proportional to the area enclosed by the path traced through the parameter space. It is argued that the geometric phase shift may subserve many biological functions. We conclude that the geometric phase shift may be functionally relevant for neural computation. ftp://ftp.cs.rulimburg.nl/pub/papers/postma/gps.ps.Z (or gps.ps) We welcome your comments and suggestions Eric Poatma From mm at santafe.edu Fri Aug 2 18:06:16 1996 From: mm at santafe.edu (Melanie Mitchell) Date: Fri, 2 Aug 1996 16:06:16 -0600 (MDT) Subject: book announcement Message-ID: <199608022206.QAA26640@shabikeschee.santafe.edu> Announcing a new book: ADAPTIVE INDIVIDUALS IN EVOLVING POPULATIONS: MODELS AND ALGORITHMS edited by Richard K. Belew and Melanie Mitchell Proceedings Volume XXVI, Santa Fe Institute Studies in the Sciences of Complexity Addison-Wesley, Reading, MA, 1996 ABOUT THE BOOK The theory of evolution has been most successful explaining the emergence of new species in terms of their morphological traits. Ethologists teach that behaviors, too, qualify as first-class phenotypic features, but evolutionary accounts of behaviors have been much less satisfactory. In part this is because maturational "programs" transforming genotype to phenotype are "open" to environmental influences affected by behaviors. Further, many organisms are able to continue to modify their behavior, i.e., learn, even after fully mature. This creates an even more complex relationship between the genotypic features underlying the mechanisms of maturation and learning and the adapted behaviors ultimately selected. A meeting held at the Santa Fe Institute during the summer of 1993 brought together a small group of biologists, psychologists, and computer scientists with shared interests in questions such as these. This volume consists of approximately two dozen papers that explore interacting adaptive systems from a range of interdisciplinary perspectives. About half the articles are classic, seminal references on the subject, ranging from biologists like Lamarck and Waddington to psychologists like Piaget and Skinner. The other papers represent new work by the workshop participants. The role played by mathematical and computational tools, both as models of natural phenomena and as algorithms useful in their own right, is particularly emphasized in these new papers. In all cases the chapters have been augmented by specially written prefaces. In the case of the reprinted classics, the prefaces help to put the older papers in a modern context. For the new papers, the prefaces have been written by colleagues from a discipline other than the paper's authors, and highligh, for example, what a computer scientist can learn from a biologist's model, or vice versa. Through these cross-disciplinary "dialogues" and a glossary collecting multidisciplinary connotations of pivotal terms, the process of interdisciplinary investigation itself becomes a central theme. ORDERING INFORMATION This series is published by The Advanced Book Program, Addison-Wesley Publishing Company, One Jacob Way, Reading, MA 01867. Please contact your local bookstore or, for credit card orders, call Addison-Wesley Publishing Company at (800)447-2226. For more information on this book, visit the web page: http://www.santafe.edu/sfi/publications/Bookinfo/aiineptofc.html ------------------------------------------------------------------ ADAPTIVE INDIVIDUALS IN EVOLVING POPULATIONS: MODELS AND ALGORITHMS TABLE OF CONTENTS Chapter 1: Introduction - R. K. Belew & M. Mitchell BIOLOGY OVERVIEW Chapter 2: Adaptive Computation in Ecology and Evolution: A Guide to Future Research - J. Roughgarden, A. Bergman, S. Shafir, and C. Taylor REPRINTED CLASSICS Chapter 3: The Classics in Their Context, and in Ours - J. Schull Chapter 4: Of the Influence of the Environment on the Activities and Habits of Animals, and the Influence of the Activities and Habits of These Living Bodies in Modifying Their Organisation and Structure - J. B. Lamarck Chapter 5: A New Factor in Evolution - J. M. Baldwin Chapter 6: On Modification and Variation - C. Lloyd Morgan Chapter 7: Canalization of Development and the Inheritance of Acquired Characters - C. H. Waddington Chapter 8: The Baldwin Effect - G. G. Simpson Chapter 9: The Role of Somatic Change in Evolution - G. Bateson NEW WORK Chapter 10: A Model of Individual Adaptive Behavior in a Fluctuating Environment - L. A. Zhivotovsky, A. Bergman, and M. W. Feldman (Preface by R. K. Belew) Chapter 11: The Baldwin Effect in the Immune System: Learning by Somatic Hypermutation - R. Hightower, S. Forrest, and A. S. Perelson (Preface by W. Hart) Chapter 12: The Effect of Memory Length on Individual Fitness in a Lizard - S. Shafir and J. Roughgarden (Preface by M. L. Littman and F. Menczer; Appendix by F. Menczer, W. E. Hart, and M. L. Littman) Chapter 13: Latent Energy Environments - F. Menczer and R. K. Belew (Preface by J. Roughgarden) PSYCHOLOGY OVERVIEW Chapter 14: The Causes and Effects of Evolutionary Simulation in the Behavioral Sciences - P. M. Todd REPRINTED CLASSICS Chapter 15: Excerpts from "Principles of Biology" - H. Spencer (Preface by P. G. Godfrey-Smith) Chapter 16: Excerpts from "Principles of Psychology" - H. Spencer (Preface by P. G. Godfrey-Smith) Chapter 17: William James and the Broader Implications of a Multilevel Selectionism - J. Schull Chapter 18: Excerpts from "The Phylogeny and Ontogeny of Behavior" - B. F. Skinner Chapter 19: Excerpts from "Adaptation and Intelligence: Organic Selection and Phenocopy" - J. Piaget (Preface by O. Miglino & R. K. Belew) Chapter 20: Selective Costs and Benefits of Learning - T. D. Johnston (Preface by P. M. Todd) NEW WORK Chapter 21: Sexual Selection and the Evolution of Learning - P. M. Todd (Preface by S. Shafir) Chapter 22: Discontinuity in Evolution: How Different Levels of Organization Imply Preadaptation - O. Miglino, S. Nolfi, and D. Parisi (Preface by M. Mitchell) Chapter 23: The Influence of Learning on Evolution - D. Parisi and S. Nolfi (Preface by W. Hart) COMPUTER SCIENCE OVERVIEW Chapter 24: Computation and the Natural Sciences - R. K. Belew, M. Mitchell, and D. Ackley REPRINTED CLASSICS Chapter 25: How Learning Can Guide Evolution - G. Hinton & S. Nowlan (Preface by M. Mitchell and R. K. Belew) Natural Selection: When Learning Guides Evolution - J. Maynard Smith NEW WORK Chapter 26: Simulations Combining Evolution and Learning - M. L. Littman (Preface by M. Mitchell) Chapter 27: Optimization with Genetic Algorithm Hybrids that Use Local Search - W. Hart & R. K. Belew (Preface by C. Taylor) GLOSSARY INDEX From scott at cpl_mmag.nhrc.navy.mil Thu Aug 1 18:27:33 1996 From: scott at cpl_mmag.nhrc.navy.mil (Scott Makeig) Date: Thu, 1 Aug 1996 15:27:33 -0700 (PDT) Subject: possible postdoctoral positions in biomedical signal processing Message-ID: <199608012227.PAA09986@cpl_mmag.nhrc.navy.mil> POSSIBLE NATIONAL RESEARCH COUNCIL POSTDOCTORAL OPPORTUNITY I anticipate possible funding this autumn of one or two National Research Council postdoctoral Research Associate positions in a Neural Human-System Interface Development project for Online Alertness Monitoring using EEG and video signals. One position would involve design and testing of experiments in use of EEG-based alertness information to improve human-computer interfaces in military and transportation-related settings. An ideal candidate would have a strong background in engineering and signal processing, with interests in cognitive neuroscience and applications to human-computer interface design. A second position would focus on combining time-series analysis, neural networks, and information theory to study applications of independent component analysis to brain imaging and EEG data. The ideal candidate would have a strong background in time-series analysis, information theory, and neural networks, with interests in blind separation and biomedical information processing. Interested persons can learn more via www at: http://128.49.52.9. As there is an Aug. 12 deadline for application to the NRC program, immediate inquiries via email are advisable. Scott Makeig makeig at nhrc.navy.mil From cristina at idsia.ch Mon Aug 5 06:08:29 1996 From: cristina at idsia.ch (Cristina Versino) Date: Mon, 5 Aug 96 12:08:29 +0200 Subject: Paper available: Learning Fine Motion by Using the Hierarchical Extended Kohonen Map Message-ID: <9608051008.AA04293@fava.idsia.ch> ``Learning Fine Motion by Using the Hierarchical Extended Kohonen Map'' Cristina Versino and Luca Maria Gambardella IDSIA, Corso Elvezia 36, 6900 Lugano, Switzerland cristina at idsia.ch, luca at idsia.ch A Hierarchical Extended Kohonen Map (HEKM) learns to associate actions to perceptions under the supervision of a planner: they cooperate to solve path finding problems. We argue for the utility of using the hierarchical version of the KM instead of the ``flat'' KM. We measure the benefits of cooperative learning due to the interaction of neighboring neurons in the HEKM. We highlight a beneficial side-effect obtained by transferring motion skill from the planner to the HEKM, namely, smoothness of motion. In Proc. ICANN96, International Conference on Artificial Neural Networks, Bochum, Germany, 17--19 July, pp. 221--226. You can obtain a copy (6 pages, 135K in compressed form) via: 1) netscape ftp://ftp.idsia.ch/pub/cristina/icann96.ps.gz 2) ftp 3) visiting recent IDSIA papers page: http://www.idsia.ch/reports.html From davec at cogs.susx.ac.uk Mon Aug 5 05:19:23 1996 From: davec at cogs.susx.ac.uk (Dave Cliff) Date: Mon, 05 Aug 1996 10:19:23 +0100 Subject: Scholarships for EASy MSc at Sussex, U.K. Message-ID: <3205BC9B.6957@cogs.susx.ac.uk> The University of Sussex at Brighton School of Cognitive and Computing Sciences MSc in Evolutionary and Adaptive Systems CYBERLIFE(tm) SCHOLARSHIPS The University of Sussex School of Cognitive and Computing Sciences is pleased to announce the availability of up to three scholarships for the Master of Science in Evolutionary and Adaptive Systems (EASy MSc). The scholarships are funded by CyberLife Ltd, Cambridge, UK. CyberLife Ltd is a subsidiary of Millennium Interactive Ltd, a leading producer of games and entertainment software. CyberLife Ltd conducts research and development of software based on technologies from Artificial Life and Evolutionary and Adaptive Systems. Each scholarship will provide funding for fees and maintenance for the one-year full-time course at a level equivalent to the standard EPSRC postgraduate studentship rate (approx 4800pounds). Candidates wishing to apply for a CyberLife Scholarship should already hold an offer of a place on the EASy MSc. Candidates who have not yet applied for admission to the EASy MSc should request an application from by contacting either the Postgraduate Office of the University of Sussex, or tehe COGS Graduate secretary Linda Thompson. Contact addresses are given at the end of this e-mail. Scholarship candidates will be shortlisted and interviewed at CyberLife's offices in Cambridge UK. Successful applicants awarded a CyberLife Scholarship will study in Brighton at the University of Sussex from October until April, attending the EASy MSc taught courses. After successful completion of the taught courses, CyberLife scholars will transfer to Cambridge, working on their summer research project at CyberLife's offices until the completion of their Master's thesis at the end of August. The scholars remain students of the University throughout this period, and a member of Sussex faculty will make visits to Cambridge for supervisory meetings in addition to maintaining contact via the internet. The Master's thesis should address a topic of interest to CyberLife Ltd's current research directions. Sample topic areas are listed below. A copy of this e-mail, with a bibliography of relevant papers, is available on the world wide web at: http://www.cogs.susx.ac.uk/lab/adapt/msc_schol.html RESEARCH TOPICS (1) Group Behaviour Application of Evolutionary and Adaptive Systems techniques to the genre of games and entertainment software where groups of agents react or interact with other agents, possibly adapting over time. Including cooperative and communicative behavior. (2) Artificial Biochemistry CyberLife Ltd have developed techniques for sensory-motor control in autonomous software agents based on neural networks which are affected by diffuse "hormones" in the agent's "biochemistry". Research projects exploring artificial biochemistry (e.g. autocatalytic sets) or diffuse modulation of activity in neural networks, or both, are possible. (3) Adaptive Architectures for Real-Time Control In many real-time games and entertainment software systems there is a significant need for control architectures which respond, react, and adapt to the environment (including the playing style of any human users) in real-time. Such architectures could be applied to most game software with real-time interactive elements. Entertainment examples include artificial motor-racing drivers, aeroplane pilots, etc. However, such systems could also possibly be applied to industrial tasks such as traffic-light scheduling for optimal flow-control on roads. (4) Speculator Agents CyberLife Ltd have an interest in developing autonomous software agents that profitably speculate on movements in the prices of stocks and shares, or financial derivatives such as warrants, options, and futures. There is particular interest in the development of agents with adaptive risk/reward profiles. (5) Navigation in Arbitrary Domains The task of providing software agents with robust, efficient, and plausible navigation mechanisms for negotiating virtual environments of two or more dimensions remains an open research issue. Research in robot guidance and studies of navigation in animals could be adapted and extended to deliver techniques appropriate for interactive software applications. (6) Strategic Planning Research in reactive planning has been applied to simple video game environments. Further work could be undertaken to extend such techniques to provide advanced techniques for adaptively generating strategies and tactics in a variety of entertainment software applications. The techniques would not necessarily apply to in-game player agents; rather, to the provision of an overall intelligence in strategy games. (7) Procedural Learning Mechanisms for Motor Control The development of techniques for adaptive motor control in software agents which interact with a simulated physical environment, e.g. learning to walk. Possibly adapting existing research on walking robots. (8) Genetic Encoding and Morphogenesis Using genetic algorithms to develop artificial neural networks requires that the multi-dimensional network architecture is encoded as a linear string of characters (the "genome"). The process which maps from a genome to a network architecture is often referred to as "morphogenesis". Recently, a number of researchers have addressed the problem of developing encodings and associated morphogenesis techniques which are robust with respect to genetic operators such as mutation and crossover, and expressive (e.g. allowing for repeated structures and modular designs). CONTACTS -------- Application forms for entry to the EASy MSc in October 1996 are available from: Postgraduate Admissions Office Sussex House University of Sussex Brighton BN1 9RH England, U.K. Tel: +44 (0)1273 678412 Email: PG.Admissions at admin.susx.ac.uk Applicants for the CyberLife Scholarships should register their interest in writing (letter or email) to: Linda Thompson (COGS Graduate Admissions Secretary) Cognitive and Computing Sciences University of Sussex Brighton BN1 9QH England, U.K. Tel: +44 (0)1273 678754 Fax: +44 (0)1273 671320 E-mail: lindat at cogs.susx.ac.uk CyberLife is a trademark of CyberLife Ltd, Quern House, Mill Court, Great Shelford, Cambridge, UK. From bogus@does.not.exist.com Mon Aug 5 08:23:47 1996 From: bogus@does.not.exist.com () Date: Mon, 5 Aug 1996 13:23:47 +0100 Subject: EMMCVPR'97 -- Final Call for Papers Message-ID: <9608051323.ZM18689@minster.york.ac.uk> I apologize if you receive multiple copies of this. Please note the following: 1) Deadline for submission of papers moved to SEPTEMBER 23, 1996. 2) Selection of papers published in a SPECIAL ISSUE of the journal PATTERN RECOGNITION. _____________________________________________________________________________ FINAL CALL FOR PAPERS International Workshop on ENERGY MINIMIZATION METHODS IN COMPUTER VISION AND PATTERN RECOGNITION Venice, Italy, May 21-23, 1997 Energy minimization methods represent a fundamental methodology in computer vision and pattern recognition, with roots in such diverse disciplines as Physics, Psychology, and Statistics. Recent manifestations of the idea include Markov random fields, relaxation labeling, various types of neural networks, etc. These techniques are finding application in areas such as early vision, graph matching, motion analysis, visual reconstruction, etc. The aim of this workshop is to consolidate research efforts in this area, and to provide a discussion forum for researchers and practitioners interested in this important yet diverse subject. The scientific program of the workshop will include the presentation of invited talks and contributed research papers. The workshop is sponsored by the International Association for Pattern Recognition (IAPR) and organized by the Department of Applied Mathematics and Computer Science of the University of Venice "Ca' Foscari." Topics Papers covering (but not limited to) the following topics are solicited: Theory: (e.g., Bayesian contextual methods, biology-inspired methods, discrete optimization, information theory and statistics, learning and parameter estimation, Markov random fields, neural networks, relaxation processes, statistical mechanics approaches, stochastic methods, variational methods) Methodology: (e.g., deformable models, early vision, matching, motion, object recognition, shape, stereo, texture, visual organization) Applications: (e.g., character and text recognition, face processing, handwriting, medical imaging, remote sensing) Program co-chairs Marcello Pelillo, University of Venice, Italy Edwin R. Hancock, University of York, UK Program committee Davi Geiger, New York University, USA Anil K. Jain, Michigan State University, USA Josef Kittler, University of Surrey, UK Stan Z. Li, Nanyang Technological University, Singapore Jean-Michel Morel, Universite' Paris Dauphine, France Maria Petrou, University of Surrey, UK Anand Rangarajan, Yale University, USA Sergio Solimini, Polytechnic of Bari, Italy Alan L. Yuille, Harvard University, USA Josiane Zerubia, INRIA, France Steven W. Zucker, McGill University, Canada Invited speakers Anil K. Jain, Michigan State University, USA Josef Kittler, University of Surrey, UK Alan L. Yuille, Harvard University, USA Steven W. Zucker, McGill University, Canada Venue The workshop will be held at the University of Venice "Ca' Foscari." The lecture theater will be in the historic center of Venice, and accommodation will be provided in nearby hotels. Submission procedure Prospective authors should submit four copies of their contribution(s) by September 23, 1996 to: Marcello Pelillo (EMMCVPR'97) Dipartimento di Matematica Applicata e Informatica Universita' "Ca' Foscari" di Venezia Via Torino 155, 30173 Venezia Mestre, Italy E-mail: pelillo at dsi.unive.it The manuscripts submitted should be no longer than 15 pages, and the cover page should contain: title, author's name, affiliation and address, e-mail address, fax and telephone number, and an abstract no longer than 200 words. In case of joint authorship, the first name will be used for correspondence unless otherwise requested. All manuscripts will be reviewed by at least two members of the program committee. Accepted papers will appear in the proceedings which are expected to be published in the series Lecture Notes in Computer Science by Springer-Verlag, and will be distributed to all participants at the workshop. In order to get a high-quality book with a uniform and professional appearance, prospective authors are strongly encouraged to use the LaTeX style file available at the WWW site indicated below. Important dates Extended paper submission deadline: September 23, 1996 Notification of acceptance: December 1996 Camera-ready paper due: February 1997 Homepage Information on the workshop is maintained at http://Dcpu1.cs.york.ac.uk:6666/~adjc/EMMCVPR97.html This page will be updated continuously and will include information on accepted papers and the final program. Special issue of Pattern Recognition Selected papers from the workshop will be published in a special edition of the journal Pattern Recognition scheduled for July 1998. Prospective author's will be asked to submit enhanced papers for review at the workshop. First reviews will be returned by October 1997. The final decisions concerning inclusion will be made in January 1998. Concomitant events During the week following EMMCVPR'97, participants will have the opportunity to attend the 3rd International Workshop on Visual Form (IWVF3) to be held in Capri, May 28-30. For additional information please contact any of the co-chairmen Carlo Arcelli (car at imagm.na.cnr.it), Luigi Cordella (cordel at nadis.dis.unina.it), and Gabriella Sanniti di Baja (gsdb at imagm.na.cnr.it), or see http://amalfi.dis.unina.it/IWF3/iwvf3cfp.html From listerrj at helios.aston.ac.uk Wed Aug 7 06:56:32 1996 From: listerrj at helios.aston.ac.uk (Richard Lister) Date: Wed, 07 Aug 1996 11:56:32 +0100 Subject: Course: Neural Computing for Industrial Applications Message-ID: <18020.199608071056@sun.aston.ac.uk> ---------------------------------------------------------------------- Neural Computing for Industrial Applications: An Intensive Hands-on Course 23-25 September 1996 Aston University Birmingham The Neural Computing Research Group at Aston University will be running the course "Neural Computing for Industrial Applications - An Intensive Hands-on Course" at Aston University between 23-25 September 1996. The course is aimed at applications developers as well as technical managers in industry and commerce. It will also be of direct relevance to practitioners in universities and research laboratories. The course will focus on a principled, rather than ad-hoc, approach to neural networks, providing the main tools to enable their successful application in real-world problems. It combines lectures with supervised laboratory sessions and aims to provide participants with a coherent picture of the foundations of neural computing, as well as a deep understanding of many practical issues arising in their application to commercial tasks. The lectures will take the student, step by step, throughout the process of applying neural networks to commercial tasks including: data preparation, choice of adequate configuration and cost function, training, methods of performance improvement and validation. The various development steps will be demonstrated on representative regression and classification commercial tasks, emphasising their relevance to real-world problems. Lectures will cover both basic and advanced material ranging from neural networks architectures and training methods to advanced Bayesian methods and stochastic Monte-Carlo techniques for tackling the difficulties of missing data, definition of error bars and model selection. Small group laboratory sessions will follow the lectures, providing a demonstration of methods and techniques taught in class and a first hand experience of their advantages and drawbacks for commercial applications. In addition, the course will provide hands-on experience in developing effective solutions to complex and challenging problems using the Netlab software developed at Aston. Who should attend ----------------- This course is aimed at applications developers as well as technical managers in industry and commerce. It will also be of direct relevance to practitioners in universities and research laboratories. Benefits -------- The course will provide hands-on experience in developing effective solutions to complex and challenging problems using the Netlab software developed at Aston. Participants will receive a complimentary copy of the Netlab software together with the Matlab simulation environment. They will also receive lecture notes, laboratory manuals, and a complementary copy of the new textbook "Neural Networks for Pattern Recognition". Laboratory sessions ------------------- The course includes four practical sessions designed to complement and reinforce the material presented during the lectures. These will make use of commercial and industrial data sets and will be based on the Netlab neural network simulation system running on modern Pentium PCs under Microsoft Windows. Course summary --------------- The course begins with registration and a course dinner on Sunday 22 September and ends at 5.00pm on Wednesday 25 September. Day 1 ----- The first day will include a general introduction to Neural Computing from a statistical viewpoint, an introduction to the example data sets used as case studies, data processing, the methodology of developing an application, multi-layer perceptrons and training algorithms. Some of the issues to be examined are * Data preparation Conventional techniques, feature extraction, dealing with missing data, linear regression, PCA and visualisation. * The multi-layer perceptron Basic architecture, using MLP for regression problems. * Training algorithms On-line and batch learning, gradient descent and conjugate gradient techniques, line search and other advanced techniques. A laboratory session for demonstrating and practising data processing techniques introduced in the lectures will also be held in the afternoon, making use of the example data sets introduced earlier. Day 2 ----- After introducing the architecture and training algorithms for Radial Basis Function networks, we will examine methods for monitoring and controlling network performance including various validation and regularisation techniques. The main topics include: * Radial Basis Function Networks Basic architecture, relation to conventional methods and training paradigms. * Generalisation Training, validation and test sets, how to monitor training success. * Model complexity and regularisation The Bayesian approach for controlling model complexity, incorporating prior knowledge, error bars, the evidence procedure and Monte Carlo methods. Following the lectures, two laboratory sessions will be held during the second day, demonstrating and practising training of MLP and RBF networks as well as regularisation and validation methods. Day 3 ----- The last day of the course will concentrate on extending the neural networks framework presented for regression tasks to accommodate classification problems. In addition we will discuss practical issues related to using neural networks for commercial problems. * Classification problems Network predictions as probabilities and the Bayesian approach, choice of error functions and activation functions, minimising risk, reject option and imbalanced priors. * Practicalities and diagnostics Measures of performance assessment, error bars and input data distribution, non-stationarity. One laboratory session will be held in the last day, demonstrating the use of MLP and RBF networks in classification tasks as well as exercising the use of practical diagnostics methods. Course tutors ------------- Professor Christopher Bishop was formerly the Head of the Applied Neurocomputing Centre at AEA Technology and has developed many successful applications of neural networks in a wide range of domains. He is Chairman of the Neural Computing Applications Forum. Professor David Lowe was previously Leader of the Pattern Processing Group at DRA Malvern, and is currently applying neural networks to problems in electricity load demand forecasting, portfolio optimisation, chemical vapour analysis and the control of internal combustion engines. Dr Ian Nabney worked on applications of neural computing for Logica and is currently Programme Chair of the Neural Computing Applications Forum. He has worked on applications of neural networks to jet engine diagnostics, analysis of satellite radar signals, and control of distillation columns. Dr Richard Rohwer has research interests which include the Bayesian and differential geometry views of machine learning, ultra-fast memory-based algorithms, and practical methods for specification of prior knowledge. He works with applications ranging from speech processing to pipeline inspection. Dr David Saad works on the foundations of neural computing from a statistical mechanics perspective with emphasis on learning and model selection, and has developed applications to problems in bar code location and identification. Dr Christopher Williams has developed novel approaches to pattern recognition which extend conventional neural network methods, and also has strong interests in applications to machine vision. Enrolment Details ----------------- Please send your booking form to the address below to reserve a place on the course. Alternatively, you can reserve a place on the course by accessing the enrolment form on our World Wide Web page at http://www.ncrg.aston.ac.uk/. An invoice will be issued upon receipt of this form and payment should be received by Friday 6th September 1996. Since the course involves laboratory classes, places are strictly limited, so an early booking is strongly advised. Please complete one form per delegate. A receipt will be issued upon payment and will be sent together with an acknowledgement. Preparatory course notes will be sent four weeks before the course date. Cancellations ------------- All cancellations must be received in writing. Cancellations made before Friday 6th September 1996 will be subject to an administration fee of UKP50, and cancellations made after this date will be subject to the full amount of the course fee. Should a delegate become unable to attend a substitution may be made, which must be confirmed in writing. What Payment Includes --------------------- * Three days attendance on the course including laboratory sessions and lectures. * Free copy of Aston's Netlab neural network software (with documentation). * Full set of course notes and laboratory manuals. * Free copy of the text book "Neural Networks for Pattern Recognition" by Professor Christopher M Bishop. * Attendance at the Course Dinner on Sunday 22nd September 1996. * Buffet Lunches and refreshments on 23, 24, 25 September. * Three nights Bed and Breakfast Accommodation at the Aston Business School (delegates are free to make their own arrangements and a reduced course fee is available). Evening Meals ------------- Evening meals on 23rd and 24th September can be taken at the Aston Business School at a cost of UKP15 each. Please indicate on the booking form if you would like either of these meals, and include payment with your registration fee. Software -------- Participants will receive a complimentary copy of the Netlab software and will be able to purchase Matlab (which is required to run Netlab) at a special discounted rate. Matlab software is available on PC/MS- Windows, Macintosh and UNIX platforms. Please indicate on the booking form if you are interested in receiving further details about the software. Please complete and return this form to: Miss H E Sondermann Neural Computing for Industrial Applications Neural Computing Research Group Aston University Birmingham B4 7ET ---------------------------------------------------------------------- From async at cs.tu-berlin.de Wed Aug 7 07:30:14 1996 From: async at cs.tu-berlin.de (Stefan M. Rueger) Date: Wed, 7 Aug 1996 13:30:14 +0200 (MET DST) Subject: TR on Decimatable Boltzmann Machines vs. Gibbs Sampling Message-ID: <199608071130.NAA24872@deneb.cs.tu-berlin.de> Technical Report Available DECIMATABLE BOLTZMANN MACHINES VS. GIBBS SAMPLING Stefan M. Rueger, Anton Weinberger, and Sebastian Wittchen Fachbereich Informatik Technische Universitaet Berlin July 1996 Exact Boltzmann learning can be done in certain restricted networks by the technique of decimation. We have enlarged the set of decimatable Boltzmann machines by introducing a new decimation rule. We have compared solutions of a probability density estimation problem with decimatable Boltzmann machines to the results obtained by Gibbs sampling in unrestricted (non-decimatable) Boltzmann machines. This technical report is available in compressed Postscript by the following URLs: http://www.cs.tu-berlin.de/~async/www-pub/TR96-29.ps.gz http://www.cs.tu-berlin.de/~async/www-pub/TR96-29-ds.ps.gz (2-on-1-page vers.) Bibtex entry: @TECHREPORT{TU-Berlin-Informatik-96-29, AUTHOR = {Stefan M. R\"uger and Anton Weinberger and Sebastian Wittchen}, TITLE = {Decimatable {B}oltzmann Machines vs. {G}ibbs Sampling}, INSTITUTION={Fachbereich Informatik der Technischen Universit\"at Berlin}, YEAR = {1996}, NUMBER = {96-29} } ------------------------------------------------------------------------------ Stefan M. Rueger http://www.cs.tu-berlin.de/~async Sekr. FR 5-9, Franklinstr. 28/29 async at cs.tu-berlin.de Technische Universitaet Berlin, 10 587 Berlin (+49)30/31422662 ------------------------------------------------------------------------------ From ken at phy.ucsf.edu Wed Aug 7 20:05:23 1996 From: ken at phy.ucsf.edu (Ken Miller) Date: Wed, 7 Aug 1996 17:05:23 -0700 Subject: Paper available: Integration and ISI variability in Cortical Neurons Message-ID: <9608080005.AA20710@coltrane.ucsf.edu> FTP-host: ftp.keck.ucsf.edu FTP-filename: /pub/ken/integration.ps.gz 14 pages The following paper, to appear in Neural Computation, is now available by ftp. The ftp site is ftp://ftp.keck.ucsf.edu/pub/ken/integration.ps.gz This and other papers can also be obtained from our http site: http://www.keck.ucsf.edu/~ken PHYSIOLOGICAL GAIN LEADS TO HIGH ISI VARIABILITY IN A SIMPLE MODEL OF A CORTICAL REGULAR SPIKING CELL Todd W. Troyer and Kenneth D. Miller todd at phy.ucsf.edu, ken at phy.ucsf.edu Keck Center for Integrative Neuroscience Sloan Center for Theoretical Neurobiology Departments of Physiology and Otolaryngology University of California, San Francisco San Francisco, CA 94143 ABSTRACT: To understand the interspike interval (ISI) variability displayed by visual cortical neurons (Softky and Koch, 1993), it is critical to examine the dynamics of their neuronal integration as well as the variability in their synaptic input current. Most previous models have focused on the latter factor. We match a simple integrate-and-fire model to the experimentally measured integrative properties of cortical regular spiking cells (McCormick et al., 1985). After setting RC parameters, the post-spike voltage reset is set to match experimental measurements of neuronal gain (obtained from {\em in vitro} plots of firing frequency vs.\ injected current). Examination of the resulting model leads to an intuitive picture of neuronal integration that unifies the seemingly contradictory ``$1/\sqrt{N}$'' and ``random walk'' pictures that have previously been proposed. When ISI's are dominated by post-spike recovery, $1/\sqrt{N}$ arguments hold and spiking is regular; if recovery is negligible so that spiking is triggered by input variance around a steady state, spiking is Poisson. In integrate-and-fire neurons matched to cortical cell physiology, steady state behavior is predominant and ISI's are highly variable at all physiological firing rates and for a wide range of inhibitory and excitatory inputs. From simon.schultz at psy.ox.ac.uk Thu Aug 8 08:30:36 1996 From: simon.schultz at psy.ox.ac.uk (Simon Schultz) Date: Thu, 8 Aug 1996 13:30:36 +0100 (BST) Subject: ITB2 Call For Participation Message-ID: <199608081230.NAA15473@axp2.cns.ox.ac.uk> Call for Participation: Information Theory and the Brain II To be held on the 20-21st of September, Headland Hotel, Newquay, Cornwall, England. http://www.mrc-bbc.ox.ac.uk/~itb2/conference.html This is the sequel to the conference held in Stirling, Scotland last year. This year the conference will be held in the Cornish town of Newquay. Apart from being one of the best areas for surfing in Europe, the surrounding countryside is amongst the most beautiful in Britain. The conference will be held in the spectacular Headland Hotel right next to the famous Fistral Beach and in mid September the water is at its warmest, the surf is starting to get larger, and the summer holiday crowds have headed home. It is hoped that pleasant surroundings will help to maintain an informal atmosphere. Organising Committee: Roland Baddeley (Chair) Nick Chater Peter Foldiak Peter Hancock Bruno Olshausen Dan Ruderman Simon Schultz Guy Wallis 21 papers have been accepted for presentation at the conference. While places are limited, there is still room for a number of non-presenting attendees. Anyone who would like to attend the conference but who has not submitted a paper is now encouraged to contact the organisers, either by email at itb2 at psy.ox.ac.uk, or by surface mail to: IBT2 c/o Roland Baddeley, Dept of Psychology, University of Oxford, Oxford, England OX1 3UD Registration will be 40 pounds (about $60 U.S.) with the participants expected to find their own accommodation. This varies in price from as low as 5 pounds (for the most basic) upwards. Accommodation in the summer can be hard to find but by the 20th, most summer holidays have finished and the situation is much better. More information on accommodation, and on travel to Newquay, can be found via the above mentioned web page. A tentative conference program follows. More details of the conference, and abstracts of the papers, are available via the web page. DAY ONE. Friday 20th September. Session 1: Applied physiology Adaptive search for most effective stimuli -- Maneesh Sahani Dynamics of receptive fields in the visual system: plasticity of intra-cortical connections -- G. Mato and N. Parga Session 2: Psychological models Modelling the vowel space: Relating a statistical model to results obtained in experimental phonetics -- Matthew Aylett Who needs neural networks when we've got information theory? (or "The emperors new neural network model") -- John A. Bullinaria Optimal resource allocation for novelty detection - the principle and some experimental MEG support related to a human auditory memory -- Janne Sinkkonen Session 3: Formal analysis of the hippocampus A quantitative model of information processing in CA1 -- Carlo Fulvi Mari, Stefano Panzeri, Edmund Rolls and Alessandro Treves Information-theoretic analysis of hippocampal subfield CA1: Schaffer-collateral connectivity -- Simon Schultz, Stefano Panzeri, Edmund Rolls and Alessandro Treves Session 4: Information, energy and the world The metabolic cost of sensory information -- Simon Laughlin, David O'Carroll and John Anderson Metabolicly optimal rate codes and the time course of visual processing -- Roland Baddeley DAY TWO. Saturday 21st September. Session 5: Network models I Information Density and Cortical Magnification Factors -- M D Plumbley Time to learn about objects -- Guy Wallis Stochastic dynamics of a neuron with quantal synaptic noise -- Paul C. Bressloff and Peter Roper Session 6: Sparse representations Experiments with Low Entropy Neural Networks -- George Harpur and Richard Prager Utilizing Sparse Coding and Metrical Organization of Features for Artificial Object Recognition -- Norbert Kr\"uger, Gabi Peters, Michael P\"otzsch The role of higher-order image statistics in human visual coding -- Mitchell Thomson Session 7: Network models II Quantifying the level of distribution present in a feed-forward neural network -- Antony Browne The Emergence of Dominance Stripes in a Network of Firing Neurons -- S P Luttrell Session 8: Vision The taming of natural intensities by the early visual system of the blowfly -- J.H. van Hateren Optimizing photoreceptor arrays in apposition compound eyes -- Daniel L. Ruderman and Simon B. Laughlin Image coding in primary visual cortex using long-range horizontal collaterals -- Darragh Smyth and W.A.Phillips Session 9: Posters Predicting natural intensities viewed by photoreceptors -- A. van der Schaaf and J.H. van Hateren From inns_www at cns.bu.edu Thu Aug 8 10:18:40 1996 From: inns_www at cns.bu.edu (INNS Web Responses) Date: Thu, 8 Aug 1996 10:18:40 -0400 Subject: WCNN'96 September 15-18 Message-ID: <199608081418.KAA01210@retina.bu.edu> ---------------------------------------------------------------------- NEW INFORMATION! UPDATED AUGUST 5, 1996 WORLD CONGRESS ON NEURAL NETWORKS September 15-18, 1996 TOWN & COUNTRY HOTEL SAN DIEGO, CA, U.S.A. ---------------------------------------------------------------------------- Highlights & Attractions: * Guest speaker: Oliver Sacks, M.D., noted neurologist and author of Awakenings and The man who mistook his wife for a hat will give a Plenary Lecture in the Session on Consciousness and Intentionality, titled "Some Neurological Insights into the Nature of Consciousness," on Tuesday, September 17, 1996. * Plenary talks by Zeki, Van Essen, Hecht-Nielsen, and Merkle * 20 sessions, 5 special sessions, Special Interest Group Meetings * 9 Short Courses ---------------------------------------------------------------------------- HOW TO REGISTER FOR WCNN'96: WCNN'96 875 King's Highway, Suite 200 Woodbury, NJ 08096-3172 U.S.A. Fax: 609-853-0411 Forms available from the WEB site: http://cns-web.bu.edu/inns/wcnn ---------------------------------------------------------------------------- The following additional information is available on the WCNN'96 web site, http://cns-web.bu.edu/inns/wcnn. * Final Schedule * Short Courses Information and Schedule * A list of ALL INVITED AND SUBMITTED PRESENTATIONS for Monday, September 16; Tuesday, September 17; Wednesday, September 18; and all posters * Registration Information and Forms * Hotel Accommodations Information and Form * Airline Information * Information about San Diego * List of Contributors ---------------------------------------------------------------------------- FOR ADDITIONAL INFORMATION: Telephone: 609-845-5010 E-Mail: meetings at wcnn.ccmail.compuserve.com  From devries at sarnoff.com Thu Aug 8 15:05:49 1996 From: devries at sarnoff.com (Aalbert De Vries x2456) Date: Thu, 8 Aug 96 15:05:49 EDT Subject: FIRST Call for Papers: NNSP*97 Message-ID: <9608081905.AA19168@peanut.sarnoff.com> 1997 IEEE Workshop on Neural Networks for Signal Processing 24-26 September 1997 Amelia Island Plantation Amelia Island, Florida FIRST ANNOUNCEMENT AND CALL FOR PAPERS Thanks to the sponsorship of the IEEE Signal Processing Society and the co-sponsorship of the IEEE Neural Network Council, we are proud to announce the seventh of a series of IEEE Workshops on Neural Networks for Signal Processing. Papers are solicited for, but not limited to, the following topics: * Paradigms artificial neural networks, Markov models, fuzzy logic, inference net, evolutionary computation, nonlinear signal processing, and wavelets * Application areas speech processing, image processing, OCR, robotics, adaptive filtering, communications, sensors, system identification, issues related to RWC, and other general signal processing and pattern recognition * Theories generalization, design algorithms, optimization, parameter estimation, and network architectures * Implementations parallel and distributed implementation, hardware design, and other general implementation technologies Instructions for sumbitting papers Prospective authors are invited to submit 5 copies of extended summaries of no more than 6 pages. The top of the first page of the summary should include a title, authors' names, affiliations, address, telephone and fax numbers and email address, if any. Camera-ready full papers of accepted proposals will be published in a hard-bound volume by IEEE and distributed at the workshop. Submissions should be sent to: Dr. Jose C. Principe IEEE NNSP'97 444 CSE Bldg #42 P.O. Box 116130 University of Florida Gainesville, FL 32611 Important Dates: * Submission of extended summary: January 27, 1997 * Notification of acceptance: March 31, 1997 * Submission of photo-ready accepted paper: April 26, 1997 * Advanced registration: before July 1, 1997 Further Information Local Organizer Ms. Sharon Bosarge Telephone: 352-392-2585 Fax: 352-392-0044 e-mail: sharon at ee1.ee.ufl.edu World Wide Web http://www.cnel.ufl.edu/nnsp97/ Organization General Chairs Lee Giles (giles at research.nj.nec.com), NEC Research Nelson Morgan (morgan at icsi.berkeley.edu), UC Berkeley Proceeding Chair Elizabeth J. Wilson (bwilson at ed.ray.com), Raytheon Co. Publicity Chair Bert DeVries (bdevries at sarnoff.com), David Sarnoff Research Center Program Chair Jose Principe (principe at synapse.ee.ufl.edu), University of Florida Program Committee Les ATLAS Andrew BACK A. CONSTANTINIDES Federico GIROSI Lars Kai HANSEN Allen GORIN Yu-Hen HU Jenq-Neng HWANG Biing-Hwang JUANG Shigeru KATAGIRI Gary KUHN Sun-Yuan KUNG Richard LIPPMANN John MAKHOUL Elias MANOLAKOS Erkki OJA Tomaso POGGIO Mahesan NIRANJAN Volker TRESP John SORENSEN Takao WATANABE Raymond WATROUS Andreas WEIGEND Christian WELLEKENS About Amelia Island Amelia Island is in the extreme northeast Florida, across the St. Mary's river. The island is just 29 miles from Jacksonville International Airport, which is served by all major airlines. About Amelia Island Plantation Amelia Island Plantation is a 1,250 acre resort/paradise that offers something for every traveler. The Plantation offers 33,000 square feet of workable meeting space and a staff dedicated to providing an efficient, yet relaxed atmosphere. The many amenities of the Plantation include 45 holes of championship golf, 23 Har-Tru tennis courts, modern fitness facilities, an award winning children's program, more than 7 miles of flora-filled bike and jogging trails, 21 swimming pools, diverse accommodations, exquisite dining opportunities, and of course, miles of glistening Atlantic beach front. From jbower at bbb.caltech.edu Sun Aug 11 16:16:15 1996 From: jbower at bbb.caltech.edu (James M. Bower) Date: Sun, 11 Aug 1996 12:16:15 -0800 Subject: J. Comput. Neurosci. Message-ID: Journal of Computational Neuroscience Volume 3, Number 2, July 1996 Paul Blush and Terrence Sejnowski, Inhibition Synchronizes Sparsely Connected Cortical Neurons Within and Between Columns in Realistic Network Models 91 Gary Strangman, Searching for Cell Assemblies: How Many Electrodes Do I Need? 111 Usula Fuentes, Raphael Ritz, Wulfram Gerstner, and J. Leo VanHemmen, Vertical Signal Flow and Oscillations in a Three-Layer Model of the Cortex 125 Joshua W. Fost and Gregory A. Clark, Modeling Hermissenda: I. Differential Contributions of IA and IC to Type-B Cell Plasticity 137 Joshua W. Fost and Gregory A. Clark, Modeling Hermissenda: II. Effects of Variations in Type-B Cell Excitability, Synaptic Strength, and Network Architecture 155 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Journal of Computational Neuroscience Volume 3, Number 3, September 1996 A.E. Sauer, R.B. Driesang, A. Buschges, and U. Bassler, Distributed Processing on the Basis of Parallel and Antagonistic Pathways Simulation of the Femur-Tibia Control System in the Stick Insect 179 R. J. Butera, Jr., J.W. Clark, Jr., and J.H. Byrne, Dissection and Reduction of a Modeled Bursting Neuron 199 Christiane Linster and Remi Gervais, Investigation of the Role of Interneurons and Their Modulation by Centrifugal Fibers in a Neural Model of the Olfactory Bulb 225 David J. Pinto, Joshua C. Brumberg, Daniel J. Simons, and G. Bard Ermentrout, A Quantitative Population Model of Whisker Barrels: Re-Examining the Wilson-Cowan Equations 247 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Information about the Journal of Computational Neuroscience is available from: http://www.bbb.caltech.edu/JCNS *************************************** James M. Bower Division of Biology Mail code: 216-76 Caltech Pasadena, CA 91125 (818) 395-6817 (818) 795-2088 FAX NCSA Mosaic addresses for: laboratory http://www.bbb.caltech.edu/bowerlab GENESIS: http://www.bbb.caltech.edu/GENESIS science education reform http://www.caltech.edu/~capsi From geoff at salk.edu Sun Aug 11 20:25:53 1996 From: geoff at salk.edu (geoff@salk.edu) Date: Sun, 11 Aug 1996 17:25:53 -0700 (PDT) Subject: Postdoc position Message-ID: <199608120025.RAA00613@gauss.salk.edu> GEORGETOWN INSTITUTE FOR COGNITIVE AND COMPUTATIONAL SCIENCES Georgetown University, Washington DC Postdoctoral position in Theoretical / Computational Neuroscience Georgetown University has recently established an interdisciplinary research institute consisting of 16 full-time faculty. The major focus areas are neuroplasticity in development, higher auditory processing and language, and injury and aging. Experimental, computational and brain imaging approaches are all well represented. A postdoctoral position in theoretical / computational neuroscience is available from October 1996 in the lab of Dr Geoffrey J. Goodhill. The lab focuses on neural development and self-organization; particularly the development and plasticity of cortical mappings, areal specification of the cortex, and mechanisms of axon guidance (for more details see http://www.cnl.salk.edu/~geoff). The ideal candidate will have an initial training in a quantitative discipline plus knowledge and experience of neuroscience. Please send a CV, summary of relevant research experience and at least 2 letters of recommendation by post or email to: Dr Geoffrey J. Goodhill The Salk Institute 10010 North Torrey Pines Road La Jolla, CA 92037 Email: geoff at salk.edu From jhf at playfair.Stanford.EDU Mon Aug 12 16:00:26 1996 From: jhf at playfair.Stanford.EDU (Jerome H. Friedman) Date: Mon, 12 Aug 1996 13:00:26 -0700 Subject: Technical Report Available. Message-ID: <199608122000.NAA01577@tukey.Stanford.EDU> *** Technical Report Available *** LOCAL LEARNING BASED ON RECURSIVE COVERING Jerome H. Friedman Stanford University (jhf at playfair.stanford.edu) ABSTRACT Local learning methods approximate a global relationship between an output (response) variable and a set of input (predictor) variables by establishing a set of "local" regions that collectively cover the input space, and modeling a different (usually simple) input-output relationship in each one. Predictions are made by using the model associated with the particular region in which the prediction point is most centered. Two widely applied local learning procedures are K - nearest neighbor methods, and decision tree induction algorithms (CART, C4.5). The former induce a large number of highly overlapping regions based only on the distribution of training input values. By contrast, the latter (recursively) partition the input space into a relatively small number of highly customized (disjoint) regions using the training output values as well. Recursive covering unifies these two approaches in an attempt to combine the strengths of both. A large number of highly customized overlapping regions are produced based on both the training input and output values. Moreover, the data structure representing this cover permits rapid search for the prediction region given a set of (future) input values. Available by ftp from: "ftp://playfair.stanford.edu/pub/friedman/dart.ps.Z" Note: this postscript does not view properly on some ghostviews. It seems to print OK on nearly all postscript printers. From rsun at cs.ua.edu Tue Aug 13 11:58:11 1996 From: rsun at cs.ua.edu (Ron Sun) Date: Tue, 13 Aug 1996 10:58:11 -0500 Subject: IEEE TNN special issue on hybrid systems Message-ID: <9608131558.AA18781@athos.cs.ua.edu> Call For Papers special issue of IEEE Transaction on Neural Networks on ``Neural Networks and Hybrid Intelligent Models: Foundations, Theory, and Applications'' Guest Editors: C. Lee Giles, Ron Sun, Jacek M. Zurada Hybrid systems, the use of other intelligence paradigms with neural networks, are becoming more common and useful. In fact it can be argued that the success of neural networks has been from its ready incorporation of other information processing approaches, including pattern recognition, statistical inference, as well as symbolic processing. Some systems (especially those incorporating symbolic processing) have been known to some segments of the scientific community as high-level connectionist models. Other systems have been referred to as knowledge insertion and extraction. However, for the many applications, there exists little (1) theoretical foundation and (2) engineering methodology for effectively developing hybrid approaches. These two aspects are the topic of this special issue. Manuscripts are solicited in neural networks and hybrid models in the following areas - Theorectical foundations of hybrid models. Mathematical analysis, theories, critiques, case studies. - Models incorporating other paradigms such as AI symbolic processing, machine learning, fuzzy systems, genetic algorithms, and other intelligent paradigms within neural networks. Techniques, methodologies, and analyses. - Methodology of engineering design of hybrid systems. - Innovative and non-trivial applications of hybrid models (for example, in natural language processing, signal and image processing, pattern recognition, and cognitive modeling). Papers will undergo the standard review procedure of the IEEE Transactions on Neural Netwoks. The special issue will appear around November 1997. Prospective authors should submit six (6) copies of the completed manuscript, on or before February 28, 1997, to one of the following three guest editors: Dr. C. Lee Giles NEC Research Institute 4 Independence Way Princeton, NJ 08540, USA Phone: 609-951-2642 Fax 609-951-2482 Email: giles at research.nj.nec.com Web: http://www.neci.nj.nec.com/homepages/giles.html Prof. Ron Sun Department of Computer Science The University of Alabama Tuscaloosa, AL 35487 Phone: (205) 348-6363 Fax: (205) 348-0219 Email: rsun at cs.ua.edu Web: http://cs.ua.edu/faculty/sun/sun.html Prof. Jacek M. Zurada Electrical Engineering Department University of Louisville Louisville, KY 40292, USA Phone: (502) 852-6314 Fax: (502) 852-6807 Email: j.zurada at ieee.org Web: under construction From listerrj at helios.aston.ac.uk Tue Aug 13 12:00:31 1996 From: listerrj at helios.aston.ac.uk (Richard Lister) Date: Tue, 13 Aug 1996 17:00:31 +0100 Subject: Lectureships available Message-ID: <6363.199608131601@sun.aston.ac.uk> ---------------------------------------------------------------------- LECTURESHIPS ------------ Aston University, Birmingham, UK * Full details at http://www.ncrg.aston.ac.uk/ * We are seeking highly motivated academic staff to contribute to research in the general areas of neural computing, pattern recognition, time series analysis, image processing, machine vision or a closely related field. Candidates are expected to have excellent academic qualifications, a strong mathematical background and a proven record of research. Two posts are available. The successful candidates will also be expected to make innovative contributions to graduate-level and undergraduate teaching programmes, and ideally also contribute to industrially funded research programmes and industrial courses. They will join the Neural Computing Research Group which currently comprises the following members: Professors: Christopher Bishop David Lowe Visiting Professors: Geoffrey Hinton Edward Feigenbaum David Bounds Lecturers: Richard Rohwer Ian Nabney David Saad Chris Williams Postdoctoral Research Fellows: David Barber Paul Goldberg Neep Hazarika Alan McLachlan Mike Tipping Huaiyu Zhu (5 further posts currently being advertised) Personal Assistant: Hanni Sondermann System Administrator: Richard Lister Research Programmer: Andrew Weaver 20 Research Students Conditions of Service --------------------- The appointment will be for an initial period of three years, with the possibility of subsequent renewal or transfer to a continuing appointment. Initial salary will be within the lecturer A and B range 15,154 to 26,430, and exceptionally up to 29,532 (UK pounds; these salary scales are currently under review). How to Apply ------------ If you wish to be considered for one of these positions, please send a full CV and publications list, together with the names and addresses of 4 referees, to: Hanni Sondermann Neural Computing Research Group Aston University Birmingham B4 7ET, U.K. Tel: +44/0 121 333 4631 Fax: +44/0 121 333 4586 e-mail: H.E.Sondermann at aston.ac.uk Applications may be submitted as postscript files using e-mail, or as hard copy by post. Closing date: 30 August 1996. ---------------------------------------------------------------------- From dwang at cis.ohio-state.edu Wed Aug 14 14:59:00 1996 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Wed, 14 Aug 1996 14:59:00 -0400 Subject: Neurocomputing Best Paper Award Message-ID: <199608141859.OAA20887@shirt.cis.ohio-state.edu> Presenting the Neurocomputing best paper award, Vols. 7-9 (1995) As a form to express our thanks to the support by all contributors to the journal for a concluded volume our Editorial Board elects one representative contribution and grants the author(s) of an outstanding contribution the "Neurocomputing Best Paper Award". Originality, clarity of result presentation, depth, and novelty are some of the properties we are looking for. >From 1995 (Vol. 7-9) on, the award has been granted on the basis of all contributions of one year. The winner(s) obtain(s) a corresponding certificate and a publication of free-choice from our publisher, Elsevier Science B.V. M. Cannon and J.-J.E. Slotine were granted the Neurocomputing Best Paper Award for their contribution Space-frequency localized basis function networks for nonlinear systems estimation and control in Vol. 9(3) (1995), pp. 293-342. Again thanks for your support. V. David Sanchez A. Editor-in-Chief From otavioc at cogs.susx.ac.uk Thu Aug 15 14:25:30 1996 From: otavioc at cogs.susx.ac.uk (Otavio Augusto Salgado Carpinteiro) Date: Thu, 15 Aug 1996 19:25:30 +0100 (BST) Subject: Thesis available Message-ID: FTP-host: ftp.cogs.susx.ac.uk FTP-filename: /pub/reports/csrp/csrp426.ps.Z The following thesis is available via anonymous ftp. A CONNECTIONIST APPROACH IN MUSIC PERCEPTION Otavio A. S. Carpinteiro email: otavioc at cogs.susx.ac.uk Cognitive Science Research Paper CSRP-426 School of Cognitive & Computing Sciences University of Sussex, Brighton, UK FTP instructions: unix> ftp ftp.cogs.susx.ac.uk [ or ftp 192.33.16.70] login: anonymous password: ftp> cd pub/reports/csrp ftp> binary ftp> get csrp426.ps.Z ftp> bye 117 pages. 422107 bytes compressed, 1195561 bytes uncompressed Paper copies can be ordered from: Celia McInnes (celiam at cogs.susx.ac.uk) School of Cognitive & Computing Sciences University of Sussex Falmer, Brighton, UK. ------------------------------------------------------------------------ ABSTRACT: Little research has been carried out in order to understand the mechanisms underlying the perception of polyphonic music. Perception of polyphonic music involves thematic recognition, that is, recognition of instances of theme through polyphonic voices, whether they appear unaccompanied, transposed, altered or not. There are many questions still open to debate concerning thematic recognition in the polyphonic domain. One of them, in particular, is the question of whether or not cognitive mechanisms of segmentation and thematic reinforcement facilitate thematic recognition in polyphonic music. This dissertation proposes a connectionist model to investigate the role of segmentation and thematic reinforcement in thematic recognition in polyphonic music. The model comprises two stages. The first stage consists of a supervised artificial neural model to segment musical pieces in accordance with three cases of rhythmic segmentation. The supervised model is trained and tested on sets of contrived patterns, and successfully applied to six musical pieces from J. S. Bach. The second stage consists of an original unsupervised artificial neural model to perform thematic recognition. The unsupervised model is trained and assessed on a four-part fugue from J. S. Bach. The research carried out in this dissertation contributes into two distinct fields. Firstly, it contributes to the field of artificial neural networks. The original unsupervised model encodes and manipulates context information effectively, and that enables it to perform sequence classification and discrimination efficiently. It has application in cognitive domains which demand classifying either a set of sequences of vectors in time or sub-sequences within a unique and large sequence of vectors in time. Secondly, the research contributes to the field of music perception. The results obtained by the connectionist model suggest, along with other important conclusions, that thematic recognition in polyphony is not facilitated by segmentation, but otherwise, facilitated by thematic reinforcement. -- Otavio. +===========================================================================+ | | | | Otavio Augusto Salgado Carpinteiro | Phone: +44 (0) 1273 606755 | | Postgraduate Pigeonholes | ext. 2385 | | School of Cognitive & Computing Sciences | | | University of Sussex | Fax: +44 (0) 1273 671320 | | FALMER - East Sussex | | | BN1 9QH | E-mail: | | England | otavioc at cogs.sussex.ac.uk | | | | +===========================================================================+ From giles at research.nj.nec.com Thu Aug 15 09:49:00 1996 From: giles at research.nj.nec.com (Lee Giles) Date: Thu, 15 Aug 96 09:49:00 EDT Subject: TR on recurrent networks and long-term dependenices Message-ID: <9608151349.AA03234@alta> The following Technical Report is available via the University of Maryland Department of Computer Science and the NEC Research Institute archives: ____________________________________________________________________ HOW EMBEDDED MEMORY IN RECURRENT NEURAL NETWORK ARCHITECTURES HELPS LEARNING LONG-TERM DEPENDENCIES Technical Report CS-TR-3626 and UMIACS-TR-96-28, Institute for Advanced Computer Studies, University of Maryland, College Park, MD 20742 Tsungnan Lin{1,2}, Bill G. Horne{1}, C. Lee Giles{1,3} {1}NEC Research Institute, 4 Independence Way, Princeton, NJ 08540 {2}Department of Electrical Engineering, Princeton University, Princeton, NJ 08540 {3}UMIACS, University of Maryland, College Park, MD 20742 ABSTRACT Learning long-term temporal dependencies with recurrent neural networks can be a difficult problem. It has recently been shown that a class of recurrent neural networks called NARX networks perform much better than conventional recurrent neural networks for learning certain simple long-term dependency problems. The intuitive explanation for this behavior is that the output memories of a NARX network can be manifested as jump-ahead connections in the time-unfolded network. These jump-ahead connections can propagate gradient information more efficiently, thus reducing the sensitivity of the network to long-term dependencies. This work gives empirical justification to our hypothesis that similar improvements in learning long-term dependencies can be achieved with other classes of recurrent neural network architectures simply by increasing the order of the embedded memory. In particular we explore the impact of learning simple long-term dependency problems on three classes of recurrent neural networks architectures: globally recurrent networks, locally recurrent networks, and NARX (output feedback) networks. Comparing the performance of these architectures with different orders of embedded memory on two simple long-term dependences problems shows that all of these classes of networks architectures demonstrate significant improvement on learning long-term dependencies when the orders of embedded memory are increased. These results can be important to a user comfortable to a specific recurrent neural network architecture because simply increasing the embedding memory order will make the architecture more robust to the problem of long-term dependency learning. ------------------------------------------------------------------- KEYWORDS: discrete-time, memory, long-term dependencies, recurrent neural networks, training, gradient-descent PAGES: 15 FIGURES: 7 TABLES: 2 ------------------------------------------------------------------- http://www.neci.nj.nec.com/homepages/giles.html http://www.cs.umd.edu/TRs/TR-no-abs.html or ftp://ftp.nj.nec.com/pub/giles/papers/UMD-CS-TR-3626.recurrent.arch.long.term.ps.Z ------------------------------------------------------------------------------------ -- C. Lee Giles / Computer Sciences / NEC Research Institute / 4 Independence Way / Princeton, NJ 08540, USA / 609-951-2642 / Fax 2482 www.neci.nj.nec.com/homepages/giles.html == From twilson at afit.af.mil Fri Aug 16 13:15:25 1996 From: twilson at afit.af.mil (Terry Wilson) Date: Fri, 16 Aug 96 13:15:25 -0400 Subject: call for papers Message-ID: <9608161715.AA05694@euclid.afit.af.mil> Applications and Science of Artificial Neural Networks ****************************************************** Call for Papers and Announcement Applications and Science of Artificial Neural Networks Part of SPIE's 1997 International Symposium on Aerospace/Defense Sensing and Controls 21-25 April 1997 Marriott's Orlando World Center Resort and Convention Center (Orlando, Florida USA) The focus of this conference is on real-world applications of artificial neural networks and on recent theoretical developments applicable to current applications. The goal of this conference is to provide a forum for interaction between researchers and industrial/government agencies with information processing requirements. Papers that investigate advantages/disadvantages of artificial neural networks in specific real-world applications will be presented. Papers that clearly state existing problems in information processing that could potentially be solved by artificial neural networks will also be considered. Sessions will concentrate on: --- innovative applications of artificial neural networks to solve real-world problems --- comparative performance in applications of target recognition, object recognition, speech processing, speaker identification, speaker normalization, cochannel processing, signal processing in realistic environments, robotics, process control, and image processing --- demonstrations of properties and limitations of existing or new artificial neural networks as shown by or related to an application --- hardware implementation technologies that are either general purpose or application specific --- knowledge acquisition and representation --- biologically inspired visual representation techniques --- decision support systems --- artificial life --- cognitive science --- hybrid systems (fuzzy, neural, genetic) --- neurobiology --- optimization --- sensation and perception --- system identification --- financial applications --- time series analysis and prediction --- pattern recognition --- medical applications --- intelligent control --- robotics --- information warfare applications --- sensation, perception and cognitive neuropsychology. Conference Chair: - Steven K. Rogers, Air Force Institute of Technology Program Committee: - Stanley C. Ahalt, The Ohio State Univ.; - John Franco Basti, Pontifical Gregorian Univ.; - James C. Bezdek, Univ. of West Florida; - Joe R. Brown, Berkom USA; - John Colombi, Dept. of Defense; - Laurene V. Fausett, Florida Institute of Technology; - Michael Georgiopoulos, Univ. of Central Florida; - Joydeep Ghosh, Univ. of Texas/Austin; - Charles W. Glover, Oak Ridge National Lab.; - John B. Hampshire II, Jet Propulsion Lab.; - Richard P. Lippmann, MIT Lincoln Lab.; - Murali Menon, MIT/Lincoln Lab.; - Harley R. Myler, Univ. of Central Florida; - Mary Lou Padgett, Auburn Univ.; - Kevin L. Priddy, Accurate Automation Corp.; - Dennis W. Ruck, Information Warfare Ctr.; - Gregory L. Tarr, Air Force Phillips Lab.; - Gary Whittington, Global Web Ltd.; - Rodney G. Winter, Dept of Defense; - Yinglin Yu, South China Univ. IMPORTANT DATES: Abstract Due Date: 9 September 1996 Manuscript Due Date: 24 January 1997 Proceedings of this conference will be published and available at the symposium. ADDITIONAL INFORMATION: * Up-to-minute information about the conference is available on the World Wide Web (WWW) at http://www.afit.af.mil/Schools/EN/ENG/LABS/PatternRec/aero97.html * Questions can be sent by E-mail to rogers at afit.af.mil From andre at icmsc.sc.usp.br Mon Aug 19 19:23:02 1996 From: andre at icmsc.sc.usp.br ( Andre Carlos P. de Leon F. de Carvalho ) Date: Mon, 19 Aug 1996 20:23:02 -0300 Subject: SBRN 96 - LAST CALL Message-ID: <199608192323.UAA06533@taba> 3rd Brazilian Symposium on Neural Networks Recife, November 12 - 14, 1996 Sponsored by the Brazilian Computer Society (SBC) Second Call for Papers The Third Brazilian Symposium on Neural Networks will be held at the Federal University of Pernambuco, in Recife (Brazil), from the 12nd to the 14th of November, 1996. The SBRN symposia, as they were initially named, are organized by the interest group in Neural Networks of the Brazilian Computer Society since 1994. The third version of the meeting follows a very successfull organization of the previous events which brought together the main developments of the area in Brazil and had the participation of many national and international researchers both as invited speakers and as authors of papers presented at the symposium. Recife is a very pleasant city in the northeast of Brazil, known by its good climate and beautiful beaches, with sunshine throughout almost the whole year. The city, whose name originated from the coral formations in the seaside port and beaches, is in a strategic touristic situation in the region and offers a good variety of hotels both in the city historic center and at the seaside resort. Scientific papers will be analyzed by the program committee. This analysis will take into account originality, significance to the area, and clarity. Accepted papers will be fully published in the conference proceedings. MAJOR TOPICS: The major topics of interest include, but are not limited to: * Biological Perspectives * Theoretical Models * Algorithms and Architectures * Learning Models * Hardware Implementation * Signal Processing * Robotics and Control * Parallel and Distributed Implementations * Pattern Recognition * Image Processing * Optimization * Cognitive Science * Hybrid Systems * Dynamic Systems * Genetic Algorithms * Fuzzy Logic * Applications INTERNATIONAL INVITED SPEAKERS: * "Adaptive Wavelets for Pattern Recognition" by Professor Harold Szu , Director of the Center for Advanced Computer Studies, University of Southwestern Louisiana * "Recurrent Neural Networks: El Dorado or Fort Knox?" by Professor C. Lee Giles , NEC Research Institute and University of Maryland, College Park * "Case-based Reasoning and Neural Networks - a Fruitful Breed?" by Professor Agnar Aamodt , Department of Informatics, University of Trondheim - Norway PROGRAM COMMITTEE: * Teresa Bernarda Ludermir - DI/UFPE * Andri C. P. L. F. de Carvalho - ICMSC/USP (Chair) * Germano C. Vasconcelos - DI/UFPE * Anttnio de Padua Braga - DELT/UFMG * Dmbio Leandro Borges - CEFET/PR * Paulo Martins Engel - II/UFRGS * Ricardo Machado - PUC/Rio * Valmir Barbosa - COPPE/UFRJ * Weber Martins - EEE/UFG ORGANISING COMMITTEE: * Teresa Bernarda Ludermir - DI/UFPE (Chair) * Edson Costa de Barros Carvalho Filho - DI/UFPE * Germano C. Vasconcelos - DI/UFPE * Paulo Jorge Leitco Adeodato - DI/UFPE SUBMISSION PROCEDURE: The symposium seeks contributions to the state of the art and future perspectives of Neural Networks research. Submitted papers must be in Portuguese, English or Spanish. The submissions must include the original and three copies of the paper and must follow the format below (Electronic mail and FAX submissions are NOT accepted). The paper must be printed using a laser printer, in two-column format, not numbered, 8.5 X 11.0 inch (21,7 X 28.0 cm). It must not exceed eight pages, including all figures and diagrams. The font size should be 10 pts, such as Times-Roman or equivalent, with the following margins: right and left 2.5 cm, top 3.5 cm, and bottom 2.0 cm. The first page should contain the paper's title, the complete author(s) name(s), affiliation(s), and mailing address(es), followed by a short (150 words) abstract and a list of descriptive key words. The submission should also include an accompanying letter containing the following information : * Manuscript title * First author's name, mailing address and e-mail * Technical area of the paper Authors may use the Latex files sbrn.tex and sbrn.sty for preparing their manuscripts. The postscript file sbrn.ps is also available. Alternately, all those files, together with an equivalent file in WORD, can be retrieved by anonymous ftp following the instructions given below : ftp ftp.di.ufpe.br (LOGIN :) anonymous (PASSWORD :) (your email address) cd pub/events/IIISBRN bin get sbrn.tex (or sbrn.doc) get sbrn.sty bye SUBMISSION ADDRESS: Four copies (one original and three copies) must be submitted to: Prof. Andri Carlos Ponce de Leon Ferreira de Carvalho Coordenador do Comitj de Programa - III SBRN Departamento de Cijncias de Computagco e Estatmstica ICMSC - Universidade de Sco Paulo Caixa Postal 668 CEP 13560.070 Sco Carlos, SP Phone: +55 162 726222 FAX: +55 162 749150 E-mail: IIISBRN at di.ufpe.br IMPORTANT DATES: August 30, 1996 (mailing date): Deadline for paper submission September 30, 1996 : Notification of acceptance/rejection November, 12-14 1996 : III SBRN ADDITIONAL INFORMATION: * Up-to-minute information about the symposium is available on the World Wide Web (WWW) at http://www.di.ufpe.br/~IIISBRN/web_sbrn * Questions can be sent by E-mail to IIISBRN at di.ufpe.br Profa. Teresa Bernarda Ludermir Coordenadora Geral do III SBRN Laboratory of Intelligent Computing (LCI) Departamento de Informatica Universidade Federal de Pernambuco Caixa Postal 7851 CEP 50.732-970 Recife-PE Fone : +55 81 271-8430 FAX: +55 81 271-8438 E-mail: IIISBRN at di.ufpe.br We look forward to seeing you in Recife ! From robert at fit.qut.edu.au Wed Aug 21 01:42:18 1996 From: robert at fit.qut.edu.au (Robert Andrews) Date: Wed, 21 Aug 1996 15:42:18 +1000 (EST) Subject: NIPS*96 Rule Extraction W'shop Message-ID: ============================================================= FIRST CALL FOR PAPERS NIPS*96 POST-CONFERENCE WORKSHOP -------------------------------------------- RULE-EXTRACTION FROM TRAINED NEURAL NETWORKS -------------------------------------------- Snowmass (Aspen), Colorado, USA Fri December 6th, 1996 Robert Andrews & Joachim Diederich Neurocomputing Research Centre Queensland University of Technology Brisbane 4001 Queensland, Australia Fax: +61 7 864-1801 E-mail: robert at fit.qut.edu.au E-mail: joachim at fit.qut.edu.au Rule extraction can be defined as the process of deriving a symbolic description of a trained Artificial Neural Network (ANN). Ideally the rule extraction process results in a symbolic description which closely mimics the behaviour of the network in a concise and comprehensible form. The merits of including rule extraction techniques as an adjunct to conventional Artificial Neural Network techniques include: a) the provision of a 'User Explanation' capability; b) improvement of the generalisation capabilities of ANN solutions by allowing identification of regions of input space not adequately represented; c) data exploration and the induction of scientific theories by the discovery and explicitation of previously unknown dependencies and relationships in data sets; d) knowledge acquistion for symbolic AI systems by overcoming the knowledge engineering bottleneck; e) the potential to contribute to the understanding of how symbolic and connectionist approaches to AI can be profitably integrated. An ancillary problem to that of rule extraction from trained ANNs is that of using the ANN for the `refinement' of existing rules within symbolic knowledge bases. The goal in rule refinement is to use a combination of ANN learning and rule extraction techniques to produce a `better' (ie a `refined') set of symbolic rules which can then be applied back in the original problem domain. In the rule refinement process, the initial rule base (ie what may be termed `prior knowledge') is inserted into an ANN by programming some of the weights. The rule refinement process then proceeds in the same way as normal rule extraction viz (1) train the network on the available data set(s); and (2) extract (in this case the `refined') rules - with the proviso that the rule refinement process may involve a number of iterations of the training phase rather than a single pass. The objective of this workshop is to provide a discussion platform for researchers and practitioners interested in all aspects of rule extraction from trained artificial neural networks. The workshop will examine current techniques for providing an explanation component for ANNs including rule extraction, extraction of fuzzy rules, rule initialisation and rule refinement. Other topics for discussion include computational complexity of rule extraction algorithms, criteria for assessing rule quality, and issues relating to generalisation differences between the ANN and the extracted rule set. The workshop will also discuss ways in which ANNs and rule extraction techniques may be profitably employed in commercial, industrial, and scientific application areas. The one day workshop will be a mixture of position papers and panel discussions. Papers presented in the mini-conference sessions will be of 20 minutes duration with ample time for questions/discussions afterwards. DISCUSSION POINTS FOR WORKSHOP PARTICIPANTS 1. Decompositional vs. learning approaches to rule-extraction from ANNs - What are the advantages and disadvantages w.r.t. performance, solution time, computational complexity, problem domain etc. Are decompositional approaches always dependent on a certain ANN architecture? 2. Rule-extraction from trained neural networks v symbolic induction. What are the relative strength and weaknesses? 3. What are the most important criteria for rule quality? 4. What are the most suitable representation languages for extracted rules? How does the extraction problem vary across different languages? 5. What is the relationship between rule-initialisation (insertion) and rule-extraction? For instance, are these equivalent or complementary processes? How important is rule-refinement by neural networks? 6. Rule-extraction from trained neural networks and computational learning theory.Is generating a minimal rule-set which mimics an ANN a hard problem? 7. Does rule-initialisation result in improved generalisation and faster learning? 8. To what extent are existing extraction algorithms limited in their applicability? How can these limitations be addressed? 9. Are there any interesting rule-extraction success stories? That is, problem domains in which the application of rule-extraction methods has resulted in an interesting or significant advance. SUBMISSION OF WORKSHOP EXTENDED ABSTRACTS/PAPERS Authors are invited to submit 3 copies of either an extended abstract or full paper relating to one of the topic areas listed above. Papers should be written in English in single column format and should be limited to no more than eight, (8) sides of A4 paper including figures and references. NIPS style files are available at http://www.cs.cmu.edu/afs/cs/project/cnbc/nips/formatting/nips.sty http://www.cs.cmu.edu/afs/cs/project/cnbc/nips/formatting/nips.tex http://www.cs.cmu.edu/afs/cs/project/cnbc/nips/formatting/nips.ps Please include the following information in an accompanying cover letter: Full title of paper, presenting author's name, address, and telephone and fax numbers, authors e-mail address. Submission Deadline is October 7th,1996 with notification to authors by 31st October,1996. For further information, inquiries, and paper submissions please contact: Robert Andrews Queensland University of Technology GPO Box 2434 Brisbane Q. 4001. Australia. phone +61 7 864-1656 fax +61 7 864-1969 email robert at fit.qut.edu.au More information about the NIPS*96 workshop series is available from: WWW: http://www.fit.qut.edu.au/~robert/nips96.html From sml at esesparc2.essex.ac.uk Thu Aug 22 09:01:17 1996 From: sml at esesparc2.essex.ac.uk (Lucas S M) Date: Thu, 22 Aug 1996 14:01:17 +0100 (BST) Subject: structuring chromosomes for total neural network evolution Message-ID: subject: structuring chromosomes for total neural network evolution The following two papers discuss recent work on a simple unified approach to evolving ALL aspects of a neural network, including its learning algorithm (if any). The first uses a grammar based chromosome, the second uses a set-based chromosome. The latter approach appears particularly promising as a method of part designing/ part evolving neural networks. ----------------------------------------------------------------------- From bruce at bme1.image.uky.edu Thu Aug 22 14:45:12 1996 From: bruce at bme1.image.uky.edu (Eugene Bruce) Date: Thu, 22 Aug 96 14:45:12 EDT Subject: No subject Message-ID: <9608221845.AA01571@image.uky.edu> POSTDOCTORAL POSITION(SENSORIMOTOR INTEGRATION AND DYNAMICAL SYSTEMS) Respiratory Dynamics Lab, University of Kentucky Center for Biomedical Engineering This position is part of an NIH-funded project to identify causes of irregular breathing and apnea. The project involves experimental and computational studies aimed at understanding nonlinear modulation of respiratory rhythm by sensory afferents from the lungs and upper airway. Specific sub-projects include: (1) characterization of vagal deflation receptors in rats from single-unit recordings; (2) analysis of modulation of breathing pattern by upper airway afferents using techniques from nonlinear dynamics, including the development of new theoretical approaches to signal processing; (3) mathematical modelling of the integration of sensory afferents with neural circuits for respiratory pattern formation; (4) experimental analysis and modelling of responses of upper airway and chest wall muscles to transient respiratory stimuli using linear and nonlinear system identification methods. Future work will address the development of an in-vitro brainstem-spinal cord preparation for studying respiratory sensorimotor integration. The ideal applicant will be able to contribute to aspects of both the experimental studies and the modelling or signal analysis efforts. The position is available immediately. More information about the laboratory can be found at the URL http://www.uky.edy/RGS/CBME/bruce.html. Information about related neuroscience activities at the Center is available at http://www.uky.edu/RGS/CBME/CBMENeuralControl.html. Additional information about this position is available via email inquiries to bruce at bme1.image.uky.edu, or by telephone (606-257-3774). Applications (curriculum vitae and names of references) may be sent to Dr. Eugene Bruce by email, or by postal mail to Wenner Gren Research Laboratory, University of Kentucky, Rose Street, Lexington, KY 40506-0070. (Posted on 8/22/96.) Eugene Bruce, Ph. D. Center for Biomedical Engineering Univ. of Kentucky BRUCE at BME1.IMAGE.UKY.EDU From dhw at almaden.ibm.com Thu Aug 22 18:33:47 1996 From: dhw at almaden.ibm.com (dhw@almaden.ibm.com) Date: Thu, 22 Aug 1996 15:33:47 -0700 Subject: Paper announcements Message-ID: <9608222233.AA24700@buson.almaden.ibm.com> *** Paper Announcements *** ==================================================================== The following new paper is now available with anonymous ftp to ftp.santafe.edu, in the directory pub/dhw_ftp, under the names BS.ps.Z and BS.ps.Z.encoded. Any comments are welcomed. * COMBINING STACKING WITH BAGGING TO IMPROVE A LEARNING ALGORITHM by David H. Wolpert and William G. Macready Abstract: In bagging \cite{breiman:bagging} one uses bootstrap replicates of the training set \cite{efron:computers, efron.tibshirani:introduction} to improve a learning algorithm's performance, often by tens of percent. This paper presents several ways that stacking \cite{wolpert:stacked,breiman:stacked} can be used in concert with the bootstrap procedure to achieve a further improvement on the performance of bagging for some regression problems. In particular, in some of the work presented here, one first converts a single underlying learning algorithm into several learning algorithms. This is done by bootstrap resampling the training set, exactly as in bagging. The resultant algorithms are then combined via stacking. This procedure can be viewed as a variant of bagging, where stacking rather than uniform averaging is used to achieve the combining. The stacking improves performance over simple bagging by up to a factor of 2 on the tested problems, and never resulted in worse performance than simple bagging. In other work presented here, there is no step of converting the underlying learning algorithm into multiple algorithms, so it is the improve-a-single-algorithm variant of stacking that is relevant. The precise version of this scheme tested can be viewed as using the bootstrap and stacking to estimate the input-dependence of the statistical bias and then correct for it. The results are preliminary, but again indicate that combining stacking with the bootstrap can be helpful. ==================================================================== The following paper has been previously announced. A new version, incorporating major modifications of the original, is now available at ftp.santafe.edu, in pub/dhw_ftp, as estimating.baggings.error.ps.Z or estimating.baggings.error.ps.Z.encoded. The new version shows in particular how the generalization error of a bagged version of a learning algorithm can be estimated with more accuracy than that afforded by using cross-validation on the original algorithm. Any comments are welcomed. * AN EFFICIENT METHOD TO ESTIMATE BAGGING'S GENERALIZATION ERROR by David H. Wolpert and William G. Macready Abstract: In bagging \cite{Breiman:Bagging} one uses bootstrap replicates of the training set \cite{Efron:Stat,BootstrapIntro} to try to improve a learning algorithm's performance. The computational requirements for estimating the resultant generalization error on a test set by means of cross-validation are often prohibitive; for leave-one-out cross-validation one needs to train the underlying algorithm on the order of $m\nu$ times, where $m$ is the size of the training set and $\nu$ is the number of replicates. This paper presents several techniques for exploiting the bias-variance decomposition \cite{Geman:Bias, Wolpert:Bias} to estimate the generalization error of a bagged learning algorithm without invoking yet more training of the underlying learning algorithm. The best of our estimators exploits stacking \cite{Wolpert:Stack}. In a set of experiments reported here, it was found to be more accurate than both the alternative cross-validation-based estimator of the bagged algorithm's error and the cross-validation-based estimator of the underlying algorithm's error. This improvement was particularly pronounced for small test sets. This suggests a novel justification for using bagging--- improved estimation of generalization error. ==================================================================== The following paper has been previously announced. A new version, incorporating major modifications of the original, is now available at ftp.santafe.edu, in pub/dhw_ftp, as bias.plus.ps.Z or bias.plus.ps.Z.encoded. The new version contains in particular an analysis of the Friedman effect, discussed in Jerry Friedman's recently announced paper on 0-1 loss. Any comments are welcomed. * ON BIAS PLUS VARIANCE by David H. Wolpert Abstract: This paper presents several additive "corrections" to the conventional quadratic loss bias- plus-variance formula. One of these corrections is appropriate when both the target is not fixed (as in Bayesian analysis) and also training sets are averaged over (as in the conventional bias-plus- variance formula). Another additive correction casts conventional fixed-training-set Bayesian analysis directly in terms of bias-plus-variance. Another correction is appropriate for measuring full generalization error over a test set rather than (as with conventional bias-plus-variance) error at a single point. Yet another correction can help explain the recent counter-intuitive bias-variance decomposition of Friedman for zero-one loss. After presenting these corrections this paper then discusses some other loss-function-specific aspects of supervised learning. In particular, there is a discussion of the fact that if the loss function is a metric (e.g., zero-one loss), then there is bound on the change in generalization error accompanying changing the algorithm's guess from h1 to h2 that depends only on h1 and h2 and not on the target. This paper ends by presenting versions of the bias-plus-variance formula appropriate for logarithmic and quadratic scoring, and then all the ad ditive corrections appropriate to those formulas. All the correction terms presented in this paper are a covariance, between the learning algorithm and the posterior distribution over targets. Accordingly, in the (very common) contexts in which those terms apply, there is not a "bias-variance trade-off", or a "bias-variance dilemma", as one often hears. Rather there is a bias-variance-cova riance trade-off. From nin at cns.brown.edu Fri Aug 23 12:07:32 1996 From: nin at cns.brown.edu (Nathan Intrator) Date: Fri, 23 Aug 96 12:07:32 EDT Subject: Paper announcements Message-ID: <9608231607.AA07816@cns.brown.edu> *** Papers Announcements *** The following papers are now available from my research page: http://www.physics.brown.edu/people/nin/research.html Comments are welcomed. ----------------------------------------------------------------------- Classifying Seismic Signals by Integrating Ensembles of Neural Networks Yair Shimshoni and Nathan Intrator ftp://cns.brown.edu/nin/papers/hong-kong.ps.Z This paper proposes a classification scheme based on the integration of multiple Ensembles of ANNs. It is demonstrated on a classification problem, in which Seismic recordings of Natural Earthquakes must be distinguished from the recordings of Artificial Explosions. A Redundant Classification Environment consists of several Ensembles of Neural Networks is created and trained on Bootstrap Sample Sets, using various data representations and architectures. The ANNs within the Ensembles are aggregated (as in Bagging) while the Ensembles are integrated non-linearly, in a signal adaptive manner, using a posterior confidence measure based on the agreement (variance) within the Ensembles. The proposed Integrated Classification Machine achieved 92.1\% correct classification on the seismic test data. Cross Validation evaluations and comparisons indicate that such integration of a collection of ANN's Ensembles is a robust way for handling high dimensional problems with a complex non-stationary signal space as in the current Seismic Classification problem. To appear: Proceedings of ICONIP 96 ----------------------------------------------------------------------- Learning low dimensional representations of visual objects with extensive use of prior knowledge Nathan Intrator and Shimon Edelman ftp://cns.brown.edu/nin/papers/ml1.ps.Z Learning to recognize visual objects from examples requires the ability to find meaningful patterns in spaces of very high dimensionality. We present a method for dimensionality reduction which effectively biases the learning system by combining multiple constraints via an extensive use of class labels. The use of multiple class labels steers the resulting low-dimensional representation to become invariant to those directions of variation in the input space that are irrelevant to classification; this is done merely by making class labels independent of these directions. We also show that prior knowledge of the proper dimensionality of the target representation can be imposed by training a multiple-layer bottleneck network. A series of computational experiments involving parameterized fractal images and real human faces indicate that the low-dimensional representation extracted by our method leads to improved generalization in the learned tasks, and is likely to preserve the topology of the original space. To appear: EXPLANATION-BASED NEURAL NETWORK LEARNING: A LIFELONG LEARNING APPROACH. Editor: SEBASTIAN THRUN ----------------------------------------------------------------------- Bootstrapping with Noise: An Effective Regularization Technique Yuval Raviv and Nathan Intrator ftp://cns.brown.edu/nin/papers/spiral.ps.Z Bootstrap samples with noise are shown to be an effective smoothness and capacity control technique for training feed-forward networks and for other statistical methods such as generalized additive models. It is shown that noisy bootstrap performs best in conjunction with weight decay regularization and ensemble averaging. The two-spiral problem, a highly non-linear noise-free data, is used to demonstrate these findings. The combination of noisy bootstrap and ensemble averaging is also shown useful for generalized additive modeling, and is also demonstrated on the well known Cleveland Heart Data \cite{Detrano89}. To appear: Connection Science, Speciall issue on Combining Estimators. From jim at stats.gla.ac.uk Fri Aug 23 12:49:16 1996 From: jim at stats.gla.ac.uk (Jim Kay) Date: Fri, 23 Aug 1996 17:49:16 +0100 Subject: TR on Contextually Guided Unsupervised Learning/Multivariate Processors Message-ID: <17769.199608231649@pole.stats.gla.ac.uk> Technical Report Available CONTEXTUALLY GUIDED UNSUPERVISED LEARNING USING LOCAL MULTIVARIATE BINARY PROCESSORS Jim Kay Department of Statistics University of Glasgow Dario Floreano MicroComputing Laboratory Swiss Federal Institute of Technology Bill Phillips Centre for Cognitive and Computational Neuroscience University of Stirling We consider the role of contextual guidance in learning and processing within multi-stream neural networks. Earlier work (Kay \& Phillips, 1994, 1996; Phillips et al., 1995) showed how the goals of feature discovery and associative learning could be fused within a single objective, and made precise using information theory, in such a way that local binary processors could extract a single feature that is coherent across streams. In this paper we consider multi-unit local processors, with multivariate binary outputs, that enable a greater number of coherent features to be extracted. Using the Ising model, we define a class of information-theoretic objective functions and also local approximations, and derive the learning rules in both cases. These rules have similarities to, and differences from, the celebrated BCM rule. Local and global versions of Infomax appear as by-products of the general approach, as well as multivariate versions of Coherent Infomax. Focussing on the more biologically plausible local rules, we describe some computational experiments designed to investigate specific properties of the processors and the general approach. The main conclusions are: 1. The local methodology introduced in the paper has the required functionality. 2. Different units within the multi-unit processors learned to respond to different aspects of their receptive fields. 3. The units within each processor generally produced a distributed code in which the outputs were correlated, and which was robust to damage; in the special case where the number of units available was only just sufficient to transmit the relevant information, a form of competitive learning was produced. 4. The contextual connections enabled the information correlated across streams to be extracted, and, by improving feature detection with weak or noisy inputs, they played a useful role in short-term processing and in improving generalization. 5. The methodology allows the statistical associations between distributed self-organizing population codes to be learned. This technical report is available in compressed Postscript by anonymous ftp from: ftp.stats.gla.ac.uk or from the following URL: ftp://ftp.stats.gla.ac.uk/pub/jim/NNkfp.ps.Z Some earlier reports and general references are available from the URL: http://www.stats.gla.ac.uk/~jim/nn.html ----------------------------------------------------------------------- Jim Kay jim at stats.gla.ac.uk From nkasabov at commerce.otago.ac.nz Sat Aug 24 18:44:32 1996 From: nkasabov at commerce.otago.ac.nz (Nikola Kasabov) Date: Sat, 24 Aug 1996 10:44:32 -1200 Subject: ICONIP/ANZIIS/ANNES'97 CFP Message-ID: <23DE8694E89@jupiter.otago.ac.nz> ICONIP'97 jointly with ANZIIS'97 and ANNES'97 The Fourth International Conference on Neural Information Processing-- The Annual Conference of the Asian Pacific Neural Network Assembly, jointly with The Fifth Australian and New Zealand International Conference on Intelligent Information Processing Systems, and The Third New Zealand International Conference on Artificial Neural Networks and Expert Systems 24-28 November, 1997 Dunedin/Queenstown, New Zealand In 1997, the annual conference of the Asian Pacific Neural Network Assembly, ICONIP'97, will be held jointly with two other major international conferences in the Asian Pacific Region, the Fifth Australian and New Zealand International Conference on Intelligent Information Processing Systems (ANZIIS'97) and the Third New Zealand International Conference on Artificial Neural Networks and Expert Systems (ANNES'97), from 24 to 28 November 1997 in Dunedin and Queenstown, New Zealand. The joint conference will have three parallel streams: Stream1: Neural Information Processing Stream2: Computational Intelligence and Soft Computing Stream3: Intelligent Information Systems and their Applications TOPICS OF INTEREST Stream1: Neural Information Processing * Neurobiological systems * Cognition * Cognitive models of the brain * Dynamical modelling, chaotic processes in the brain * Brain computers, biological computers * Consciousness, awareness, attention * Adaptive biological systems * Modelling emotions * Perception, vision * Learning languages * Evolution Stream2: Computational Intelligence and Soft Computing * Artificial neural networks: models, architectures, algorithms * Fuzzy systems * Evolutionary programming and genetic algorithms * Artificial life * Distributed AI systems, agent-based systems * Soft computing--paradigms, methods, tools * Approximate reasoning * Probabilistic and statistical methods * Software tools, hardware implementation Stream3: Intelligent Information Systems and their Applications * Connectionist-based information systems * Hybrid systems * Expert systems * Adaptive systems * Machine learning, data mining and intelligent databases * Pattern recognition and image processing * Speech recognition and language processing * Intelligent information retrieval systems * Human-computer interfaces * Time-series prediction * Control * Diagnosis * Optimisation * Application of intelligent information technologies in: manufacturing, process control, quality testing, finance, economics, marketing, management, banking, agriculture, environment protection, medicine, geographic information systems, government, law, education, and sport * Intelligent information technologies on the global networks HONORARY CHAIR Shun-Ichi Amari, Tokyo University GENERAL CONFERENCE CHAIR Nikola Kasabov, University of Otago LOCAL ORGANIZING COMMITTEE CHAIR: Philip Sallis, University of Otago CONFERENCE ORGANISER Ms Kitty Ko Department of Information Science, University of Otago, PO Box 56, Dunedin, New Zealand phone: +64 3 479 8153, fax: +64 3 479 8311, email: kittyko at commerce.otago.ac.nz CALL FOR PAPERS Papers must be received by 30 May 1997. They will be reviewed by senior researchers in the field and the authors will be informed about the decision of the review process by 20 July 1997. The accepted papers must be submitted in a camera-ready format by 20 August. All accepted papers will be published by IEEE Computer Society Press. As the conference is a multi-disciplinary meeting the papers are required to be comprehensible to a wider rather than to a very specialised audience. Papers will be presented at the conference either in an oral or in a poster session. Please submit three copies of the paper written in English on A4-format white paper with one inch margins on all four sides, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. Centred at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by an abstract and the text. In the covering letter the stream and the topic of the paper according to the list above should be indicated. The IEEE Transaction journals LaTex article style can be used. SPECIAL ISSUES OF JOURNALS AND EDITED VOLUMES Selected papers will be published in special issues of scientific journals. The organising committee is looking for publications of edited volumes which include chapters covering the conference topics written by invited conference participants. TUTORIALS (24 November) Conference tutorials will be organized to introduce the basics of cognitive modelling, dynamical systems, neural networks, fuzzy systems, evolutionary programming, soft computing, expert systems, hybrid systems, and adaptive systems. Proposals for tutorials are due on 30 May 1997. EXHIBITION Companies and university research laboratories are encouraged to exhibit their developed or distributing software and hardware systems. STUDENT SESSION Postgraduate students are encouraged to submit papers to this session following the same formal requirements for paper submission. The submitted papers will be published in a separate brochure. SPECIAL EVENTS FOR PRACTITIONERS The New Zealand Computer Society is organising special demonstrations, lectures and materials for practitioners working in the area of information technologies. VENUE (Dunedin/Queenstown) The Conference will be held at the University of Otago, Dunedin, New Zealand. The closing session will be held on Friday, 28 November on a cruise on one of the most beautiful lakes in the world, Lake Wakatipu. The cruise departs from the famous tourist centre Queenstown, about 300 km from Dunedin. Transportation will be provided and there will be a separate discount cost for the cruise. ACCOMMODATION Accommodation has been booked at St Margaret's College located right on the Campus and 10 minutes from downtown Dunedin. The College offers well equipped facilities including library, sport hall, music hall and computers with e-mail connection. Full board (NZ$50) is available during the conference days as well as two days before and after the conference. Accommodation is also available for a range of hotels in the city. TRAVELLING The Dunedin branch of House of Travel, a travelling company, is happy to assist in any domestic and international travelling arrangements for the Conference delegates. They can be contacted through email: travel at es.co.nz, fax: +64 3 477 3806, phone: +64 3 477 3464, or toll free number: 0800 735 737 (within NZ). POSTCONFERENCE EVENTS Following the closing conference cruise, delegates may like to experience the delights of Queenstown, Central Otago, and Fiordland. Travel plans can be coordinated by the Dunedin Visitor Centre (phone: +64 3 474 3300, fax: +64 3 474 3311). IMPORTANT DATES Papers due: 30 May 1997 Proposals for tutorials: 30 May 1997 Notification of acceptance: 20 July 1997 Final camera-ready papers due: 20 August 1997 Registration of at least one author of a paper: 20 August 1997 Early registration: 20 August 1997 CONFERENCE CONTACTS, PAPER SUBMISSIONS, CONFERENCE INFORMATION, REGISTRATION FORMS Conference Secretariat Department of Information Science, University of Otago, PO Box 56, Dunedin, New Zealand; phone: +64 3 479 8142; fax: +64 3 479 8311; email: iconip97 at otago.ac.nz Home page: http://divcom.otago.ac.nz:800/com/infosci/kel/conferen.htm ---------------------------------------------- ICONIP'97 jointly with ANZIIS'97 and ANNES'97 TENTATIVE REGISTRATION PLEASE PRINT Title:________________________________ Surname:______________________________ First Name:___________________________ Position:_____________________________ Organisation:_________________________ Department:___________________________ Address:______________________________ ______________________________________ City:_________________________________ Country:______________________________ Phone:________________________________ Fax:__________________________________ Email:________________________________ Yes/No. Would you attend the conference? Yes/No. Would you submit a paper? Yes/No. Would you attend the closing session on the cruise? Yes/No. Would you like any further information? Please mail a copy of this completed form to: Ms Kitty Ko Department of Information Science University of Otago PO Box 56 Dunedin New Zealand. -------------------------------------------------------------- -------------------------------------------------------------------------------- Assoc.Professor Dr Nikola Kasabov phone:+64 3 479 8319 Director of Graduate Studies fax:+64 3 479 8311 Department of Information Science nkasabov at otago.ac.nz University of Otago P.O. Box 56, Dunedin, New Zealand home page http://divcom.otago.ac.nz:800/COM/INFOSCI/KEL/home.htm ------------------------------------------------------------------------------- From josh at vlsia.uccs.edu Fri Aug 23 19:52:30 1996 From: josh at vlsia.uccs.edu (Alspector) Date: Fri, 23 Aug 96 17:52:30 MDT Subject: call for papers, IWANNT*97 Message-ID: <9608232352.AA02678@vlsia.uccs.edu> CALL FOR PAPERS International Workshop on Applications of Neural Networks (and other intelligent systems) to Telecommunications (IWANNT*97) Melbourne, Australia June 9-11, 1997 Organizing Committee General Chair: Josh Alspector, U. of Colorado Program Chair: Rod Goodman, Caltech Publications Chair: Timothy X Brown, U. of Colorado Treasurer: Anthony Jayakumar, Bellcore Publicity: Atul Chhabra, NYNEX Lee Giles, NEC Research Institute Local Arrangements: Adam Kowalczyk, Telstra, Chair Michael Dale, Telstra Andrew Jennings, RMIT Maributu Palaniswami, U. of Melbourne Robert Slaviero, Signal Proc. Ass. (& local IEEE liason) Jacek Szymanski, Telstra Program Committee: Nader Azarmi, British Telecom Miklos Boda, Ericsson Radio Systems Harald Brandt, Ericsson Telecommunications Tzi-Dar Chiueh, National Taiwan U Bruce Denby, U of Versailles Simon Field, Nortel Francoise Fogelman, SLIGOS Marwan A. Jabri, Sydney Univ. Thomas John, SBC S Y Kung, Princeton University Tadashi Sone, ATR Scott Toborg, SBC TRI IEEE Liaison: Steve Weinstein, NEC Conference Administrator: Helen Alspector IWANNT Conference Administrator Univ. of Colorado at Col. Springs Dept. of Elec. & Comp. Eng. P.O. Box 7150 Colorado Springs, CO 80933-7150 (719) 593-3351 (719) 593-3589 (fax) neuranet at mail.uccs.edu Dear Colleague: You are invited to an international workshop on applications of neural networks and other intelligent systems to problems in telecommunica- tions and information networking. This is the third workshop in a series that began in Princeton, New Jersey on October 18-20, 1993. and continued in Stockholm, Sweden on May 22-24, 1995. This conference will be at the University of Melbourne on the Monday through Wednesday (June 9 - 11, 1997) just before the Australian Conference on Neural Networks (ACNN) which will be at the same location on June 11 - 13 (Wednesday - Friday). There will be a hard cover proceedings available at the workshop. There is further information on the IWANNT home page at: http://ece-www.colorado.edu/~timxb/iwannt.html Suggested topics include: Internet Services Intelligent Agents Database Mining Network Management ATM Networking Wireless Networks Modulation and Coding Techniques Congestion Control Adaptive Equalization Speech Recognition Security Verification Adaptive User Interfaces Language ID/Translation Multimedia Networking Information Filtering Dynamic Routing Propagation Path Loss Modeling Dynamic Frequency Allocation Software Engineering Telecom Market Prediction Fault Identification and Prediction Character Recognition Adaptive Control Data Compression Credit Management Customer Modeling Submissions: Please submit 6 copies of both a 50 word abstract and a 1000 word summary of your paper to arrive in Colorado, USA by Oct. 15, 1996. Mail papers to the conference administrator. Note the following dates: Tuesday, Oct. 15, 1996: Abstract, summary due. Monday, Nov. 25, 1996: Notification of acceptance Monday, Feb. 10, 1997: Camera Ready Copy Due I hope to see you at the workshop. Sincerely, Josh Alspector, General Chair ----------------------------------------------------------- REGISTRATION FORM ___________________________________________________________ International Workshop on Applications of Neural Networks (and other intelligent systems) to Telecommunications (IWANNT*97) Melbourne, Australia June 9-11, 1997 Name: Institution: Mailing Address: Telephone: Fax: E-mail: Make check ($400; $500 after May 1, 1997; $200 students) out to IWANNT*97. Please make sure your name is on the check. Registration includes breaks and proceedings available at the conference. Mail to: Helen Alspector IWANNT Conference Administrator Univ. of Colorado at Col. Springs Dept. of Elec. & Comp. Eng. P.O. Box 7150 Colorado Springs, CO 80933-7150 (719) 593-3351 (719) 593-3589 (fax) neuranet at mail.uccs.edu Site The conference will be held at the University of Melbourne. There are several good hotels within walking distance of the university. More information will be sent to registrants or upon request. From dhw at almaden.ibm.com Fri Aug 23 19:54:39 1996 From: dhw at almaden.ibm.com (dhw@almaden.ibm.com) Date: Fri, 23 Aug 1996 16:54:39 -0700 Subject: Job openings Message-ID: <9608232354.AA22322@buson.almaden.ibm.com> *** Job Announcements. Please distribute. *** The Web is currently dumb. Join our team at IBM net.Mining; we are making the web intelligent. We currently have immediate need to fill positions at our Almaden Research Center facility in the south of Silicon Valley. net.Mining is a sub-organization of IBM Data Mining Solutions, a rapidly expanding group that also has openings (see recent postings). IBM is an equal opportunity employer. Scientific Programmers/ Responsibilities: Interact with the Machine Learning Researchers to implement new web-based algorithms as code, verify the code, and test the algorithms in real world environments. Must be able to work independently. Qualitifications: Bachelors or equivalent in computer science, statistics, mathematics, physics, or an equivalent field. Higher degree highly desirable. Extensive experience implementing numeric code, especially in machine learning, statistics, neural nets, or a similar field. Familiarity with college-level mathematics (multi-variable calculus, differential equations, linear algebra, etc.) 2 or more years experience with C/C++ in a research or commercial environment. Knowledge of Internet technologies highly desirable. Machine Learning Researchers/ Responsibilities: Develop new algorithms applying machine learning and associated technologies to the web. Develop new such technologies. Work with the Scientific Programmers to implement and investigate those algorithms and technologies in the real world. Qualifications include: PhD or equivalent in computer science, statistics, mathematics, physics, or an equivalent field, with an emphasis on machine learning, statistics, neural nets, or a similar field. Strong background in mathematics. Experience with C/C++ highly desirable. Knowledge of information retrieval and/or indexing systems, text mining, and/or knowledge of Internet technologies, all highly desirable. From sandro at parker.physio.nwu.edu Tue Aug 27 12:13:36 1996 From: sandro at parker.physio.nwu.edu (Sandro Mussa-Ivaldi) Date: Tue, 27 Aug 96 11:13:36 CDT Subject: Postdoctoral fellowship - Motor learning Message-ID: <9608271613.AA01631@parker.physio.nwu.edu> ****** POSTDOCTORAL FELLOWSHIP ON MOTOR LEARNING ****** A postdoctoral position is available at the Sensory Motor Performance Program of the Rehabilitation Institute of Chicago (RIC) to work on learning and adaptation of multi-joint arm movements. RIC is a Northwestern University affiliated rehabilitation hospital, with close ties to Northwestern University schools of medicine and engineering. The research is to be be carried out with Sandro Mussa-Ivaldi, Ph.D. and involves both experimental and theoretical components. The experimental work will be based on the interaction of human subjects with a two-joint robot manipulandum that has been recently been built in Mussa-Ivaldi's lab. The manipulandum is controlled by a PC programmed in C++. The theoretical work will involve the representation of the arm's adaptive controller as a combination of non-linear basis functions. The position which is available immediately is for one year but it is expected to be extended at least for a second year. It requires a professional level of knowledge which is acquired with a PhD in Engineering of Physics with emphasis in Biomedical research and with some substantial background in Classical Mechanics and Control theory. Technical proficiency in C++, PC platforms (DOS, Windows 95) real-time programming and Unix (X) is highly desirable. RIC offers competitive salary and benefits (EOE, M/F/D/V). Applicants should send a CV, a statement of their interests and professional goals (not longer than 1 page) and the names, addresses and telephone numbers of at least two reference to Domenica Pappas either via email (dgpappas at casbah.acns.nwu.edu) or via surface mail at the following address: Domenica Pappas Administrative Supervisor Rehabilitation Institute of Chicago 345 East Superior Street Room 1406 Chicago, Illinois 60611 ---------------------------------------------------------- Sandro Mussa-Ivaldi (sandro at nwu.edu) From meyer at wotan.ens.fr Wed Aug 28 05:52:32 1996 From: meyer at wotan.ens.fr (Jean-Arcady MEYER) Date: Wed, 28 Aug 1996 11:52:32 +0200 Subject: SAB96 Registration by August 30th Message-ID: <9608280952.AA12138@eole.ens.fr> !! Note: if you plan to attend SAB96 and have not yet registered, please do so ASAP. Your registration should arrive by August 30th. After that, mail will be delayed due to Labor Day (Sep 2) and your registration may not get processed in time for the conference!! ***********************CONFERENCE INFORMATION******************************* From Animals to Animats The Fourth International Conference on Simulation of Adaptive Behavior (SAB96) September 9th-13th, 1996 Sea Crest Resort & Conference Center North Falmouth, Massachusetts, USA FULL DETAILS ON THE WEB PAGE: http://www.cs.brandeis.edu/conferences/sab96 GENERAL CONTACT: sab96 at cs.brandeis.edu The objective of this conference is to bring together researchers in ethology, psychology, ecology, artificial intelligence, artificial life, robotics, and related fields so as to further our understanding of the behaviors and underlying mechanisms that allow natural and artificial animals to adapt and survive in uncertain environments. The conference start with an opening reception on Sunday, September 8th. Technical sessions start in the morning on Monday, September 9th and run until early Friday afternoon. The conference banquet will be held on Thursday evening, September 12th, and tickets for it are sold separately and in advance. Wednesday afternoon is left free for sightseeing. **************************PROGRAM**************************************** The invited speakers are James Albus, Jelle Atema, Daniel Dennett, Randy Gallistel, Scott Kelso, and David Touretzky. The conference will consist of a single track of 35 papers, 30 posters, and several demonstrations. Full details of the schedule will be maintained on the web page. ***********************REGISTRATION IS OPEN************************* Included in full conference registration fees are: * Reception and breaks * Lunches (4 days) * Entry to the all technical and poster sessions * Conference proceedings published by MIT Press Please purchase your banquet tickets separately and in advance, as they may not be available at the conference. Members of the International Society for Adaptive Behavior (ISAB) will save $50 on registration fees for this conference. In addition, ISAB members receive an annual subscription to Adaptive Behavior, the premier journal of the field, as well as discounts to other ISAB-related meetings. To join ISAB, please visit http://netq.rowland.org/isab/isab.html Early registration (postmarked before June 30, 1996) is recommended as it helps us plan and saves you $50. For the convenience of overseas colleagues, we will process a limited number of MasterCard or VISA registrations. Full-time students, and ISAB members should be able to prove their status at the registration desk with an ID card or member number. Please fill out this form and send with your check (in US dollars, made out to "Brandeis University") to: SAB96 Registration c/o Ms. Myrna Fox Computer Science Department Brandeis University 415 South St Waltham, MA 02254 USA Dr__Mr__Ms__ Last Name First Name Middle Title Affiliation Address E-mail Phone Fax FEES (in US dollars) Category EARLY REGISTRATION Regular Total Postmarked by June 30th ISAB Member $250 $300 Non-Member $300 $350 Full-Time Student Include copy of ID $175 $225 Banquet Tickets $40 each Total fees Enclosed In addition, 6 foot tables for publishers and other vendors are available for $250 for the 5 days of the conference. If you register but must cancel, and you notify us before July 8th, you will receive your fees minus a $50 service charge. If you notify us before August 26th, you will receive a refund minus a $100 service charge. After August 26th your fees will not be refundable. ***************************HOTEL************************************ HOTEL DEADLINE FOR BLOCKED ROOMS IS AUGUST 8TH The entire conference will take place at the Sea Crest in North Falmouth on Cape Cod. Participants should fly into Boston Logan Airport and either rent a car or take the Bonanza Shuttle Bus ($30 r/t, see schedule on Web page) to Falmouth, and transfer to the hotel by the hotel shuttle. All participants are responsible for making their own reservations by contacting the group reservations office at (800) 225-3110, and must state they are attending the Simulation of Adaptive Behavior Conference in order to receive the special rate. 200 rooms have been set aside for our use, and are being held on a first-come first-served basis until August 8th. Single and double rooms are available for $90 plus tax. Children under 16 may stay in their parent's room for free. The rooms have two double beds; 3 or 4 persons may share a room for $100 and $110 plus tax, respectively. The conference rate also applies to the 3 days before and 3 days after the meeting dates. Each room reservation must be secured with a credit card deposit for one night. If a reservation is cancelled 8 days or more prior to arrival, the deposit is refunded, minus a $10 service charge. If a reservation is cancelled 7 days or less prior to arrival, or the individual does not show up for the specified dates, the reservation will be cancelled and the deposit forfeited. In case of overflow, the Quality Inn (508) 540-2000 and the Falmouth Inn (508) 540-2500 have rooms available at $65 and $60 per night plus tax. *************************AIRLINE********************************* Delta Airline is the official airline for SAB96, and is offering special discounted meeting fares from the US and Canada. To take advantage of these fares: 1) Please call or have your travel agent call 1-800-241-6760 between 8 am and 10 pm EST, 2) Refer to file number "XI304" From radford at cs.toronto.edu Wed Aug 28 20:12:47 1996 From: radford at cs.toronto.edu (Radford Neal) Date: Wed, 28 Aug 1996 20:12:47 -0400 Subject: New release of Bayesian NN software Message-ID: <96Aug28.201256edt.1290@neuron.ai.toronto.edu> New Release of Software for BAYESIAN LEARNING FOR NEURAL NETWORKS Radford Neal, University of Toronto A new version of my software for Bayesian learning of models based on multilayer perceptron networks, using Markov chain Monte Carlo methods, is now available on the Internet. This software implements the methods described in my Ph. D. thesis, "Bayesian Learning for Neural Networks", which is now available from Springer-Verlag (ISBN 0-387-94724-8). Use of the software is free for research and educational purposes. The software supports models for regression and classification problems based on networks with any number of hidden layers, using a wide variety of prior distributions for network parameters and hyperparameters. The advantages of Bayesian learning include the automatic determination of "regularization" parameters, without the need for a validation set, avoidance of overfitting when using large networks, and quantification of the uncertainty in predictions. The software implements the Automatic Relevance Determination (ARD) approach to handling inputs that may turn out to be irrelevant (developed with David MacKay). For problems and networks of moderate size (eg, 200 training cases, 10 inputs, 20 hidden units), full training (to the point where one can be reasonably sure that the correct Bayesian answer has been found) typically takes several hours to a day on our SGI machine. However, quite good results, competitive with other methods, are often obtained after training for under an hour. (Of course, your machine may not be as fast as ours!) The software is written in ANSI C, and has been tested on SGI and Sun machines. Full source code is included. This new release is not radically different from the release of a year ago, but it does contain a number of enhancements to both the programs and the documentation, so it is probably worth your while to upgrade if you are using the old version. The new version is not quite upwardly compatible with the old, but converting old scripts should be very easy. You can obtain the software via my home page, at URL http://www.cs.toronto.edu/~radford/ If you have any problems obtaining the software, please contact me at one of the addresses below. --------------------------------------------------------------------------- Radford M. Neal radford at cs.toronto.edu Dept. of Statistics and Dept. of Computer Science radford at stat.toronto.edu University of Toronto http://www.cs.toronto.edu/~radford --------------------------------------------------------------------------- From esann at dice.ucl.ac.be Thu Aug 29 10:45:10 1996 From: esann at dice.ucl.ac.be (esann@dice.ucl.ac.be) Date: Thu, 29 Aug 1996 15:45:10 +0100 Subject: ESANN'97: Call for papers Message-ID: <199608291336.PAA14156@ns1.dice.ucl.ac.be> Dear colleagues, You will find enclosed a short version of the call for papers of ESANN'97, the European Symposium on Artificial Neural Networks, which will be held in Bruges (Belgium), on April 16-18, 1996. The full version can be viewed on the ESANN WWW server: http://www.dice.ucl.ac.be/neural-nets/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. Sincerely yours, Michel Verleysen --------------------------------------------------- | European Symposium | | on Artificial Neural Networks | | | | Bruges - April 16-17-18, 1997 | | | | First announcement and call for papers | --------------------------------------------------- Scope and topics ---------------- Since its first edition in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamental and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments and directions in this field. In 1997, the programme of the conference will be slightly modified. Besides the traditional oral sessions, there will be a few "invited" sessions organized by renowned scientists, and poster sessions. Poster authors will have the opportunity to present their poster orally, with one slide, in one or two minutes. It is important to note that posters will be considered at the same scientific level as oral presentations, the poster format being a more appropriate medium for presenting certain kinds of results. The fifth European Symposium on Artificial Neural Networks will be organized in Bruges, Belgium, in April 1997. The four first successful editions each gathered between 90 and 110 scientists, coming not only from Western and Eastern Europe, but also from USA, Japan, Australia, New Zealand, South America... The fifth ESANN symposium will concentrate on fundamental and theoretical aspects of artificial neural networks, and on the links between neural networks and other domains of research, such as statistics, data analysis, biology, psychology, evolutive learning, bio-inspired systems,... The following is a non-exhaustive list of topics which will be covered during ESANN'97: * theory * models and architectures * mathematics * learning algorithms * statistical data analysis * self-organization * approximation of functions * Bayesian classification * time series forecasting * vector quantization * independent component analysis * bio-inspired systems * cognitive psychology * biologically plausible artificial networks * formal models of biological phenomena * neurobiological systems * identification of non-linear dynamic systems * adaptive behavior * adaptive control * signal processing * evolutive learning The ESANN'97 conference is organized with the support of the IEEE Region 8, the IEEE Benelux Section, and the Universit? Catholique de Louvain (UCL, Louvain-la-Neuve, Belgium). Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is worldwide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organized in an hotel located near the center (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all levels of comfort and price are available in Bruges; there will be a possibility to book a 3-stars or a lower category hotel at a preferential rate through the conference secretariat, and a list of other smaller hotels will be available. Deadlines --------- Submission of papers November 29, 1996 Notification of acceptance January 31, 1997 Symposium April 16-18, 1997 Call for contributions ---------------------- Prospective authors are invited to submit six original copies of their contribution before November 29, 1996. Working language of the conference (including proceedings) is English. Papers should not exceed six A4 pages (including figures and references). Printing area will be 12.2 x 19.3 cm (centered on the A4 page); left, right, top and bottom margins will thus respectively be 4.4, 4.4, 5.2 and 5.2 cm. Complying with these margins and centering the text on the A4 sheets is mandatory: the manuscript will be reproduced in its original size in the proceedings, and margins will be cut to the book format. Papers not respecting these margins will be photocopied before printing, what strongly reduces their quality in the proceedings. 10-point Times font will be used for the core of the text; headings will be in bold characters (but not underlined), and will be separated from the main text by two blank lines before and one after. The manuscript will begin with a title (Times 14 point, bold, centered), two blank lines, the names of the authors (Times 10 point, centered), a blank line, their affiliation(s) (Times 9 point, centered), two blank lines, the abstract (Times 9 point, justified), and two blank lines. The maximum width of the header (title, authors, affiliations and abstract) will be 10.2 cm (i.e. left and right margins will be each 1 cm larger than for the main text). Originals of the figures will be pasted into the manuscript and centered between the margins. The lettering of the figures should be in 10-point Times font size. Figures should be numbered. The legends also should be centered between the margins and be written in 9-point Times font size. The pages of the manuscript will not be numbered (numbering decided by the editor). We strongly encourage authors to read the full instructions on the WWW server of the conference, and/or to use the LaTeX format available on this server: http://www.dice.ucl.ac.be/neural-nets/esann A separate page (not included in the manuscript) will indicate: * the title of the manuscript * author(s) name(s) * the complete address (including phone & fax numbers and E-mail) of the corresponding author * a list of five keywords or topics On the same page, the authors will copy and sign the following paragraph: "in case of acceptation of the paper for presentation at ESANN 97: - at least one of the authors will register to the conference and will present the paper - the author(s) transfer the copyright of the paper to the organizers of ESANN 97, for the proceedings and any publication that could directly be generated by the conference - if the paper does not match the format requirements for the proceedings, the author(s) will send a revised version within two weeks of the notification of acceptation." Contributions must be sent to the conference secretariat. Examples of camera-ready contributions can be obtained by writing to the same address. Registration fees ----------------- registration before registration after February 1st, 1997 February 1st, 1997 Universities BEF 15500 BEF 16500 Industries BEF 19500 BEF 20500 The registration fee include the attendance to all sessions, the lunches during the three days of the conference, the coffee breaks twice a day, the conference dinner, and the proceedings. To benefit from the reduced registration fee (before February 1st), please use the "advanced registration form" available on the ESANN WWW server or through the conference secretariat. Grants ------ A few grants, covering part of the registration fees, could be offered (depending of the acceptation of a project by the EC) to young scientists from the European Union, and/or from former Central and Eastern European countries. Please write to the conference secretariat or refer to the ESANN WWW server for details and availability. Deadline for applications: February 28, 1997. Warning: in no case, the participation of an author may be conditioned by a grant; as indicated above, prospective authors must commit themselves to register to the conference, even if their application for a grant is not accepted. Conference secretariat ---------------------- Michel Verleysen D facto conference services phone: + 32 2 203 43 63 45 rue Masui Fax: + 32 2 203 42 94 B - 1000 Brussels (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/neural-nets/esann Reply form ---------- If you wish to receive the final program of ESANN'97, for any address change, or to add one of your colleagues in our database, please send this form to the conference secretariat. ------------------------ cut here ----------------------- ------------------ ESANN'97 reply form ------------------ Name: ................................................. First Name: ............................................ University or Company: ................................. ................................. Address: .............................................. .............................................. .............................................. ZIP: ........ Town: ................................ Country: ............................................... Tel: ................................................... Fax: ................................................... E-mail: ................................................ ------------------------ cut here ----------------------- Please send this form to : D facto conference services 45 rue Masui B - 1000 Brussels e-mail: esann at dice.ucl.ac.be Steering and local committee (to be confirmed) ---------------------------------------------- Fran?ois Blayo Univ. Paris I (F) Herv? Bourlard FPMS Mons (B) Marie Cottrell Univ. Paris I (F) Jeanny H?rault INPG Grenoble (F) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------------------------- Edoardo Amaldi Cornell Univ. (USA) Agn?s Babloyantz Univ. Libre Bruxelles (B) Joan Cabestany Univ. Polit. de Catalunya (E) Holk Cruse Universit?t Bielefeld (D) Eric de Bodt UCL Louvain-la-Neuve (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips / LEP (F) Jean-Claude Fort Universit? Nancy I (F) Bernd Fritzke Ruhr-Universit?t Bochum (D) Karl Goser Universit?t Dortmund (D) Manuel Grana UPV San Sebastian (E) Martin Hasler EPFL Lausanne (CH) Kurt Hornik Techische Univ. Wien (A) Christian Jutten INPG Grenoble (F) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Hans-Peter Mallot Max-Planck Institut (D) Eddy Mayoraz IDIAP Martigny (CH) Jean Arcady Meyer Ecole Normale Sup?rieure Paris (F) Jos? Mira Mira UNED (E) Pietro Morasso Univ. of Genoa (I) Jean-Pierre Nadal Ecole Normale Sup?rieure Paris (F) Erkki Oja Helsinky University of Technology (FIN) Gilles Pag?s Universit? Paris VI (F) H?l?ne Paugam-Moisy Ecole Normale Sup?rieure Lyon (F) Alberto Prieto Universitad de Granada (E) Pierre Puget LETI Grenoble (F) Ronan Reilly University College Dublin (IRE) Tamas Roska Hungarian Academy of Science (H) Jean-Pierre Rospars INRA Versailles (F) Jean-Pierre Royet Universit? Lyon 1 (F) John Stonham Brunel University (UK) John Taylor King's College London (UK) Vincent Torre Universita di Genova (I) Claude Touzet IUSPIM Marseilles (F) Marc Van Hulle KUL Leuven (B) Christian Wellekens Eurecom Sophia-Antipolis (F) !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! WARNING: new postal code: 1000 Brussels instead of 1210 Brussels WARNING: new phone and fax numbers ! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! _____________________________ D facto publications - conference services 45 rue Masui 1000 Brussels Belgium tel: +32 2 203 43 63 fax: +32 2 203 42 94 _____________________________ From jb at uran.informatik.uni-bonn.de Thu Aug 29 06:03:58 1996 From: jb at uran.informatik.uni-bonn.de (Joachim Buhmann) Date: Thu, 29 Aug 96 11:03:58 +0100 Subject: Postdoc position in Neural Computing/Computer Vision Message-ID: <199608291004.LAA02307@retina> *************************************************************************** UNIVERSITY OF BONN, GERMANY COMPUTER SCIENCE INSTITUTE III Postdoctoral position in NEURAL NETWORKS for COMPUTER VISION / AUTONOMOUS ROBOTICS +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Please forward this call to whoever might be interested in this job offer. Thanks in advance. +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ The COMPUTER SCIENCE INSTITUTE III, UNIVERSITY OF BONN, GERMANY has established a research group for autonomous robotics in 1993. The RHINO group, named after the autonomous robot RHINO, addresses the research questions active vision fast scene segmentation statistical pattern recognition and learning methods autonomous map building from sonar data and stereo robot planning and control design of a high level robot control language with special emphasis of adaptive neural network methods. We expect to have a postdoctoral position available for one year starting this fall with the possibility of renewal for a second year. Applicants should have research experience in several of the research areas listed above. Furthermore, the applicant must have development experience in a C++/UNIX environment. The position is associated with the research groups of Profs. Buhmann and Cremers. Funding is provided by the European Community under the Training and Mobility of Researchers (TMR) contract VIRGO. Applicants must have been residents of a European country (excluding Germany) for the last 18 months. The COMPUTER SCIENCE INSTITUTE III at Bonn has about 30 PhD students, 5 faculty members, and 10 research associates (Postdocs). 4 PhD students, 2 postdocs, and 3 faculty members are associated with the RHINO group. The Institute is headed by Prof. A. B. Cremers. Please send a CV, summary of relevant research interests and of recent publications by ordinary mail or email to: Joachim Buhmann University of Bonn Computer Science Institute III Roemerstr. 164 53117 Bonn, Germany Email: jb at cs.uni-bonn.de http://www-dbv.informatik.uni-bonn.de/ From mmisra at adastra.Mines.EDU Thu Aug 29 19:58:11 1996 From: mmisra at adastra.Mines.EDU (Manavendra Misra) Date: Thu, 29 Aug 1996 17:58:11 -0600 Subject: CS positions Message-ID: <9608291758.ZM3438@adastra.Mines.EDU> Interested connectionists are invited to apply for two new faculty positions at the Colorado School of Mines. One of these is at the Associate/Full professor level, and the other is an entry level Assistant Professor position. Announcements are enclosed below. Manav. ------- Applied Computer Science Associate/Full Professor Applications are invited for an anticipated tenured position in Applied Computer Science at the Associate or Full Professor level for fall 1997. Applicants should have a Ph.D. in Computer Science or a related field; excellence in teaching and research is essential. Preference will be given to candidates in the areas of artificial intelligence, scientific visualization, and high performance computing. Evidence of interest or successful involvement in interdisciplinary collaborative research projects is desirable. To apply, send: a) A curriculum vitae, b) Three letters of recommendation, at least one of which addresses teaching ability, and c) A one-page statement describing teaching experience and philosophy, and research interests and aspirations to: Colorado School of Mines Office of Human Resources Applied Computer Science Associate/Full Professor Search #96-081600 1500 Illinois Street Golden, CO 80401-1887 FAX: (303)273-3278 Applications will be considered beginning December 16, 1996, and thereafter until the position is filled. ------ Applied Computer Science Assistant Professor Applications are invited for a tenure-track position in Applied Computer Science at the Assistant Professor level for fall 1997. Applicants should have a Ph.D. in Computer Science or a related field, excellent research accomplishments/potential and a strong commitment to teaching. Preference will be given to candidates in the areas of artificial intelligence, scientific visualization, and high performance computing who have one or more years of postdoctoral experience. To apply, send: a) A curriculum vitae, b) Three letters of recommendation, at least one of which addresses teaching ability, and c) A one-page statement describing teaching experience and philosophy, and research interests and aspirations to: Colorado School of Mines Office of Human Resources Applied Computer Science Assistant Professor Search #96-081330 1500 Illinois Street Golden, CO 80401-1887 FAX: (303)273-3278 Applications will be considered beginning January 13, 1997, and thereafter until the position is filled. -- ***************************************************************************** Manavendra Misra Dept of Mathematical and Computer Sciences Colorado School of Mines, Golden, CO 80401 Ph. (303)-273-3873 Fax. (303)-273-3875 Home messages/fax : (303)-271-0775 email: mmisra at mines.edu WWW URL: http://www.mines.edu/fs_home/mmisra/ ***************************************************************************** From back at zoo.riken.go.jp Thu Aug 29 22:28:58 1996 From: back at zoo.riken.go.jp (Andrew Back) Date: Fri, 30 Aug 1996 11:28:58 +0900 (JST) Subject: NIPS'96 Workshop - Blind Signal Processing Message-ID: CALL FOR PAPERS NIPS'96 Postconference Workshop BLIND SIGNAL PROCESSING AND THEIR APPLICATIONS (Neural Information Processing Approaches) Snowmass (Aspen), Colorado USA Sat Dec 7th, 1996 A. Cichocki and A. Back Brain Information Processing Group Frontier Research Program RIKEN, Institute of Physical and Chemical Research, Hirosawa 2-1, Saitama 351-01, WAKO-Shi, JAPAN Email: cia at zoo.riken.go.jp, back at zoo.riken.go.jp Fax: (+81) 48 462 4633. URL: http://zoo.riken.go.jp/bip.html Blind Signal Processing is an emerging area of research in neural networks and image/signal processing with many potential applications. It originated in France in the late 80's and since then there has continued to be a strong and growing interest in the field. Blind signal processing problems can be classified into three areas: (1) blind signal separation of sources and/or independent component analysis (ICA), (2) blind channel identification and (3) blind deconvolution and blind equalization. OBJECTIVES The main objectives of this workshop are to: Give presentations by experts in the field on the state of the art in this exciting area of research. Compare the performance of recently developed adaptive un-supervised learning algorithms for neural networks. Discuss issues surrounding prospective applications and the suitability of current neural network models. Hence we seek to provide a forum for better understanding current limitations of neural network models. Examine issues surrounding local, online adaptive learning algorithms and their robustness and biologically plausibility or justification. Discuss issues concerning effective computer simulation programs. Discuss open problems and perspectives for future research in this area. Especially, we intend to discuss the following items: 1. Criteria for blind separation and blind deconvolution problems (both for time and frequency domain approaches) 2. Natural (or relative) gradient approach to blind signal processing. 3. Neural networks for blind separation of time delayed and convolved signals. 4. On line adaptive learning algorithms for blind signal processing with variable learning rate (learning of learning rate). 5.Open problems, e.g. dynamic on-line determination of number of sources (more sources than sensors), influence of noise, robustness of algorithms, stability, convergence, identifiability, non-causal, non-stationary dynamic systems . 6. Applications in different areas of science and engineering, e.g., non-invasive medical diagnosis (EEG, ECG), telecommunication, voice recognition problems, image processing and enhancement. WORKSHOP FORMAT The workshop will be 1-day in length, combining some invited expert speakers and a significant group discussion time. We will open up the workshop in a moderated way. The intent here is to permit a free-flowing, but productive discourse on the topics relevant to this area. Participants will be encouraged to consider the implications of the current findings in their own work, and to raise questions accordingly. We invite and encourage potential participants to "come prepared" for open discussions. SUBMISSION OF WORKSHOP EXTENDED ABSTRACTS If you would like to contribute, please send an abstract or extended summary as soon as possible to: Andrew Back Laboratory for Artificial Brain Systems, Frontier Research Program RIKEN, Institute of Physical and Chemical Research, Hirosawa 2-1, Saitama 351-01, WAKO-Shi, JAPAN Email: back at zoo.riken.go.jp Phone: (+81) 48 467 9629 Fax: (+81) 48 462 4633. Manuscripts may be sent in by email (in postscript format), air mail or by fax. Important Dates: Submission of abstract deadline: 16 September, 1996 Notification of acceptance: 1 October, 1996 Final paper to be sent by: 30 October, 1996 A set of workshop notes will be produced. For accepted papers to be included in the notes, papers accepted for presentation will need to be supplied to us by the due date of 30 Oct, 1996. For the format of papers, the usual NIPS style file should be used with up to 16 pages allowed. Please contact the workshop organizers for further information, or consult the NIPS WWW home page: http://www.cs.cmu.edu/afs/cs.cmu.edu/Web/Groups/NIPS/ From sml at esesparc2.essex.ac.uk Tue Aug 27 13:29:24 1996 From: sml at esesparc2.essex.ac.uk (Lucas S M) Date: Tue, 27 Aug 1996 18:29:24 +0100 (BST) Subject: papers available on high performance OCR Message-ID: Summary: The following two papers discuss recent work on applying scanning n-tuple classifiers to handwritten OCR. The first is a journal paper which gives some background and all the technical details. The second is a paper for a forthcoming conference which includes more up-to-date results and more detailed timing analysis. The main feature of the method is the incredible speed. If we ignore the pre-processing time, we can train the system at a rate of over 20,000 character images per second, and recognise about 1,200 characters per second, on a humble 66mhz Pentium PC. If we include pre-processing time, then we can still train (recognise) 500 (200) chars per second. The fast training and recognition speeds allow the system parameters to be optimised very quickly. Best accuracy reported is 98.3% on the CEDAR hand-written digit test set. This is not quite at good as the best reported in the literature for this data (98.9%, to the best of our knowledge), but offers a significant speed advantage. The following two papers discuss recent work on a simple unified approach to evolving ALL aspects of a neural network, including its learning algorithm (if any). The first uses a grammar based chromosome, the second uses a set-based chromosome. The latter approach appears particularly promising as a method of part designing/ part evolving neural networks. ----------------------------------------------------------------------- From mo216 at newton.cam.ac.uk Sun Aug 18 00:38:11 1996 From: mo216 at newton.cam.ac.uk (M. Opper) Date: Sun, 18 Aug 1996 05:38:11 +0100 (BST) Subject: technical reports Message-ID: <199608180438.FAA17296@larmor.newton.cam.ac.uk> Two new papers are now available: Opper, M and D. Haussler (1997) "Worst case prediction over sequences under log loss" in {\em The Mathematics of Information Coding, Extraction and Distribution}, Springer Verlag, Edited by G. Cybenko, D. O'Leary and J. Rissanen. ftp://ftp.cse.ucsc.edu/pub/ml/OHWCpaper.ps (180K postscript) Abstract: We consider the game of sequentially assigning probabilities to future data based on past observations under logarithmic loss. We are not making probabilistic assumptions about the generation of the data, but consider a situation where a player tries to minimize his loss relative to the loss of the (with hindsight) best distribution from a target class for the worst sequence of data. We give bounds on the minimax regret in terms of the metric entropies of the target class with respect to suitable distances between distributions. D. Haussler and M. Opper (1997) "Metric Entropy and Minimax Risk in Classification" {\em Lecture Notes in Computer Science: Studies in Logic and Computer Science, a selection of essays in honor of Andrzej Ehrenfeucht} Vol. 1261, 212-235 (1997) Eds. J. Mycielski, G. Rozenberg and A. Salomaa ftp://ftp.cse.ucsc.edu/pub/ml/Andrzejpaper.ps (245k postscript) Abstract: We apply recent results on the minimax risk in density estimation to the related problem of pattern classification. The notion of loss we seek to minimize is an information theoretic measure of how well we can predict the classification of future examples, given the classification of previously seen examples. We give an asymptotic characterization of the minimax risk in terms of the metric entropy properties of the class of distributions that might be generating the examples. We then use these results to characterize the minimax risk in the special case of noisy two-valued classification problems in terms of the Assouad density and the Vapnik-Chervonenkis dimension. From cas-cns at cns.bu.edu Thu Aug 1 09:29:15 1996 From: cas-cns at cns.bu.edu (Boston University - CAS/CNS) Date: Thu, 01 Aug 1996 09:29:15 -0400 Subject: CALL FOR PAPERS: Vison, Recognition, Action Message-ID: <199608011329.JAA05650@cns.bu.edu> CALL FOR PAPERS International Conference on VISION, RECOGNITION, ACTION: NEURAL MODELS OF MIND AND MACHINE May 29--31, 1997 Sponsored by the Center for Adaptive Systems and the Department of Cognitive and Neural Systems Boston University with financial support from the Defense Advanced Research Projects Agency and the Office of Naval Research This conference will include 21 invited lectures and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems see, understand, and act upon a changing world. CALL FOR ABSTRACTS: Contributed abstracts by active modelers of vision, recognition, or action in cognitive science, computational neuroscience, artificial neural networks, artificial intelligence, and neuromorphic engineering are welcome. They must be received, in English, by January 31, 1997. Notification of acceptance will be given by February 28, 1997. A meeting registration fee of $35 for regular attendees and $25 for students must accompany each Abstract. See Registration Information below for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Each Abstract should fit on one 8 x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract, corresponding author and presenting author name, address, telephone, fax, and email address. Preference for oral or poster presentation should be noted. (Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, and VCR facilities will be available for talks.) Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. The original and 3 copies of each Abstract should be sent to: Neural Models of Mind and Machine, c/o Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. The program committee will determine whether papers will be accepted in an oral or poster presentation, or rejected. REGISTRATION INFORMATION: Since seating at the meeting is limited, early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to: Neural Models of Mind and Machine, c/o Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. The registration fee will help to pay for a reception, 6 coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: A limited number of fellowships for PhD candidates and postdoctoral fellows are available to at least partially defray meeting travel and living costs. The deadline for applying for fellowship support is January 31, 1997. Applicants will be notifed by February 28, 1997. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on offiicial institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. Reimbursement checks will be distributed after the meeting. Their size will be determined by student need and the availability of funds. TUTORIALS: A day of tutorials will be held on May 28. Details will follow soon. MEETING INFORMATION: For meeting updates, see the web site at http://cns-web.bu.edu/cns-meeting/. REGISTRATION FORM (Please Type or Print) Vision, Recognition, Action: Neural Models of Mind and Machine Boston University Boston, Massachusetts May 29--31, 1997 Mr/Ms/Dr/Prof: Name: Affiliation: Address: City, State, Postal Code: Phone and Fax: Email: The registration fee includes the meeting program, reception, coffee breaks, and meeting proceedings. [ ] $35 Regular [ ] $25 Student Method of Payment: Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Type of card: Name as it appears on the card: Account number: Expiration date: Signature and date: From sajda at peanut.sarnoff.com Thu Aug 1 14:54:46 1996 From: sajda at peanut.sarnoff.com (Paul Sajda x2961) Date: Thu, 1 Aug 1996 14:54:46 -0400 Subject: Positions in Adaptive Information and Signal Processing Message-ID: <9608011854.AA00702@ironman.sarnoff.com> David Sarnoff Research Center The Computing Sciences Research Group of The David Sarnoff Research Center currently has openings for two technical positions in Adaptive Information and Signal Processing. 1. Member Technical Staff PhD. in Electrical Engineering, Computer Engineering or Computer Science or related discipline. Experience in image and/or signal processing (preferably medical image or signal processing), neural networks, pattern recognition and computer vision. Will conduct research and build and explore commercial opportunities in areas of medical imaging, object recognition/detection, speech recognition, and/or datamining. Special Requirements: Interest in applying neural networks and image understanding to applications in medical image and speech processing. Ambition for helping to develop new commercial neural network/vision business. Excellent communication skills and willingness to bring in new business and extend application domain of group. Experience in MATLAB, C and C++ programming and the UNIX operating system. 2. Associate Member Technical Staff Bachelor of Science in Electrical Engineering or Computer Engineering or Computer Science. Experience in image and/or signal processing. Responsibilities will include developing and proto-typing software for applications in medical imaging, signal processing and datamining. Special Requirements: Experience in MATLAB, C and C++ programming and the UNIX operating system. Background in image and/or signal processing. Experience in neural networks and datamining preferred. Comfortable with building X-based graphical user interfaces. Please send resumes and correspondence via mail to: Dr. Paul Sajda David Sarnoff Reseach center CN5300 Princeton, NJ 08543-5300 or preferably by e-mail to: csrg at sarnoff.com Postscript is preferred for e-mail submissions. From becker at curie.psychology.mcmaster.ca Thu Aug 1 22:28:19 1996 From: becker at curie.psychology.mcmaster.ca (Sue Becker) Date: Thu, 1 Aug 1996 22:28:19 -0400 (EDT) Subject: REGISTRATION FOR NIPS*96 Message-ID: REGISTRATION FOR NIPS*96 Neural Information Processing Systems Tenth Annual Conference Monday December 2 - Saturday December 7, 1996 Denver, Colorado The NIPS*96 registration brochure is now available online. NIPS*96 is the tenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks and oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session (Dec. 3-5), there will be one day of tutorial presentations (Dec. 2), both in Denver, Colorado. Following will be two days of focused workshops on topical issues at Snowmass, Colorado, a world class ski resort (Dec. 6-7). The registration brochure and other conference information may be retrieved via the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS We expect to offer online registration soon from the NIPS web site. Registration material and other information may also be obtained by writing to: NIPS*96 Registration Conference Consulting Associates 451 N. Sycamore Monticello, IA 52310 fax: (319) 465-6709 (attn: Denise Prull) e-mail: nipsinfo at salk.edu REGISTRATION FEES: Conference (includes Proceedings, Reception, Banquet and 3 Continental Breakfasts) Regular $285.00 ($360.00 after Oct. 31, 1996) Full-time students, with I.D. $100.00 ($150.00 after Oct. 31, 1996) Workshops Regular $150.00 ($200.00 after Oct. 31, 1996) Full-time students, with I.D. $75.00 ($125.00 after Oct. 31, 1996) Tutorials Regular $150.00 Full-time students, with I.D. $50.00 TUTORIAL PROGRAM December 2, 1996 Session I: 09:30-11:30 Mostly statistical methods for language processing Dan Jurafsky, University of Colorado at Boulder From postma at cs.rulimburg.nl Fri Aug 2 09:48:43 1996 From: postma at cs.rulimburg.nl (Eric Postma) Date: Fri, 2 Aug 96 15:48:43 +0200 Subject: geometric phase shift Message-ID: <9608021348.AA02835@bommel.cs.rulimburg.nl> The following paper (draft version) is available from our archive: GEOMETRIC PHASE SHIFT IN A NEURAL OSCILLATOR Eric Postma, Jaap van den Herik and Patrick Hudson Computer Science Department University of Maastricht P.O. Box 616, 6200 MD Maastricht The Netherlands ABSTRACT This paper studies the effect of slow cyclic variation of parameters on the phase of the oscillating Morris-Lecar model. In chemical oscillators it is known that a phase shift, called the geometric phase shift, is observed upon return to an initial point in parameter space. We find geometric phase shifts for a two-parameter variation in the Morris-Lecar model. As with the chemical oscillator, the magnitude of the shift is proportional to the area enclosed by the path traced through the parameter space. It is argued that the geometric phase shift may subserve many biological functions. We conclude that the geometric phase shift may be functionally relevant for neural computation. ftp://ftp.cs.rulimburg.nl/pub/papers/postma/gps.ps.Z (or gps.ps) We welcome your comments and suggestions Eric Poatma From mm at santafe.edu Fri Aug 2 18:06:16 1996 From: mm at santafe.edu (Melanie Mitchell) Date: Fri, 2 Aug 1996 16:06:16 -0600 (MDT) Subject: book announcement Message-ID: <199608022206.QAA26640@shabikeschee.santafe.edu> Announcing a new book: ADAPTIVE INDIVIDUALS IN EVOLVING POPULATIONS: MODELS AND ALGORITHMS edited by Richard K. Belew and Melanie Mitchell Proceedings Volume XXVI, Santa Fe Institute Studies in the Sciences of Complexity Addison-Wesley, Reading, MA, 1996 ABOUT THE BOOK The theory of evolution has been most successful explaining the emergence of new species in terms of their morphological traits. Ethologists teach that behaviors, too, qualify as first-class phenotypic features, but evolutionary accounts of behaviors have been much less satisfactory. In part this is because maturational "programs" transforming genotype to phenotype are "open" to environmental influences affected by behaviors. Further, many organisms are able to continue to modify their behavior, i.e., learn, even after fully mature. This creates an even more complex relationship between the genotypic features underlying the mechanisms of maturation and learning and the adapted behaviors ultimately selected. A meeting held at the Santa Fe Institute during the summer of 1993 brought together a small group of biologists, psychologists, and computer scientists with shared interests in questions such as these. This volume consists of approximately two dozen papers that explore interacting adaptive systems from a range of interdisciplinary perspectives. About half the articles are classic, seminal references on the subject, ranging from biologists like Lamarck and Waddington to psychologists like Piaget and Skinner. The other papers represent new work by the workshop participants. The role played by mathematical and computational tools, both as models of natural phenomena and as algorithms useful in their own right, is particularly emphasized in these new papers. In all cases the chapters have been augmented by specially written prefaces. In the case of the reprinted classics, the prefaces help to put the older papers in a modern context. For the new papers, the prefaces have been written by colleagues from a discipline other than the paper's authors, and highligh, for example, what a computer scientist can learn from a biologist's model, or vice versa. Through these cross-disciplinary "dialogues" and a glossary collecting multidisciplinary connotations of pivotal terms, the process of interdisciplinary investigation itself becomes a central theme. ORDERING INFORMATION This series is published by The Advanced Book Program, Addison-Wesley Publishing Company, One Jacob Way, Reading, MA 01867. Please contact your local bookstore or, for credit card orders, call Addison-Wesley Publishing Company at (800)447-2226. For more information on this book, visit the web page: http://www.santafe.edu/sfi/publications/Bookinfo/aiineptofc.html ------------------------------------------------------------------ ADAPTIVE INDIVIDUALS IN EVOLVING POPULATIONS: MODELS AND ALGORITHMS TABLE OF CONTENTS Chapter 1: Introduction - R. K. Belew & M. Mitchell BIOLOGY OVERVIEW Chapter 2: Adaptive Computation in Ecology and Evolution: A Guide to Future Research - J. Roughgarden, A. Bergman, S. Shafir, and C. Taylor REPRINTED CLASSICS Chapter 3: The Classics in Their Context, and in Ours - J. Schull Chapter 4: Of the Influence of the Environment on the Activities and Habits of Animals, and the Influence of the Activities and Habits of These Living Bodies in Modifying Their Organisation and Structure - J. B. Lamarck Chapter 5: A New Factor in Evolution - J. M. Baldwin Chapter 6: On Modification and Variation - C. Lloyd Morgan Chapter 7: Canalization of Development and the Inheritance of Acquired Characters - C. H. Waddington Chapter 8: The Baldwin Effect - G. G. Simpson Chapter 9: The Role of Somatic Change in Evolution - G. Bateson NEW WORK Chapter 10: A Model of Individual Adaptive Behavior in a Fluctuating Environment - L. A. Zhivotovsky, A. Bergman, and M. W. Feldman (Preface by R. K. Belew) Chapter 11: The Baldwin Effect in the Immune System: Learning by Somatic Hypermutation - R. Hightower, S. Forrest, and A. S. Perelson (Preface by W. Hart) Chapter 12: The Effect of Memory Length on Individual Fitness in a Lizard - S. Shafir and J. Roughgarden (Preface by M. L. Littman and F. Menczer; Appendix by F. Menczer, W. E. Hart, and M. L. Littman) Chapter 13: Latent Energy Environments - F. Menczer and R. K. Belew (Preface by J. Roughgarden) PSYCHOLOGY OVERVIEW Chapter 14: The Causes and Effects of Evolutionary Simulation in the Behavioral Sciences - P. M. Todd REPRINTED CLASSICS Chapter 15: Excerpts from "Principles of Biology" - H. Spencer (Preface by P. G. Godfrey-Smith) Chapter 16: Excerpts from "Principles of Psychology" - H. Spencer (Preface by P. G. Godfrey-Smith) Chapter 17: William James and the Broader Implications of a Multilevel Selectionism - J. Schull Chapter 18: Excerpts from "The Phylogeny and Ontogeny of Behavior" - B. F. Skinner Chapter 19: Excerpts from "Adaptation and Intelligence: Organic Selection and Phenocopy" - J. Piaget (Preface by O. Miglino & R. K. Belew) Chapter 20: Selective Costs and Benefits of Learning - T. D. Johnston (Preface by P. M. Todd) NEW WORK Chapter 21: Sexual Selection and the Evolution of Learning - P. M. Todd (Preface by S. Shafir) Chapter 22: Discontinuity in Evolution: How Different Levels of Organization Imply Preadaptation - O. Miglino, S. Nolfi, and D. Parisi (Preface by M. Mitchell) Chapter 23: The Influence of Learning on Evolution - D. Parisi and S. Nolfi (Preface by W. Hart) COMPUTER SCIENCE OVERVIEW Chapter 24: Computation and the Natural Sciences - R. K. Belew, M. Mitchell, and D. Ackley REPRINTED CLASSICS Chapter 25: How Learning Can Guide Evolution - G. Hinton & S. Nowlan (Preface by M. Mitchell and R. K. Belew) Natural Selection: When Learning Guides Evolution - J. Maynard Smith NEW WORK Chapter 26: Simulations Combining Evolution and Learning - M. L. Littman (Preface by M. Mitchell) Chapter 27: Optimization with Genetic Algorithm Hybrids that Use Local Search - W. Hart & R. K. Belew (Preface by C. Taylor) GLOSSARY INDEX From scott at cpl_mmag.nhrc.navy.mil Thu Aug 1 18:27:33 1996 From: scott at cpl_mmag.nhrc.navy.mil (Scott Makeig) Date: Thu, 1 Aug 1996 15:27:33 -0700 (PDT) Subject: possible postdoctoral positions in biomedical signal processing Message-ID: <199608012227.PAA09986@cpl_mmag.nhrc.navy.mil> POSSIBLE NATIONAL RESEARCH COUNCIL POSTDOCTORAL OPPORTUNITY I anticipate possible funding this autumn of one or two National Research Council postdoctoral Research Associate positions in a Neural Human-System Interface Development project for Online Alertness Monitoring using EEG and video signals. One position would involve design and testing of experiments in use of EEG-based alertness information to improve human-computer interfaces in military and transportation-related settings. An ideal candidate would have a strong background in engineering and signal processing, with interests in cognitive neuroscience and applications to human-computer interface design. A second position would focus on combining time-series analysis, neural networks, and information theory to study applications of independent component analysis to brain imaging and EEG data. The ideal candidate would have a strong background in time-series analysis, information theory, and neural networks, with interests in blind separation and biomedical information processing. Interested persons can learn more via www at: http://128.49.52.9. As there is an Aug. 12 deadline for application to the NRC program, immediate inquiries via email are advisable. Scott Makeig makeig at nhrc.navy.mil From cristina at idsia.ch Mon Aug 5 06:08:29 1996 From: cristina at idsia.ch (Cristina Versino) Date: Mon, 5 Aug 96 12:08:29 +0200 Subject: Paper available: Learning Fine Motion by Using the Hierarchical Extended Kohonen Map Message-ID: <9608051008.AA04293@fava.idsia.ch> ``Learning Fine Motion by Using the Hierarchical Extended Kohonen Map'' Cristina Versino and Luca Maria Gambardella IDSIA, Corso Elvezia 36, 6900 Lugano, Switzerland cristina at idsia.ch, luca at idsia.ch A Hierarchical Extended Kohonen Map (HEKM) learns to associate actions to perceptions under the supervision of a planner: they cooperate to solve path finding problems. We argue for the utility of using the hierarchical version of the KM instead of the ``flat'' KM. We measure the benefits of cooperative learning due to the interaction of neighboring neurons in the HEKM. We highlight a beneficial side-effect obtained by transferring motion skill from the planner to the HEKM, namely, smoothness of motion. In Proc. ICANN96, International Conference on Artificial Neural Networks, Bochum, Germany, 17--19 July, pp. 221--226. You can obtain a copy (6 pages, 135K in compressed form) via: 1) netscape ftp://ftp.idsia.ch/pub/cristina/icann96.ps.gz 2) ftp 3) visiting recent IDSIA papers page: http://www.idsia.ch/reports.html From davec at cogs.susx.ac.uk Mon Aug 5 05:19:23 1996 From: davec at cogs.susx.ac.uk (Dave Cliff) Date: Mon, 05 Aug 1996 10:19:23 +0100 Subject: Scholarships for EASy MSc at Sussex, U.K. Message-ID: <3205BC9B.6957@cogs.susx.ac.uk> The University of Sussex at Brighton School of Cognitive and Computing Sciences MSc in Evolutionary and Adaptive Systems CYBERLIFE(tm) SCHOLARSHIPS The University of Sussex School of Cognitive and Computing Sciences is pleased to announce the availability of up to three scholarships for the Master of Science in Evolutionary and Adaptive Systems (EASy MSc). The scholarships are funded by CyberLife Ltd, Cambridge, UK. CyberLife Ltd is a subsidiary of Millennium Interactive Ltd, a leading producer of games and entertainment software. CyberLife Ltd conducts research and development of software based on technologies from Artificial Life and Evolutionary and Adaptive Systems. Each scholarship will provide funding for fees and maintenance for the one-year full-time course at a level equivalent to the standard EPSRC postgraduate studentship rate (approx 4800pounds). Candidates wishing to apply for a CyberLife Scholarship should already hold an offer of a place on the EASy MSc. Candidates who have not yet applied for admission to the EASy MSc should request an application from by contacting either the Postgraduate Office of the University of Sussex, or tehe COGS Graduate secretary Linda Thompson. Contact addresses are given at the end of this e-mail. Scholarship candidates will be shortlisted and interviewed at CyberLife's offices in Cambridge UK. Successful applicants awarded a CyberLife Scholarship will study in Brighton at the University of Sussex from October until April, attending the EASy MSc taught courses. After successful completion of the taught courses, CyberLife scholars will transfer to Cambridge, working on their summer research project at CyberLife's offices until the completion of their Master's thesis at the end of August. The scholars remain students of the University throughout this period, and a member of Sussex faculty will make visits to Cambridge for supervisory meetings in addition to maintaining contact via the internet. The Master's thesis should address a topic of interest to CyberLife Ltd's current research directions. Sample topic areas are listed below. A copy of this e-mail, with a bibliography of relevant papers, is available on the world wide web at: http://www.cogs.susx.ac.uk/lab/adapt/msc_schol.html RESEARCH TOPICS (1) Group Behaviour Application of Evolutionary and Adaptive Systems techniques to the genre of games and entertainment software where groups of agents react or interact with other agents, possibly adapting over time. Including cooperative and communicative behavior. (2) Artificial Biochemistry CyberLife Ltd have developed techniques for sensory-motor control in autonomous software agents based on neural networks which are affected by diffuse "hormones" in the agent's "biochemistry". Research projects exploring artificial biochemistry (e.g. autocatalytic sets) or diffuse modulation of activity in neural networks, or both, are possible. (3) Adaptive Architectures for Real-Time Control In many real-time games and entertainment software systems there is a significant need for control architectures which respond, react, and adapt to the environment (including the playing style of any human users) in real-time. Such architectures could be applied to most game software with real-time interactive elements. Entertainment examples include artificial motor-racing drivers, aeroplane pilots, etc. However, such systems could also possibly be applied to industrial tasks such as traffic-light scheduling for optimal flow-control on roads. (4) Speculator Agents CyberLife Ltd have an interest in developing autonomous software agents that profitably speculate on movements in the prices of stocks and shares, or financial derivatives such as warrants, options, and futures. There is particular interest in the development of agents with adaptive risk/reward profiles. (5) Navigation in Arbitrary Domains The task of providing software agents with robust, efficient, and plausible navigation mechanisms for negotiating virtual environments of two or more dimensions remains an open research issue. Research in robot guidance and studies of navigation in animals could be adapted and extended to deliver techniques appropriate for interactive software applications. (6) Strategic Planning Research in reactive planning has been applied to simple video game environments. Further work could be undertaken to extend such techniques to provide advanced techniques for adaptively generating strategies and tactics in a variety of entertainment software applications. The techniques would not necessarily apply to in-game player agents; rather, to the provision of an overall intelligence in strategy games. (7) Procedural Learning Mechanisms for Motor Control The development of techniques for adaptive motor control in software agents which interact with a simulated physical environment, e.g. learning to walk. Possibly adapting existing research on walking robots. (8) Genetic Encoding and Morphogenesis Using genetic algorithms to develop artificial neural networks requires that the multi-dimensional network architecture is encoded as a linear string of characters (the "genome"). The process which maps from a genome to a network architecture is often referred to as "morphogenesis". Recently, a number of researchers have addressed the problem of developing encodings and associated morphogenesis techniques which are robust with respect to genetic operators such as mutation and crossover, and expressive (e.g. allowing for repeated structures and modular designs). CONTACTS -------- Application forms for entry to the EASy MSc in October 1996 are available from: Postgraduate Admissions Office Sussex House University of Sussex Brighton BN1 9RH England, U.K. Tel: +44 (0)1273 678412 Email: PG.Admissions at admin.susx.ac.uk Applicants for the CyberLife Scholarships should register their interest in writing (letter or email) to: Linda Thompson (COGS Graduate Admissions Secretary) Cognitive and Computing Sciences University of Sussex Brighton BN1 9QH England, U.K. Tel: +44 (0)1273 678754 Fax: +44 (0)1273 671320 E-mail: lindat at cogs.susx.ac.uk CyberLife is a trademark of CyberLife Ltd, Quern House, Mill Court, Great Shelford, Cambridge, UK. From bogus@does.not.exist.com Mon Aug 5 08:23:47 1996 From: bogus@does.not.exist.com () Date: Mon, 5 Aug 1996 13:23:47 +0100 Subject: EMMCVPR'97 -- Final Call for Papers Message-ID: <9608051323.ZM18689@minster.york.ac.uk> I apologize if you receive multiple copies of this. Please note the following: 1) Deadline for submission of papers moved to SEPTEMBER 23, 1996. 2) Selection of papers published in a SPECIAL ISSUE of the journal PATTERN RECOGNITION. _____________________________________________________________________________ FINAL CALL FOR PAPERS International Workshop on ENERGY MINIMIZATION METHODS IN COMPUTER VISION AND PATTERN RECOGNITION Venice, Italy, May 21-23, 1997 Energy minimization methods represent a fundamental methodology in computer vision and pattern recognition, with roots in such diverse disciplines as Physics, Psychology, and Statistics. Recent manifestations of the idea include Markov random fields, relaxation labeling, various types of neural networks, etc. These techniques are finding application in areas such as early vision, graph matching, motion analysis, visual reconstruction, etc. The aim of this workshop is to consolidate research efforts in this area, and to provide a discussion forum for researchers and practitioners interested in this important yet diverse subject. The scientific program of the workshop will include the presentation of invited talks and contributed research papers. The workshop is sponsored by the International Association for Pattern Recognition (IAPR) and organized by the Department of Applied Mathematics and Computer Science of the University of Venice "Ca' Foscari." Topics Papers covering (but not limited to) the following topics are solicited: Theory: (e.g., Bayesian contextual methods, biology-inspired methods, discrete optimization, information theory and statistics, learning and parameter estimation, Markov random fields, neural networks, relaxation processes, statistical mechanics approaches, stochastic methods, variational methods) Methodology: (e.g., deformable models, early vision, matching, motion, object recognition, shape, stereo, texture, visual organization) Applications: (e.g., character and text recognition, face processing, handwriting, medical imaging, remote sensing) Program co-chairs Marcello Pelillo, University of Venice, Italy Edwin R. Hancock, University of York, UK Program committee Davi Geiger, New York University, USA Anil K. Jain, Michigan State University, USA Josef Kittler, University of Surrey, UK Stan Z. Li, Nanyang Technological University, Singapore Jean-Michel Morel, Universite' Paris Dauphine, France Maria Petrou, University of Surrey, UK Anand Rangarajan, Yale University, USA Sergio Solimini, Polytechnic of Bari, Italy Alan L. Yuille, Harvard University, USA Josiane Zerubia, INRIA, France Steven W. Zucker, McGill University, Canada Invited speakers Anil K. Jain, Michigan State University, USA Josef Kittler, University of Surrey, UK Alan L. Yuille, Harvard University, USA Steven W. Zucker, McGill University, Canada Venue The workshop will be held at the University of Venice "Ca' Foscari." The lecture theater will be in the historic center of Venice, and accommodation will be provided in nearby hotels. Submission procedure Prospective authors should submit four copies of their contribution(s) by September 23, 1996 to: Marcello Pelillo (EMMCVPR'97) Dipartimento di Matematica Applicata e Informatica Universita' "Ca' Foscari" di Venezia Via Torino 155, 30173 Venezia Mestre, Italy E-mail: pelillo at dsi.unive.it The manuscripts submitted should be no longer than 15 pages, and the cover page should contain: title, author's name, affiliation and address, e-mail address, fax and telephone number, and an abstract no longer than 200 words. In case of joint authorship, the first name will be used for correspondence unless otherwise requested. All manuscripts will be reviewed by at least two members of the program committee. Accepted papers will appear in the proceedings which are expected to be published in the series Lecture Notes in Computer Science by Springer-Verlag, and will be distributed to all participants at the workshop. In order to get a high-quality book with a uniform and professional appearance, prospective authors are strongly encouraged to use the LaTeX style file available at the WWW site indicated below. Important dates Extended paper submission deadline: September 23, 1996 Notification of acceptance: December 1996 Camera-ready paper due: February 1997 Homepage Information on the workshop is maintained at http://Dcpu1.cs.york.ac.uk:6666/~adjc/EMMCVPR97.html This page will be updated continuously and will include information on accepted papers and the final program. Special issue of Pattern Recognition Selected papers from the workshop will be published in a special edition of the journal Pattern Recognition scheduled for July 1998. Prospective author's will be asked to submit enhanced papers for review at the workshop. First reviews will be returned by October 1997. The final decisions concerning inclusion will be made in January 1998. Concomitant events During the week following EMMCVPR'97, participants will have the opportunity to attend the 3rd International Workshop on Visual Form (IWVF3) to be held in Capri, May 28-30. For additional information please contact any of the co-chairmen Carlo Arcelli (car at imagm.na.cnr.it), Luigi Cordella (cordel at nadis.dis.unina.it), and Gabriella Sanniti di Baja (gsdb at imagm.na.cnr.it), or see http://amalfi.dis.unina.it/IWF3/iwvf3cfp.html From listerrj at helios.aston.ac.uk Wed Aug 7 06:56:32 1996 From: listerrj at helios.aston.ac.uk (Richard Lister) Date: Wed, 07 Aug 1996 11:56:32 +0100 Subject: Course: Neural Computing for Industrial Applications Message-ID: <18020.199608071056@sun.aston.ac.uk> ---------------------------------------------------------------------- Neural Computing for Industrial Applications: An Intensive Hands-on Course 23-25 September 1996 Aston University Birmingham The Neural Computing Research Group at Aston University will be running the course "Neural Computing for Industrial Applications - An Intensive Hands-on Course" at Aston University between 23-25 September 1996. The course is aimed at applications developers as well as technical managers in industry and commerce. It will also be of direct relevance to practitioners in universities and research laboratories. The course will focus on a principled, rather than ad-hoc, approach to neural networks, providing the main tools to enable their successful application in real-world problems. It combines lectures with supervised laboratory sessions and aims to provide participants with a coherent picture of the foundations of neural computing, as well as a deep understanding of many practical issues arising in their application to commercial tasks. The lectures will take the student, step by step, throughout the process of applying neural networks to commercial tasks including: data preparation, choice of adequate configuration and cost function, training, methods of performance improvement and validation. The various development steps will be demonstrated on representative regression and classification commercial tasks, emphasising their relevance to real-world problems. Lectures will cover both basic and advanced material ranging from neural networks architectures and training methods to advanced Bayesian methods and stochastic Monte-Carlo techniques for tackling the difficulties of missing data, definition of error bars and model selection. Small group laboratory sessions will follow the lectures, providing a demonstration of methods and techniques taught in class and a first hand experience of their advantages and drawbacks for commercial applications. In addition, the course will provide hands-on experience in developing effective solutions to complex and challenging problems using the Netlab software developed at Aston. Who should attend ----------------- This course is aimed at applications developers as well as technical managers in industry and commerce. It will also be of direct relevance to practitioners in universities and research laboratories. Benefits -------- The course will provide hands-on experience in developing effective solutions to complex and challenging problems using the Netlab software developed at Aston. Participants will receive a complimentary copy of the Netlab software together with the Matlab simulation environment. They will also receive lecture notes, laboratory manuals, and a complementary copy of the new textbook "Neural Networks for Pattern Recognition". Laboratory sessions ------------------- The course includes four practical sessions designed to complement and reinforce the material presented during the lectures. These will make use of commercial and industrial data sets and will be based on the Netlab neural network simulation system running on modern Pentium PCs under Microsoft Windows. Course summary --------------- The course begins with registration and a course dinner on Sunday 22 September and ends at 5.00pm on Wednesday 25 September. Day 1 ----- The first day will include a general introduction to Neural Computing from a statistical viewpoint, an introduction to the example data sets used as case studies, data processing, the methodology of developing an application, multi-layer perceptrons and training algorithms. Some of the issues to be examined are * Data preparation Conventional techniques, feature extraction, dealing with missing data, linear regression, PCA and visualisation. * The multi-layer perceptron Basic architecture, using MLP for regression problems. * Training algorithms On-line and batch learning, gradient descent and conjugate gradient techniques, line search and other advanced techniques. A laboratory session for demonstrating and practising data processing techniques introduced in the lectures will also be held in the afternoon, making use of the example data sets introduced earlier. Day 2 ----- After introducing the architecture and training algorithms for Radial Basis Function networks, we will examine methods for monitoring and controlling network performance including various validation and regularisation techniques. The main topics include: * Radial Basis Function Networks Basic architecture, relation to conventional methods and training paradigms. * Generalisation Training, validation and test sets, how to monitor training success. * Model complexity and regularisation The Bayesian approach for controlling model complexity, incorporating prior knowledge, error bars, the evidence procedure and Monte Carlo methods. Following the lectures, two laboratory sessions will be held during the second day, demonstrating and practising training of MLP and RBF networks as well as regularisation and validation methods. Day 3 ----- The last day of the course will concentrate on extending the neural networks framework presented for regression tasks to accommodate classification problems. In addition we will discuss practical issues related to using neural networks for commercial problems. * Classification problems Network predictions as probabilities and the Bayesian approach, choice of error functions and activation functions, minimising risk, reject option and imbalanced priors. * Practicalities and diagnostics Measures of performance assessment, error bars and input data distribution, non-stationarity. One laboratory session will be held in the last day, demonstrating the use of MLP and RBF networks in classification tasks as well as exercising the use of practical diagnostics methods. Course tutors ------------- Professor Christopher Bishop was formerly the Head of the Applied Neurocomputing Centre at AEA Technology and has developed many successful applications of neural networks in a wide range of domains. He is Chairman of the Neural Computing Applications Forum. Professor David Lowe was previously Leader of the Pattern Processing Group at DRA Malvern, and is currently applying neural networks to problems in electricity load demand forecasting, portfolio optimisation, chemical vapour analysis and the control of internal combustion engines. Dr Ian Nabney worked on applications of neural computing for Logica and is currently Programme Chair of the Neural Computing Applications Forum. He has worked on applications of neural networks to jet engine diagnostics, analysis of satellite radar signals, and control of distillation columns. Dr Richard Rohwer has research interests which include the Bayesian and differential geometry views of machine learning, ultra-fast memory-based algorithms, and practical methods for specification of prior knowledge. He works with applications ranging from speech processing to pipeline inspection. Dr David Saad works on the foundations of neural computing from a statistical mechanics perspective with emphasis on learning and model selection, and has developed applications to problems in bar code location and identification. Dr Christopher Williams has developed novel approaches to pattern recognition which extend conventional neural network methods, and also has strong interests in applications to machine vision. Enrolment Details ----------------- Please send your booking form to the address below to reserve a place on the course. Alternatively, you can reserve a place on the course by accessing the enrolment form on our World Wide Web page at http://www.ncrg.aston.ac.uk/. An invoice will be issued upon receipt of this form and payment should be received by Friday 6th September 1996. Since the course involves laboratory classes, places are strictly limited, so an early booking is strongly advised. Please complete one form per delegate. A receipt will be issued upon payment and will be sent together with an acknowledgement. Preparatory course notes will be sent four weeks before the course date. Cancellations ------------- All cancellations must be received in writing. Cancellations made before Friday 6th September 1996 will be subject to an administration fee of UKP50, and cancellations made after this date will be subject to the full amount of the course fee. Should a delegate become unable to attend a substitution may be made, which must be confirmed in writing. What Payment Includes --------------------- * Three days attendance on the course including laboratory sessions and lectures. * Free copy of Aston's Netlab neural network software (with documentation). * Full set of course notes and laboratory manuals. * Free copy of the text book "Neural Networks for Pattern Recognition" by Professor Christopher M Bishop. * Attendance at the Course Dinner on Sunday 22nd September 1996. * Buffet Lunches and refreshments on 23, 24, 25 September. * Three nights Bed and Breakfast Accommodation at the Aston Business School (delegates are free to make their own arrangements and a reduced course fee is available). Evening Meals ------------- Evening meals on 23rd and 24th September can be taken at the Aston Business School at a cost of UKP15 each. Please indicate on the booking form if you would like either of these meals, and include payment with your registration fee. Software -------- Participants will receive a complimentary copy of the Netlab software and will be able to purchase Matlab (which is required to run Netlab) at a special discounted rate. Matlab software is available on PC/MS- Windows, Macintosh and UNIX platforms. Please indicate on the booking form if you are interested in receiving further details about the software. Please complete and return this form to: Miss H E Sondermann Neural Computing for Industrial Applications Neural Computing Research Group Aston University Birmingham B4 7ET ---------------------------------------------------------------------- From async at cs.tu-berlin.de Wed Aug 7 07:30:14 1996 From: async at cs.tu-berlin.de (Stefan M. Rueger) Date: Wed, 7 Aug 1996 13:30:14 +0200 (MET DST) Subject: TR on Decimatable Boltzmann Machines vs. Gibbs Sampling Message-ID: <199608071130.NAA24872@deneb.cs.tu-berlin.de> Technical Report Available DECIMATABLE BOLTZMANN MACHINES VS. GIBBS SAMPLING Stefan M. Rueger, Anton Weinberger, and Sebastian Wittchen Fachbereich Informatik Technische Universitaet Berlin July 1996 Exact Boltzmann learning can be done in certain restricted networks by the technique of decimation. We have enlarged the set of decimatable Boltzmann machines by introducing a new decimation rule. We have compared solutions of a probability density estimation problem with decimatable Boltzmann machines to the results obtained by Gibbs sampling in unrestricted (non-decimatable) Boltzmann machines. This technical report is available in compressed Postscript by the following URLs: http://www.cs.tu-berlin.de/~async/www-pub/TR96-29.ps.gz http://www.cs.tu-berlin.de/~async/www-pub/TR96-29-ds.ps.gz (2-on-1-page vers.) Bibtex entry: @TECHREPORT{TU-Berlin-Informatik-96-29, AUTHOR = {Stefan M. R\"uger and Anton Weinberger and Sebastian Wittchen}, TITLE = {Decimatable {B}oltzmann Machines vs. {G}ibbs Sampling}, INSTITUTION={Fachbereich Informatik der Technischen Universit\"at Berlin}, YEAR = {1996}, NUMBER = {96-29} } ------------------------------------------------------------------------------ Stefan M. Rueger http://www.cs.tu-berlin.de/~async Sekr. FR 5-9, Franklinstr. 28/29 async at cs.tu-berlin.de Technische Universitaet Berlin, 10 587 Berlin (+49)30/31422662 ------------------------------------------------------------------------------ From ken at phy.ucsf.edu Wed Aug 7 20:05:23 1996 From: ken at phy.ucsf.edu (Ken Miller) Date: Wed, 7 Aug 1996 17:05:23 -0700 Subject: Paper available: Integration and ISI variability in Cortical Neurons Message-ID: <9608080005.AA20710@coltrane.ucsf.edu> FTP-host: ftp.keck.ucsf.edu FTP-filename: /pub/ken/integration.ps.gz 14 pages The following paper, to appear in Neural Computation, is now available by ftp. The ftp site is ftp://ftp.keck.ucsf.edu/pub/ken/integration.ps.gz This and other papers can also be obtained from our http site: http://www.keck.ucsf.edu/~ken PHYSIOLOGICAL GAIN LEADS TO HIGH ISI VARIABILITY IN A SIMPLE MODEL OF A CORTICAL REGULAR SPIKING CELL Todd W. Troyer and Kenneth D. Miller todd at phy.ucsf.edu, ken at phy.ucsf.edu Keck Center for Integrative Neuroscience Sloan Center for Theoretical Neurobiology Departments of Physiology and Otolaryngology University of California, San Francisco San Francisco, CA 94143 ABSTRACT: To understand the interspike interval (ISI) variability displayed by visual cortical neurons (Softky and Koch, 1993), it is critical to examine the dynamics of their neuronal integration as well as the variability in their synaptic input current. Most previous models have focused on the latter factor. We match a simple integrate-and-fire model to the experimentally measured integrative properties of cortical regular spiking cells (McCormick et al., 1985). After setting RC parameters, the post-spike voltage reset is set to match experimental measurements of neuronal gain (obtained from {\em in vitro} plots of firing frequency vs.\ injected current). Examination of the resulting model leads to an intuitive picture of neuronal integration that unifies the seemingly contradictory ``$1/\sqrt{N}$'' and ``random walk'' pictures that have previously been proposed. When ISI's are dominated by post-spike recovery, $1/\sqrt{N}$ arguments hold and spiking is regular; if recovery is negligible so that spiking is triggered by input variance around a steady state, spiking is Poisson. In integrate-and-fire neurons matched to cortical cell physiology, steady state behavior is predominant and ISI's are highly variable at all physiological firing rates and for a wide range of inhibitory and excitatory inputs. From simon.schultz at psy.ox.ac.uk Thu Aug 8 08:30:36 1996 From: simon.schultz at psy.ox.ac.uk (Simon Schultz) Date: Thu, 8 Aug 1996 13:30:36 +0100 (BST) Subject: ITB2 Call For Participation Message-ID: <199608081230.NAA15473@axp2.cns.ox.ac.uk> Call for Participation: Information Theory and the Brain II To be held on the 20-21st of September, Headland Hotel, Newquay, Cornwall, England. http://www.mrc-bbc.ox.ac.uk/~itb2/conference.html This is the sequel to the conference held in Stirling, Scotland last year. This year the conference will be held in the Cornish town of Newquay. Apart from being one of the best areas for surfing in Europe, the surrounding countryside is amongst the most beautiful in Britain. The conference will be held in the spectacular Headland Hotel right next to the famous Fistral Beach and in mid September the water is at its warmest, the surf is starting to get larger, and the summer holiday crowds have headed home. It is hoped that pleasant surroundings will help to maintain an informal atmosphere. Organising Committee: Roland Baddeley (Chair) Nick Chater Peter Foldiak Peter Hancock Bruno Olshausen Dan Ruderman Simon Schultz Guy Wallis 21 papers have been accepted for presentation at the conference. While places are limited, there is still room for a number of non-presenting attendees. Anyone who would like to attend the conference but who has not submitted a paper is now encouraged to contact the organisers, either by email at itb2 at psy.ox.ac.uk, or by surface mail to: IBT2 c/o Roland Baddeley, Dept of Psychology, University of Oxford, Oxford, England OX1 3UD Registration will be 40 pounds (about $60 U.S.) with the participants expected to find their own accommodation. This varies in price from as low as 5 pounds (for the most basic) upwards. Accommodation in the summer can be hard to find but by the 20th, most summer holidays have finished and the situation is much better. More information on accommodation, and on travel to Newquay, can be found via the above mentioned web page. A tentative conference program follows. More details of the conference, and abstracts of the papers, are available via the web page. DAY ONE. Friday 20th September. Session 1: Applied physiology Adaptive search for most effective stimuli -- Maneesh Sahani Dynamics of receptive fields in the visual system: plasticity of intra-cortical connections -- G. Mato and N. Parga Session 2: Psychological models Modelling the vowel space: Relating a statistical model to results obtained in experimental phonetics -- Matthew Aylett Who needs neural networks when we've got information theory? (or "The emperors new neural network model") -- John A. Bullinaria Optimal resource allocation for novelty detection - the principle and some experimental MEG support related to a human auditory memory -- Janne Sinkkonen Session 3: Formal analysis of the hippocampus A quantitative model of information processing in CA1 -- Carlo Fulvi Mari, Stefano Panzeri, Edmund Rolls and Alessandro Treves Information-theoretic analysis of hippocampal subfield CA1: Schaffer-collateral connectivity -- Simon Schultz, Stefano Panzeri, Edmund Rolls and Alessandro Treves Session 4: Information, energy and the world The metabolic cost of sensory information -- Simon Laughlin, David O'Carroll and John Anderson Metabolicly optimal rate codes and the time course of visual processing -- Roland Baddeley DAY TWO. Saturday 21st September. Session 5: Network models I Information Density and Cortical Magnification Factors -- M D Plumbley Time to learn about objects -- Guy Wallis Stochastic dynamics of a neuron with quantal synaptic noise -- Paul C. Bressloff and Peter Roper Session 6: Sparse representations Experiments with Low Entropy Neural Networks -- George Harpur and Richard Prager Utilizing Sparse Coding and Metrical Organization of Features for Artificial Object Recognition -- Norbert Kr\"uger, Gabi Peters, Michael P\"otzsch The role of higher-order image statistics in human visual coding -- Mitchell Thomson Session 7: Network models II Quantifying the level of distribution present in a feed-forward neural network -- Antony Browne The Emergence of Dominance Stripes in a Network of Firing Neurons -- S P Luttrell Session 8: Vision The taming of natural intensities by the early visual system of the blowfly -- J.H. van Hateren Optimizing photoreceptor arrays in apposition compound eyes -- Daniel L. Ruderman and Simon B. Laughlin Image coding in primary visual cortex using long-range horizontal collaterals -- Darragh Smyth and W.A.Phillips Session 9: Posters Predicting natural intensities viewed by photoreceptors -- A. van der Schaaf and J.H. van Hateren From inns_www at cns.bu.edu Thu Aug 8 10:18:40 1996 From: inns_www at cns.bu.edu (INNS Web Responses) Date: Thu, 8 Aug 1996 10:18:40 -0400 Subject: WCNN'96 September 15-18 Message-ID: <199608081418.KAA01210@retina.bu.edu> ---------------------------------------------------------------------- NEW INFORMATION! UPDATED AUGUST 5, 1996 WORLD CONGRESS ON NEURAL NETWORKS September 15-18, 1996 TOWN & COUNTRY HOTEL SAN DIEGO, CA, U.S.A. ---------------------------------------------------------------------------- Highlights & Attractions: * Guest speaker: Oliver Sacks, M.D., noted neurologist and author of Awakenings and The man who mistook his wife for a hat will give a Plenary Lecture in the Session on Consciousness and Intentionality, titled "Some Neurological Insights into the Nature of Consciousness," on Tuesday, September 17, 1996. * Plenary talks by Zeki, Van Essen, Hecht-Nielsen, and Merkle * 20 sessions, 5 special sessions, Special Interest Group Meetings * 9 Short Courses ---------------------------------------------------------------------------- HOW TO REGISTER FOR WCNN'96: WCNN'96 875 King's Highway, Suite 200 Woodbury, NJ 08096-3172 U.S.A. Fax: 609-853-0411 Forms available from the WEB site: http://cns-web.bu.edu/inns/wcnn ---------------------------------------------------------------------------- The following additional information is available on the WCNN'96 web site, http://cns-web.bu.edu/inns/wcnn. * Final Schedule * Short Courses Information and Schedule * A list of ALL INVITED AND SUBMITTED PRESENTATIONS for Monday, September 16; Tuesday, September 17; Wednesday, September 18; and all posters * Registration Information and Forms * Hotel Accommodations Information and Form * Airline Information * Information about San Diego * List of Contributors ---------------------------------------------------------------------------- FOR ADDITIONAL INFORMATION: Telephone: 609-845-5010 E-Mail: meetings at wcnn.ccmail.compuserve.com  From devries at sarnoff.com Thu Aug 8 15:05:49 1996 From: devries at sarnoff.com (Aalbert De Vries x2456) Date: Thu, 8 Aug 96 15:05:49 EDT Subject: FIRST Call for Papers: NNSP*97 Message-ID: <9608081905.AA19168@peanut.sarnoff.com> 1997 IEEE Workshop on Neural Networks for Signal Processing 24-26 September 1997 Amelia Island Plantation Amelia Island, Florida FIRST ANNOUNCEMENT AND CALL FOR PAPERS Thanks to the sponsorship of the IEEE Signal Processing Society and the co-sponsorship of the IEEE Neural Network Council, we are proud to announce the seventh of a series of IEEE Workshops on Neural Networks for Signal Processing. Papers are solicited for, but not limited to, the following topics: * Paradigms artificial neural networks, Markov models, fuzzy logic, inference net, evolutionary computation, nonlinear signal processing, and wavelets * Application areas speech processing, image processing, OCR, robotics, adaptive filtering, communications, sensors, system identification, issues related to RWC, and other general signal processing and pattern recognition * Theories generalization, design algorithms, optimization, parameter estimation, and network architectures * Implementations parallel and distributed implementation, hardware design, and other general implementation technologies Instructions for sumbitting papers Prospective authors are invited to submit 5 copies of extended summaries of no more than 6 pages. The top of the first page of the summary should include a title, authors' names, affiliations, address, telephone and fax numbers and email address, if any. Camera-ready full papers of accepted proposals will be published in a hard-bound volume by IEEE and distributed at the workshop. Submissions should be sent to: Dr. Jose C. Principe IEEE NNSP'97 444 CSE Bldg #42 P.O. Box 116130 University of Florida Gainesville, FL 32611 Important Dates: * Submission of extended summary: January 27, 1997 * Notification of acceptance: March 31, 1997 * Submission of photo-ready accepted paper: April 26, 1997 * Advanced registration: before July 1, 1997 Further Information Local Organizer Ms. Sharon Bosarge Telephone: 352-392-2585 Fax: 352-392-0044 e-mail: sharon at ee1.ee.ufl.edu World Wide Web http://www.cnel.ufl.edu/nnsp97/ Organization General Chairs Lee Giles (giles at research.nj.nec.com), NEC Research Nelson Morgan (morgan at icsi.berkeley.edu), UC Berkeley Proceeding Chair Elizabeth J. Wilson (bwilson at ed.ray.com), Raytheon Co. Publicity Chair Bert DeVries (bdevries at sarnoff.com), David Sarnoff Research Center Program Chair Jose Principe (principe at synapse.ee.ufl.edu), University of Florida Program Committee Les ATLAS Andrew BACK A. CONSTANTINIDES Federico GIROSI Lars Kai HANSEN Allen GORIN Yu-Hen HU Jenq-Neng HWANG Biing-Hwang JUANG Shigeru KATAGIRI Gary KUHN Sun-Yuan KUNG Richard LIPPMANN John MAKHOUL Elias MANOLAKOS Erkki OJA Tomaso POGGIO Mahesan NIRANJAN Volker TRESP John SORENSEN Takao WATANABE Raymond WATROUS Andreas WEIGEND Christian WELLEKENS About Amelia Island Amelia Island is in the extreme northeast Florida, across the St. Mary's river. The island is just 29 miles from Jacksonville International Airport, which is served by all major airlines. About Amelia Island Plantation Amelia Island Plantation is a 1,250 acre resort/paradise that offers something for every traveler. The Plantation offers 33,000 square feet of workable meeting space and a staff dedicated to providing an efficient, yet relaxed atmosphere. The many amenities of the Plantation include 45 holes of championship golf, 23 Har-Tru tennis courts, modern fitness facilities, an award winning children's program, more than 7 miles of flora-filled bike and jogging trails, 21 swimming pools, diverse accommodations, exquisite dining opportunities, and of course, miles of glistening Atlantic beach front. From jbower at bbb.caltech.edu Sun Aug 11 16:16:15 1996 From: jbower at bbb.caltech.edu (James M. Bower) Date: Sun, 11 Aug 1996 12:16:15 -0800 Subject: J. Comput. Neurosci. Message-ID: Journal of Computational Neuroscience Volume 3, Number 2, July 1996 Paul Blush and Terrence Sejnowski, Inhibition Synchronizes Sparsely Connected Cortical Neurons Within and Between Columns in Realistic Network Models 91 Gary Strangman, Searching for Cell Assemblies: How Many Electrodes Do I Need? 111 Usula Fuentes, Raphael Ritz, Wulfram Gerstner, and J. Leo VanHemmen, Vertical Signal Flow and Oscillations in a Three-Layer Model of the Cortex 125 Joshua W. Fost and Gregory A. Clark, Modeling Hermissenda: I. Differential Contributions of IA and IC to Type-B Cell Plasticity 137 Joshua W. Fost and Gregory A. Clark, Modeling Hermissenda: II. Effects of Variations in Type-B Cell Excitability, Synaptic Strength, and Network Architecture 155 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Journal of Computational Neuroscience Volume 3, Number 3, September 1996 A.E. Sauer, R.B. Driesang, A. Buschges, and U. Bassler, Distributed Processing on the Basis of Parallel and Antagonistic Pathways Simulation of the Femur-Tibia Control System in the Stick Insect 179 R. J. Butera, Jr., J.W. Clark, Jr., and J.H. Byrne, Dissection and Reduction of a Modeled Bursting Neuron 199 Christiane Linster and Remi Gervais, Investigation of the Role of Interneurons and Their Modulation by Centrifugal Fibers in a Neural Model of the Olfactory Bulb 225 David J. Pinto, Joshua C. Brumberg, Daniel J. Simons, and G. Bard Ermentrout, A Quantitative Population Model of Whisker Barrels: Re-Examining the Wilson-Cowan Equations 247 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Information about the Journal of Computational Neuroscience is available from: http://www.bbb.caltech.edu/JCNS *************************************** James M. Bower Division of Biology Mail code: 216-76 Caltech Pasadena, CA 91125 (818) 395-6817 (818) 795-2088 FAX NCSA Mosaic addresses for: laboratory http://www.bbb.caltech.edu/bowerlab GENESIS: http://www.bbb.caltech.edu/GENESIS science education reform http://www.caltech.edu/~capsi From geoff at salk.edu Sun Aug 11 20:25:53 1996 From: geoff at salk.edu (geoff@salk.edu) Date: Sun, 11 Aug 1996 17:25:53 -0700 (PDT) Subject: Postdoc position Message-ID: <199608120025.RAA00613@gauss.salk.edu> GEORGETOWN INSTITUTE FOR COGNITIVE AND COMPUTATIONAL SCIENCES Georgetown University, Washington DC Postdoctoral position in Theoretical / Computational Neuroscience Georgetown University has recently established an interdisciplinary research institute consisting of 16 full-time faculty. The major focus areas are neuroplasticity in development, higher auditory processing and language, and injury and aging. Experimental, computational and brain imaging approaches are all well represented. A postdoctoral position in theoretical / computational neuroscience is available from October 1996 in the lab of Dr Geoffrey J. Goodhill. The lab focuses on neural development and self-organization; particularly the development and plasticity of cortical mappings, areal specification of the cortex, and mechanisms of axon guidance (for more details see http://www.cnl.salk.edu/~geoff). The ideal candidate will have an initial training in a quantitative discipline plus knowledge and experience of neuroscience. Please send a CV, summary of relevant research experience and at least 2 letters of recommendation by post or email to: Dr Geoffrey J. Goodhill The Salk Institute 10010 North Torrey Pines Road La Jolla, CA 92037 Email: geoff at salk.edu From jhf at playfair.Stanford.EDU Mon Aug 12 16:00:26 1996 From: jhf at playfair.Stanford.EDU (Jerome H. Friedman) Date: Mon, 12 Aug 1996 13:00:26 -0700 Subject: Technical Report Available. Message-ID: <199608122000.NAA01577@tukey.Stanford.EDU> *** Technical Report Available *** LOCAL LEARNING BASED ON RECURSIVE COVERING Jerome H. Friedman Stanford University (jhf at playfair.stanford.edu) ABSTRACT Local learning methods approximate a global relationship between an output (response) variable and a set of input (predictor) variables by establishing a set of "local" regions that collectively cover the input space, and modeling a different (usually simple) input-output relationship in each one. Predictions are made by using the model associated with the particular region in which the prediction point is most centered. Two widely applied local learning procedures are K - nearest neighbor methods, and decision tree induction algorithms (CART, C4.5). The former induce a large number of highly overlapping regions based only on the distribution of training input values. By contrast, the latter (recursively) partition the input space into a relatively small number of highly customized (disjoint) regions using the training output values as well. Recursive covering unifies these two approaches in an attempt to combine the strengths of both. A large number of highly customized overlapping regions are produced based on both the training input and output values. Moreover, the data structure representing this cover permits rapid search for the prediction region given a set of (future) input values. Available by ftp from: "ftp://playfair.stanford.edu/pub/friedman/dart.ps.Z" Note: this postscript does not view properly on some ghostviews. It seems to print OK on nearly all postscript printers. From rsun at cs.ua.edu Tue Aug 13 11:58:11 1996 From: rsun at cs.ua.edu (Ron Sun) Date: Tue, 13 Aug 1996 10:58:11 -0500 Subject: IEEE TNN special issue on hybrid systems Message-ID: <9608131558.AA18781@athos.cs.ua.edu> Call For Papers special issue of IEEE Transaction on Neural Networks on ``Neural Networks and Hybrid Intelligent Models: Foundations, Theory, and Applications'' Guest Editors: C. Lee Giles, Ron Sun, Jacek M. Zurada Hybrid systems, the use of other intelligence paradigms with neural networks, are becoming more common and useful. In fact it can be argued that the success of neural networks has been from its ready incorporation of other information processing approaches, including pattern recognition, statistical inference, as well as symbolic processing. Some systems (especially those incorporating symbolic processing) have been known to some segments of the scientific community as high-level connectionist models. Other systems have been referred to as knowledge insertion and extraction. However, for the many applications, there exists little (1) theoretical foundation and (2) engineering methodology for effectively developing hybrid approaches. These two aspects are the topic of this special issue. Manuscripts are solicited in neural networks and hybrid models in the following areas - Theorectical foundations of hybrid models. Mathematical analysis, theories, critiques, case studies. - Models incorporating other paradigms such as AI symbolic processing, machine learning, fuzzy systems, genetic algorithms, and other intelligent paradigms within neural networks. Techniques, methodologies, and analyses. - Methodology of engineering design of hybrid systems. - Innovative and non-trivial applications of hybrid models (for example, in natural language processing, signal and image processing, pattern recognition, and cognitive modeling). Papers will undergo the standard review procedure of the IEEE Transactions on Neural Netwoks. The special issue will appear around November 1997. Prospective authors should submit six (6) copies of the completed manuscript, on or before February 28, 1997, to one of the following three guest editors: Dr. C. Lee Giles NEC Research Institute 4 Independence Way Princeton, NJ 08540, USA Phone: 609-951-2642 Fax 609-951-2482 Email: giles at research.nj.nec.com Web: http://www.neci.nj.nec.com/homepages/giles.html Prof. Ron Sun Department of Computer Science The University of Alabama Tuscaloosa, AL 35487 Phone: (205) 348-6363 Fax: (205) 348-0219 Email: rsun at cs.ua.edu Web: http://cs.ua.edu/faculty/sun/sun.html Prof. Jacek M. Zurada Electrical Engineering Department University of Louisville Louisville, KY 40292, USA Phone: (502) 852-6314 Fax: (502) 852-6807 Email: j.zurada at ieee.org Web: under construction From listerrj at helios.aston.ac.uk Tue Aug 13 12:00:31 1996 From: listerrj at helios.aston.ac.uk (Richard Lister) Date: Tue, 13 Aug 1996 17:00:31 +0100 Subject: Lectureships available Message-ID: <6363.199608131601@sun.aston.ac.uk> ---------------------------------------------------------------------- LECTURESHIPS ------------ Aston University, Birmingham, UK * Full details at http://www.ncrg.aston.ac.uk/ * We are seeking highly motivated academic staff to contribute to research in the general areas of neural computing, pattern recognition, time series analysis, image processing, machine vision or a closely related field. Candidates are expected to have excellent academic qualifications, a strong mathematical background and a proven record of research. Two posts are available. The successful candidates will also be expected to make innovative contributions to graduate-level and undergraduate teaching programmes, and ideally also contribute to industrially funded research programmes and industrial courses. They will join the Neural Computing Research Group which currently comprises the following members: Professors: Christopher Bishop David Lowe Visiting Professors: Geoffrey Hinton Edward Feigenbaum David Bounds Lecturers: Richard Rohwer Ian Nabney David Saad Chris Williams Postdoctoral Research Fellows: David Barber Paul Goldberg Neep Hazarika Alan McLachlan Mike Tipping Huaiyu Zhu (5 further posts currently being advertised) Personal Assistant: Hanni Sondermann System Administrator: Richard Lister Research Programmer: Andrew Weaver 20 Research Students Conditions of Service --------------------- The appointment will be for an initial period of three years, with the possibility of subsequent renewal or transfer to a continuing appointment. Initial salary will be within the lecturer A and B range 15,154 to 26,430, and exceptionally up to 29,532 (UK pounds; these salary scales are currently under review). How to Apply ------------ If you wish to be considered for one of these positions, please send a full CV and publications list, together with the names and addresses of 4 referees, to: Hanni Sondermann Neural Computing Research Group Aston University Birmingham B4 7ET, U.K. Tel: +44/0 121 333 4631 Fax: +44/0 121 333 4586 e-mail: H.E.Sondermann at aston.ac.uk Applications may be submitted as postscript files using e-mail, or as hard copy by post. Closing date: 30 August 1996. ---------------------------------------------------------------------- From dwang at cis.ohio-state.edu Wed Aug 14 14:59:00 1996 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Wed, 14 Aug 1996 14:59:00 -0400 Subject: Neurocomputing Best Paper Award Message-ID: <199608141859.OAA20887@shirt.cis.ohio-state.edu> Presenting the Neurocomputing best paper award, Vols. 7-9 (1995) As a form to express our thanks to the support by all contributors to the journal for a concluded volume our Editorial Board elects one representative contribution and grants the author(s) of an outstanding contribution the "Neurocomputing Best Paper Award". Originality, clarity of result presentation, depth, and novelty are some of the properties we are looking for. >From 1995 (Vol. 7-9) on, the award has been granted on the basis of all contributions of one year. The winner(s) obtain(s) a corresponding certificate and a publication of free-choice from our publisher, Elsevier Science B.V. M. Cannon and J.-J.E. Slotine were granted the Neurocomputing Best Paper Award for their contribution Space-frequency localized basis function networks for nonlinear systems estimation and control in Vol. 9(3) (1995), pp. 293-342. Again thanks for your support. V. David Sanchez A. Editor-in-Chief From otavioc at cogs.susx.ac.uk Thu Aug 15 14:25:30 1996 From: otavioc at cogs.susx.ac.uk (Otavio Augusto Salgado Carpinteiro) Date: Thu, 15 Aug 1996 19:25:30 +0100 (BST) Subject: Thesis available Message-ID: FTP-host: ftp.cogs.susx.ac.uk FTP-filename: /pub/reports/csrp/csrp426.ps.Z The following thesis is available via anonymous ftp. A CONNECTIONIST APPROACH IN MUSIC PERCEPTION Otavio A. S. Carpinteiro email: otavioc at cogs.susx.ac.uk Cognitive Science Research Paper CSRP-426 School of Cognitive & Computing Sciences University of Sussex, Brighton, UK FTP instructions: unix> ftp ftp.cogs.susx.ac.uk [ or ftp 192.33.16.70] login: anonymous password: ftp> cd pub/reports/csrp ftp> binary ftp> get csrp426.ps.Z ftp> bye 117 pages. 422107 bytes compressed, 1195561 bytes uncompressed Paper copies can be ordered from: Celia McInnes (celiam at cogs.susx.ac.uk) School of Cognitive & Computing Sciences University of Sussex Falmer, Brighton, UK. ------------------------------------------------------------------------ ABSTRACT: Little research has been carried out in order to understand the mechanisms underlying the perception of polyphonic music. Perception of polyphonic music involves thematic recognition, that is, recognition of instances of theme through polyphonic voices, whether they appear unaccompanied, transposed, altered or not. There are many questions still open to debate concerning thematic recognition in the polyphonic domain. One of them, in particular, is the question of whether or not cognitive mechanisms of segmentation and thematic reinforcement facilitate thematic recognition in polyphonic music. This dissertation proposes a connectionist model to investigate the role of segmentation and thematic reinforcement in thematic recognition in polyphonic music. The model comprises two stages. The first stage consists of a supervised artificial neural model to segment musical pieces in accordance with three cases of rhythmic segmentation. The supervised model is trained and tested on sets of contrived patterns, and successfully applied to six musical pieces from J. S. Bach. The second stage consists of an original unsupervised artificial neural model to perform thematic recognition. The unsupervised model is trained and assessed on a four-part fugue from J. S. Bach. The research carried out in this dissertation contributes into two distinct fields. Firstly, it contributes to the field of artificial neural networks. The original unsupervised model encodes and manipulates context information effectively, and that enables it to perform sequence classification and discrimination efficiently. It has application in cognitive domains which demand classifying either a set of sequences of vectors in time or sub-sequences within a unique and large sequence of vectors in time. Secondly, the research contributes to the field of music perception. The results obtained by the connectionist model suggest, along with other important conclusions, that thematic recognition in polyphony is not facilitated by segmentation, but otherwise, facilitated by thematic reinforcement. -- Otavio. +===========================================================================+ | | | | Otavio Augusto Salgado Carpinteiro | Phone: +44 (0) 1273 606755 | | Postgraduate Pigeonholes | ext. 2385 | | School of Cognitive & Computing Sciences | | | University of Sussex | Fax: +44 (0) 1273 671320 | | FALMER - East Sussex | | | BN1 9QH | E-mail: | | England | otavioc at cogs.sussex.ac.uk | | | | +===========================================================================+ From giles at research.nj.nec.com Thu Aug 15 09:49:00 1996 From: giles at research.nj.nec.com (Lee Giles) Date: Thu, 15 Aug 96 09:49:00 EDT Subject: TR on recurrent networks and long-term dependenices Message-ID: <9608151349.AA03234@alta> The following Technical Report is available via the University of Maryland Department of Computer Science and the NEC Research Institute archives: ____________________________________________________________________ HOW EMBEDDED MEMORY IN RECURRENT NEURAL NETWORK ARCHITECTURES HELPS LEARNING LONG-TERM DEPENDENCIES Technical Report CS-TR-3626 and UMIACS-TR-96-28, Institute for Advanced Computer Studies, University of Maryland, College Park, MD 20742 Tsungnan Lin{1,2}, Bill G. Horne{1}, C. Lee Giles{1,3} {1}NEC Research Institute, 4 Independence Way, Princeton, NJ 08540 {2}Department of Electrical Engineering, Princeton University, Princeton, NJ 08540 {3}UMIACS, University of Maryland, College Park, MD 20742 ABSTRACT Learning long-term temporal dependencies with recurrent neural networks can be a difficult problem. It has recently been shown that a class of recurrent neural networks called NARX networks perform much better than conventional recurrent neural networks for learning certain simple long-term dependency problems. The intuitive explanation for this behavior is that the output memories of a NARX network can be manifested as jump-ahead connections in the time-unfolded network. These jump-ahead connections can propagate gradient information more efficiently, thus reducing the sensitivity of the network to long-term dependencies. This work gives empirical justification to our hypothesis that similar improvements in learning long-term dependencies can be achieved with other classes of recurrent neural network architectures simply by increasing the order of the embedded memory. In particular we explore the impact of learning simple long-term dependency problems on three classes of recurrent neural networks architectures: globally recurrent networks, locally recurrent networks, and NARX (output feedback) networks. Comparing the performance of these architectures with different orders of embedded memory on two simple long-term dependences problems shows that all of these classes of networks architectures demonstrate significant improvement on learning long-term dependencies when the orders of embedded memory are increased. These results can be important to a user comfortable to a specific recurrent neural network architecture because simply increasing the embedding memory order will make the architecture more robust to the problem of long-term dependency learning. ------------------------------------------------------------------- KEYWORDS: discrete-time, memory, long-term dependencies, recurrent neural networks, training, gradient-descent PAGES: 15 FIGURES: 7 TABLES: 2 ------------------------------------------------------------------- http://www.neci.nj.nec.com/homepages/giles.html http://www.cs.umd.edu/TRs/TR-no-abs.html or ftp://ftp.nj.nec.com/pub/giles/papers/UMD-CS-TR-3626.recurrent.arch.long.term.ps.Z ------------------------------------------------------------------------------------ -- C. Lee Giles / Computer Sciences / NEC Research Institute / 4 Independence Way / Princeton, NJ 08540, USA / 609-951-2642 / Fax 2482 www.neci.nj.nec.com/homepages/giles.html == From twilson at afit.af.mil Fri Aug 16 13:15:25 1996 From: twilson at afit.af.mil (Terry Wilson) Date: Fri, 16 Aug 96 13:15:25 -0400 Subject: call for papers Message-ID: <9608161715.AA05694@euclid.afit.af.mil> Applications and Science of Artificial Neural Networks ****************************************************** Call for Papers and Announcement Applications and Science of Artificial Neural Networks Part of SPIE's 1997 International Symposium on Aerospace/Defense Sensing and Controls 21-25 April 1997 Marriott's Orlando World Center Resort and Convention Center (Orlando, Florida USA) The focus of this conference is on real-world applications of artificial neural networks and on recent theoretical developments applicable to current applications. The goal of this conference is to provide a forum for interaction between researchers and industrial/government agencies with information processing requirements. Papers that investigate advantages/disadvantages of artificial neural networks in specific real-world applications will be presented. Papers that clearly state existing problems in information processing that could potentially be solved by artificial neural networks will also be considered. Sessions will concentrate on: --- innovative applications of artificial neural networks to solve real-world problems --- comparative performance in applications of target recognition, object recognition, speech processing, speaker identification, speaker normalization, cochannel processing, signal processing in realistic environments, robotics, process control, and image processing --- demonstrations of properties and limitations of existing or new artificial neural networks as shown by or related to an application --- hardware implementation technologies that are either general purpose or application specific --- knowledge acquisition and representation --- biologically inspired visual representation techniques --- decision support systems --- artificial life --- cognitive science --- hybrid systems (fuzzy, neural, genetic) --- neurobiology --- optimization --- sensation and perception --- system identification --- financial applications --- time series analysis and prediction --- pattern recognition --- medical applications --- intelligent control --- robotics --- information warfare applications --- sensation, perception and cognitive neuropsychology. Conference Chair: - Steven K. Rogers, Air Force Institute of Technology Program Committee: - Stanley C. Ahalt, The Ohio State Univ.; - John Franco Basti, Pontifical Gregorian Univ.; - James C. Bezdek, Univ. of West Florida; - Joe R. Brown, Berkom USA; - John Colombi, Dept. of Defense; - Laurene V. Fausett, Florida Institute of Technology; - Michael Georgiopoulos, Univ. of Central Florida; - Joydeep Ghosh, Univ. of Texas/Austin; - Charles W. Glover, Oak Ridge National Lab.; - John B. Hampshire II, Jet Propulsion Lab.; - Richard P. Lippmann, MIT Lincoln Lab.; - Murali Menon, MIT/Lincoln Lab.; - Harley R. Myler, Univ. of Central Florida; - Mary Lou Padgett, Auburn Univ.; - Kevin L. Priddy, Accurate Automation Corp.; - Dennis W. Ruck, Information Warfare Ctr.; - Gregory L. Tarr, Air Force Phillips Lab.; - Gary Whittington, Global Web Ltd.; - Rodney G. Winter, Dept of Defense; - Yinglin Yu, South China Univ. IMPORTANT DATES: Abstract Due Date: 9 September 1996 Manuscript Due Date: 24 January 1997 Proceedings of this conference will be published and available at the symposium. ADDITIONAL INFORMATION: * Up-to-minute information about the conference is available on the World Wide Web (WWW) at http://www.afit.af.mil/Schools/EN/ENG/LABS/PatternRec/aero97.html * Questions can be sent by E-mail to rogers at afit.af.mil From andre at icmsc.sc.usp.br Mon Aug 19 19:23:02 1996 From: andre at icmsc.sc.usp.br ( Andre Carlos P. de Leon F. de Carvalho ) Date: Mon, 19 Aug 1996 20:23:02 -0300 Subject: SBRN 96 - LAST CALL Message-ID: <199608192323.UAA06533@taba> 3rd Brazilian Symposium on Neural Networks Recife, November 12 - 14, 1996 Sponsored by the Brazilian Computer Society (SBC) Second Call for Papers The Third Brazilian Symposium on Neural Networks will be held at the Federal University of Pernambuco, in Recife (Brazil), from the 12nd to the 14th of November, 1996. The SBRN symposia, as they were initially named, are organized by the interest group in Neural Networks of the Brazilian Computer Society since 1994. The third version of the meeting follows a very successfull organization of the previous events which brought together the main developments of the area in Brazil and had the participation of many national and international researchers both as invited speakers and as authors of papers presented at the symposium. Recife is a very pleasant city in the northeast of Brazil, known by its good climate and beautiful beaches, with sunshine throughout almost the whole year. The city, whose name originated from the coral formations in the seaside port and beaches, is in a strategic touristic situation in the region and offers a good variety of hotels both in the city historic center and at the seaside resort. Scientific papers will be analyzed by the program committee. This analysis will take into account originality, significance to the area, and clarity. Accepted papers will be fully published in the conference proceedings. MAJOR TOPICS: The major topics of interest include, but are not limited to: * Biological Perspectives * Theoretical Models * Algorithms and Architectures * Learning Models * Hardware Implementation * Signal Processing * Robotics and Control * Parallel and Distributed Implementations * Pattern Recognition * Image Processing * Optimization * Cognitive Science * Hybrid Systems * Dynamic Systems * Genetic Algorithms * Fuzzy Logic * Applications INTERNATIONAL INVITED SPEAKERS: * "Adaptive Wavelets for Pattern Recognition" by Professor Harold Szu , Director of the Center for Advanced Computer Studies, University of Southwestern Louisiana * "Recurrent Neural Networks: El Dorado or Fort Knox?" by Professor C. Lee Giles , NEC Research Institute and University of Maryland, College Park * "Case-based Reasoning and Neural Networks - a Fruitful Breed?" by Professor Agnar Aamodt , Department of Informatics, University of Trondheim - Norway PROGRAM COMMITTEE: * Teresa Bernarda Ludermir - DI/UFPE * Andri C. P. L. F. de Carvalho - ICMSC/USP (Chair) * Germano C. Vasconcelos - DI/UFPE * Anttnio de Padua Braga - DELT/UFMG * Dmbio Leandro Borges - CEFET/PR * Paulo Martins Engel - II/UFRGS * Ricardo Machado - PUC/Rio * Valmir Barbosa - COPPE/UFRJ * Weber Martins - EEE/UFG ORGANISING COMMITTEE: * Teresa Bernarda Ludermir - DI/UFPE (Chair) * Edson Costa de Barros Carvalho Filho - DI/UFPE * Germano C. Vasconcelos - DI/UFPE * Paulo Jorge Leitco Adeodato - DI/UFPE SUBMISSION PROCEDURE: The symposium seeks contributions to the state of the art and future perspectives of Neural Networks research. Submitted papers must be in Portuguese, English or Spanish. The submissions must include the original and three copies of the paper and must follow the format below (Electronic mail and FAX submissions are NOT accepted). The paper must be printed using a laser printer, in two-column format, not numbered, 8.5 X 11.0 inch (21,7 X 28.0 cm). It must not exceed eight pages, including all figures and diagrams. The font size should be 10 pts, such as Times-Roman or equivalent, with the following margins: right and left 2.5 cm, top 3.5 cm, and bottom 2.0 cm. The first page should contain the paper's title, the complete author(s) name(s), affiliation(s), and mailing address(es), followed by a short (150 words) abstract and a list of descriptive key words. The submission should also include an accompanying letter containing the following information : * Manuscript title * First author's name, mailing address and e-mail * Technical area of the paper Authors may use the Latex files sbrn.tex and sbrn.sty for preparing their manuscripts. The postscript file sbrn.ps is also available. Alternately, all those files, together with an equivalent file in WORD, can be retrieved by anonymous ftp following the instructions given below : ftp ftp.di.ufpe.br (LOGIN :) anonymous (PASSWORD :) (your email address) cd pub/events/IIISBRN bin get sbrn.tex (or sbrn.doc) get sbrn.sty bye SUBMISSION ADDRESS: Four copies (one original and three copies) must be submitted to: Prof. Andri Carlos Ponce de Leon Ferreira de Carvalho Coordenador do Comitj de Programa - III SBRN Departamento de Cijncias de Computagco e Estatmstica ICMSC - Universidade de Sco Paulo Caixa Postal 668 CEP 13560.070 Sco Carlos, SP Phone: +55 162 726222 FAX: +55 162 749150 E-mail: IIISBRN at di.ufpe.br IMPORTANT DATES: August 30, 1996 (mailing date): Deadline for paper submission September 30, 1996 : Notification of acceptance/rejection November, 12-14 1996 : III SBRN ADDITIONAL INFORMATION: * Up-to-minute information about the symposium is available on the World Wide Web (WWW) at http://www.di.ufpe.br/~IIISBRN/web_sbrn * Questions can be sent by E-mail to IIISBRN at di.ufpe.br Profa. Teresa Bernarda Ludermir Coordenadora Geral do III SBRN Laboratory of Intelligent Computing (LCI) Departamento de Informatica Universidade Federal de Pernambuco Caixa Postal 7851 CEP 50.732-970 Recife-PE Fone : +55 81 271-8430 FAX: +55 81 271-8438 E-mail: IIISBRN at di.ufpe.br We look forward to seeing you in Recife ! From robert at fit.qut.edu.au Wed Aug 21 01:42:18 1996 From: robert at fit.qut.edu.au (Robert Andrews) Date: Wed, 21 Aug 1996 15:42:18 +1000 (EST) Subject: NIPS*96 Rule Extraction W'shop Message-ID: ============================================================= FIRST CALL FOR PAPERS NIPS*96 POST-CONFERENCE WORKSHOP -------------------------------------------- RULE-EXTRACTION FROM TRAINED NEURAL NETWORKS -------------------------------------------- Snowmass (Aspen), Colorado, USA Fri December 6th, 1996 Robert Andrews & Joachim Diederich Neurocomputing Research Centre Queensland University of Technology Brisbane 4001 Queensland, Australia Fax: +61 7 864-1801 E-mail: robert at fit.qut.edu.au E-mail: joachim at fit.qut.edu.au Rule extraction can be defined as the process of deriving a symbolic description of a trained Artificial Neural Network (ANN). Ideally the rule extraction process results in a symbolic description which closely mimics the behaviour of the network in a concise and comprehensible form. The merits of including rule extraction techniques as an adjunct to conventional Artificial Neural Network techniques include: a) the provision of a 'User Explanation' capability; b) improvement of the generalisation capabilities of ANN solutions by allowing identification of regions of input space not adequately represented; c) data exploration and the induction of scientific theories by the discovery and explicitation of previously unknown dependencies and relationships in data sets; d) knowledge acquistion for symbolic AI systems by overcoming the knowledge engineering bottleneck; e) the potential to contribute to the understanding of how symbolic and connectionist approaches to AI can be profitably integrated. An ancillary problem to that of rule extraction from trained ANNs is that of using the ANN for the `refinement' of existing rules within symbolic knowledge bases. The goal in rule refinement is to use a combination of ANN learning and rule extraction techniques to produce a `better' (ie a `refined') set of symbolic rules which can then be applied back in the original problem domain. In the rule refinement process, the initial rule base (ie what may be termed `prior knowledge') is inserted into an ANN by programming some of the weights. The rule refinement process then proceeds in the same way as normal rule extraction viz (1) train the network on the available data set(s); and (2) extract (in this case the `refined') rules - with the proviso that the rule refinement process may involve a number of iterations of the training phase rather than a single pass. The objective of this workshop is to provide a discussion platform for researchers and practitioners interested in all aspects of rule extraction from trained artificial neural networks. The workshop will examine current techniques for providing an explanation component for ANNs including rule extraction, extraction of fuzzy rules, rule initialisation and rule refinement. Other topics for discussion include computational complexity of rule extraction algorithms, criteria for assessing rule quality, and issues relating to generalisation differences between the ANN and the extracted rule set. The workshop will also discuss ways in which ANNs and rule extraction techniques may be profitably employed in commercial, industrial, and scientific application areas. The one day workshop will be a mixture of position papers and panel discussions. Papers presented in the mini-conference sessions will be of 20 minutes duration with ample time for questions/discussions afterwards. DISCUSSION POINTS FOR WORKSHOP PARTICIPANTS 1. Decompositional vs. learning approaches to rule-extraction from ANNs - What are the advantages and disadvantages w.r.t. performance, solution time, computational complexity, problem domain etc. Are decompositional approaches always dependent on a certain ANN architecture? 2. Rule-extraction from trained neural networks v symbolic induction. What are the relative strength and weaknesses? 3. What are the most important criteria for rule quality? 4. What are the most suitable representation languages for extracted rules? How does the extraction problem vary across different languages? 5. What is the relationship between rule-initialisation (insertion) and rule-extraction? For instance, are these equivalent or complementary processes? How important is rule-refinement by neural networks? 6. Rule-extraction from trained neural networks and computational learning theory.Is generating a minimal rule-set which mimics an ANN a hard problem? 7. Does rule-initialisation result in improved generalisation and faster learning? 8. To what extent are existing extraction algorithms limited in their applicability? How can these limitations be addressed? 9. Are there any interesting rule-extraction success stories? That is, problem domains in which the application of rule-extraction methods has resulted in an interesting or significant advance. SUBMISSION OF WORKSHOP EXTENDED ABSTRACTS/PAPERS Authors are invited to submit 3 copies of either an extended abstract or full paper relating to one of the topic areas listed above. Papers should be written in English in single column format and should be limited to no more than eight, (8) sides of A4 paper including figures and references. NIPS style files are available at http://www.cs.cmu.edu/afs/cs/project/cnbc/nips/formatting/nips.sty http://www.cs.cmu.edu/afs/cs/project/cnbc/nips/formatting/nips.tex http://www.cs.cmu.edu/afs/cs/project/cnbc/nips/formatting/nips.ps Please include the following information in an accompanying cover letter: Full title of paper, presenting author's name, address, and telephone and fax numbers, authors e-mail address. Submission Deadline is October 7th,1996 with notification to authors by 31st October,1996. For further information, inquiries, and paper submissions please contact: Robert Andrews Queensland University of Technology GPO Box 2434 Brisbane Q. 4001. Australia. phone +61 7 864-1656 fax +61 7 864-1969 email robert at fit.qut.edu.au More information about the NIPS*96 workshop series is available from: WWW: http://www.fit.qut.edu.au/~robert/nips96.html From sml at esesparc2.essex.ac.uk Thu Aug 22 09:01:17 1996 From: sml at esesparc2.essex.ac.uk (Lucas S M) Date: Thu, 22 Aug 1996 14:01:17 +0100 (BST) Subject: structuring chromosomes for total neural network evolution Message-ID: subject: structuring chromosomes for total neural network evolution The following two papers discuss recent work on a simple unified approach to evolving ALL aspects of a neural network, including its learning algorithm (if any). The first uses a grammar based chromosome, the second uses a set-based chromosome. The latter approach appears particularly promising as a method of part designing/ part evolving neural networks. ----------------------------------------------------------------------- From bruce at bme1.image.uky.edu Thu Aug 22 14:45:12 1996 From: bruce at bme1.image.uky.edu (Eugene Bruce) Date: Thu, 22 Aug 96 14:45:12 EDT Subject: No subject Message-ID: <9608221845.AA01571@image.uky.edu> POSTDOCTORAL POSITION(SENSORIMOTOR INTEGRATION AND DYNAMICAL SYSTEMS) Respiratory Dynamics Lab, University of Kentucky Center for Biomedical Engineering This position is part of an NIH-funded project to identify causes of irregular breathing and apnea. The project involves experimental and computational studies aimed at understanding nonlinear modulation of respiratory rhythm by sensory afferents from the lungs and upper airway. Specific sub-projects include: (1) characterization of vagal deflation receptors in rats from single-unit recordings; (2) analysis of modulation of breathing pattern by upper airway afferents using techniques from nonlinear dynamics, including the development of new theoretical approaches to signal processing; (3) mathematical modelling of the integration of sensory afferents with neural circuits for respiratory pattern formation; (4) experimental analysis and modelling of responses of upper airway and chest wall muscles to transient respiratory stimuli using linear and nonlinear system identification methods. Future work will address the development of an in-vitro brainstem-spinal cord preparation for studying respiratory sensorimotor integration. The ideal applicant will be able to contribute to aspects of both the experimental studies and the modelling or signal analysis efforts. The position is available immediately. More information about the laboratory can be found at the URL http://www.uky.edy/RGS/CBME/bruce.html. Information about related neuroscience activities at the Center is available at http://www.uky.edu/RGS/CBME/CBMENeuralControl.html. Additional information about this position is available via email inquiries to bruce at bme1.image.uky.edu, or by telephone (606-257-3774). Applications (curriculum vitae and names of references) may be sent to Dr. Eugene Bruce by email, or by postal mail to Wenner Gren Research Laboratory, University of Kentucky, Rose Street, Lexington, KY 40506-0070. (Posted on 8/22/96.) Eugene Bruce, Ph. D. Center for Biomedical Engineering Univ. of Kentucky BRUCE at BME1.IMAGE.UKY.EDU From dhw at almaden.ibm.com Thu Aug 22 18:33:47 1996 From: dhw at almaden.ibm.com (dhw@almaden.ibm.com) Date: Thu, 22 Aug 1996 15:33:47 -0700 Subject: Paper announcements Message-ID: <9608222233.AA24700@buson.almaden.ibm.com> *** Paper Announcements *** ==================================================================== The following new paper is now available with anonymous ftp to ftp.santafe.edu, in the directory pub/dhw_ftp, under the names BS.ps.Z and BS.ps.Z.encoded. Any comments are welcomed. * COMBINING STACKING WITH BAGGING TO IMPROVE A LEARNING ALGORITHM by David H. Wolpert and William G. Macready Abstract: In bagging \cite{breiman:bagging} one uses bootstrap replicates of the training set \cite{efron:computers, efron.tibshirani:introduction} to improve a learning algorithm's performance, often by tens of percent. This paper presents several ways that stacking \cite{wolpert:stacked,breiman:stacked} can be used in concert with the bootstrap procedure to achieve a further improvement on the performance of bagging for some regression problems. In particular, in some of the work presented here, one first converts a single underlying learning algorithm into several learning algorithms. This is done by bootstrap resampling the training set, exactly as in bagging. The resultant algorithms are then combined via stacking. This procedure can be viewed as a variant of bagging, where stacking rather than uniform averaging is used to achieve the combining. The stacking improves performance over simple bagging by up to a factor of 2 on the tested problems, and never resulted in worse performance than simple bagging. In other work presented here, there is no step of converting the underlying learning algorithm into multiple algorithms, so it is the improve-a-single-algorithm variant of stacking that is relevant. The precise version of this scheme tested can be viewed as using the bootstrap and stacking to estimate the input-dependence of the statistical bias and then correct for it. The results are preliminary, but again indicate that combining stacking with the bootstrap can be helpful. ==================================================================== The following paper has been previously announced. A new version, incorporating major modifications of the original, is now available at ftp.santafe.edu, in pub/dhw_ftp, as estimating.baggings.error.ps.Z or estimating.baggings.error.ps.Z.encoded. The new version shows in particular how the generalization error of a bagged version of a learning algorithm can be estimated with more accuracy than that afforded by using cross-validation on the original algorithm. Any comments are welcomed. * AN EFFICIENT METHOD TO ESTIMATE BAGGING'S GENERALIZATION ERROR by David H. Wolpert and William G. Macready Abstract: In bagging \cite{Breiman:Bagging} one uses bootstrap replicates of the training set \cite{Efron:Stat,BootstrapIntro} to try to improve a learning algorithm's performance. The computational requirements for estimating the resultant generalization error on a test set by means of cross-validation are often prohibitive; for leave-one-out cross-validation one needs to train the underlying algorithm on the order of $m\nu$ times, where $m$ is the size of the training set and $\nu$ is the number of replicates. This paper presents several techniques for exploiting the bias-variance decomposition \cite{Geman:Bias, Wolpert:Bias} to estimate the generalization error of a bagged learning algorithm without invoking yet more training of the underlying learning algorithm. The best of our estimators exploits stacking \cite{Wolpert:Stack}. In a set of experiments reported here, it was found to be more accurate than both the alternative cross-validation-based estimator of the bagged algorithm's error and the cross-validation-based estimator of the underlying algorithm's error. This improvement was particularly pronounced for small test sets. This suggests a novel justification for using bagging--- improved estimation of generalization error. ==================================================================== The following paper has been previously announced. A new version, incorporating major modifications of the original, is now available at ftp.santafe.edu, in pub/dhw_ftp, as bias.plus.ps.Z or bias.plus.ps.Z.encoded. The new version contains in particular an analysis of the Friedman effect, discussed in Jerry Friedman's recently announced paper on 0-1 loss. Any comments are welcomed. * ON BIAS PLUS VARIANCE by David H. Wolpert Abstract: This paper presents several additive "corrections" to the conventional quadratic loss bias- plus-variance formula. One of these corrections is appropriate when both the target is not fixed (as in Bayesian analysis) and also training sets are averaged over (as in the conventional bias-plus- variance formula). Another additive correction casts conventional fixed-training-set Bayesian analysis directly in terms of bias-plus-variance. Another correction is appropriate for measuring full generalization error over a test set rather than (as with conventional bias-plus-variance) error at a single point. Yet another correction can help explain the recent counter-intuitive bias-variance decomposition of Friedman for zero-one loss. After presenting these corrections this paper then discusses some other loss-function-specific aspects of supervised learning. In particular, there is a discussion of the fact that if the loss function is a metric (e.g., zero-one loss), then there is bound on the change in generalization error accompanying changing the algorithm's guess from h1 to h2 that depends only on h1 and h2 and not on the target. This paper ends by presenting versions of the bias-plus-variance formula appropriate for logarithmic and quadratic scoring, and then all the ad ditive corrections appropriate to those formulas. All the correction terms presented in this paper are a covariance, between the learning algorithm and the posterior distribution over targets. Accordingly, in the (very common) contexts in which those terms apply, there is not a "bias-variance trade-off", or a "bias-variance dilemma", as one often hears. Rather there is a bias-variance-cova riance trade-off. From nin at cns.brown.edu Fri Aug 23 12:07:32 1996 From: nin at cns.brown.edu (Nathan Intrator) Date: Fri, 23 Aug 96 12:07:32 EDT Subject: Paper announcements Message-ID: <9608231607.AA07816@cns.brown.edu> *** Papers Announcements *** The following papers are now available from my research page: http://www.physics.brown.edu/people/nin/research.html Comments are welcomed. ----------------------------------------------------------------------- Classifying Seismic Signals by Integrating Ensembles of Neural Networks Yair Shimshoni and Nathan Intrator ftp://cns.brown.edu/nin/papers/hong-kong.ps.Z This paper proposes a classification scheme based on the integration of multiple Ensembles of ANNs. It is demonstrated on a classification problem, in which Seismic recordings of Natural Earthquakes must be distinguished from the recordings of Artificial Explosions. A Redundant Classification Environment consists of several Ensembles of Neural Networks is created and trained on Bootstrap Sample Sets, using various data representations and architectures. The ANNs within the Ensembles are aggregated (as in Bagging) while the Ensembles are integrated non-linearly, in a signal adaptive manner, using a posterior confidence measure based on the agreement (variance) within the Ensembles. The proposed Integrated Classification Machine achieved 92.1\% correct classification on the seismic test data. Cross Validation evaluations and comparisons indicate that such integration of a collection of ANN's Ensembles is a robust way for handling high dimensional problems with a complex non-stationary signal space as in the current Seismic Classification problem. To appear: Proceedings of ICONIP 96 ----------------------------------------------------------------------- Learning low dimensional representations of visual objects with extensive use of prior knowledge Nathan Intrator and Shimon Edelman ftp://cns.brown.edu/nin/papers/ml1.ps.Z Learning to recognize visual objects from examples requires the ability to find meaningful patterns in spaces of very high dimensionality. We present a method for dimensionality reduction which effectively biases the learning system by combining multiple constraints via an extensive use of class labels. The use of multiple class labels steers the resulting low-dimensional representation to become invariant to those directions of variation in the input space that are irrelevant to classification; this is done merely by making class labels independent of these directions. We also show that prior knowledge of the proper dimensionality of the target representation can be imposed by training a multiple-layer bottleneck network. A series of computational experiments involving parameterized fractal images and real human faces indicate that the low-dimensional representation extracted by our method leads to improved generalization in the learned tasks, and is likely to preserve the topology of the original space. To appear: EXPLANATION-BASED NEURAL NETWORK LEARNING: A LIFELONG LEARNING APPROACH. Editor: SEBASTIAN THRUN ----------------------------------------------------------------------- Bootstrapping with Noise: An Effective Regularization Technique Yuval Raviv and Nathan Intrator ftp://cns.brown.edu/nin/papers/spiral.ps.Z Bootstrap samples with noise are shown to be an effective smoothness and capacity control technique for training feed-forward networks and for other statistical methods such as generalized additive models. It is shown that noisy bootstrap performs best in conjunction with weight decay regularization and ensemble averaging. The two-spiral problem, a highly non-linear noise-free data, is used to demonstrate these findings. The combination of noisy bootstrap and ensemble averaging is also shown useful for generalized additive modeling, and is also demonstrated on the well known Cleveland Heart Data \cite{Detrano89}. To appear: Connection Science, Speciall issue on Combining Estimators. From jim at stats.gla.ac.uk Fri Aug 23 12:49:16 1996 From: jim at stats.gla.ac.uk (Jim Kay) Date: Fri, 23 Aug 1996 17:49:16 +0100 Subject: TR on Contextually Guided Unsupervised Learning/Multivariate Processors Message-ID: <17769.199608231649@pole.stats.gla.ac.uk> Technical Report Available CONTEXTUALLY GUIDED UNSUPERVISED LEARNING USING LOCAL MULTIVARIATE BINARY PROCESSORS Jim Kay Department of Statistics University of Glasgow Dario Floreano MicroComputing Laboratory Swiss Federal Institute of Technology Bill Phillips Centre for Cognitive and Computational Neuroscience University of Stirling We consider the role of contextual guidance in learning and processing within multi-stream neural networks. Earlier work (Kay \& Phillips, 1994, 1996; Phillips et al., 1995) showed how the goals of feature discovery and associative learning could be fused within a single objective, and made precise using information theory, in such a way that local binary processors could extract a single feature that is coherent across streams. In this paper we consider multi-unit local processors, with multivariate binary outputs, that enable a greater number of coherent features to be extracted. Using the Ising model, we define a class of information-theoretic objective functions and also local approximations, and derive the learning rules in both cases. These rules have similarities to, and differences from, the celebrated BCM rule. Local and global versions of Infomax appear as by-products of the general approach, as well as multivariate versions of Coherent Infomax. Focussing on the more biologically plausible local rules, we describe some computational experiments designed to investigate specific properties of the processors and the general approach. The main conclusions are: 1. The local methodology introduced in the paper has the required functionality. 2. Different units within the multi-unit processors learned to respond to different aspects of their receptive fields. 3. The units within each processor generally produced a distributed code in which the outputs were correlated, and which was robust to damage; in the special case where the number of units available was only just sufficient to transmit the relevant information, a form of competitive learning was produced. 4. The contextual connections enabled the information correlated across streams to be extracted, and, by improving feature detection with weak or noisy inputs, they played a useful role in short-term processing and in improving generalization. 5. The methodology allows the statistical associations between distributed self-organizing population codes to be learned. This technical report is available in compressed Postscript by anonymous ftp from: ftp.stats.gla.ac.uk or from the following URL: ftp://ftp.stats.gla.ac.uk/pub/jim/NNkfp.ps.Z Some earlier reports and general references are available from the URL: http://www.stats.gla.ac.uk/~jim/nn.html ----------------------------------------------------------------------- Jim Kay jim at stats.gla.ac.uk From nkasabov at commerce.otago.ac.nz Sat Aug 24 18:44:32 1996 From: nkasabov at commerce.otago.ac.nz (Nikola Kasabov) Date: Sat, 24 Aug 1996 10:44:32 -1200 Subject: ICONIP/ANZIIS/ANNES'97 CFP Message-ID: <23DE8694E89@jupiter.otago.ac.nz> ICONIP'97 jointly with ANZIIS'97 and ANNES'97 The Fourth International Conference on Neural Information Processing-- The Annual Conference of the Asian Pacific Neural Network Assembly, jointly with The Fifth Australian and New Zealand International Conference on Intelligent Information Processing Systems, and The Third New Zealand International Conference on Artificial Neural Networks and Expert Systems 24-28 November, 1997 Dunedin/Queenstown, New Zealand In 1997, the annual conference of the Asian Pacific Neural Network Assembly, ICONIP'97, will be held jointly with two other major international conferences in the Asian Pacific Region, the Fifth Australian and New Zealand International Conference on Intelligent Information Processing Systems (ANZIIS'97) and the Third New Zealand International Conference on Artificial Neural Networks and Expert Systems (ANNES'97), from 24 to 28 November 1997 in Dunedin and Queenstown, New Zealand. The joint conference will have three parallel streams: Stream1: Neural Information Processing Stream2: Computational Intelligence and Soft Computing Stream3: Intelligent Information Systems and their Applications TOPICS OF INTEREST Stream1: Neural Information Processing * Neurobiological systems * Cognition * Cognitive models of the brain * Dynamical modelling, chaotic processes in the brain * Brain computers, biological computers * Consciousness, awareness, attention * Adaptive biological systems * Modelling emotions * Perception, vision * Learning languages * Evolution Stream2: Computational Intelligence and Soft Computing * Artificial neural networks: models, architectures, algorithms * Fuzzy systems * Evolutionary programming and genetic algorithms * Artificial life * Distributed AI systems, agent-based systems * Soft computing--paradigms, methods, tools * Approximate reasoning * Probabilistic and statistical methods * Software tools, hardware implementation Stream3: Intelligent Information Systems and their Applications * Connectionist-based information systems * Hybrid systems * Expert systems * Adaptive systems * Machine learning, data mining and intelligent databases * Pattern recognition and image processing * Speech recognition and language processing * Intelligent information retrieval systems * Human-computer interfaces * Time-series prediction * Control * Diagnosis * Optimisation * Application of intelligent information technologies in: manufacturing, process control, quality testing, finance, economics, marketing, management, banking, agriculture, environment protection, medicine, geographic information systems, government, law, education, and sport * Intelligent information technologies on the global networks HONORARY CHAIR Shun-Ichi Amari, Tokyo University GENERAL CONFERENCE CHAIR Nikola Kasabov, University of Otago LOCAL ORGANIZING COMMITTEE CHAIR: Philip Sallis, University of Otago CONFERENCE ORGANISER Ms Kitty Ko Department of Information Science, University of Otago, PO Box 56, Dunedin, New Zealand phone: +64 3 479 8153, fax: +64 3 479 8311, email: kittyko at commerce.otago.ac.nz CALL FOR PAPERS Papers must be received by 30 May 1997. They will be reviewed by senior researchers in the field and the authors will be informed about the decision of the review process by 20 July 1997. The accepted papers must be submitted in a camera-ready format by 20 August. All accepted papers will be published by IEEE Computer Society Press. As the conference is a multi-disciplinary meeting the papers are required to be comprehensible to a wider rather than to a very specialised audience. Papers will be presented at the conference either in an oral or in a poster session. Please submit three copies of the paper written in English on A4-format white paper with one inch margins on all four sides, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. Centred at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by an abstract and the text. In the covering letter the stream and the topic of the paper according to the list above should be indicated. The IEEE Transaction journals LaTex article style can be used. SPECIAL ISSUES OF JOURNALS AND EDITED VOLUMES Selected papers will be published in special issues of scientific journals. The organising committee is looking for publications of edited volumes which include chapters covering the conference topics written by invited conference participants. TUTORIALS (24 November) Conference tutorials will be organized to introduce the basics of cognitive modelling, dynamical systems, neural networks, fuzzy systems, evolutionary programming, soft computing, expert systems, hybrid systems, and adaptive systems. Proposals for tutorials are due on 30 May 1997. EXHIBITION Companies and university research laboratories are encouraged to exhibit their developed or distributing software and hardware systems. STUDENT SESSION Postgraduate students are encouraged to submit papers to this session following the same formal requirements for paper submission. The submitted papers will be published in a separate brochure. SPECIAL EVENTS FOR PRACTITIONERS The New Zealand Computer Society is organising special demonstrations, lectures and materials for practitioners working in the area of information technologies. VENUE (Dunedin/Queenstown) The Conference will be held at the University of Otago, Dunedin, New Zealand. The closing session will be held on Friday, 28 November on a cruise on one of the most beautiful lakes in the world, Lake Wakatipu. The cruise departs from the famous tourist centre Queenstown, about 300 km from Dunedin. Transportation will be provided and there will be a separate discount cost for the cruise. ACCOMMODATION Accommodation has been booked at St Margaret's College located right on the Campus and 10 minutes from downtown Dunedin. The College offers well equipped facilities including library, sport hall, music hall and computers with e-mail connection. Full board (NZ$50) is available during the conference days as well as two days before and after the conference. Accommodation is also available for a range of hotels in the city. TRAVELLING The Dunedin branch of House of Travel, a travelling company, is happy to assist in any domestic and international travelling arrangements for the Conference delegates. They can be contacted through email: travel at es.co.nz, fax: +64 3 477 3806, phone: +64 3 477 3464, or toll free number: 0800 735 737 (within NZ). POSTCONFERENCE EVENTS Following the closing conference cruise, delegates may like to experience the delights of Queenstown, Central Otago, and Fiordland. Travel plans can be coordinated by the Dunedin Visitor Centre (phone: +64 3 474 3300, fax: +64 3 474 3311). IMPORTANT DATES Papers due: 30 May 1997 Proposals for tutorials: 30 May 1997 Notification of acceptance: 20 July 1997 Final camera-ready papers due: 20 August 1997 Registration of at least one author of a paper: 20 August 1997 Early registration: 20 August 1997 CONFERENCE CONTACTS, PAPER SUBMISSIONS, CONFERENCE INFORMATION, REGISTRATION FORMS Conference Secretariat Department of Information Science, University of Otago, PO Box 56, Dunedin, New Zealand; phone: +64 3 479 8142; fax: +64 3 479 8311; email: iconip97 at otago.ac.nz Home page: http://divcom.otago.ac.nz:800/com/infosci/kel/conferen.htm ---------------------------------------------- ICONIP'97 jointly with ANZIIS'97 and ANNES'97 TENTATIVE REGISTRATION PLEASE PRINT Title:________________________________ Surname:______________________________ First Name:___________________________ Position:_____________________________ Organisation:_________________________ Department:___________________________ Address:______________________________ ______________________________________ City:_________________________________ Country:______________________________ Phone:________________________________ Fax:__________________________________ Email:________________________________ Yes/No. Would you attend the conference? Yes/No. Would you submit a paper? Yes/No. Would you attend the closing session on the cruise? Yes/No. Would you like any further information? Please mail a copy of this completed form to: Ms Kitty Ko Department of Information Science University of Otago PO Box 56 Dunedin New Zealand. -------------------------------------------------------------- -------------------------------------------------------------------------------- Assoc.Professor Dr Nikola Kasabov phone:+64 3 479 8319 Director of Graduate Studies fax:+64 3 479 8311 Department of Information Science nkasabov at otago.ac.nz University of Otago P.O. Box 56, Dunedin, New Zealand home page http://divcom.otago.ac.nz:800/COM/INFOSCI/KEL/home.htm ------------------------------------------------------------------------------- From josh at vlsia.uccs.edu Fri Aug 23 19:52:30 1996 From: josh at vlsia.uccs.edu (Alspector) Date: Fri, 23 Aug 96 17:52:30 MDT Subject: call for papers, IWANNT*97 Message-ID: <9608232352.AA02678@vlsia.uccs.edu> CALL FOR PAPERS International Workshop on Applications of Neural Networks (and other intelligent systems) to Telecommunications (IWANNT*97) Melbourne, Australia June 9-11, 1997 Organizing Committee General Chair: Josh Alspector, U. of Colorado Program Chair: Rod Goodman, Caltech Publications Chair: Timothy X Brown, U. of Colorado Treasurer: Anthony Jayakumar, Bellcore Publicity: Atul Chhabra, NYNEX Lee Giles, NEC Research Institute Local Arrangements: Adam Kowalczyk, Telstra, Chair Michael Dale, Telstra Andrew Jennings, RMIT Maributu Palaniswami, U. of Melbourne Robert Slaviero, Signal Proc. Ass. (& local IEEE liason) Jacek Szymanski, Telstra Program Committee: Nader Azarmi, British Telecom Miklos Boda, Ericsson Radio Systems Harald Brandt, Ericsson Telecommunications Tzi-Dar Chiueh, National Taiwan U Bruce Denby, U of Versailles Simon Field, Nortel Francoise Fogelman, SLIGOS Marwan A. Jabri, Sydney Univ. Thomas John, SBC S Y Kung, Princeton University Tadashi Sone, ATR Scott Toborg, SBC TRI IEEE Liaison: Steve Weinstein, NEC Conference Administrator: Helen Alspector IWANNT Conference Administrator Univ. of Colorado at Col. Springs Dept. of Elec. & Comp. Eng. P.O. Box 7150 Colorado Springs, CO 80933-7150 (719) 593-3351 (719) 593-3589 (fax) neuranet at mail.uccs.edu Dear Colleague: You are invited to an international workshop on applications of neural networks and other intelligent systems to problems in telecommunica- tions and information networking. This is the third workshop in a series that began in Princeton, New Jersey on October 18-20, 1993. and continued in Stockholm, Sweden on May 22-24, 1995. This conference will be at the University of Melbourne on the Monday through Wednesday (June 9 - 11, 1997) just before the Australian Conference on Neural Networks (ACNN) which will be at the same location on June 11 - 13 (Wednesday - Friday). There will be a hard cover proceedings available at the workshop. There is further information on the IWANNT home page at: http://ece-www.colorado.edu/~timxb/iwannt.html Suggested topics include: Internet Services Intelligent Agents Database Mining Network Management ATM Networking Wireless Networks Modulation and Coding Techniques Congestion Control Adaptive Equalization Speech Recognition Security Verification Adaptive User Interfaces Language ID/Translation Multimedia Networking Information Filtering Dynamic Routing Propagation Path Loss Modeling Dynamic Frequency Allocation Software Engineering Telecom Market Prediction Fault Identification and Prediction Character Recognition Adaptive Control Data Compression Credit Management Customer Modeling Submissions: Please submit 6 copies of both a 50 word abstract and a 1000 word summary of your paper to arrive in Colorado, USA by Oct. 15, 1996. Mail papers to the conference administrator. Note the following dates: Tuesday, Oct. 15, 1996: Abstract, summary due. Monday, Nov. 25, 1996: Notification of acceptance Monday, Feb. 10, 1997: Camera Ready Copy Due I hope to see you at the workshop. Sincerely, Josh Alspector, General Chair ----------------------------------------------------------- REGISTRATION FORM ___________________________________________________________ International Workshop on Applications of Neural Networks (and other intelligent systems) to Telecommunications (IWANNT*97) Melbourne, Australia June 9-11, 1997 Name: Institution: Mailing Address: Telephone: Fax: E-mail: Make check ($400; $500 after May 1, 1997; $200 students) out to IWANNT*97. Please make sure your name is on the check. Registration includes breaks and proceedings available at the conference. Mail to: Helen Alspector IWANNT Conference Administrator Univ. of Colorado at Col. Springs Dept. of Elec. & Comp. Eng. P.O. Box 7150 Colorado Springs, CO 80933-7150 (719) 593-3351 (719) 593-3589 (fax) neuranet at mail.uccs.edu Site The conference will be held at the University of Melbourne. There are several good hotels within walking distance of the university. More information will be sent to registrants or upon request. From dhw at almaden.ibm.com Fri Aug 23 19:54:39 1996 From: dhw at almaden.ibm.com (dhw@almaden.ibm.com) Date: Fri, 23 Aug 1996 16:54:39 -0700 Subject: Job openings Message-ID: <9608232354.AA22322@buson.almaden.ibm.com> *** Job Announcements. Please distribute. *** The Web is currently dumb. Join our team at IBM net.Mining; we are making the web intelligent. We currently have immediate need to fill positions at our Almaden Research Center facility in the south of Silicon Valley. net.Mining is a sub-organization of IBM Data Mining Solutions, a rapidly expanding group that also has openings (see recent postings). IBM is an equal opportunity employer. Scientific Programmers/ Responsibilities: Interact with the Machine Learning Researchers to implement new web-based algorithms as code, verify the code, and test the algorithms in real world environments. Must be able to work independently. Qualitifications: Bachelors or equivalent in computer science, statistics, mathematics, physics, or an equivalent field. Higher degree highly desirable. Extensive experience implementing numeric code, especially in machine learning, statistics, neural nets, or a similar field. Familiarity with college-level mathematics (multi-variable calculus, differential equations, linear algebra, etc.) 2 or more years experience with C/C++ in a research or commercial environment. Knowledge of Internet technologies highly desirable. Machine Learning Researchers/ Responsibilities: Develop new algorithms applying machine learning and associated technologies to the web. Develop new such technologies. Work with the Scientific Programmers to implement and investigate those algorithms and technologies in the real world. Qualifications include: PhD or equivalent in computer science, statistics, mathematics, physics, or an equivalent field, with an emphasis on machine learning, statistics, neural nets, or a similar field. Strong background in mathematics. Experience with C/C++ highly desirable. Knowledge of information retrieval and/or indexing systems, text mining, and/or knowledge of Internet technologies, all highly desirable. From sandro at parker.physio.nwu.edu Tue Aug 27 12:13:36 1996 From: sandro at parker.physio.nwu.edu (Sandro Mussa-Ivaldi) Date: Tue, 27 Aug 96 11:13:36 CDT Subject: Postdoctoral fellowship - Motor learning Message-ID: <9608271613.AA01631@parker.physio.nwu.edu> ****** POSTDOCTORAL FELLOWSHIP ON MOTOR LEARNING ****** A postdoctoral position is available at the Sensory Motor Performance Program of the Rehabilitation Institute of Chicago (RIC) to work on learning and adaptation of multi-joint arm movements. RIC is a Northwestern University affiliated rehabilitation hospital, with close ties to Northwestern University schools of medicine and engineering. The research is to be be carried out with Sandro Mussa-Ivaldi, Ph.D. and involves both experimental and theoretical components. The experimental work will be based on the interaction of human subjects with a two-joint robot manipulandum that has been recently been built in Mussa-Ivaldi's lab. The manipulandum is controlled by a PC programmed in C++. The theoretical work will involve the representation of the arm's adaptive controller as a combination of non-linear basis functions. The position which is available immediately is for one year but it is expected to be extended at least for a second year. It requires a professional level of knowledge which is acquired with a PhD in Engineering of Physics with emphasis in Biomedical research and with some substantial background in Classical Mechanics and Control theory. Technical proficiency in C++, PC platforms (DOS, Windows 95) real-time programming and Unix (X) is highly desirable. RIC offers competitive salary and benefits (EOE, M/F/D/V). Applicants should send a CV, a statement of their interests and professional goals (not longer than 1 page) and the names, addresses and telephone numbers of at least two reference to Domenica Pappas either via email (dgpappas at casbah.acns.nwu.edu) or via surface mail at the following address: Domenica Pappas Administrative Supervisor Rehabilitation Institute of Chicago 345 East Superior Street Room 1406 Chicago, Illinois 60611 ---------------------------------------------------------- Sandro Mussa-Ivaldi (sandro at nwu.edu) From meyer at wotan.ens.fr Wed Aug 28 05:52:32 1996 From: meyer at wotan.ens.fr (Jean-Arcady MEYER) Date: Wed, 28 Aug 1996 11:52:32 +0200 Subject: SAB96 Registration by August 30th Message-ID: <9608280952.AA12138@eole.ens.fr> !! Note: if you plan to attend SAB96 and have not yet registered, please do so ASAP. Your registration should arrive by August 30th. After that, mail will be delayed due to Labor Day (Sep 2) and your registration may not get processed in time for the conference!! ***********************CONFERENCE INFORMATION******************************* From Animals to Animats The Fourth International Conference on Simulation of Adaptive Behavior (SAB96) September 9th-13th, 1996 Sea Crest Resort & Conference Center North Falmouth, Massachusetts, USA FULL DETAILS ON THE WEB PAGE: http://www.cs.brandeis.edu/conferences/sab96 GENERAL CONTACT: sab96 at cs.brandeis.edu The objective of this conference is to bring together researchers in ethology, psychology, ecology, artificial intelligence, artificial life, robotics, and related fields so as to further our understanding of the behaviors and underlying mechanisms that allow natural and artificial animals to adapt and survive in uncertain environments. The conference start with an opening reception on Sunday, September 8th. Technical sessions start in the morning on Monday, September 9th and run until early Friday afternoon. The conference banquet will be held on Thursday evening, September 12th, and tickets for it are sold separately and in advance. Wednesday afternoon is left free for sightseeing. **************************PROGRAM**************************************** The invited speakers are James Albus, Jelle Atema, Daniel Dennett, Randy Gallistel, Scott Kelso, and David Touretzky. The conference will consist of a single track of 35 papers, 30 posters, and several demonstrations. Full details of the schedule will be maintained on the web page. ***********************REGISTRATION IS OPEN************************* Included in full conference registration fees are: * Reception and breaks * Lunches (4 days) * Entry to the all technical and poster sessions * Conference proceedings published by MIT Press Please purchase your banquet tickets separately and in advance, as they may not be available at the conference. Members of the International Society for Adaptive Behavior (ISAB) will save $50 on registration fees for this conference. In addition, ISAB members receive an annual subscription to Adaptive Behavior, the premier journal of the field, as well as discounts to other ISAB-related meetings. To join ISAB, please visit http://netq.rowland.org/isab/isab.html Early registration (postmarked before June 30, 1996) is recommended as it helps us plan and saves you $50. For the convenience of overseas colleagues, we will process a limited number of MasterCard or VISA registrations. Full-time students, and ISAB members should be able to prove their status at the registration desk with an ID card or member number. Please fill out this form and send with your check (in US dollars, made out to "Brandeis University") to: SAB96 Registration c/o Ms. Myrna Fox Computer Science Department Brandeis University 415 South St Waltham, MA 02254 USA Dr__Mr__Ms__ Last Name First Name Middle Title Affiliation Address E-mail Phone Fax FEES (in US dollars) Category EARLY REGISTRATION Regular Total Postmarked by June 30th ISAB Member $250 $300 Non-Member $300 $350 Full-Time Student Include copy of ID $175 $225 Banquet Tickets $40 each Total fees Enclosed In addition, 6 foot tables for publishers and other vendors are available for $250 for the 5 days of the conference. If you register but must cancel, and you notify us before July 8th, you will receive your fees minus a $50 service charge. If you notify us before August 26th, you will receive a refund minus a $100 service charge. After August 26th your fees will not be refundable. ***************************HOTEL************************************ HOTEL DEADLINE FOR BLOCKED ROOMS IS AUGUST 8TH The entire conference will take place at the Sea Crest in North Falmouth on Cape Cod. Participants should fly into Boston Logan Airport and either rent a car or take the Bonanza Shuttle Bus ($30 r/t, see schedule on Web page) to Falmouth, and transfer to the hotel by the hotel shuttle. All participants are responsible for making their own reservations by contacting the group reservations office at (800) 225-3110, and must state they are attending the Simulation of Adaptive Behavior Conference in order to receive the special rate. 200 rooms have been set aside for our use, and are being held on a first-come first-served basis until August 8th. Single and double rooms are available for $90 plus tax. Children under 16 may stay in their parent's room for free. The rooms have two double beds; 3 or 4 persons may share a room for $100 and $110 plus tax, respectively. The conference rate also applies to the 3 days before and 3 days after the meeting dates. Each room reservation must be secured with a credit card deposit for one night. If a reservation is cancelled 8 days or more prior to arrival, the deposit is refunded, minus a $10 service charge. If a reservation is cancelled 7 days or less prior to arrival, or the individual does not show up for the specified dates, the reservation will be cancelled and the deposit forfeited. In case of overflow, the Quality Inn (508) 540-2000 and the Falmouth Inn (508) 540-2500 have rooms available at $65 and $60 per night plus tax. *************************AIRLINE********************************* Delta Airline is the official airline for SAB96, and is offering special discounted meeting fares from the US and Canada. To take advantage of these fares: 1) Please call or have your travel agent call 1-800-241-6760 between 8 am and 10 pm EST, 2) Refer to file number "XI304" From radford at cs.toronto.edu Wed Aug 28 20:12:47 1996 From: radford at cs.toronto.edu (Radford Neal) Date: Wed, 28 Aug 1996 20:12:47 -0400 Subject: New release of Bayesian NN software Message-ID: <96Aug28.201256edt.1290@neuron.ai.toronto.edu> New Release of Software for BAYESIAN LEARNING FOR NEURAL NETWORKS Radford Neal, University of Toronto A new version of my software for Bayesian learning of models based on multilayer perceptron networks, using Markov chain Monte Carlo methods, is now available on the Internet. This software implements the methods described in my Ph. D. thesis, "Bayesian Learning for Neural Networks", which is now available from Springer-Verlag (ISBN 0-387-94724-8). Use of the software is free for research and educational purposes. The software supports models for regression and classification problems based on networks with any number of hidden layers, using a wide variety of prior distributions for network parameters and hyperparameters. The advantages of Bayesian learning include the automatic determination of "regularization" parameters, without the need for a validation set, avoidance of overfitting when using large networks, and quantification of the uncertainty in predictions. The software implements the Automatic Relevance Determination (ARD) approach to handling inputs that may turn out to be irrelevant (developed with David MacKay). For problems and networks of moderate size (eg, 200 training cases, 10 inputs, 20 hidden units), full training (to the point where one can be reasonably sure that the correct Bayesian answer has been found) typically takes several hours to a day on our SGI machine. However, quite good results, competitive with other methods, are often obtained after training for under an hour. (Of course, your machine may not be as fast as ours!) The software is written in ANSI C, and has been tested on SGI and Sun machines. Full source code is included. This new release is not radically different from the release of a year ago, but it does contain a number of enhancements to both the programs and the documentation, so it is probably worth your while to upgrade if you are using the old version. The new version is not quite upwardly compatible with the old, but converting old scripts should be very easy. You can obtain the software via my home page, at URL http://www.cs.toronto.edu/~radford/ If you have any problems obtaining the software, please contact me at one of the addresses below. --------------------------------------------------------------------------- Radford M. Neal radford at cs.toronto.edu Dept. of Statistics and Dept. of Computer Science radford at stat.toronto.edu University of Toronto http://www.cs.toronto.edu/~radford --------------------------------------------------------------------------- From esann at dice.ucl.ac.be Thu Aug 29 10:45:10 1996 From: esann at dice.ucl.ac.be (esann@dice.ucl.ac.be) Date: Thu, 29 Aug 1996 15:45:10 +0100 Subject: ESANN'97: Call for papers Message-ID: <199608291336.PAA14156@ns1.dice.ucl.ac.be> Dear colleagues, You will find enclosed a short version of the call for papers of ESANN'97, the European Symposium on Artificial Neural Networks, which will be held in Bruges (Belgium), on April 16-18, 1996. The full version can be viewed on the ESANN WWW server: http://www.dice.ucl.ac.be/neural-nets/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. Sincerely yours, Michel Verleysen --------------------------------------------------- | European Symposium | | on Artificial Neural Networks | | | | Bruges - April 16-17-18, 1997 | | | | First announcement and call for papers | --------------------------------------------------- Scope and topics ---------------- Since its first edition in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamental and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments and directions in this field. In 1997, the programme of the conference will be slightly modified. Besides the traditional oral sessions, there will be a few "invited" sessions organized by renowned scientists, and poster sessions. Poster authors will have the opportunity to present their poster orally, with one slide, in one or two minutes. It is important to note that posters will be considered at the same scientific level as oral presentations, the poster format being a more appropriate medium for presenting certain kinds of results. The fifth European Symposium on Artificial Neural Networks will be organized in Bruges, Belgium, in April 1997. The four first successful editions each gathered between 90 and 110 scientists, coming not only from Western and Eastern Europe, but also from USA, Japan, Australia, New Zealand, South America... The fifth ESANN symposium will concentrate on fundamental and theoretical aspects of artificial neural networks, and on the links between neural networks and other domains of research, such as statistics, data analysis, biology, psychology, evolutive learning, bio-inspired systems,... The following is a non-exhaustive list of topics which will be covered during ESANN'97: * theory * models and architectures * mathematics * learning algorithms * statistical data analysis * self-organization * approximation of functions * Bayesian classification * time series forecasting * vector quantization * independent component analysis * bio-inspired systems * cognitive psychology * biologically plausible artificial networks * formal models of biological phenomena * neurobiological systems * identification of non-linear dynamic systems * adaptive behavior * adaptive control * signal processing * evolutive learning The ESANN'97 conference is organized with the support of the IEEE Region 8, the IEEE Benelux Section, and the Universit? Catholique de Louvain (UCL, Louvain-la-Neuve, Belgium). Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is worldwide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organized in an hotel located near the center (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all levels of comfort and price are available in Bruges; there will be a possibility to book a 3-stars or a lower category hotel at a preferential rate through the conference secretariat, and a list of other smaller hotels will be available. Deadlines --------- Submission of papers November 29, 1996 Notification of acceptance January 31, 1997 Symposium April 16-18, 1997 Call for contributions ---------------------- Prospective authors are invited to submit six original copies of their contribution before November 29, 1996. Working language of the conference (including proceedings) is English. Papers should not exceed six A4 pages (including figures and references). Printing area will be 12.2 x 19.3 cm (centered on the A4 page); left, right, top and bottom margins will thus respectively be 4.4, 4.4, 5.2 and 5.2 cm. Complying with these margins and centering the text on the A4 sheets is mandatory: the manuscript will be reproduced in its original size in the proceedings, and margins will be cut to the book format. Papers not respecting these margins will be photocopied before printing, what strongly reduces their quality in the proceedings. 10-point Times font will be used for the core of the text; headings will be in bold characters (but not underlined), and will be separated from the main text by two blank lines before and one after. The manuscript will begin with a title (Times 14 point, bold, centered), two blank lines, the names of the authors (Times 10 point, centered), a blank line, their affiliation(s) (Times 9 point, centered), two blank lines, the abstract (Times 9 point, justified), and two blank lines. The maximum width of the header (title, authors, affiliations and abstract) will be 10.2 cm (i.e. left and right margins will be each 1 cm larger than for the main text). Originals of the figures will be pasted into the manuscript and centered between the margins. The lettering of the figures should be in 10-point Times font size. Figures should be numbered. The legends also should be centered between the margins and be written in 9-point Times font size. The pages of the manuscript will not be numbered (numbering decided by the editor). We strongly encourage authors to read the full instructions on the WWW server of the conference, and/or to use the LaTeX format available on this server: http://www.dice.ucl.ac.be/neural-nets/esann A separate page (not included in the manuscript) will indicate: * the title of the manuscript * author(s) name(s) * the complete address (including phone & fax numbers and E-mail) of the corresponding author * a list of five keywords or topics On the same page, the authors will copy and sign the following paragraph: "in case of acceptation of the paper for presentation at ESANN 97: - at least one of the authors will register to the conference and will present the paper - the author(s) transfer the copyright of the paper to the organizers of ESANN 97, for the proceedings and any publication that could directly be generated by the conference - if the paper does not match the format requirements for the proceedings, the author(s) will send a revised version within two weeks of the notification of acceptation." Contributions must be sent to the conference secretariat. Examples of camera-ready contributions can be obtained by writing to the same address. Registration fees ----------------- registration before registration after February 1st, 1997 February 1st, 1997 Universities BEF 15500 BEF 16500 Industries BEF 19500 BEF 20500 The registration fee include the attendance to all sessions, the lunches during the three days of the conference, the coffee breaks twice a day, the conference dinner, and the proceedings. To benefit from the reduced registration fee (before February 1st), please use the "advanced registration form" available on the ESANN WWW server or through the conference secretariat. Grants ------ A few grants, covering part of the registration fees, could be offered (depending of the acceptation of a project by the EC) to young scientists from the European Union, and/or from former Central and Eastern European countries. Please write to the conference secretariat or refer to the ESANN WWW server for details and availability. Deadline for applications: February 28, 1997. Warning: in no case, the participation of an author may be conditioned by a grant; as indicated above, prospective authors must commit themselves to register to the conference, even if their application for a grant is not accepted. Conference secretariat ---------------------- Michel Verleysen D facto conference services phone: + 32 2 203 43 63 45 rue Masui Fax: + 32 2 203 42 94 B - 1000 Brussels (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/neural-nets/esann Reply form ---------- If you wish to receive the final program of ESANN'97, for any address change, or to add one of your colleagues in our database, please send this form to the conference secretariat. ------------------------ cut here ----------------------- ------------------ ESANN'97 reply form ------------------ Name: ................................................. First Name: ............................................ University or Company: ................................. ................................. Address: .............................................. .............................................. .............................................. ZIP: ........ Town: ................................ Country: ............................................... Tel: ................................................... Fax: ................................................... E-mail: ................................................ ------------------------ cut here ----------------------- Please send this form to : D facto conference services 45 rue Masui B - 1000 Brussels e-mail: esann at dice.ucl.ac.be Steering and local committee (to be confirmed) ---------------------------------------------- Fran?ois Blayo Univ. Paris I (F) Herv? Bourlard FPMS Mons (B) Marie Cottrell Univ. Paris I (F) Jeanny H?rault INPG Grenoble (F) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------------------------- Edoardo Amaldi Cornell Univ. (USA) Agn?s Babloyantz Univ. Libre Bruxelles (B) Joan Cabestany Univ. Polit. de Catalunya (E) Holk Cruse Universit?t Bielefeld (D) Eric de Bodt UCL Louvain-la-Neuve (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips / LEP (F) Jean-Claude Fort Universit? Nancy I (F) Bernd Fritzke Ruhr-Universit?t Bochum (D) Karl Goser Universit?t Dortmund (D) Manuel Grana UPV San Sebastian (E) Martin Hasler EPFL Lausanne (CH) Kurt Hornik Techische Univ. Wien (A) Christian Jutten INPG Grenoble (F) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Hans-Peter Mallot Max-Planck Institut (D) Eddy Mayoraz IDIAP Martigny (CH) Jean Arcady Meyer Ecole Normale Sup?rieure Paris (F) Jos? Mira Mira UNED (E) Pietro Morasso Univ. of Genoa (I) Jean-Pierre Nadal Ecole Normale Sup?rieure Paris (F) Erkki Oja Helsinky University of Technology (FIN) Gilles Pag?s Universit? Paris VI (F) H?l?ne Paugam-Moisy Ecole Normale Sup?rieure Lyon (F) Alberto Prieto Universitad de Granada (E) Pierre Puget LETI Grenoble (F) Ronan Reilly University College Dublin (IRE) Tamas Roska Hungarian Academy of Science (H) Jean-Pierre Rospars INRA Versailles (F) Jean-Pierre Royet Universit? Lyon 1 (F) John Stonham Brunel University (UK) John Taylor King's College London (UK) Vincent Torre Universita di Genova (I) Claude Touzet IUSPIM Marseilles (F) Marc Van Hulle KUL Leuven (B) Christian Wellekens Eurecom Sophia-Antipolis (F) !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! WARNING: new postal code: 1000 Brussels instead of 1210 Brussels WARNING: new phone and fax numbers ! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! _____________________________ D facto publications - conference services 45 rue Masui 1000 Brussels Belgium tel: +32 2 203 43 63 fax: +32 2 203 42 94 _____________________________ From jb at uran.informatik.uni-bonn.de Thu Aug 29 06:03:58 1996 From: jb at uran.informatik.uni-bonn.de (Joachim Buhmann) Date: Thu, 29 Aug 96 11:03:58 +0100 Subject: Postdoc position in Neural Computing/Computer Vision Message-ID: <199608291004.LAA02307@retina> *************************************************************************** UNIVERSITY OF BONN, GERMANY COMPUTER SCIENCE INSTITUTE III Postdoctoral position in NEURAL NETWORKS for COMPUTER VISION / AUTONOMOUS ROBOTICS +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Please forward this call to whoever might be interested in this job offer. Thanks in advance. +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ The COMPUTER SCIENCE INSTITUTE III, UNIVERSITY OF BONN, GERMANY has established a research group for autonomous robotics in 1993. The RHINO group, named after the autonomous robot RHINO, addresses the research questions active vision fast scene segmentation statistical pattern recognition and learning methods autonomous map building from sonar data and stereo robot planning and control design of a high level robot control language with special emphasis of adaptive neural network methods. We expect to have a postdoctoral position available for one year starting this fall with the possibility of renewal for a second year. Applicants should have research experience in several of the research areas listed above. Furthermore, the applicant must have development experience in a C++/UNIX environment. The position is associated with the research groups of Profs. Buhmann and Cremers. Funding is provided by the European Community under the Training and Mobility of Researchers (TMR) contract VIRGO. Applicants must have been residents of a European country (excluding Germany) for the last 18 months. The COMPUTER SCIENCE INSTITUTE III at Bonn has about 30 PhD students, 5 faculty members, and 10 research associates (Postdocs). 4 PhD students, 2 postdocs, and 3 faculty members are associated with the RHINO group. The Institute is headed by Prof. A. B. Cremers. Please send a CV, summary of relevant research interests and of recent publications by ordinary mail or email to: Joachim Buhmann University of Bonn Computer Science Institute III Roemerstr. 164 53117 Bonn, Germany Email: jb at cs.uni-bonn.de http://www-dbv.informatik.uni-bonn.de/ From mmisra at adastra.Mines.EDU Thu Aug 29 19:58:11 1996 From: mmisra at adastra.Mines.EDU (Manavendra Misra) Date: Thu, 29 Aug 1996 17:58:11 -0600 Subject: CS positions Message-ID: <9608291758.ZM3438@adastra.Mines.EDU> Interested connectionists are invited to apply for two new faculty positions at the Colorado School of Mines. One of these is at the Associate/Full professor level, and the other is an entry level Assistant Professor position. Announcements are enclosed below. Manav. ------- Applied Computer Science Associate/Full Professor Applications are invited for an anticipated tenured position in Applied Computer Science at the Associate or Full Professor level for fall 1997. Applicants should have a Ph.D. in Computer Science or a related field; excellence in teaching and research is essential. Preference will be given to candidates in the areas of artificial intelligence, scientific visualization, and high performance computing. Evidence of interest or successful involvement in interdisciplinary collaborative research projects is desirable. To apply, send: a) A curriculum vitae, b) Three letters of recommendation, at least one of which addresses teaching ability, and c) A one-page statement describing teaching experience and philosophy, and research interests and aspirations to: Colorado School of Mines Office of Human Resources Applied Computer Science Associate/Full Professor Search #96-081600 1500 Illinois Street Golden, CO 80401-1887 FAX: (303)273-3278 Applications will be considered beginning December 16, 1996, and thereafter until the position is filled. ------ Applied Computer Science Assistant Professor Applications are invited for a tenure-track position in Applied Computer Science at the Assistant Professor level for fall 1997. Applicants should have a Ph.D. in Computer Science or a related field, excellent research accomplishments/potential and a strong commitment to teaching. Preference will be given to candidates in the areas of artificial intelligence, scientific visualization, and high performance computing who have one or more years of postdoctoral experience. To apply, send: a) A curriculum vitae, b) Three letters of recommendation, at least one of which addresses teaching ability, and c) A one-page statement describing teaching experience and philosophy, and research interests and aspirations to: Colorado School of Mines Office of Human Resources Applied Computer Science Assistant Professor Search #96-081330 1500 Illinois Street Golden, CO 80401-1887 FAX: (303)273-3278 Applications will be considered beginning January 13, 1997, and thereafter until the position is filled. -- ***************************************************************************** Manavendra Misra Dept of Mathematical and Computer Sciences Colorado School of Mines, Golden, CO 80401 Ph. (303)-273-3873 Fax. (303)-273-3875 Home messages/fax : (303)-271-0775 email: mmisra at mines.edu WWW URL: http://www.mines.edu/fs_home/mmisra/ ***************************************************************************** From back at zoo.riken.go.jp Thu Aug 29 22:28:58 1996 From: back at zoo.riken.go.jp (Andrew Back) Date: Fri, 30 Aug 1996 11:28:58 +0900 (JST) Subject: NIPS'96 Workshop - Blind Signal Processing Message-ID: CALL FOR PAPERS NIPS'96 Postconference Workshop BLIND SIGNAL PROCESSING AND THEIR APPLICATIONS (Neural Information Processing Approaches) Snowmass (Aspen), Colorado USA Sat Dec 7th, 1996 A. Cichocki and A. Back Brain Information Processing Group Frontier Research Program RIKEN, Institute of Physical and Chemical Research, Hirosawa 2-1, Saitama 351-01, WAKO-Shi, JAPAN Email: cia at zoo.riken.go.jp, back at zoo.riken.go.jp Fax: (+81) 48 462 4633. URL: http://zoo.riken.go.jp/bip.html Blind Signal Processing is an emerging area of research in neural networks and image/signal processing with many potential applications. It originated in France in the late 80's and since then there has continued to be a strong and growing interest in the field. Blind signal processing problems can be classified into three areas: (1) blind signal separation of sources and/or independent component analysis (ICA), (2) blind channel identification and (3) blind deconvolution and blind equalization. OBJECTIVES The main objectives of this workshop are to: Give presentations by experts in the field on the state of the art in this exciting area of research. Compare the performance of recently developed adaptive un-supervised learning algorithms for neural networks. Discuss issues surrounding prospective applications and the suitability of current neural network models. Hence we seek to provide a forum for better understanding current limitations of neural network models. Examine issues surrounding local, online adaptive learning algorithms and their robustness and biologically plausibility or justification. Discuss issues concerning effective computer simulation programs. Discuss open problems and perspectives for future research in this area. Especially, we intend to discuss the following items: 1. Criteria for blind separation and blind deconvolution problems (both for time and frequency domain approaches) 2. Natural (or relative) gradient approach to blind signal processing. 3. Neural networks for blind separation of time delayed and convolved signals. 4. On line adaptive learning algorithms for blind signal processing with variable learning rate (learning of learning rate). 5.Open problems, e.g. dynamic on-line determination of number of sources (more sources than sensors), influence of noise, robustness of algorithms, stability, convergence, identifiability, non-causal, non-stationary dynamic systems . 6. Applications in different areas of science and engineering, e.g., non-invasive medical diagnosis (EEG, ECG), telecommunication, voice recognition problems, image processing and enhancement. WORKSHOP FORMAT The workshop will be 1-day in length, combining some invited expert speakers and a significant group discussion time. We will open up the workshop in a moderated way. The intent here is to permit a free-flowing, but productive discourse on the topics relevant to this area. Participants will be encouraged to consider the implications of the current findings in their own work, and to raise questions accordingly. We invite and encourage potential participants to "come prepared" for open discussions. SUBMISSION OF WORKSHOP EXTENDED ABSTRACTS If you would like to contribute, please send an abstract or extended summary as soon as possible to: Andrew Back Laboratory for Artificial Brain Systems, Frontier Research Program RIKEN, Institute of Physical and Chemical Research, Hirosawa 2-1, Saitama 351-01, WAKO-Shi, JAPAN Email: back at zoo.riken.go.jp Phone: (+81) 48 467 9629 Fax: (+81) 48 462 4633. Manuscripts may be sent in by email (in postscript format), air mail or by fax. Important Dates: Submission of abstract deadline: 16 September, 1996 Notification of acceptance: 1 October, 1996 Final paper to be sent by: 30 October, 1996 A set of workshop notes will be produced. For accepted papers to be included in the notes, papers accepted for presentation will need to be supplied to us by the due date of 30 Oct, 1996. For the format of papers, the usual NIPS style file should be used with up to 16 pages allowed. Please contact the workshop organizers for further information, or consult the NIPS WWW home page: http://www.cs.cmu.edu/afs/cs.cmu.edu/Web/Groups/NIPS/ From sml at esesparc2.essex.ac.uk Tue Aug 27 13:29:24 1996 From: sml at esesparc2.essex.ac.uk (Lucas S M) Date: Tue, 27 Aug 1996 18:29:24 +0100 (BST) Subject: papers available on high performance OCR Message-ID: Summary: The following two papers discuss recent work on applying scanning n-tuple classifiers to handwritten OCR. The first is a journal paper which gives some background and all the technical details. The second is a paper for a forthcoming conference which includes more up-to-date results and more detailed timing analysis. The main feature of the method is the incredible speed. If we ignore the pre-processing time, we can train the system at a rate of over 20,000 character images per second, and recognise about 1,200 characters per second, on a humble 66mhz Pentium PC. If we include pre-processing time, then we can still train (recognise) 500 (200) chars per second. The fast training and recognition speeds allow the system parameters to be optimised very quickly. Best accuracy reported is 98.3% on the CEDAR hand-written digit test set. This is not quite at good as the best reported in the literature for this data (98.9%, to the best of our knowledge), but offers a significant speed advantage. The following two papers discuss recent work on a simple unified approach to evolving ALL aspects of a neural network, including its learning algorithm (if any). The first uses a grammar based chromosome, the second uses a set-based chromosome. The latter approach appears particularly promising as a method of part designing/ part evolving neural networks. ----------------------------------------------------------------------- From mo216 at newton.cam.ac.uk Sun Aug 18 00:38:11 1996 From: mo216 at newton.cam.ac.uk (M. Opper) Date: Sun, 18 Aug 1996 05:38:11 +0100 (BST) Subject: technical reports Message-ID: <199608180438.FAA17296@larmor.newton.cam.ac.uk> Two new papers are now available: Opper, M and D. Haussler (1997) "Worst case prediction over sequences under log loss" in {\em The Mathematics of Information Coding, Extraction and Distribution}, Springer Verlag, Edited by G. Cybenko, D. O'Leary and J. Rissanen. ftp://ftp.cse.ucsc.edu/pub/ml/OHWCpaper.ps (180K postscript) Abstract: We consider the game of sequentially assigning probabilities to future data based on past observations under logarithmic loss. We are not making probabilistic assumptions about the generation of the data, but consider a situation where a player tries to minimize his loss relative to the loss of the (with hindsight) best distribution from a target class for the worst sequence of data. We give bounds on the minimax regret in terms of the metric entropies of the target class with respect to suitable distances between distributions. D. Haussler and M. Opper (1997) "Metric Entropy and Minimax Risk in Classification" {\em Lecture Notes in Computer Science: Studies in Logic and Computer Science, a selection of essays in honor of Andrzej Ehrenfeucht} Vol. 1261, 212-235 (1997) Eds. J. Mycielski, G. Rozenberg and A. Salomaa ftp://ftp.cse.ucsc.edu/pub/ml/Andrzejpaper.ps (245k postscript) Abstract: We apply recent results on the minimax risk in density estimation to the related problem of pattern classification. The notion of loss we seek to minimize is an information theoretic measure of how well we can predict the classification of future examples, given the classification of previously seen examples. We give an asymptotic characterization of the minimax risk in terms of the metric entropy properties of the class of distributions that might be generating the examples. We then use these results to characterize the minimax risk in the special case of noisy two-valued classification problems in terms of the Assouad density and the Vapnik-Chervonenkis dimension.