From lfausett at work2.cc.nps.navy.mil Thu May 1 18:44:02 1997 From: lfausett at work2.cc.nps.navy.mil (Laurene Fausett) Date: Thu, 1 May 1997 14:44:02 -0800 Subject: CFP-abstracts due by 5/15 Message-ID: SIAN KA'AN 97 The Second Joint Mexico-US International Workshop on Neural Networks and Neurocontrol August 19-29 1997 Playa del Carmen Quintana Roo Mexico GOAL Sian Ka'an 97, The Second Joint Mexico-US International Workshop on Neural Networks and Neurocontrol will provide an opportunity for academic and industrial researchers of the United States and Mexico, and other participants including graduate students, to exchange ideas and research results with the aim of enhancing the overall research effort, reducing duplication, and encouraging cooperation among the participants. FORMAT Approximately twenty speakers have been selected to present two hour keynote talks describing the state of the art of topics related to the development of the theoretical and applied aspects of neural networks and neurocontrol. Poster sessions of invited and contributed papers will allow for discussion of particular works. A hands-on workshop for students, and other interested participants, will provide some practice and more basic information about the topics addressed in the conference presentations and poster sessions. LOCATION The conference will be held in Playa del Carmen, in the state of Quintana Roo, Mexico. The hotel is a five star luxury facility, with many legendary Mayan ruins in the vicinity of the hotel. The facilities provide a relaxed and informal atmoshpere for the exchange of ideas. A program of cultural, educational, and social events will enhance the understanding, and help to establish solid cooperative efforts, among conference participants. CALL FOR PAPERS The Organizing Committee invites all persons interested in Artificial Neural Networks, Fuzzy Sets, Evolutionary Programming, Control Theory, and Smart Engineering Systems Theory to submit papers for presentation (poster sessions) at the conference. All papers accepted for presentation will be published in the conference proceedings. To ensure a high-quality conference and proceedings, all paper submissions will be reviewed for technical merit and content by three senior researchers in the field. Best Paper Awards will be presented in the areas of "Novel Applications", "Theoretical Development", and "Electronic Implementations". Authors are requested to submit letter of intent, abstract (up to 250 words), and information sheet (full name(s) of author(s), title of paper, phone/fax numbers, e-mail address) Important Dates: Letter of intent: by May 15, 1997. Notice of acceptance of papers will be sent by May 30, 1997. Camera-ready laser manuscripts: by June 30, 1997. Authors should forward the letter of intent, information sheet, and abstract to: Nydia Lara Zavala Laboratorio de Neurocomputacion Centro de Instrumentos Universidad Nacional Autonoma de Mexico Circuito Exterior, Ciudad Universitaria A.P. 70-186 Mexico, 04510, D.F. Phone: (525) 652 5920 Fax: 622 8620 e-mail: nydia at aleph.cinstrum.unam.mx or Laurene Fausett 450 Larkin Street Monterey, CA 93940 Phone: (408) 656-2714 Fax: (408) 656-2355 e-mail:lfausett at nps.navy.mil COST Conference Registration (until June 30) $300 ($350 after June 30) Hotel (per person/per night) including breakfast Single Room $ 60 Double Room $ 40 Triple Room $ 30 Lunch and Dinner are available at the Hotel for approximately $15 each Fellowships will be available for students on a competitive basis. Please request application forms from the organizing Committee Chairs (before May 15). KEYNOTE SPEAKERS (preliminary list, alphabetical order) James Bezdek Donald Fausett Laurene Fausett Benito Fernandez Joydeep Ghosh Robert Hecht-Nielsen John Koza George Lendaris Ken Marko Thomas J. McAvoy Kumpati Narendra Jose Principe Dejan J. Sobajic Eduardo Sontag Jean Jacques E. Slotine Paul Werbos Bernard Widrow Lotfi A. Zadeh SIAN KA'AN 97 SPONSORS National Science Foundation Academia de la Investigacion Cientifica Sociedad Mexicana de Instrumentacion Florida Institute of Technology University of Texas at Austin Consejo Nacional de Ciencia y Tecnologia Universidad Nacional Autonoma de Mexico Universidad de Quintana Roo Gobierno del Estado de Quintana Roo IBM de Mexico, S.A. de C.V. Sun de Mexico Apple Computer de Mexico, S.A. de C.V. HONORARY CONFERENCE CHAIRS Claudio Firmani Clementi Paul Werbos Bernard Widrow Felipe Lara ORGANIZING COMMITTEE CHAIRS Nydia Lara (Mexico) Laurene Fausett (USA) PROGRAM COMMITTEE CHAIR Bernard Widrow Felipe Lara Jose Principe Dejan Sobajic Benito Fernandez WORKSHOP COMMITTEE CHAIR Nydia Lara Laurene Fausett Benito Fernandez George Lendaris Guillermo Morales TOPICS OF INTEREST Architecture - ANN Paradigms - Associative Memories - Hybrid Systems Learning - Gradient-Based Learning - Stochastic Learning - Adaptive Methods - Supervised Learning - Reinforcement Learning Systems Analysis - Time Series - Signal Processing - Systems Modeling - Process Monitoring - Fuzzy Models Control & Design - Optimization - Neurocontrol - Adaptive Control - Learning Control - Fuzzy Control - Intelligent Control From moody at chianti.cse.ogi.edu Thu May 1 19:14:41 1997 From: moody at chianti.cse.ogi.edu (John Moody) Date: Thu, 1 May 97 16:14:41 -0700 Subject: Research Position in Statistical Learning Algorithms at OGI Message-ID: <9705012314.AA20039@chianti.cse.ogi.edu> Research Position in Nonparametric Statistics, Neural Networks and Machine Learning at Department of Computer Science & Engineering Oregon Graduate Institute of Science & Technology I am seeking a highly qualified researcher to take a leading role on a project involving the development and testing of new model selection and input variable subset selection algorithms for classification, regression, and time series prediction applications. Candidates should have a PhD in Statistics, EE, CS, or a related field, have experience in neural network modeling, nonparametric statistics or machine learning, have strong C programming skills, and preferably have experience with S-Plus and Matlab. The compensation and level of appointment (Postdoctoral Research Associate or Senior Research Associate) will depend upon experience. The initial appointment will be for one year, but may be extended depending upon the availability of funding. Candidates who can start by July 1, 1997 or before will be given preference, although an extremely qualified candidate who is available by September 1 may also be considered. If you are interested in applying for this position, please mail, fax, or email your CV (ascii text or postscript only), a letter of application, and a list of at least three references (names, addresses, emails, phone numbers) to: Ms. Sheri Dhuyvetter Computer Science & Engineering Oregon Graduate Institute PO Box 91000 Portland, OR 97291-1000 Phone: (503) 690-1476 FAX: (503) 690-1548 Email: sherid at cse.ogi.edu Please do not send applications to me directly. I will consider all applications received by Sheri on or before June 1. OGI (Oregon Graduate Institute of Science and Technology) has over a dozen faculty, senior research staff, and postdocs doing research in Neural Networks, Machine Learning, Signal Processing, Time Series, Control, Speech, Language, Vision, and Computational Finance. Short descriptions of our research interests are appended below. Additional information is available on the Web at http://www.cse.ogi.edu/Neural/ and http://www.cse.ogi.edu/CompFin/ . OGI is a young, but rapidly growing, private research institute located in the Silicon Forest area west of downtown Portland, Oregon. OGI offers Masters and PhD programs in Computer Science and Engineering, Electrical Engineering, Applied Physics, Materials Science and Engineering, Environmental Science and Engineering, Biochemistry, Molecular Biology, Management, and Computational Finance. The Portland area has a high concentration of high tech companies that includes major firms like Intel, Hewlett Packard, Tektronix, Sequent Computer, Mentor Graphics, Wacker Siltronics, and numerous smaller companies like Planar Systems, FLIR Systems, Flight Dynamics, and Adaptive Solutions (an OGI spin-off that manufactures high performance parallel computers for neural network and signal processing applications). John Moody Professor, Computer Science and Electrical Engineering Director, Computational Finance Program +++++++++++++++++++++++++++++++++++++++++++++++++++++++ Oregon Graduate Institute of Science & Technology Department of Computer Science & Engineering Department of Electrical Engineering Research Interests of Faculty, Research Staff, and Postdocs in Neural Networks, Machine Learning, Signal Processing, Control, Speech, Language, Vision, Time Series, and Computational Finance Etienne Barnard (Associate Professor, EE): Etienne Barnard is interested in the theory, design and implementation of pattern-recognition systems, classifiers, and neural networks. He is also interested in adaptive control systems -- specifically, the design of near-optimal controllers for real- world problems such as robotics. Ron Cole (Professor, CSE): Ron Cole is director of the Center for Spoken Language Understanding at OGI. Research in the Center currently focuses on speaker- independent recognition of continuous speech over the telephone and automatic language identification for English and ten other languages. The approach combines knowledge of hearing, speech perception, acoustic phonetics, prosody and linguistics with neural networks to produce systems that work in the real world. Mark Fanty (Research Associate Professor, CSE): Mark Fanty's research interests include continuous speech recognition for the telephone; natural language and dialog for spoken language systems; neural networks for speech recognition; and voice control of computers. Dan Hammerstrom (Associate Professor, CSE): Based on research performed at the Institute, Dan Hammerstrom and several of his students have spun out a company, Adaptive Solutions Inc., which is creating massively parallel computer hardware for the acceleration of neural network and pattern recognition applications. There are close ties between OGI and Adaptive Solutions. Dan is still on the faculty of the Oregon Graduate Institute and continues to study next generation VLSI neurocomputer architectures. Hynek Hermansky (Associate Professor, EE); Hynek Hermansky is interested in speech processing by humans and machines with engineering applications in speech and speaker recognition, speech coding, enhancement, and synthesis. His main research interest is in practical engineering models of human information processing. Todd K. Leen (Associate Professor, CSE): Todd Leen's research spans theory of neural network models, architecture and algorithm design and applications to speech recognition. His theoretical work is currently focused on the foundations of stochastic learning, while his work on Algorithm design is focused on fast algorithms for non-linear data modeling. John Moody (Professor, CSE and EE): John Moody does research on the design and analysis of learning algorithms, statistical learning theory (including generalization and model selection), optimization methods (both deterministic and stochastic), and applications to signal processing, time series, economics, and computational finance. He is the Director of OGI's Computational Finance Program. David Novick (Associate Professor, CSE): David Novick conducts research in interactive systems, including computational models of conversation, technologically mediated communication, and human-computer interaction. A central theme of this research is the role of meta-acts in the control of interaction. Current projects include dialogue models for telephone-based information systems. Misha Pavel (Associate Professor, EE): Misha Pavel does mathematical and neural modeling of adaptive behaviors including visual processing, pattern recognition, visually guided motor control, categorization, and decision making. He is also interested in the application of these models to sensor fusion, visually guided vehicular control, and human-computer interfaces. He is the Director of OGI's Center for Information Technology. Hong Pi (Senior Research Associate, CSE) Hong Pi's research interests include neural network models, time series analysis, and dynamical systems theory. He currently works on the applications of nonlinear modeling and analysis techniques to time series prediction problems and financial market analysis. Pieter Vermeulen (Research Associate Professor, EE): Pieter Vermeulen is interested in the theory, design and implementation of pattern-recognition systems, neural networks and telephone based speech systems. He currently works on the realization of speaker independent, small vocabulary interfaces to the public telephone network. Current projects include voice dialing, a system to collect the year 2000 census information and the rapid prototyping of such systems. He is also a cofounder of Livingston Legend, a company specializing in neural network based intelligent sensors. Eric A. Wan (Assistant Professor, EE): Eric Wan's research interests include learning algorithms and architectures for neural networks and adaptive signal processing. He is particularly interested in neural applications to time series prediction, adaptive control, active noise cancellation, and telecommunications. Lizhong Wu (Senior Research Associate, CSE): Lizhong Wu's research interests include neural network theory and modeling, time series analysis and prediction, pattern classification and recognition, signal processing, vector quantization, source coding and data compression. He is now working on the application of neural networks and nonparametric statistical paradigms to finance. From sml%essex.ac.uk at seralph21.essex.ac.uk Fri May 2 04:53:55 1997 From: sml%essex.ac.uk at seralph21.essex.ac.uk (Simon Lucas) Date: Fri, 02 May 1997 09:53:55 +0100 Subject: Rapid best-first retrieval from massive dictionaries (paper available) Message-ID: <3369ABA3.D8A@essex.ac.uk> The following paper has recently been published in Pattern Recognition Letters (vol 17; 1507 - 1512), and may be of interest to people on this list. ------------------------------------------------ Rapid Best-First Retrieval from Massive Dictionaries by Lazy Evaluation of a Syntactic Neural Network S.M. Lucas A new method of searching large dictionaries given uncertain inputs is described, based on the lazy evaluation of a syntactic neural network (SNN). The new method is shown to significantly outperform a conventional trie-based method for large dictionaries (e.g.\ in excess of 100,000 entries). Results are presented for the problem of recognising UK postcodes using dictionary sizes of up to 1 million entries. Most significantly, it is demonstrated that the SNN actually gets {\em faster} as more data is loaded into it. ------------------------------------------------ Sorry, but no electronic version available due to copyright. Paper offprints available on request. Regards, Simon Lucas ------------------------------------------------ Dr. Simon Lucas Department of Electronic Systems Engineering University of Essex Colchester CO4 3SQ United Kingdom Tel: (+44) 1206 872935 Fax: (+44) 1206 872900 Email: sml at essex.ac.uk http://esewww.essex.ac.uk/~sml secretary: Mrs Janet George (+44) 1206 872438 ------------------------------------------------- From harnad at cogsci.soton.ac.uk Mon May 5 10:51:08 1997 From: harnad at cogsci.soton.ac.uk (Stevan Harnad) Date: Mon, 5 May 97 15:51:08 +0100 Subject: Call for Papers: Psycoloquy Message-ID: <2758.9705051451@cogsci.ecs.soton.ac.uk> PSYCOLOQUY CALL FOR PAPERS PSYCOLOQUY is a refereed electronic journal (ISSN 1055-0143) now in its 8th year of publication. PSYCOLOQUY is sponsored on an experimental basis by the American Psychological Association and is currently estimated to reach a readership of over 50,000. PSYCOLOQUY publishes reports of new ideas and findings on which the author wishes to solicit rapid peer feedback, international and interdisciplinary ("Scholarly Skywriting"), in all areas of psychology and its related fields (biobehavioral science, cognitive science, neuroscience, social science, etc.). All contributions are refereed. All target articles, commentaries and responses must have (1) a short abstract (up to 100 words for target articles, shorter for commentaries and responses), (2) an indexable title, (3) the authors' full name(s), institutional address(es) and URL(s). In addition, for target articles only: (4) 6-8 indexable keywords, (5) a separate statement of the authors' rationale for soliciting commentary (e.g., why would commentary be useful and of interest to the field? what kind of commentary do you expect to elicit?) and (6) a list of potential commentators (with their email addresses). All paragraphs should be numbered in articles, commentaries and responses (see format of already published articles in the PSYCOLOQUY archive; line length should be < 80 characters, no hyphenation). Two version of the figurese would be helpful, one version as screen-readable ascii the other as .gif .jpeg .tiff or (least preferred:) postscript files (or in some other universally available format) to be printed out locally by readers to supplement the screen-readable text of the article. PSYCOLOQUY also publishes multiple reviews of books in any of the above fields; these should normally be the same length as commentaries, but longer reviews will be considered as well. Book authors should submit a 500-line self-contained Precis of their book, in the format of a target article; if accepted, this will be published in PSYCOLOQUY together with a formal Call for Reviews (of the book, not the Precis). The author's publisher must agree in advance to furnish review copies to the reviewers selected. Authors of accepted manuscripts assign to PSYCOLOQUY the right to publish and distribute their text electronically and to archive and make it permanently retrievable electronically, but they retain the copyright, and after it has appeared in PSYCOLOQUY authors may republish their text in any way they wish -- electronic or print -- as long as they clearly acknowledge PSYCOLOQUY as its original locus of publication. However, except in very special cases, agreed upon in advance, contributions that have already been published or are being considered for publication elsewhere are not eligible to be considered for publication in PSYCOLOQUY, Please submit all material to psyc at pucc.princeton.edu http://www.princeton.edu/~harnad/psyc.html http://cogsci.soton.ac.uk/psyc ftp://ftp.princeton.edu/pub/harnad/Psycoloquy ftp://cogsci.soton.ac.uk/pub/harnad/Psycoloquy gopher://gopher.princeton.edu/11/.libraries/.pujournals news:sci.psychology.journals.psycoloquy ---------------------------------------------------------------------- CRITERIA FOR ACCEPTANCE: To be eligible for publication, a PSYCOLOQUY target article should not only have sufficient conceptual rigor, empirical grounding, and clarity of style, but should also offer a clear rationale for soliciting Commentary. That rationale should be provided in the author's covering letter, together with a list of suggested commentators. A target article can be (i) the report and discussion of empirical research; (ii) an theoretical article that formally models or systematizes a body of research; or (iii) a novel interpretation, synthesis, or critique of existing experimental or theoretical work. Rrticles dealing with social or philosophical aspects of the behavioral and brain sciences are also eligible.. The service of Open Peer Commentary will be primarily devoted to original unpublished manuscripts. However, a recently published book whose contents meet the standards outlined above may also be eligible for Commentary. In such a Multiple Book Review, a comprehensive, 500-line precis by the author is published in advance of the commentaries and the author's response. In rare special cases, Commentary will also be extended to a position paper or an already published article dealing with particularly influential or controversial research. Submission of an article implies that it has not been published or is not being considered for publication elsewhere. Multiple book reviews and previously published articles appear by invitation only. The Associateship and professional readership of PSYCOLOQUY are encouraged to nominate current topics and authors for Commentary. In all the categories described, the decisive consideration for eligibility will be the desirability of Commentary for the submitted material. Controversially simpliciter is not a sufficient criterion for soliciting Commentary: a paper may be controversial simply because it is wrong or weak. Nor is the mere presence of interdisciplinary aspects sufficient: general cybernetic and "organismic" disquisitions are not appropriate for PSYCOLOQUY. Some appropriate rationales for seeking Open Peer Commentary would be that: (1) the material bears in a significant way on some current controversial issues in behavioral and brain sciences; (2) its findings substantively contradict some well-established aspects of current research and theory; (3) it criticizes the findings, practices, or principles of an accepted or influential line of work; (4) it unifies a substantial amount of disparate research; (5) it has important cross-disciplinary ramifications; (6) it introduces an innovative methodology or formalism for consideration by proponents of the established forms; (7) it meaningfully integrates a body of brain and behavioral data; (8) it places a hitherto dissociated area of research into an evolutionary or ecological perspective; etc. In order to assure communication with potential commentators (and readers) from other PSYCOLOQUY specialty areas, all technical terminology must be clearly defined or simplified, and specialized concepts must be fully described. NOTE TO COMMENTATORS: The purpose of the Open Peer Commentary service is to provide a concentrated constructive interaction between author and commentators on a topic judged to be of broad significance to the biobehavioral science community. Commentators should provide substantive criticism, interpretation, and elaboration as well as any pertinent complementary or supplementary material, such as illustrations; all original data will be refereed in order to assure the archival validity of PSYCOLOQUY commentaries. Commentaries and articles should be free of hyperbole and remarks ad hominem. STYLE AND FORMAT FOR ARTICLES AND COMMENTARIES TARGET ARTICLES: should not exceed 500 lines (~4500 words); commentaries should not exceed 200 lines (1800 words), including references. Spelling, capitalization, and punctuation should be consistent within each article and commentary and should follow the style recommended in the latest edition of A Manual of Style, The University of Chicago Press. It may be helpful to examine a recent issue of PSYCOLOQUY. All submissions must include an indexable title, followed by the authors' names in the form preferred for publication, full institutional addresses and electronic mail addresses, a 100-word abstract, and 6-12 keywords. Tables and diagrams should be made screen-readable wherever possible (if unavoidable, printable postscript files may contain the graphics separately). All paragraphs should be numbered, consecutively. No line should exceed 72 characters, and a blank line should separate paragraphs. REFERENCES: Bibliographic citations in the text must include the author's last name and the date of publication and may include page references. Complete bibliographic information for each citation should be included in the list of references. Examples of correct style are: Brown(1973); (Brown 1973); Brown 1973; 1978); (Brown 1973; Jones 1976); (Brown & Jones 1978); (Brown et al. 1978). References should be typed on a separate sheet in alphabetical order in the style of the following examples. Do not abbreviate journal titles. Kupfermann, I. & Weiss, K. (1978) The command neuron concept. Behavioral and Brain Sciences 1:3-39. Dunn, J. (1976) How far do early differences in mother-child relations affect later developments? In: Growing point in ethology, ed. P. P. G. Bateson & R. A. Hinde, Cambridge University Press. Bateson, P. P. G. & Hinde, R. A., eds. (1978) Growing points in ethology, Cambridge University Press. EDITING: PSYCOLOQUY reserves the right to edit and proof all articles and commentaries accepted for publication. Authors of articles will be given the opportunity to review the copy-edited draft. Commentators will be asked to review copy-editing only when changes have been substantial. --------------------------------------------------------------------------- Prof. Stevan Harnad psyc at pucc.princeton.edu Editor, Psycoloquy phone: +44 1703 594-583 fax: +44 1703 593-281 Department of Psychology http://cogsci.soton.ac.uk/psyc University of Southampton http://www.princeton.edu/~harnad/psyc.html Highfield, Southampton ftp://ftp.princeton.edu/pub/harnad/Psycoloquy SO17 1BJ UNITED KINGDOM ftp://cogsci.soton.ac.uk/pub/harnad/Psycoloquy news:sci.psychology.journals.psycoloquy gopher://gopher.princeton.edu/11/.libraries/.pujournals Sponsored by the American Psychological Association (APA) From tony at salk.edu Tue May 6 21:40:53 1997 From: tony at salk.edu (Tony Bell) Date: Tue, 6 May 1997 18:40:53 -0700 (PDT) Subject: NIPS 97 deadlines Message-ID: <199705070140.SAA11760@curie.salk.edu> ************** NIPS*97 DEADLINES APPROACHING WARNING ************* Just to remind you all that the deadlines for NIPS submissions is MAY 23 and for workshop proposals, it is MAY 20, only "a few short weeks away". All information regarding the conference and submissions (including NIPS LaTeX style files) is on the NIPS web page: http://www.cs.cmu.edu/Groups/NIPS/ So you still have time to hone your algorithms, and write your 7 pages! - Tony Bell (NIPS Publicity) ************** NIPS*97 DEADLINES APPROACHING WARNING ************* From adilson at uenf.br Tue May 6 15:19:44 1997 From: adilson at uenf.br (Adilson GONGALVES) Date: Tue, 6 May 1997 17:19:44 -0200 Subject: No subject Message-ID: <9705061919.AA13655@uenf.br> Please Distribute. Thank you. Cabral LIMA. ANNOUNCING InterSymp' 97 9th INTERNATIONAL CONFERENCE on SYSTEMS RESEARCH INFORMATICS AND CYBERNETICS to be held August 18-23, 1997 at the Markgraf-Ludwig-Gymnasium in Baden-Baden, Germany Sponsored by: The International Institute for Advanced Studies in System Research and Cybernetics and Society for Applied Systems Research The Conference will provide a forum for the presentation and discussion of short reports on current systems research in humanities, sciences and engineering. A number of specialized symposia is being organized to focus on reseqrch in computer science, synergetics, cognitive science, psychocybernetics, sociocybernetics, logic, philosophy, management, ecology, health care, education and other related areas. The aim of the Conference is to encourage and facilitate the interdisciplinary and transdisciplinary communication and cooperation amongst scientists, engineers and professionals working in different fields, and to identify and develop those areas of research that will most benefit from such a cooperation. Participants who wish to present a paper are requested to submit two copies of an Abstract up to 200 words) as soon as possible but not later than May 20, 1997. All submitted papers will be refereed. Those selected will be scheduled for presentation and published in Conference Proceedings. Notification of acceptance will be sent to authors by July 5, 1997. The full papers not exceeding 5 single-spaced typed pages with photoready copies of artwork should be submitted by July 25, 1997. Important Dates Abstracts/Summary by May 20, 1997 Notification of Acceptance by July 5, 1997 InterSymp' 97 9th INTERNATIONAL CONFERENCE on SYSTEMS RESEARCH INFORMATICS AND CYBERNETICS PRELIMINARY PROGRAM Friday - August 15, 1997 10:00-24:00 City Festival (StadtFest) Saturday - August 16, 1997 10:00-24:00 City Festival (StadtFest) Sunday - August 17, 1997 10:00-24:00 City Festival (StadtFest) Monday - August 18, 1997 09:00-12:00 Registration of Participants 14:00-17:30 Opening Session 18:00-20:00 Presidential Reception Tuesday - August 19, 1997 08:30-12:30 Plenary Session & Symposia 12:30-14:00 Lunch Break 14:00-18:00 Plenary Session & Symposia Wednesday - August 20, 1997 08:30-12:30 Plenary Session & Symposia 12:30-14:00 Lunch Break 14:00-18:00 Plenary Session & Symposia Thursday - August 21, 1997 08:30-12:30 Plenary Session & Symposia 12:30-14:00 Lunch Break 14:00-18:00 Plenary Session & Symposia 17:00-18:00 Award Ceremony Friday - August 22, 1997 10:00-12:00 Board of Directors Meeting 12:00-14:00 Lunch Break 14:00-17:00 General Assembly I Saturday - August 23, 1997 10:00-12:00 General Assembly II 12:00-14:00 Lunch Break 14:00-16:00 Closing Session TRAVEL INFORMATION: Baden-Baden is a beautiful spa-resort town and convention center located in the middle of the Black Forest in the western part of Germany. It can be reached in two hours by train from Frankfurt or Stuttgart. The best way to travel to the conference site is to fly first to Frankfurt(or Sttutgart) and then to take na express train to Baden-Baden. Those travelling by car can reach Baden-Baden through Hwy (Autobahn) A5 (Frankfurt-Basel) or through Hwy A8 (Stuttgart-Karlsruhe). The conference will be held at the Markgraf-Ludwig-Gymnasium, located at Hardstr. 2 in the center of Baden-Baden. This conference site can be easily reached from Baden-Baden railway station by city bus travelling to Augustaplatz, or by taxi. ACCOMODATION: All conference participants are responsible for making their own travel arrangements and hotel reservations. Convenient and reasonable accomodation is available in various Baden-Baden hotels, some of which are indicated on the overleaf. Prices for accomodations range between $40.00 and $90.00 (U.S. $) per day, depending on hotel category and the type of occupancy (single, double,etc.). Participants or their travel agencies should make a reservation in a hotel of their choice in writing, indicating preferred price range, type of occupancy and the length of intended stay. This reservation should be made as soon as possible. August is a very popular vacation month in Europe and preferred flights and hotel accomodation may not be easily available unless booked well in advance. Further tourist information and help with hotel reservations in Baden-Baden is provided by: TOURIST - INFO BUREAU Augustaplatz 8 76530 Baden-Baden Germany CUT HERE NAME:____________________________________________ TITLE: _____________________ Institution/Organization: ____________________________________________________________ Mailing Address: __________________________________________________________________ ____________________________________________________________________________ __________________________________________ Home Phone: ___________ Office Phone: ___________ Fax: ___________ Email: _____________ I am an Author Presenter Session Organizer Participant Tentative Title of My Presentation: ___________________________________________________ ________________________________________________________________________________ Attached is my cheque/money order for Conference Registration (US$300.00 [if paid before May 5, 1997]; US$350.00 [if paid after May 5, 1997] payable to Intersymp'97. Please mail your cheque with this form to: before June 12, 1997 to: after June 12, 1997 to: Dr. George E. Lasker Dr. George E. Lasker Shool of Computer Science Hauptpostlagernd University of Windsor 7001 Stuttgart Windsor, Ontario, Canada N9B 3P4 Germany InterSymp' 97 Hotels in Standard Category Hotel Roemerhof Sophienstr 25, 76530 Baden-Baden Tel.: 07221-23415, Fax: 07221-391707 Hotel Schweizer Hof Lange Str. 73, 76530 Baden-Baden Tel.: 07221-24231, Fax: 07221-24069 Hotel Shuetzenhof Baldreitstr, 1, 76530 Baden-Baden Tel.: 07221-24088, Fax: 07221-390674 Hotel Deutscher Kaiser Hauptstr. 35, 76530 Baden-Baden Tel.: 07221-72152, Fax: 07221-72154 Hotel Pension Schuler (WC & shower on the floor) Lichtentaler Str. 29, 76530 Baden-Baden Tel.: 07221-23619, Fax: 07221-82639 Hotel Tanneck Werderstr. 14, 76530 Baden-Baden Tel.: 07221-23035, Fax: 07221-38327 Hotel Bischoff Roemerplatz 2, 76530 Baden-Baden Tel.: 07221-22373, Fax: 07221-38308 Prof. Adilson GONCALVES Chefe do Laboratorio de Ciencias Matematicas Centro de Ciencias e Tecnologia Universidade Estadual do Norte Fluminense Av. Alberto LAMEGO 2000 Campos RJ BRAZIL t: 0247 263731 f: 0247 263730 **************************** *"Science non facit sautum"* **************************** From info at cogsci.ed.ac.uk Wed May 7 11:52:42 1997 From: info at cogsci.ed.ac.uk (Centre for Cognitive Science) Date: Wed, 7 May 1997 16:52:42 +0100 Subject: MSc in Cognitive Science, Edinburgh Message-ID: <9284.199705071552@lindsay.cogsci.ed.ac.uk> POSTGRADUATE STUDY IN THE CENTRE FOR COGNITIVE SCIENCE AT THE UNIVERSITY OF EDINBURGH Cognitive Psychology Neural Computation Computational Linguistics Formal Logic Data Intensive Linguistics Logic Programming Theoretical Linguistics & Knowledge Representation The Centre for Cognitive Science (CCS) offers a programme of postgraduate study in cognitive science, centred on language and cognition. The programme leads to the degrees of MSc in Cognitive Science and Natural Language, MPhil or PhD. Some MSc places are still available for the year starting October 1997. CCS is committed to research and postgraduate teaching in cognitive science at international level. The work of the Centre is at the heart of Edinburgh's view of *informatics* -- the study of the structure, behaviour, and design of computational systems, both natural and artificial. CCS has a well-developed system of collaboration with departments within Informatics (Artificial Intelligence, Computer Science) and beyond (Linguistics, Philosophy, Psychology). The Centre's lecturers and research fellows work with over 60 postgraduates in a rich and varied intellectual and social environment. Regular interdisciplinary research workshops, in which students actively participate, focus on current problems in cognitive science. Visiting researchers contribute to a lively seminar series. Research projects, many of them collaborative with other European centres of excellence, have been funded by the UK research councils ESRC, EPSRC and MRC as well as by the European Union LRE and ESPRIT programmes in such areas as natural language understanding and computational neuroscience. Teaching staff: [with associated departments] Ewan Klein Head of Department linguistic theory, phonology Chris Brew [HCRC] corpora, data intensive linguistics, language technology Jo Calder [HCRC] grammar formalisms, computational linguistics Matthew Crocker [ESRC Fellow] statistical language processing, computational psycholinguistics Mark Ellison computational phonology and morphology, natural computation Bruce Graham computational neuroscience, neural networks Alexander Holt natural language semantics, computational linguistics Alex Lascarides [HCRC] lexical and discourse processing, semantics, pragmatics Paul Schweizer PhD Organiser philosophical logic, philosophy of mind, philosophy of language Richard Shillcock MSc Course Organiser psycholinguistics, cognitive modelling, cognitive neuropsychology Keith Stenning [HCRC] human memory, inference, connectionism Associates and Fellows: Sheila Glasbey EPSRC Fellow M. Louise Kelly [Linguistics] Robert Ladd [Linguistics] John Lee [HCRC] Chris Mellish [Artificial Intelligence] Jon Oberlander [HCRC] Massimo Poesio EPSRC Fellow David Willshaw [MRC] Human Communication Research Centre: The HCRC is a centre of excellence in the interdisciplinary study of cognition and computation in human communication, funded by the Economic and Social Research Council (UK). Drawing together researchers from Edinburgh, Glasgow and Durham, HCRC focuses on the psychological aspects of real language processing. HCRC shares a site with CCS, and the two contribute towards a joint research environment. Studying in Edinburgh: Edinburgh contains the largest concentration of expertise in Artificial Intelligence and Natural Language Processing in Europe. Students have access to that expertise, to Edinburgh's large copyright libraries, and within Cognitive Science, to a substantial offprint library. The department possesses extensive computing facilities based on a network of Sun workstations and Apple Macintoshes; access to Edinburgh's concurrent supercomputer and other central computing services is easily arranged. Requirements: Applicants typically have a first degree in one of the participating areas or an appropriate joint honours degree. Funding: UK and EU students following the MSc and PhD courses are eligible to apply for studentships. CCS will advise all students concerning funding possibilities. CCS attracts studentships from a variety of UK and non-UK funding bodies. Non-UK applicants with sufficient background may enroll as non-graduating students. If you would like more information about the Postgraduate Programme in Cognitive Science at the University of Edinburgh, please contact: Admissions Centre for Cognitive Science University of Edinburgh 2 Buccleuch Place Edinburgh EH8 9LW UK Telephone: +44 131 650 4667 Fax: +44 131 650 6626 Email: info at cogsci.ed.ac.uk WWW: http://www.cogsci.ed.ac.uk/ From beigi at watson.ibm.com Wed May 7 13:09:21 1997 From: beigi at watson.ibm.com (Homayoon S.M. Beigi) Date: Wed, 7 May 1997 13:09:21 -0400 (EDT) Subject: ISSCI Pattern Recognition Section Message-ID: CALL FOR PAPERS PATTERN RECOGNITION SECTION of ISSCI'98 (WAC'98) http://ace.unm.edu/wac98/issci.html Hi, I am putting together a few sessions in the ISSCI'98 conference of WAC'98 revolving around pattern recognition. I am soliciting a 2 page abstract by the END of MAY. The papers may be on any Pattern Recognition related areas including but not limited to Speaker Recognition (Verification and Identification), Speech Recognition, On-Line Handwriting Recognition, Optical Character Recognition (Handwritten and text), image recognition and clasification problems. To see the details of the conference please see our web page at http://ace.unm.edu/wac98/issci.html. Address to send the papers: Homayoon Beigi IBM TJ Watson Research Center P.O. Box 218 -- Room 36-219 Yorktown Heights, NY 10598 USA Alternate Street address for Express Mail: Homayoon Beigi IBM TJ Watson Research Center Room 36-219 Route 134 Yorktown Heights, NY 10598 USA Tel. (914) 945-1894 Fax. (914) 243-4965 EMail: beigi at watson.ibm.com From avm at CS.ColoState.EDU Wed May 7 13:51:30 1997 From: avm at CS.ColoState.EDU (anneliese von mayrhauser) Date: Wed, 7 May 1997 11:51:30 -0600 (MDT) Subject: ICCIMA'98 Second Call for Papers Message-ID: <171E4667236@gscit-1.fcit.monash.edu.au> 5th International Workshop on Program Comprehension The Dearborn Inn Dearborn, Michigan May 28-30, 1997 ========================= Preliminary Program ========================= Theme Comprehending programs written by others is at the heart of various software engineering activities. Program comprehension is performed when one reuses, reengineers, or enhances existing (or legacy) programs. It is also performed during review or code walk-through of new programs. The goal of this workshop is to bring together practitioners and researchers from government, academia, and industry to review the current state of the art and explore solutions to the program comprehension problem. On-line Web sites for detailed information http://www.dis.unina.it/~iwpc97 http://www.cacs.usl.edu/~iwpc97 IWPC'97 Chairs General Chair: Anneliese von Mayrhauser, Colorado State University, USA Program Co-Chairs: Gerardo Canfora, University of Salerno, Italy Arun Lakhotia, University of Southwestern Louisiana. USA Local arrangements Chair: Vaclav Rajlich, Wayne State University, USA Sponsored by: The Institute of Electrical & Electronics Engineers Inc. IEEE Computer Society Technical Council on Software Engineering In cooperation with: Wayne State University, Detroit, Michigan Louisiana Board of Regents Preliminary Program: DAY 1: Wednesday, May 28, 1997 ============================== SESSION 1: Intro and Keynote 8:45-10:00am Keynote Address: Problems versus Solutions: The Role of the Domain in Software Comprehension. Iris Vessey, Indiana University, USA SESSION 2: Program understanding-in-the-large 10:30-12:00pm Session Chair: M. Vans, Hewlett-Packard Co., USA - - Relationships between Comprehension and Maintenance Activities G. Visaggio University of Bari, Bari, Italy - - Cognitive design elements to support the construction of a mental model during software visualization M.A.D. Storey, F.D. Fracchia, H.A. Muller University of Victoria, Victoria, Canada - - Understanding-in-the-large J.M. Favre IMAG Institute, Grenoble, France SESSION 3: Automated Program Understanding 1:30-3:00pm Session Chair: Hongji Yang, De Montfort University, UK - - Automated chunking to support program comprehension I.J. Burnstein, K. Roberson Illinois Institute of Technology, Chicago, USA - - Semi-automatic generation of parallelizable patterns from source code examples D. Markovic, J.R. Hagemeister, C.S. Raghavendra, S. Bhansali Washington State University, Pullman, USA - - Using knowledge representation to understand interactive systems M. Moore, S. Rugaber Georgia Institute of Technology, Atlanta, USA SESSION 4: Program Analysis 3:30-5:00pm Session Chair: A. De Lucia, University of Salerno, Italy - - Amorphous Program Slicing M. Harman, S. Danicic University of North London, London, UK - - Dynamic program slicing in understanding of program execution B. Korel, J. Rilling Illinois Institute of Technology, Chicago, USA - - Points-to Analysis for Program Understanding P. Tonella, G. Antoniol, R. Fiutem IRST, Povo (Trento), Italy E. Merlo Ecole Polytechnique, Montreal, Quebec, Canada DAY 2: Thursday, May 29, 1997 ============================= SESSION 5: Program Comprehension 8:30-10:00am Session Chair: J. Q. Ning, Andersen Consulting, USA - - A case study of domain-based program understanding R. Clayton, S. Rugaber, L. Taylor, L. Wills Georgia Institute of Technology, Atlanta, USA - - A little knowledge can go a long way towards program understanding J. Sayyad-Shirabad, T.C. Lethbridge University of Ottawa, Ottawa, Canada S. Lyon Mitel Corporation, Kanata, Canada - - Facilitating Program Comprehension via Generic Components for State Machines J. Weidl, R. Klosch, G. Trausmuth, H. Gall Technical University of Vienna, Vienna, Austria SESSION 6: Finding Reusable Assets 10:30-12:00pm Session Chair: A. Cahill, University of Limerick, Ireland - - Enriching Program Comprehension for Software Reuse E.L. Burd, M. Munro University of Durham, Durham, UK - - Identifying Objects in Legacy Systems A. Cimitile, A. De Lucia University of Salerno, Benevento, Italy G.A. Di Lucca, A.R. Fasolino University of Naples, Naples, Italy - - Code Understanding through Program Transformation for Reusable Component Identification W.C. Chu Feng Chia University, Taiwan P. Luker, Hongji Yang De Montfort University, Leicester, UK SESSION 7: Panel 1:30-3:00pm Session Chair: V. Rajlich, Wayne State University, USA Panellists: - - S. Rugaber, Georgia Institute of Technology, USA - - S. Tilley, Software Engineering Institute, USA - - A. Von Mayrhauser, Colorado State University, USA Infrastructure for Software Comprehension and Reengineering SESSION 8: Tools 3:30-5:00pm Session Chair: P. Linos, Tennessee Technological University, USA - - Evaluation of the ITOC information system design recovery tool A. Berglas, J. Harrison University of Queensland, Australia - - Glyphs for software visualization M.C. Chuah Carnegie Mellon University, USA S.G. Eick, Bell Laboratories, USA - - PUI: A Tool to Support Program Understanding P.S. Chan, M. Munro University of Durham, Durham, UK DAY 3: Friday, May 30, 1997 =========================== TUTORIAL: 8:30-4:00pm Empirical techniques: putting empirical evidence into perspective Marian Petre, Open University, UK Workshop location The 5th International Workshop on Program Comprehension is held in Dearborn Inn, Dearborn, Michigan. Dearborn is a suburb of Detroit and it is a home of Henry Ford Museum and historic Greenfield Village, both star attractions. Dearborn is easily accessible from Detroit airport. Hotel Information The Dearborn Inn Phone :13-271-2700 20301 Oakwood Blvd Fax: 313-271-7464 Dearborn, MI 48124 Cost:US$ 119 pernight USA (Single/Double) Deadline for hotel reservations : May 6,1997 Airport to Hotel : Taxi - $15,Shuttle -$10 ============================ IWPC'97 Registration form ============================ Return this registration form to: IEEE Computer Society IWPC'97 Registration 1730 Massachusetts Ave., N.W. Washington, DC 20036-1992 Fax: 202-728-0884 Phone: 202-371-1013 (sorry, no phone registrations) Name__________________________________________________________________ Affiliation___________________________________________________________ Mailing Address_______________________________________________________________ ______________________________________________________________________ Daytime Phone Number_________________________________________________________ Fax Number__________________________________________________________ E-mail address_______________________________________________________________ IEEE/CSMembership Number:_________________________________ Do not include my mailing address on: __Non-society mailing lists __Meeting attendee lists Workshop Registration Fees (Please check appropriate fee) Advance (received by May 6) On-Site (received after May 6) Member __US $250 __US $290 Nonmember __US $320 __US $370 Student __US $100 __US $100 Workshop registration fee includes admission to the workshop, refreshment breaks, the workshop luncheon, and one copy of the workshop proceedings. Student fee does not include the luncheon. Total Enclosed (in US dollars): $_________ Method of Payment (All payments must be in US dollars, drawn on US banks.) __Personal Check __Company Check __Traveler's Check Please make checks payable to IEEE Computer Society. __Purchase Order (U.S. organizations only) __VISA __Mastercard __American Express __Diners Club Card Number:_________________________________ Expiration Date:_____________________________ Cardholder Name:_____________________________ Signature:___________________________________ Written requests for refunds must be received in the registration office no later than 5/9/97. Refunds are subject to a $50 processing fee. All no-show registrations will be billed in full. Students are required to show current picture ID cards at the time of registration. Registrations after 5/9/97 will be accepted on-site only. ======================= Program Committee ======================= Paul Bailes, University of Queensland, Australia Paolo Benedusi, CRIAI, Italy Keith Bennett, University of Durham, UK Anthony Cahill, University of Limerick, Ireland Doris Carver, Louisiana State University, USA Aniello Cimitile, University of Benevento, Italy Robin Chen, AT&T Research, USA Ugo De Carlini, University of Naples, Italy Prem Devanbu, AT&T Research, USA Stephen G. Eick, AT&T Research, USA Philippe Facon, IEE-CNAM, France Harald Gall, Vienna University of Technology, Austria Ric Holt, University of Toronto, Canada Daniel Jackson, Carnegie Mellon University, USA Robin Jeffries, SunSoft, Inc., USA Rene Kloesch, Vienna University of Technology, Austria Panos Linos, Tennessee Technological University, USA Ettore Merlo, Ecole Polytechnique, Canada Hausi A. Muller, University of Victoria, Canada Malcolm Munro, University of Durham, UK Jim Q. Ning, Andersen Consulting, USA Alex Quilici, University of Hawaii, USA Vaclav Rajlich, Wayne State University, USA Spencer Rugaber, Georgia Institute of Technology, USA Dennis Smith, Software Engineering Institute, USA Harry M. Sneed, SES, Germany Jorma Taramaa, VTT Electronics, Finland Scott Tilley, Software Engineering Institute, USA Maria Tortorella, University of Naples, Italy Marie Vans, Hewlett-Packard Co., USA Giuseppe Visaggio, University of Bari, Italy Norman Wilde, University of Western Florida, USA Linda Wills, Georgia Institute of Technology, USA Hongji Yang, De Montfort University, UK From honavar at cs.iastate.edu Thu May 8 11:48:27 1997 From: honavar at cs.iastate.edu (Vasant Honavar) Date: Thu, 8 May 1997 10:48:27 -0500 (CDT) Subject: Call for Participation: Workshop on Automata Induction, Grammatical Inference, and Language Acquisition Message-ID: <199705081548.KAA05472@ren.cs.iastate.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 6576 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/caec0088/attachment.ksh From berthold at ira.uka.de Thu May 8 11:54:21 1997 From: berthold at ira.uka.de (Michael Berthold) Date: Thu, 8 May 1997 17:54:21 +0200 Subject: IDA-97 Call for Participation Message-ID: <"i80fs1.ira.061:08.05.97.15.55.21"@ira.uka.de> CALL FOR PARTICIPATION The Second International Symposium on Intelligent Data Analysis (IDA-97) Birkbeck College, University of London 4th-6th August 1997 In Cooperation with AAAI, ACM SIGART, BCS SGES, IEEE SMC, and SSAISB [ http://web.dcs.bbk.ac.uk/ida97.html ] You are invited to participate in IDA-97, to be held in the heart of London. IDA-97 will be a single-track conference consisting of oral and poster presentations, invited speakers, demonstrations and exhibitions. The conference Call for Papers introduced a theme, "Reasoning About Data", and many papers complement this theme, but other, exciting topics have emerged, including exploratory data analysis, data quality, knowledge discovery and data-analysis tools, as well as the perennial technologies of classification and soft computing. A new and exciting theme involves analyzing time series data from physical systems, such as medical instruments, environmental data and industrial processes. Information regarding registration can be found on the IDA-97 web page (address listed above). Please note that there are reduced rates for early registration (before 2nd June). Also there are still a limited number of spaces available for exhibition, and potential exhibitors are encouraged to book early (the application deadline is 2nd June). Provisional Technical Program Schedule Intelligent Data Analysis 97 Monday 4 August 10:00 to 10:30 Opening Ceremony 10:30 to 11:45 Invited Presentation I Intelligent Data Analysis: Issues and Opportunities .... David J Hand (UK) (The abstract of Professor Hand's talk, and a brief biographical sketch, can be found at the end of this document) 12:00 to 1:30PM LUNCH 1:30 to 2:45 PAPER SESSION 1: Exploratory Data Analysis and Preprocessing Decomposition of heterogeneous classification problems .... C Apte, S J Hong, J Hosking, J Lepre, E Pednault & B Rosen (USA) Managing Dialogue in a Statistical Expert Assistant with a Cluster-based User Model .... M Muller (South Africa) How to Find Big-Oh in Your Data Set (and How Not To) .... C McGeoch, D Precup and P R Cohen (USA) 2:45 to 3:00 COFFEE BREAK 3:00 to 4:45 PAPER SESSION 2: Classification and Feature Selection A Connectionist Approach to Structural Similarity Determination as a Basis of Clustering, Classification and Feature Detection .... K Schadler and F Wysotzki (Germany) Efficient GA Bases Techniques for Automating the Design of Classification Models .... R Glover and P Sharpe (UK) Data Representation and ML Techniques .... C Lam, G West and T Caelli (Australia) Development of a Knowledge-Driven Constructive Induction Mechanism .... S Lo and A Famili (Canada) 4:45 to 5:00 COFFEE BREAK 5:00 to 5:45 POSTER SESSION I: Introduction TOPIC ONE: Exploratory Data Analysis, Preprocessing and Tools Data Classification Using a W.I.S.E. Toolbox .... I Berry and P Gough (UK) Mill's Methods for Complete Intelligent Data Analysis .... T Cornish (UK) Integrating Many Techniques for Discovering Structure in Data .... D Gregory and P Cohen (USA) Meta-Reasoning for Data Analysis Tool Allocation .... R Levinson and J Wilkinson (USA) Navigation for Data Analysis Systems .... R St Amant (USA) An Annotated Data Collection System to Support Intelligent Analysis of Intensive Care Unit Data .... C Tsien and J Fackler (USA) A Combined Approach to Uncertain Data Manipulation .... H Yu and A Ramer (Australia) TOPIC TWO: Classification and Feature Selection Oblique Linear Tree .... J Gama (Portugal) Feature selection for Neural Networks through Functional Links found by Evolutionary Computation .... S Haring, J Kok and M van Wezel (The Netherlands) Overfitting Explained: a Case Study .... D Jensen, T Oates and P Cohen (USA) Exploiting Symbolic Learning in Visual Inspection .... M Piccardi, R Cucchiara, M Bariani and P Mello (Italy) Forming Categories in Exploratory Data Analysis and Data Mining .... P Scott, H Williams and K Ho, (UK) A systematic description of greedy optimisation algorithms for cost sensitive generalisation .... M van Someren, C Torres and F Verdenius (The Netherlands) Automatic classification within object knowledge bases .... P Valtchev and J Euzenat (France) 5:45 to 7:00 POSTER SESSION I: Posters 7:00 to 9:00 Conference Reception Tuesday 5 August 9:15 to 10:30 Invited Presentation II Given 3,000,000,000 Nucleotides, Induce a Person or Intelligent Data Analysis for Molecular Biology .... Lawrence Hunter (USA) (The abstract of Dr Hunter's talk, and a brief biographical sketch, can be found at the end of this document) 10:30 to 10:45 COFFEE BREAK 10:45 to 12:00 PAPER SESSION 3: Medical Applications ECG Segmentation using Time-Warping .... H Vullings, M Verhaegen and H Vergruggen (The Netherlands) Interpreting longitudinal data through temporal abstractions: an application to diabetic patients monitoring .... R Bellazzi and C Larizza (Italy) Intelligent Support for Multidimensional Data Analysis in Environmental Epidemiology .... V Kamp and F Wietek (Germany) 12:00 to 1:30pm LUNCH 1:30 to 2:45 PAPER SESSION 4: Soft Computing Network Performance Assessment for Neurofuzzy Data Modelling .... S Gunn, M Brown and K Bossley (UK) A Genetic Approach to Fuzzy Clustering with a Validity Measure Fitness Function .... S Nascimento and F Moura-Pires (Portugal) The Analysis of Artificial Neural Network Data Models ....C Roadknight, D Palmer-Brown and G Mills (UK) 2:45 to 3:00 COFFEE BREAK 3:00 to 4:15 PAPER SESSION 5: Knowledge Discovery A Strategy for Increasing the Efficiency of Rule Discovery in Data Mining .... D McSherry (UK) Knowledge-Based Concept Discovery from Textual Data .... U Hahn and K Schnattinger (Germany) Knowledge Discovery in Endgame Databases .... M Schlosser (Germany) 4:15 to 4:30 COFFEE BREAK 4:30 to 5:15 POSTER SESSION II: Introduction TOPIC ONE: Fuzzy and Soft Computing Simulation Data Analysis Using Fuzzy Graphs .... K-P Huber and M Berthold (Germany) Mathematical Analysis of Fuzzy Classifiers .... F Klawonn and E-P Klement (Germany; Austria) Neuro-Fuzzy Diagnosis System with a Rated Diagnosis Reliability and Visual Data Analysis .... A Lapp and H Kranz (Germany) Genetic Fuzzy Clustering by means of Discovering Membership Functions .... M Turhan (Turkey) TOPIC TWO: Data Mining Parallelising Induction Algorithms for Data Mining .... J Darlington, Y Guo, J Sutiwaraphun, H To (UK) Data Analysis for Query Processing .... J Robinson (UK) Datum Discovery .... L Siklossy and M Ayel (France) Using neural network to extract knowledge from database .... Y Zhou, Y Lu and C Shi (China) TOPIC THREE: Estimation, Clustering A Modulated parzen-Windows Approach for Probability Density Estimation .... G van den Eijkel, J van der Lubbe and E Backer (The Netherlands) Improvement on Estimating Quantiles in Finite Population Using Indirect Methods of Estimation .... M Garcia, E Rodriguez and A Cebrian (Spain) Robustness of Clustering under Outliers .... Y Kharin (Belarus) The BANG-Clustering System: Grid-Based Data Analysis .... E Schikuta and M Erhart (Austria) TOPIC FOUR: Qualitative Models Diagnosis of Tank Ballast Systems .... B Schieffer and G Hotz (Germany) Qualitative Uncertainty Models from Random Set Theory .... O Wolkenhauer (UK) 5:15 to 6:30 POSTER SESSION II: Posters 7:30 - Conference Dinner Wednesday 6 August 9:15 to 10:30 PAPER SESSION 6: Data Quality Techniques for Dealing with Missing Values in Classification .... W Liu, A White, S Thompson and M Bramer (UK) The Use of Exogenous Knowledge to Learn Bayesian networks from Incomplete Databases .... M Ramoni and P Sebastiani (UK) Reasoning about Outliers in Visual Field Data .... J Wu, G Cheng and X Liu (UK) 10:30 to 10:45 COFFEE BREAK 10:45 to 12:00 PAPER SESSION 7: Qualitative Models Reasoning about sensor data for automated system identification .... E Bradley and M Easley (USA) Modeling Discrete Event Sequences as State Transition Diagrams .... A E Howe and G Somlo (USA) Detecting and Describing Patterns in Time-Varying Data Using Wavelets .... S Boyd (Australia) 12:00 to 12:30 Closing Ceremony 12:30 to 2:00pm LUNCH 2:00 to 4:00 IDA Open Business Meeting -------------------------------------------------------------------------- Intelligent data analysis: issues and opportunities David J. Hand Modern data analysis is the product of the union of several disciplines: statistics, computer science, pattern recognition, machine learning, and others. Perhaps the oldest parent is statistics, being driven by the demands of the different areas to which it has been applied. More recently, however, the possibilities arising from powerful and available computers have stimulated a revolution. Data of new kinds and in unimaginable quantities now occur; they bring with them entirely new classes of problems, problems to which the classical statistical solutions are not always well-matched; these problems in turn require novel and original solutions. In this talk I look at some of these new kinds of data, and the associated problems and solutions. The data include data sets which are large in dimensionality or number of records, data which are dependent on each other, and that special kind of qualitative data known as metadata. New problems arising from these data include straightforward mechanical issues of how to handle them, how to estimate descriptors and parameters (adaptive and sequential methods are obviously more important than in classical statistics), the (ir)relevance of significance tests, and automatic data analysis (as in anomaly detection in large data sets or, quite differently, in automatic model fitting). Some of the new types of model which are becoming so important nicely illustrate the interdisciplinary nature of modern data analysis: rule-based systems, hidden Markov models, neural networks, genetic algorithms, and so on. These are briefly discussed. All of this leads us to consider more carefully the link between data and information and to recognise the complementary data analytic abilities and powers of humans and computers. But we can go too far. If there is 'intelligent data analysis' there is also 'unintelligent data analysis'. Two different manifestations of the latter are examined, and a cautionary note sounded. -------------- David J. Hand is Professor of Statistics at the Open University in the UK. He has published over 150 papers and fourteen books, including Artificial Intelligence Frontiers in Statistics, Practical Longitudinal Data Analysis, and, most recently, Construction and Assessment of Classification Rules. He is founding editor and editor-in-chief of the journal Statistics and Computing. His research interests include developments at the interface between statistics and computing, multivariate statistics, the foundations of statistics, and applications in medicine, psychology, and finance. ------------------------------------------------------------------------ Given 3,000,000,000 Nucleotides, Induce a Person or Intelligent Data Analysis for Molecular Biology Lawrence Hunter In the last decade or so, large scale gene sequencing, combinatorial biochemistry, DNA PCR and many other innovations in molecular biotechnology have transformed biology from a data-poor science to a data-rich one. This data is a harbinger of great change in medicine, in agriculture, and in our fundamental understanding of life. However, the availability of an exponentially growing onslaught of relevant data is only the first step toward understanding. There are many scientifically and economically significant opportunities (and challenges) for intelligent data analysis in exploiting this information. In this talk, I will give a brief overview of the kinds of data available and the open problems in the field, describe a few successes, and speculate about the future. ------------------- Dr. Lawrence Hunter is the director of the Machine Learning Project at the (U.S.) National Library of Medicine, and a Fellow of the Krasnow Institute of Advanced Study in Cognition at George Mason University. He received his Ph.D. in Computer Science from Yale University in 1989. He edited the MIT Press book "Artificial Intelligence and Molecular Biology," and was recently elected the founding president of the International Society for Computational Biology. His research contributions span the range from basic contributions to machine learning methodology to development of IDA technology for clinical and pharmaceutical industry applications. From ingber at ingber.com Thu May 8 17:41:20 1997 From: ingber at ingber.com (Lester Ingber) Date: Thu, 8 May 1997 17:41:20 -0400 Subject: EEG data now publicly available Message-ID: <19970508174120.21441@ingber.com> EEG data now publicly available It is extremely difficult for modelers of nonlinear time series, and EEG systems in particular, to get access to large sets of raw clean data. Such a set of data was acquired and used for the study in %A L. Ingber %T Statistical mechanics of neocortical interactions: Canonical momenta indicators of electroencephalography %J Physical Review E %V 55 %N 4 %P 4578-4593 %D 1997 %O URL http://www.ingber.com/smni97_cmi.ps.Z The above adaptive simulated annealing (ASA) application to EEG analysis is one of several ASA applications being prepared for the SPEC (Standard Performance Evaluation Corporation) CPU98 suite. Eventually the code used to perform these calculations will be published on a CDROM by SPEC. Raw EEG data is now publicly available, as described in http://www.ingber.com/smni_eeg_data.html ftp://ftp.ingber.com/MISC.DIR/smni_eeg_data.txt The ASA code is publicly available at no charge from http://www.ingber.com/ ftp:/ftp.ingber.com A complete homepage is mirrored on http://www.alumni.caltech.edu/~ingber/ Lester -- /* RESEARCH ingber at ingber.com * * INGBER ftp://ftp.ingber.com * * LESTER http://www.ingber.com/ * * Prof. Lester Ingber __ PO Box 857 __ McLean, VA 22101-0857 __ USA */ From bruno at redwood.ucdavis.edu Thu May 8 14:44:26 1997 From: bruno at redwood.ucdavis.edu (Bruno A. Olshausen) Date: Thu, 8 May 1997 11:44:26 -0700 Subject: postdoctoral openings Message-ID: <199705081844.LAA17724@redwood.ucdavis.edu> MULTIDISCIPLINARY POSTDOCTORAL TRAINING A new National Science Foundation Biology Research Training Group at the University of California, Davis, invites postdoctoral applications from United States citizens and permanent residents. This multidisciplinary program is designed to provide biologists with sufficient mathematical skills and applied mathematicians with sufficient biological knowledge to solve problems in cell physiology, neurobiology, biofluiddynamics, ecology, and population biology. Relevant training faculty within neurobiology include Charles Gray (visual cortex physiology), Joel Keizer (modeling of intracellular dynamics), Bruno Olshausen (computational models of vision), and Mitch Sutter (auditory cortex physiology). Applicants should see the Research Training Group webpage (http://www.itd.ucdavis.edu/rtg) for information about applications, which are due by June 1, 1997. From pci-inc at aub.mindspring.com Tue May 6 14:52:37 1997 From: pci-inc at aub.mindspring.com (Mary Lou Padgett) Date: Tue, 06 May 1997 14:52:37 -0400 Subject: ICNN97 FINAL PROGRAM & REGISTRATION Message-ID: <2.2.16.19970506185237.1d37b920@pop.aub.mindspring.com> PLEASE CIRCULATE WIDELY (APOLOGIES IF YOU RECEIVE DUPLICATE COPIES) ICNN'97 FINAL PROGRAM Schedule and Registration (Conference, Tutorials, Tours, Hotel) *** Check our website: http://www.mindspring.com/ICNN97/ for details, final paper abstracts *** I. SCHEDULE _______________________________________________ SUNDAY, June 8, 1997 TUTORIALS _______________________________________________ 9:00 AM - 12:00 Noon T1: Neural Networks for Consciousness. J. G. Taylor: King's College, London T2: Network ensembles and hybrid systems. Joydeep Ghosh: Univ. of Texas 13:30 - 16:30 PM T3: Neuro-Fuzzy Recognition System: Concepts, Features and Feasibility. Sankar K. Pal: Indian Statistical Institute T4: Learning from Examples : from theory to practice. Don R. Hush: University of New Mexico 18:00 - 21:00 PM T5: Principles of Neurobiological Information Processing for Biology-Inspired Neural Computers. Rolf Eckmiller: University of Bonn T6: Hybrid Intelligent Information Systems - Models, Tools, Applications. Nik Kasabov: University of Otago Robert Kozma: Tohoku University _______________________________________________ MONDAY, June 9, 1997 _______________________________________________ 8:30 - 10:30 Opening Remarks and Plenary Session Plenary talk (David Waltz: Neural Nets and AI: Time for a Synthesis) Plenary talk (Jean-Jaques Slotine: Adaptive approximation networks for stable learning and control) 10:30 - 10:50 Coffee Break 10:50 - 12:30 8 Parallel sessions of 5 papers each AP1, SU1, LM1, PR1, TS1, AR1, CI1, SS6 12:30 - 13:50 Lunch Break 13:50 - 15:50 8 Parallel sessions of 6 papers each AP2, SU2, LM2, PR2, TS2, AR2, CI2, EC1 15:50 - 16:10 Tea Break 16:10 - 18:10 4 Parallel sessions of 6 papers each SS1, SS3, SS4, SS5 Also Panel Session: Classical Connectionist Learning 18:30 - 20:30 Opening Reception _______________________________________________ TUESDAY, June 10, 1997 _______________________________________________ 8:30 - 9:40 Parallel Plenary Talks Plenary talk (James Bezdek: A geometric approach to edge detection) Plenary talk (Teuvo Kohonen: Exploration of very large databases by self-organizing maps) 9:40 - 10:00 Coffee Break 10:00 - 12:00 8 Parallel sessions of 6 papers each AP3, SU3, LM3, PR3, TS3, OA1, CI3, EC2 12:00 - 13:20 Lunch Break 13:20 - 14:30 Plenary talk (Peter Fox: Functional volume models: System level models for functional neuroimaging) Plenary talk (Kaoru Hirota: Research and application aspects in soft computing: History and recent trends in Japan) 14:40 - 16:00 8 Parallel sessions of 4 papers each AP4, AR3, LM4, EC3, TS4, OA2, SS2.1, SS9.1 16:00 - 16:20 Tea Break Also Poster I (starts at 16:00) ARP1, CIP1, ECP1, OAP1, RVP1, TSP1 16:20 - 18:20 Panel Session: Brain Imaging Poster Session I (ends at 18:20) 20:00 - 22:00 INNS / SIG Meetings _______________________________________________ WEDNESDAY, June 11, 1997 _______________________________________________ 8:30 - 9:40 Parallel Plenary Sessions Plenary talk (Joaquim Fuster: Structure and dynamics of network memory) Plenary talk (Geoffrey Hinton: Towards neurally plausible Bayesian networks) 9:40 - 10:00 Coffee Break 10:00 - 12:00 8 Parallel sessions of 6 papers each TS5, SU4, LM5, PR4, BI1, EC4, RV1, EO1 12:00 - 13:20 Lunch Break 13:20 - 14:30 Parallel Plenary Sessions Plenary talk (Karl Pribram: The deep and surface structure of memory) Plenary talk (Eric Baum: Reinforcement learning by an economy of agents) 14:40 - 16:00 8 Parallel sessions of 4 papers each AR4, AP5, CS1, EO2, OA3, SS8, SS2.2, SS9.2 16:00 - 16:20 Tea Break Poster II (starts at 16:00) APP2, BIP2, CSP2, EOP2, LMP2, PRP2, SUP2 16:20 - 18:20 Panel Session: Creativity Poster Session II (ends at 18:20) 19:00 - 22:00 BANQUET Presentation of IEEE Fellowship Awards by the President of the IEEE Neural Networks Council, Dr. James C. Bezdek Banquet talk (Robert J. Marks II: Neural Networks, reduction to practice) _______________________________________________ THURSDAY, June 12, 1997 _______________________________________________ 8:30 - 9:40 Plenary Session Plenary talk (Paul Werbos: From neuro-control to brain-like intelligence) 9:40 - 10:00 Coffee Break 10:00 - 12:00 8 Parallel sessions of 6 papers each BI2, SU5, TS6, EO3, PR5, EC5, OA4, SS7.1 also 12:00 - 13:20 Lunch Break 13:20 - 15:20 8 Parallel sessions of 6 papers each CS2, SU6, AP6, RV2, PR6, EC6, OA5, SS7.2 also Tours ADJOURN _______________________________________________ ICNN'97 PAPER LISTS are on the web site * Note Two Letter Abbreviations for Sessions: AP Applications SU Supervised/Unsupervised Learning LM Learning and Memory BI Biological Neural Nets CS Cognitive Science and Cognitive Neuroscience EO Electronics and Optical Implementation PR Pattern Recognition and Image Processing RV Robotics and Vision OA Optimization and Associative Memory TS Speech Processing, Time Series and Filtering AR Architectures CI Computational Intelligence SS Special Sessions SS1: Adaptive Critic Designs SS2: Visual System Models & Prostheses SS3: Adaptive Applications SS4: Linguistic Rule Extraction SS5: Intelligent Control Theory & Applications SS6: NN Appl. for Monitoring of Complex Systems SS7: Biomedical Applications SS8: Sensors and Biosensors SS9: Knowledge-based Methods in NN ______________________________________________________ ______________________________________________________ ICNN97 CONFERENCE REGISTRATION ______________________________________________________ 1997 IEEE International Conference on Neural Networks June 9-12, 1997 Westin Galleria Hotel, Houston, Texas, USA Circle One: Dr. Mr. Ms. Last Name:____________________________________ First Name: ______________________________________ IEEE or INNS Membership Number: _______________________________ Affiliation: _____________________ (Must be current to qualify for discount. Email questions to: pci-inc at mindspring.com ) Mailing Address: ________________________________________________________________________________ City: _____________________________State: ______________________ Zip: ___________ Country: _________ Phone: __________________________ Fax: __________________________ Email: _________________________ Information to Appear on Badge: (First Name or "Nickname") ________________________________________ ______________________________________________________ Conference Registration Fees: Early Rate Late Rate (Before May 9, 1997) (After May 9, 1997) IEEE or INNS Members $375 $450 Non-Members $425 $550 Students* $110 $150 * A letter from the Department Head to verify full-time student status at the time of registration is required. At the conference, all students must present a current student ID with picture. Student registration does not include the banquet. ______________________________________________________ ______________________________________________________ TUTORIALS ______________________________________________________ Tutorial Registration Fees: (Tutorials June 8, 1997) One Tutorial $300 Two Tutorials $450 Three Tutorials $550 Student - One Tutorial $300 Student - Two Tutorials $450 Student - Three Tutorials $550 Tutorial Selection: (Circle desired tutorials) Morning Tutorials: 9:00 am - 12:00 noon T1 "Neural Networks for Conciousness" T2 "Network Ensembles and Hybrid Systems" Afternoon Tutorials: 1:30 pm - 4:30 pm T3 "Neuro-Fuzzy Recognition System: Concepts, Features and Feasibility T4 "Learning from Examples: From Theory to Practice" Evening Tutorials: 6:00 pm - 9:00 pm T5 "Principles of Neurobiological Information Processing for Biology-Inspired Neural Computers" T6 "Hybrid Intelligent Information Systems - Models, Tools, Applications" Tutorial Registration is on a first-come, first-served basis. ______________________________________________________ Please make check payable to: ICNN 97 Or indicate credit card payment(s) enclosed ______ Mastercard ______ Visa ______ Amex Credit Card No.: _____________________________________________ Exp. Date: ________________________ Signature: ___________________________________________________ For Credit Card Registration Only, Fax to: (714) 752-7444 Registrations by check/money order must be mailed. Payments Enclosed: Registration Fees: U.S. $ ________________________ Tutorial Fees: U.S. $ ________________________ Additional Proceedings: U.S. $ ________________________ GRAND TOTAL: U.S. $ ________________________ IMPORTANT NOTE: All registrations are fully processed only after payment is received. Payments made for registrations after the early deadline (May 9, 1997) are subject to the late registration fee. Cancellations received in writing by Meeting Management by May 1, 1997 will receive a refund, minus a $100 administrative charge. No refunds will be issued after that date, although substitutions may be made any time before June 5, 1997 by faxing/mailing the substitute registrant's name to Meeting Management. All other substitutions and registrations after June 5, 1997 must be handled on-site. ______________________________________________________ Please mail or fax your completed Conference Registration form, along with your payment to: ICNN 97 Meeting Management 2603 Main Street, Suite 690 Irvine, CA 92614, USA Phone: (714) 712-8205; FAX: (714) 752-7444; Email: MeetingMgt at aol.com ______________________________________________________ ______________________________________________________ ICNN97 TOURS ______________________________________________________ TOUR #1: Space Center Houston TOUR #2: Moody Gardens TOUR #3: Theater Museum Districts TOUR #4: Virtual Environment Technology Laboratory ______________________________________________________ TOUR #1: Space Center Houston A state-of-the-art education and entertainment complex designed by Walt Disney Imagineering. The magical adventure begins in Space Center Plaza where a space shuttle mock up welcomes you. Exhibits and demonstrations let you land a space shuttle by computer simulation, touch a real moon rock, and listen to communications between Mission Control and astronaut crews on board the space shuttle. In Destiny Theater, relive the great moments of the space program in the film "On Human Destiny." Adjacent to the complex is Johnson Space Center (JSC), home to the famed Mission Control Center and Rocket Park, the outdoor home of retired flight hardware too huge to house indoors. There will be a technical tour to JSC by NASA engineers. For the technical tour the attendees must be either US citizen or permanent resident. ______________________________________________________ TOUR #2: Moody Gardens Traveling to the Rainforest Pyramid at Moody Gardens in Galveston is an ultimate treat. Housing a tropical rainforest with waterfalls, cliffs and caverns, the 10-story glass pyramid, open daily, holds more than 2,000 species of exotic plants, animals, tropical fish and butterflies. Also on the grounds are the 3-D IMAX Theater, the white sand Palm Beach, the Colonel Paddlewheel Boat, the fascinating Bat Cave, and acres of lush gardens to explore. ______________________________________________________ TOUR #3: Theater Museum Districts Flocked by striking, contemporary architecture of downtown Houston, the Theater District covers ten square blocks of the city's central business district and is home to five of the country's most innovative performing arts companies. The Museum District, a lovely oak-lined area near Hermann Park an the Texas Medical Center, is anchored by the Museum of Fine Arts, the Contemporary Arts Museum, the Children's Museum of Houston, the nearby Menil Collection and the Museum of Natural Science. Two intriguing new museums to the area are the Holocaust Museum and the Museum of Health and Medical Science. ______________________________________________________ TOUR #4: Virtual Environment Technology Laboratory The Virtual Environment Technology Laboratory (VETL) is a joint enterprise of the University of Houston and NASA/Johnson Space Center. The laboratory's objectives include research and development activities utilizing virtual environments in (1) scientific/ engineering data visualization, (2) training, and (3) education. With a complement of over three million dollars in high performance computing and display equipment, and this region's only CAVE (a cube, ten feet on a side, with four display surfaces for total immersion), the VETL is advancing the state-of-the-art in virtual environment technology. The VETL is capable of displaying the visual components of virtual environments via monitors, stereoscopic head-mounted displays (HMDs), and projection displays. Demonstrations will include interactive immersive virtual environments in the CAVE and Flogiston "flostation" as well as 3-D modeling. ______________________________________________________ TOUR REGISTRATION FORM ______________________________________________________ I am interested in attending: (circle one) Tour #1 Tour #2 Tour #3 Tour #4 Name:______________________________________________________ (First) (Last) Address:______________________________________________________ (Street Address) (City) (State) (Zip) Phone: ________________________________ Fax: ____________________________ Social Security Number (Space Center attendees only): _________________________________ IMPORTANT NOTE: Prices for tours have not been finalized. Please fill out this form if you are interested in attending a tour. Tour fee information will be included on your registration confirmation letter. ______________________________________________________ Please mail your completed Tour form to: ICNN 97 Meeting Management 2603 Main Street, Suite 690 Irvine, CA 92614, USA Phone: (714) 712-8205; FAX: (714) 752-7444; Email: MeetingMgt at aol.com ______________________________________________________ ICNN97 HOTEL REGISTRATION FORM ______________________________________________________ ABOUT THE CONFERENCE VENUE The Conference will be held in the Westin Galleria Hotel, located in The Galleria complex, a glass enclosed entertainment/shopping center with over 350 retail stores in the heart of uptown Houston. Houston, Texas is America's fourth largest city and is also the capital of the international energy industry, the largest international port in North America, and the headquarters for America's Mann Space Flight Effort. Houston is also home of the world's largest medical center--a burgeoning Mecca of multicultural arts which also defines the colorful nature of this city. ______________________________________________________ TRAVEL INFORMATION Houston boasts two major international airports: Houston Intercontinental and Hobby. Houston Intercontinental Airport is the eighth largest airport system in the United States for international travel. It has three modern terminals connected by a state-of-the-art subway system. Hobby Airport offers full US and regional service with easy access to all parts of the city. Travel is very convenient, with many non-stop flights and easy connections from other major cities, as Houston is a hub for many airlines. ICNN attendees will find that flexible flight schedules and low fares await them in Houston. For special discounted rates, contact Nationwide Travel at (714) 847-1788. Be sure to identify yourself as an IEEE ICNN 97 attendee. Round trip shuttle bus service to both airports is available through Airport Express, which provides passenger pick-up and drop-off at both the Westin Galleria and The Westin Oaks Hotels. The cost of a one-way fare to Houston Intercontinental Airport is $16.00, and the airport is approximately 25 miles from the Galleria area. A flat rate taxi fare is $35.00 each way to Houston Intercontinental Airport. Approximate travel time is one hour. The one-way shuttle bus fare to Hobby Airport is $11.00. Hobby Airport is approximately 20 miles from the Galleria area and taxi fare is $26.00 each way. Approximate travel time is 45 minutes. Service to both airports funs every hour on the hour, from 5:30am to 7pm from the hotel. Reservations are required only for your return trip to the airport from the hotel. ______________________________________________________ HOTEL RESERVATION FORM ______________________________________________________ 1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS June 8-12, 1997 Westin Galleria Hotel 5060 West Alabama Houston, Texas 77056 Phone: (713) 960-8100 Fax: (713) 960-6549 ***** FLASH: PLENTY OF ROOM LEFT IN HOTEL AS OF MAY 6 -- YOU MAY HAVE TO MENTION "IEEE CONFERENCE" -- IF YOU HAVE TROUBLE RESERVING A ROOM, CALL MEETING MANAGEMENT AT (714) 752-8205 OR DR. KARAYIANNIS (713)743-4436 ! ***** Rates: $119.00 Single or Double Occupancy Note: All guest rooms have one king size bed or two double beds. There is an additional charge for roll-aways. There will be an additional charge of $15.00 for each additional person in the room. Discounted rates are available to attendees three days prior and three days after the dates of the conference. Reservations must be received by May 18, 1997 to ensure availability and special rates. Reservations and deposits received after this date will be accepted on a space available basis at the hotel's published rates. Please MAIL or FAX this form to: Westin Galleria Hotel at the above address. RESERVATIONS MUST BE ACCOMPANIED BY A DEPOSIT FOR THE FIRST NIGHT PLUS 15% TAX. (Check Appropriate Blanks) Bed Type Request: ______ One King Bed ______Two Double Beds Smoking Room: ______ No ______Yes Arrival Date/Time: ____________________________ (Check in time is 3:00 pm) Departure Date: ______________________________ (Check out time is 12 noon) Name: _________________________________________________ Organization/Firm: ________________________ Address: ______________________________________________________ City: ______________________________________ State: _______________________ Zip: ___________________ Sharing Room with: _________________________________________ Special Requests: ______________________ Reservations at the Westin Galleria Hotel require one night's deposit or credit card guarantee (including 15% tax plus $1.50 occupancy tax), please complete the following: ______ Enclosed is a check or money order for $__________________ OR ______ Enclosed is my credit card information authorizing my reservation to be guaranteed in the amount of $ _______ Reservations must be canceled 72 hours prior to arrival date for deposit refund. A $35.00 charge will be assessed if your departure is changed to an earlier date after check-in. Reservations are subject to cancellation at 4pm if not guaranteed. ______American Express ______Mastercard ______Diners Club ______Visa ______Carte Blanche ______Discover Credit Card #: ________________________________________________ Exp. Date: _______________________ Please print name as it appears on the card: __________________________________________________________ Signature: ____________________________________________________________________________ ________ NOTE: DO NOT EMAIL YOUR CREDIT INFORMATION. PRINT AND FAX OR MAIL THESE FORMS Mary Lou Padgett m.padgett at ieee.org http://www.mindspring.com/~pci-inc/ICNN97/ Mary Lou Padgett 1165 Owens Road Auburn, AL 36830 P: (334) 821-2472 F: (334) 821-3488 m.padgett at ieee.org Auburn University, EE Dept. Padgett Computer Innovations, Inc. (PCI) Simulation, VI, Seminars IEEE Standards Board -- Virtual Intelligence ( VI): NN, FZ, EC, VR http://www.mindspring.com/~pci-inc/ From henrys at gscit-1.fcit.monash.edu.au Tue May 6 16:08:06 1997 From: henrys at gscit-1.fcit.monash.edu.au (Henry Selvaraj) Date: Tue, 6 May 1997 16:08:06 EST-10 Subject: ICCIMA'98 Second Call for Papers Message-ID: <14E329D54CB@gscit-1.fcit.monash.edu.au> ICCIMA'98 International Conference on Computational Intelligence and Multimedia Applications 9-11 February 1998 Monash University, Gippsland Campus, Churchill, Australia S E C O N D C A L L F O R P A P E R S The International Conference on Computational Intelligence and Multimedia Applications will be held at Monash University on 9-11 February 1998. The conference will provide an international forum for discussion on issues in the areas of Computational Intelligence and Multimedia for scientists, engineers, researchers and practitioners. The conference will include sessions on theory, implementation and applications, as well as the non-technical areas of challenges in education and technology transfer to industry. There will be both oral and poster sessions. Accepted full papers will be included in the proceedings to be published by World Scientific. Several well-known keynote speakers will address the conference. Conference Topics Include (but not limited to): Artificial Intelligence Artificial Neural Networks Artificial Intelligence and Logic Synthesis Functional decomposition Pattern Recognition Fuzzy Systems Genetic Algorithms Intelligent Control Intelligent Databases Knowledge-based Engineering Learning Algorithms Memory, Storage and Retrieval Multimedia Systems Formal Models for Multimedia Interactive Multimedia Multimedia and Virtual Reality Multimedia and Telecommunications Multimedia Information Retrieval Special Sessions: Artificial Intelligence and Logic Synthesis: intelligent algorithms for logic synthesis; functional decomposition in machine learning, pattern recognition, knowledge discovery and logic synthesis;evolutionary and reconfigurable computing with FPGAs. Chair: Lech Jozwiak, Eindhoven University, Netherlands. Multimedia Information Retrieval: segmentation of audio, image and video; feature extraction and representation; semi-automatic text annotation techniques; indexing structure; query model and retrieval methods; feature similarity measurement; system integration issues; prototype systems and applications. Chair: Guojun Lu, Monash University, Australia. Pre-Conference Workshops and Tutorial: Proposals for pre-conference workshops and tutorials relevant to the conference topics are invited. These are to be held on Saturday 7th February and Sunday 8th February at the conference venue. People wishing to organise such workshops or tutorials are invited to submit a proposal at the same time as submission deadline for papers. The accepted proposals will be advertised. Special Poster Session: ICCIMA'98 will include a special poster session devoted to recent work and work-in-progress. Abstracts are solicited for this session (2 page limit) in camera ready form, and may be submitted up to 30 days before the conference date. They will not be refereed and will not be included in the proceedings, but will be distributed to attendees upon arrival. Students are especially encouraged to submit abstracts for this session. Invited Sessions Keynote speakers (key industrialists, chief research scientists and leading academics) will be addressing the main issues of the conference. Important Dates: Submission of papers received latest on: 7 July 97 Notification of acceptance: 19 September 97 Camera ready papers & registration received by: 24 October 97 Submission of Papers Papers in English reporting original and unpublished research results and experience are solicited. Electronic submission of papers via e-mail in postscript or Microsoft Word for Windows format directly to the General Chair are acceptable and encouraged for the refereeing process. If not submitting an electronic version, please submit three hard copy originals to the General Chair. Papers for refereeing purposes must be received at the ICCIMA 98 secretariat latest by 7 July 1997. Notification of acceptance will be mailed by 19 September 1997. Page Limits Papers for refereeing should be double-spaced and must include an abstract of 100-150 words with up to six keywords. The accepted papers will need to be received at the ICCIMA 98 secretariat by 24 October 1997 in camera ready format. A final preparation format for the camera-ready papers will be provided upon notification of acceptance. Camera ready papers exceeding 6 pages (including abstract, all text, figures, tables and references etc.) will be charged an extra fee per page in excess to the normal registration. Evaluation Process All submissions will be refereed based on the following criteria by two reviewers with appropriate background. originality significance contribution to the area of research technical quality relevance to ICCIMA 98 topics clarity of presentation Referees report will be provided to all authors. Check List Prospective authors should check that the following items are attached and guidelines followed while submitting the papers for refereeing purpose. * The paper and its title page should not contain the name(s) of the author(s), or their affiliation * The paper should have attached a covering page containing the following information: -title of the paper -author name(s), Affiliation, mail and e-mail addresses, phone and fax numbers -Conference topic area -up to six keywords * The name, e-mail, phone, fax and postal address of the contact person should be attached to the submission Visits and Social Events Industrial and sight seeing visits will be arranged for the delegates and guests. A separate program will be arranged for companions during the conference. General Chair: Henry Selvaraj Gippsland School of Computing & Information Technology Monash University, Churchill, VIC, Australia 3842 Henry.Selvaraj at fcit.monash.edu.au Phone: +61 3 9902 6665 Fax: +61 3 9902 6842 International Programme Committee: Abdul Sattar, Griffith University, Australia Andre de Carvalho, University of Sao Paulo, Brazil Bob Bignall, Monash University, Australia Brijesh Verma, Griffith University, Australia (Programme Chair) Dinesh Patel, Surrey University, UK Henry Selvaraj, Monash University, Australia Hyunsoo Lee, University of Yonsei, Korea Jan Mulawka, Warsaw University of Technology, Poland Jong-Hwan Kim, Korea Advanced Institute of Science & Technology, Korea Lech Jozwiak, Eindhoven Univ. of Tech, Netherlands Margaret Marek-Sadowska, University of California, USA Marek Perkowski, Portland State University, USA Michael Bove, MIT Media Laboratory, USA Mikio Takagi, University of Tokyo, Japan Nagarajan Ramesh,Tencor Instruments, USA Ramana Reddy, West Virginia University, USA Regu Subramanian, Nanyang Tech University, Singapore Sargur Srihari, State University of New York, USA Shyam Kapur, James Cook University, Australia Sourav Kundu, Kanazawa University, Japan S. Srinivasan, IIT, Madras, India Subhash Wadhwa, IIT, Delhi, India Tadeusz Luba, Warsaw University of Technology, Poland Vishy Karri, University of Tasmania, Australia Xin Yao, University of New South Wales, Australia International Liaison Asian Liaison: Regu Subramanian, Network Technology Research Centre, Nanyang Technological University, Singapore U.S. Liaison: Marek Perkowski, Portland State University, USA European Liaison: Tadeusz Luba, Warsaw University of Technology, Poland Organising Committee: Bob Bignall, Monash University, Australia Baikunth Nath, Monash University, Australia Vishy Karri, University of Tasmania, Australia Syed M. Rahman, Monash University, Australia Bala Srinivasan, Monash University,Australia Cheryl Brickell, Monash University, Australia Andy Flitman, Monash University, Australia Lindsay Smith, Monash University, Australia Further Information: Conference Email : iccima98 at fcit.monash.edu.au Conference WWW Page: http://www-gscit.fcit.monash.edu.au/~iccima98 From esann at dice.ucl.ac.be Tue May 6 07:43:02 1997 From: esann at dice.ucl.ac.be (esann@dice.ucl.ac.be) Date: Tue, 6 May 1997 13:43:02 +0200 Subject: ESANN'97 proceedings available Message-ID: <199705061136.NAA09184@ns1.dice.ucl.ac.be> The following proceedings are available: --------------------------------------------------- | ESANN'97 | | European Symposium | | on Artificial Neural Networks | | | | Bruges - April 16-17-18, 1997 | --------------------------------------------------- ESANN'97 proceedings D facto publications (Belgium) ISBN 2-9600049-7-3, 362 pages Price: BEF 2000 Instructions to obtain these proceedings and the table of contents are available on the ESANN Web server: http://www.dice.ucl.ac.be/neural-nets/esann/ You may also contact directly the publisher: D facto publications 45 rue Masui B-1000 Brussels Belgium Tel: + 32 2 203 43 63 Fax: + 32 2 203 42 94 The previous ESANN proceedings are also available: - ESANN'96 proceedings ISBN 2-9600049-6-5, 340 pages Price: BEF 2000 - ESANN'95 proceedings ISBN 2-9600049-3-0, 382 pages Price: BEF 2000 - ESANN'94 proceedings ISBN 2-9600049-1-4, 287 pages Price: BEF 1500 - ESANN'93 proceedings ISBN 2-9600049-0-6, 243 pages Price: BEF 1500 Please add BEF 500 to any order for p. & p. _____________________________ _____________________________ D facto publications - Michel Verleysen conference services Univ. Cath. de Louvain - DICE 45 rue Masui 3, pl. du Levant 1000 Brussels B-1348 Louvain-la-Neuve Belgium Belgium tel: +32 2 203 43 63 tel: +32 10 47 25 51 fax: +32 2 203 42 94 fax: +32 10 47 25 98 esann at dice.ucl.ac.be verleysen at dice.ucl.ac.be http://www.dice.ucl.ac.be/neural-nets/esann _____________________________ _____________________________ From khosla at latcs1.cs.latrobe.edu.au Fri May 9 21:31:04 1997 From: khosla at latcs1.cs.latrobe.edu.au (khosla@latcs1.cs.latrobe.edu.au) Date: Sat, 10 May 1997 11:31:04 +1000 (AEST) Subject: Book on Engineering Intelligent Hybrid Multi-Agent Systems Message-ID: <199705100131.LAA09241@ipc6.cs.latrobe.edu.au> Please accept my sincere apologies if you receive multiple copies of this posting. *** BOOK ANNOUNCEMENT *** ENGINEERING INTELLIGENT HYBRID MULTI-AGENT SYSTEMS by Rajiv Khosla and Tharam Dillon This book is about building intelligent hybrid systems, problem solving, and software modeling. It is relevant to practioners and researchers in the areas of intelligent hybrid systems, control systems, multi-agent systems, knowledge discovery and data mining, software engineering, and enterprise-wide systems modeling. The book in many ways is a synergy of all these areas. The book can also be used as a text or reference book for postgraduate students in intelligent hybrid systems, software engineering, and system modeling. On the intelligent hybrid systems front, the book covers applications and design concepts related to fusion systems, transformation systems and combination systems. It describes industrial applications in these areas involving hybrid configurations of knowledge based systems, case-based reasoning, fuzzy systems, artificial neural networks, and genetic algorithms. On the problem solving front, the book describes an architectural theory for engineering intelligent associative hybrid multi-agent systems. The architectural theory is described at the task structure level and the computational level. From an organizational context the problem solving architecture is not only relevant at the knowledge engineering layer for developing knowledge agents but also at the information engineering layer for developing information agents. On the software modeling front, the book describes the role of objects, agents and problem solving in knowledge engineering, information engineering, data engineering, and software modeling of intelligent hybrid systems. Based on the concepts developed in the book, an enterprise-wide systems modeling framework is described to facilitate forward and backward integration of systems developed in the knowledge, information, and data engineering layers of an organization. In the modeling process, agent oriented analysis, design, and reuse aspects of software engineering are also discussed. The book consists of four parts: Part I: introduces various methodologies and their hybrid applications in the industry. Part II: describes a multi-agent architectural theory of associative intelligent hybrid systems at the task structure level and the computational level. It covers various aspects related to knowledge modeling of hybrid systems. Part III: describes the software engineering aspects of the architecture. It does that by describing a real-time alarm processing application of the architecture. Part IV: takes a boader view of the various concepts and theories developed in Part II and III of the book respectively in terms of enterprise-wide systems modeling, multi-agent systems, control systems, and software engineering and reuse. Part I is described through chapters 1, 2, 3, 4 and 5 respectively. Part II is described through chapters 6, 7, and 8 respectively. Part III is described through chapters 9, 10, 11, and 12 respectively. Part IV is described through chapters 13 and 14 respectively. ------------------------------------------------------------------------------- Summary of Table of Contents PART I: Methodologies and Applications Chapter 1. Why Intelligent Hybrid Systems 1.1 Introduction 1.2 Evolution of Hybrid Systems 1.3 Classes of Hybrid Systems 1.4 Summary Chapter References Chapter 2. Methodologies 2.1 Introduction 2.2 Expert Systems 2.3 Artficial Neural Networks 2.4 Fuzzy Systems 2.5 Genetic Algorithms 2.6 Knowledge Discovery and Data Mining 2.7 Object-Oriented Methodology 2.8 Agents and Agent Architectures 2.9 Summary Chapter References Chapter 3. Intelligent Fusion and Transformation Systems 3.1 Introduction 3.2 Fusion and Transformation 3.3 Neural Network Based Neuro-Symbolic Fusion and Transformation Systems 3.4 Neural Network Based Neuro-Fuzzy Fusion and Transformation Systems 3.5 Recapitulation 3.6 Genetic Algorithms Based Fusion and Transformation Systems 3.7 Summary Chapter References Chapter4. Intelligent Combination Systems 4.1 Introduction 4.2 Intelligent Combination Approaches 4.3 Neuro-Symbolic Combination Systems 4.4 Symbolic-Genetic Combination Scheduling System 4.5 Neuro-Fuzzy Combination Systems 4.6 Neuro-Fuzzy-Case Combination System 4.7 Combination Approach Based Intelligent Hybrid Control Applications 4.8 Summary Chapter References Chapter 5. Knowledge Discovery, Data Mining and Hybrid Systems 5.1 Introduction 5.2 KDD Process 5.3 KDD Application in Forecasting 5.4 Financial Trading Application 5.5 Learning Rules \& Knowledge Hierarchies in the LED Digit Domain 5.6 Rule Extraction in Computer Network Diagnosis Application 5.7 Summary Chapter References ------------------------------------------------------------------------------ PART II: Problem Solving and Architectural Theory Chapter 6. Association Systems - Task Structure Level Associative Hybrid Architecture 6.1 Introduction 6.2 Various Perspectives Characterizing Problem Solving 6.3 Task Structure Level Architecture 6.4 Some Observations 6.5 Summary Chapter References Chapter 7. Intelligent Multi-Agent Hybrid Computational Architecture - Part I 7.1 Introduction 7.2 Object-Oriented Model 7.3 Agent Model 7.4 Distributed Operating System Process Model 7.5 Computational Level Intelligent Multi-Agent Hybrid Distributed Architecture (IMAHDA) 7.6 Agent Building Blocks of IMAHDA 7.7 Summary Chapter References Chapter 8. Intelligent Multi-Agent Hybrid Computational Architecture - Part II 8.1 Introduction 8.2 Communication in IMAHDA 8.3 Concept Learning in IMAHDA 8.4 Underlying Training Problems with Neural Networks 8.5 Learning and IMAHDA 8.6 Learning Knowledge in IMAHDA 8.7 Learning Strategy in IMAHDA 8.8 Dynamic Analysis of IMAHDA 8.9 Comprehensive View of IMAHDA's Agents 8.10 Emergent Characteristics of IMAHDA 8.11 Summary Chapter References ----------------------------------------------------------------------- PART III: Software Engineering Aspects Chapter 9. Alarm Processing - An Application of IMAHDA 9.1 Introduction 9.2 Characteristics of the Problem 9.3 Survey of Existing Methods 9.4 IMAHDA and Alarm Processing 9.5 Application of IMAHDA 9.6 Summary Chapter References Chapter 10. Agent Oriented Analysis and Design of the RTAPS - Part I 10.1 Introduction 10.2 Agent Oriented Analysis (AOA) 10.3 AOA of the RTAPS 10.4 Summary Chapter References Chapter 11. Agent Oriented Analysis and Design of the RTAPS - Part II 11.1 Introduction 11.2 Agent Oriented Analysis Continued 11.3 Agent Oriented Design of the RTAPS 11.4 Emergent Characteristics of the RTAPS Agents 11.5 Summary Chapter References Chapter 12. RTAPS Implementation 12.1 Introduction 12.2 IMAHDA Related Issues 12.3 Training of Neural Networks in RTAPS 12.4 Power System Aspects of the RTAPS 12.5 Scalability and Cost Effectiveness 12.6 Summary Chapter References ----------------------------------------------------------------------- PART IV: Software Modeling Chapter 13. From Data Repositories to Knowledge Repositories - Intelligent Organizations 13.1 Introduction 13.2 Information Systems and Organizational Levels 13.3 Characteristics of Information Systems 13.4 Information Systems and Knowledge Systems 13.5 IMAHDA and Organizational Knowledge Systems 13.6 Application of IMAHDA in Sales \& Marketing Function 13.7 Unified Approach to Enterprise-Wide System Modeling 13.8 Summary Chapter References Chapter 14. IMAHDA Revisited 14.1 Introduction 14.2 IMAHDA and Problem Solving 14.3 IMAHDA and Hybrid Systems 14.4 IMAHDA and Control Systems 14.5 IMAHDA and Multi-Agent Systems 14.6 IMAHDA and Software Engineering 14.7 IMAHDA and Enterprise-wide Systems Modeling Chapter References Appendices Index The book consists of 412 pages and is being published in USA by Kluwer Academic Publishers. For ordering and other information, please contact Alexander Greene Publisher Kluwer Academic Publishers 101 Philip Drive Assinippi Park Norwell, MA 02061 U.S.A Phone: +1.617.8716600 Fax: +1.617.871.6528 E-Mail: agreene at wkap.com -------------------------------------------------------------------- Dr Rajiv Khosla School of Computer and Computer Engineering La Trobe University Melbourne, Victoria - 3083 Australia Phone: +61.3.94793034 Fax: +61.3.94793060 E-Mail:khosla at cs.latrobe.edu.au From imlm at tuck.cs.fit.edu Sat May 10 16:29:41 1997 From: imlm at tuck.cs.fit.edu (IMLM Workshop (pkc)) Date: Sat, 10 May 1997 15:29:41 -0500 Subject: CFP: MLJ special issue on IMLM Message-ID: <199705102029.PAA01437@tuck.cs.fit.edu> Dear colleagues, Here is a CFP for the Machine Learning Journal special issue on IMLM. Submission is due on Oct 1st, 97. Hope you can submit. Thanks. Phil, Sal, and Dave ------ CALL FOR PAPERS Machine Learning Journal Special Issue on Integrating Multiple Learned Models for Improving and Scaling Machine Learning Algorithms Most modern Machine Learning, Statistics and KDD techniques use a single model or learning algorithm at a time, or at most select one model from a set of candidate models. Recently however, there has been considerable interest in techniques that integrate the collective predictions of a set of models in some principled fashion. With such techniques often the predictive accuracy and/or the training efficiency of the overall system can be improved, since one can "mix and match" among the relative strengths of the models being combined. Any aspect of integrating multiple models is appropriate for the special issue. However we intend the focus of the special issue to be on the issues of improving prediction accuracy and improving training efficiency in the context of large databases. Submissions are sought in, but not limited to, the following topics: 1) Techniques that generate and/or integrate multiple learned models. Examples are schemes that generate and combine models by * using different training data distributions (in particular by training over different partitions of the data) * using different sampling techniques to generate different partitions * using different output classification schemes (for example using output codes) * using different hyperparameters or training heuristics (primarily as a tool for generating multiple models) 2) Systems and architectures to implement such strategies. For example, * parallel and distributed multiple learning systems * multi-agent learning over inherently distributed data 3) Techniques that analyze the integration of multiple learned models for * selecting/pruning models * estimating the overall accuracy * comparing different integration methods * tradeoff of accuracy and simplicity/comprehensibility Schedule: October 1: Deadline for submissions December 15: Deadline for getting decisions back to authors March 15: Deadline for authors to submit final versions August 1998: Publication Submission Guidelines: 1) Manuscripts should conform to the formatting instructions in: http://www.cs.orst.edu/~tgd/mlj/info-for-authors.html The first author will be the primary contact unless otherwise stated. 2) Authors should send 5 copies of the manuscript to: Karen Cullen Machine Learning Editorial Office Attn: Special Issue on IMLM Kluwer Academic Press 101 Philip Drive Assinippi Park Norwell, MA 02061 617-871-6300 617-871-6528 (fax) kcullen at wkap.com and one copy to: Philip Chan MLJ Special Issue on IMLM Computer Science Florida Institute of Technology 150 W. University Blvd. Melbourne, FL 32901 407-768-8000 x7280 (x8062) (407-674-7280/8062 after 6/1/97) 407-984-8461 (fax) 3) Please also send an ASCII title page (title, authors, email, abstract, and keywords) and a postscript version of the manuscript to imlm at cs.fit.edu. General Inquiries: Please address general inquiries to: imlm at cs.fit.edu Up-to-date information is maintained on WWW at: http://www.cs.fit.edu/~imlm/ Co-Editors: Philip Chan, Florida Institute of Technology pkc at cs.fit.edu Salvatore Stolfo, Columbia University sal at cs.columbia.edu David Wolpert, IBM Almaden Research Center dhw at almaden.ibm.com From school at cogs.nbu.acad.bg Sun May 11 09:22:52 1997 From: school at cogs.nbu.acad.bg (CogSci Summer School) Date: Sun, 11 May 1997 16:22:52 +0300 Subject: CogSci97 deadline approaches Message-ID: 4th International Summer School in Cognitive Science Sofia, July 14 - 26, 1997 Call for Papers and School Brochure The Summer School features introductory and advanced courses in Cognitive Science, participant symposia, panel discussions, student sessions, and intensive informal discussions. Participants will include university teachers and researchers, graduate and senior undergraduate students. International Advisory Board Elizabeth BATES (University of California at San Diego, USA) Amedeo CAPPELLI (CNR, Pisa, Italy) Cristiano CASTELFRANCHI (CNR, Roma, Italy) Daniel DENNETT (Tufts University, Medford, Massachusetts, USA) Ennio De RENZI (University of Modena, Italy) Charles DE WEERT (University of Nijmegen, Holland ) Christian FREKSA (Hamburg University, Germany) Dedre GENTNER (Northwestern University, Evanston, Illinois, USA) Christopher HABEL (Hamburg University, Germany) Joachim HOHNSBEIN (Dortmund University, Germany) Douglas HOFSTADTER (Indiana University, Bloomington, Indiana, USA) Keith HOLYOAK (University of California at Los Angeles, USA) Mark KEANE (Trinity College, Dublin, Ireland) Alan LESGOLD (University of Pittsburg, Pennsylvania, USA) Willem LEVELT (Max-Plank Institute of Psycholinguistics, Nijmegen, Holland) David RUMELHART (Stanford University, California, USA) Richard SHIFFRIN (Indiana University, Bloomington, Indiana, USA) Paul SMOLENSKY (University of Colorado, Boulder, USA) Chris THORNTON (University of Sussex, Brighton, England) Carlo UMILTA' (University of Padova, Italy) Eran ZAIDEL (University of California at Los Angeles, USA) Courses Dynamics of Change: Lessons from Human Development - Linda Smith (Indiana University, USA) Ecological Approaches to Human Memory - William Hirst (New School for Social Research, USA) Cognitive Approaches to Syntax - Robert Van Valin (State University of New York at Buffalo, USA) Culture and Cognition - Naomi Quinn (Duke University, USA) Spatial Attention - Carlo Umilta' (University of Padova, Italy) Cognitive modeling in ACT-R - Werner Tack (University of Saarlandes, Germany) Spatial Concepts and Spatial Representation - Emile van der Zee (Hamburg University) Brain Imaging Techniques for Cognitive Neurosciences - Joachim Hohnsbein (University of Dortmund) Participant Symposia Participants are invited to submit papers reporting completed research which will be presented (30 min) at the participant symposia. Authors should send full papers (8 single spaced pages) in triplicate or electronically (postscript, RTF, MS Word or plain ASCII) by May 15. Selected papers will be published in the School's Proceedings. Only papers presented at the School will be eligible for publication. Student Session Graduate students in Cognitive Science are invited to present their work at the student session. Research in progress as well as research plans and proposals for M.Sc. Theses and Ph.D. Theses will be discussed at the student session. Papers will not be published in the School's Proceedings. Panel Discussions Cognitive Science in the 21st century Cognition in Context: Social, Cutural, Physical, Developmental Brain, Body, Environment, and Cognition Dynamics of Cognition: Short-Term and Long-Term Dynamics Local Organizers New Bulgarian University, Bulgarian Academy of Sciences, Bulgarian Cognitive Science Society Sponsors TEMPUS SJEP 07272/94 Local Organizing Committee Boicho Kokinov - School Director, Elena Andonova, Gergana Yancheva, Iliana Haralanova Timetable Registration Form: as soon as possible Deadline for paper submission: May 15 Notification for acceptance: June 1 Early registration: June 5 Arrival date and on site registration July 13 Summer School July 14-25 Excursion July 20 Departure date July 26 Paper submission to: Boicho Kokinov Cognitive Science Department New Bulgarian University 21, Montevideo Str. Sofia 1635, Bulgaria e-mail: school at cogs.nbu.acad.bg Send your Registration Form to: e-mail: school at cogs.nbu.acad.bg (If you don't receive an aknowledgement within 3 days, send a message to kokinov at bgearn.acad.bg) From koza at CS.Stanford.EDU Sat May 10 16:09:18 1997 From: koza at CS.Stanford.EDU (John R. Koza) Date: Sat, 10 May 1997 13:09:18 -0700 (PDT) Subject: GP-97 Revised Call for Participation Message-ID: <199705102009.NAA26523@Sunburn.Stanford.EDU> CALL FOR PARTICIPATION Genetic Programming 1997 Conference (GP-97) July 13 - 16 (Sunday - Wednesday), 1997 Fairchild Auditorium - Stanford University - Stanford, California ----------------------------------------------------------------------- In cooperation with American Association for Artificial Intelligence (AAAI), Association for Computing Machinery (ACM), SIGART, and Society for Industrial and Applied Mathematics (SIAM) ----------------------------------------------------------------------- WWW FOR GP-97: http://www-cs-faculty.stanford.edu/~koza/gp97.html ----------------------------------------------------------------------- NOTE: You are urged to make your housing arrangements as early as possible since convenient hotel locations are limited. Also, if you are driving to the Stanford campus, please be aware of parking lot construction in the area of Fairchild Auditorium and allow a little extra time (particularly on the first Monday session) to find a parking place. ----------------------------------------------------------------------- Genetic programming is an automatic programming technique for evolving computer programs that solve (or approximately solve) problems. Starting with a primordial ooze of thousands of randomly created computer programs, a population of programs is progressively evolved over many generations using the Darwinian principle of survival of the fittest, a sexual recombination operation, and occasional mutation. The first annual genetic programming conference in 1996 featured 15 tutorials, 2 invited speakers, 3 parallel tracks, 73 papers, and 17 poster papers in proceedings book, and 27 late-breaking papers in a separate book distributed to conference attendees, and 288 attendees. A description of GP-96 appears in the October 1996 issue of Scientific American (http://www.sciam.com/WEB/1096issue/1096techbus3.html). This second annual conference in 1997 reflects the rapid growth of this field in which over 600 technical papers have been published since 1992. For August 5, 1996 article in E. E. Times on GP-96 conference and August 12, 1996 article in E. E Times on John Holland's invited speech at GP-96, go to http://www.techweb.com/search/search.html There will be 36 long, 33 short, and 15 poster papers at the Second Annual Genetic Programming Conference to be held on July 13-16 (Sunday - Wednesday), 1997 at Stanford University. In addition, there will be late-breaking papers (published in a separate book in mid June after the June 11 deadline for late-breaking papers). Topics include, but are not limited to, applications of genetic programming, theoretical foundations of genetic programming, implementation issues, technique extensions, cellular encoding, evolvable hardware, evolvable machine language programs, automated evolution of program architecture, evolution and use of mental models, automatic programming of multi-agent strategies, distributed artificial intelligence, auto-parallelization of algorithms, automated circuit synthesis, automatic programming of cellular automata, induction, system identification, control, automated design, data and image compression, image analysis, pattern recognition, molecular biology applications, grammar induction, and parallelization. Papers describing recent developments are also solicited in the following additional areas: genetic algorithms, classifier systems, evolutionary programming and evolution strategies, artificial life and evolutionary robotics, DNA computing, and evolvable hardware. ----------------------------------------------------------------------- INVITED SPEAKERS: - Ellen Goldberg, President, Santa Fe Institute - Susumu Ohno, Ben Horowitz Chair of Distinguished Scientist in Theoretical Biology, Beckman Research Institute - David B. Fogel, Natural Selection Inc. and Editor-In-Chief of the IEEE Transactions on Evolutionary Computation ----------------------------------------------------------------------- SPECIAL PROGRAM CHAIRS The main focus of the conference (and most of the papers) will be on genetic programming. In addition, papers describing recent developments in the closely related areas will be reviewed and selected by special program committees appointed and supervised by the following special program chairs. --- Genetic Algorithms: Kalyanmoy Deb, Indian Inst of Tech - Kanpur, India --- Classifier Systems: Rick L. Riolo, University of Michigan --- Evolutionary Programming and Evolution Strategies: David B. Fogel, Natural Selection Inc, San Diego --- Artificial Life and Evolutionary Robotics: Marco Dorigo, Universite Libre de Bruxelles --- DNA Computing: Max Garzon, University of Memphis --- Evolvable Hardware: Hitoshi Iba, Electrotechnical Laboratory, Japan ----------------------------------------------------------------------- 20 TUTORIALS AT GP-97 (Note: Slight Revisions from earlier listing) Sunday July 13 - 9:15 AM - 11:30 AM --- Genetic Algorithms - David E. Goldberg, University of Illinois at Urbana- Champaign --- Evolvable Hardware - Tetsuya Higuchi - Electrotechnical Laboratory, Tsukuba, Japan --- Program Growth Control in Genetic Programming - Byoung-Tak Zhang, Konkuk University, Seoul, South Korea and Hitoshi Iba, Electrotechnical Laboratory, Tsukuba, Japan --- Introduction to Genetic Programming - John Koza, Stanford University ----------------------------------------------------------------------- Sunday July 13 - 1:00 PM - 3: 15 PM --- Evolutionary Algorithms for Computer-Aided Design of Integrated Circuits - Rolf Drechsler - Albert-Ludwigs-University, Freiburg, Germany --- Self-Replicating Systems in Cellular Space Models - Jason Lohn - Stnaofrd University --- Neural Networks - Bernard Widrow - Stanford University --- Advanced Genetic Programming - John Koza, Stanford University ----------------------------------------------------------------------- Sunday July 13 - 3:45 PM - 6 PM --- Evolutionary Programming and Evolution Strategies - David Fogel, University of California, San Diego --- Genetic Programming Representations - Astro Teller - Carnegie Mellon University --- Design of Electrical Circuits using Genetic Programming - David Andre University of California - Berkeley and Forrest H Bennett III - Stanford University --- Genetic Programming with Linear Genomes - Wolfgang Banzhaf, University of Dortmund, Germany ----------------------------------------------------------------------- Tuesday July 15 - 3:25 PM - 5:40 PM --- Computational Learning Theory - Vasant Honavar - Iowa State University --- Machine Learning - Pat Langley, Institute for the Study of Learning and Expertise --- Molecular Biology for Computer Scientists - Russ B. Altman, Stanford University --- Simulated Evolution of Models - Janine Graf - Inquire America Corp ----------------------------------------------------------------------- Tuesday July 15 - 7:30 PM - 9:30 PM --- DNA Computing - Russell Deaton and Randy C. Murphy - University of Memphis --- Evolutionary Algorithms with Mathematica - Christian Jacobs --- Cellular Programming: Evolution Of Parallel Cellular Machines - Moshe Sipper - Swiss Federal Institute of Technology, Lausanne --- Machine Language Genetic Programming - Peter Nordin DaCapo AB, Sweden ----------------------------------------------------------------------- GENERAL CHAIR: John Koza, Stanford University PUBLICITY CHAIR: Patrick Tufts, Brandeis University EXECUTIVE COMMITTEE: David Andre, Forrest H Bennett III, Jason Lohn ----------------------------------------------------------------------- FOR MORE INFORMATION ABOUT THE GP-97 CONFERENCE: See the GP-97 home page on the World Wide Web: http://www-cs-faculty.stanford.edu/~koza/gp97.html E- MAIL: gp at aaai.org. PHONE: 415-328-3123. FAX: 415-321-4457. The conference is operated by Genetic Programming Conferences, Inc. (a California not-for- profit corporation). ----------------------------------------------------------------------- FOR MORE INFORMATION ABOUT GENETIC PROGRAMMING IN GENERAL: http://www-cs- faculty.stanford.edu/~koza/. ----------------------------------------------------------------------- Hotel information: Numerous local hotels within a short distance of Stanford University are listed at the GP-97 home page. Because of other events held in the area during the summer, attendees are urged to make their arrangements for accomodations early. For your convenience, AAAI has reserved a block of rooms at the Holiday Inn-Palo Alto Hotel, 625 El Camino Real, Palo Alto, CA 94301, Phone: 800-874-3516 or 415-328-2800, FAX: 415-327-7362. Make your reservations directly with the Holiday Inn before June 28, 1997 for the GP-97 rate rate of $99 single and $109 double. In addition, AAAI has reserved a block of rooms at the Stanford Terrace Inn, 531 Stanford Avenue, Palo Alto, CA 94306, Phone: 800-729-0332 or 415-857-0333, FAX: 415-857-0343. Make your reservations directly with the Stanford Terrace Inn before June 11, 1997. There is a free Stanford University shuttle (called Marguerite) that stops near both of these hotels (and various other hotels, the train station, and Palo Alto locations). ----------------------------------------------------------------------- University Housing information: A limited number of spaces are available at Stanford University housing on a first-come-first-served basis. The final deadline for University housing applications is June 13, 1997. See the GP-97 WWW home page for a university housing application form. ----------------------------------------------------------------------- TRAVEL INFORMATION: Stanford University is near Palo Alto in Northern California and is about 40 miles south of San Francisco. Stanford is about 25 miles south of the San Francisco International Airport and about 25 miles north of San Jose International Airport. Oakland airport is about 45 miles away. Conventions in America has arranged special GP-97 airline and car rental discounts. For travel between July 10 - 20, 1997, American Airlines can save you 5% on lowest applicable fares or 10% off lowest unrestricted coach fares, with 7-day advance purchases. Some restrictions apply. Hertz is offering special low conference rates with unlimited free mileage. Please contact Conventions in America concerning "Group #428" at 1-800-929-4242; or phone 619-678-3600; or FAX 619-678-3699 or e-mail scltravel at cgl.com.If you call American Airlines direct at 800-433-1790, ask for "Index #S9485." If you call Hertz direct at 800-654-2240, ask for "CV #24250." See the GP-97 WWW home page for additional details. ------------------------------------------------------------------------ SAN FRANCISCO BAY AND SILICON VALLEY TOURIST INFORMATION: Try the Stanford University home page at http://www.stanford.edu/, the Hyperion Guide at http://www.hyperion.com/ba/sfbay.html; the Palo Alto weekly at http://www.service.com/PAW/home.html; the California Virtual Tourist at http://www.research.digital.com/SRC/virtual-tourist/California.html; and the Yahoo Guide of San Francisco at http://www.yahoo.com/Regional_Information/States/California/San_Francisco. ----------------------------------------------------------------------- CONTEMPORANEOUS CONFERENCES IN CALIFORNIA AND ELSEWHERE: GP-97 is concurrent with the 45th Anniversary meeting of the Society for Industrial and Applied Mathematics (SIAM) on July 14-18, 1997 at Stanford University (http://www.siam.org). GP-97 comes just after the IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA-97) on July 10 - 11, 1997 in Monterey, California (90 miles from Stanford University) and the IEEE 8th International Conference on Advanced Robotics (ICAR-97) on July 5 - 9, 1997 in Monterey http://www.cs.cmu.edu/afs/cs/project/space/www/cira97/conference.html. Other non-California conferences of interest include AAAI-97 on July 27-31, 1997 in Providence, Rhode Island (http://www.aaai.org/); ICGA-97 on July 20-23, 1997 in East Lansing, Michigan (http://isl.cps.msu.edu/GA/icga97); European Artificial Life Conference on July 28-31, 1997 in Brighton, England (http://www.cogs.susx.ac.uk/ecal97/); and IJCAI-97 on August 26-29, 1997 in Nagoya, Japan (http://www.aaai.org/). ----------------------------------------------------------------------- MEMBERSHIP IN THE ACM, AAAI, or SIAM: For information about ACM membership, go to http://www.acm.org/; for SIGART, http://sigart.acm.org/; for AAAI http://www.aaai.org/; and for SIAM, http://www.siam.org. There is a discount on GP-97 registration fees for members of ACM, SIGART, AAAI, and SIAM. ----------------------------------------------------------------------- ADDRESSES FOR GP-97: GP-97 Conference, c/o American Association for Artificial Intelligence, 445 Burgess Drive, Menlo Park, CA 94025. PHONE: 415- 328-3123. FAX: 415-321-4457. E-MAIL: gp at aaai.org. WWW FOR AAAI: http://www.aaai.org/. WWW FOR GP-97: http://www-cs- faculty.stanford.edu/~koza/gp97.html ----------------------------------------------------------------------- REGISTRATION FORM FOR genetic programming 1997 CONFERENCE July 13 - 16 (Sunday - Wednesday), 1997 at Stanford University First Name ________________ Last Name _____________ Affiliation _________________________________________ Address ____________________________________________ __________________________________________________ City _______________________ State/Province _________ Zip/Postal Code ______________ Country _______________ Daytime telephone __________________________________ E-Mail address _____________________________________ Conference registration fee includes admission to all conference sessions and events, one copy of conference proceedings book, attendance at 5 tutorials of your choice, syllabus books for your 5 tutorials, Sunday night welcoming wine and cheese reception, Monday night conference dinner reception, one copy of a book of late-breaking papers, the conference T-shirt, 4 box lunches, and coffee breaks. Conference proceedings will be mailed to registered attendees with U.S. mailing addresses via 2-day U.S. priority mail about 1 - 2 weeks prior to the conference at no extra charge (at addressee's risk). If you are uncertain as to whether you will be at the above address at that time or DO NOT WANT your proceedings mailed to you at the above address for any other reason, your copy of the proceedings will be held for you at the conference registration desk if you check here ___. ------------------------------------- REGISTER BY June 19 FOR LOWER REGISTRATIONS FEES ------------------------------------- Postmarked by June 19 Student - ACM, SIAM or AAAI Member - $245 Regular - ACM, SIAM, or AAAI Member - $445 Student - Non-member - $265 Regular - Non-member - $465 ------------------------------------- Postmarked after June 19, 1997 or on-site - Add $50 to June 19 rates ------------------------------------- Member Number: ACM # ___________ SIAM # _________ AAAI # _________ Students must send legible proof of full-time student status. ------------------------------------- Stanford Parking Permits ($6 per day - C). Number of days ___ Total $_____ ------------------------------------- Grand Total (enter appropriate amount) $ _____________ ------------------------------------- ___ Check or money order made payable to "AAAI" (in U.S. funds) ___ Mastercard ___ Visa ___ American Express Credit card number __________________________________________ Expiration Date _________ Signature ____________________________________________ ------------------------------------- T-Shirt Size: ___ small ___ medium ___ large ___ extra-large ------------------------------------- TUTORIALS: Check off a box for one tutorial from each of the 6 rows: Sunday July 13 - 9:15 AM - 11:30 AM --- Genetic Algorithms --- Evolvable Hardware --- Program Growth Control in Genetic Programming --- Introduction to Genetic Programming ------------------------------------- Sunday July 13 - 1:00 PM - 3: 15 PM --- Evolutionary Algorithms for Computer-Aided Design of Integrated Circuits --- Self-Replicating Systems in Cellular Space Models --- Neural Networks --- Advanced Genetic Programming ------------------------------------- --- Evolutionary Programming and Evolution Strategies --- Genetic Programming Representations --- Design of Electrical Circuits using Genetic Programming --- Genetic Programming with Linear Genomes ------------------------------------- Tuesday July 15 - 3:25 PM - 5:40 PM --- Computational Learning Theory --- Simulated Evolution of Models --- Machine Learning --- Molecular Biology for Computer Scientists ------------------------------------- Tuesday July 15 - 7:30 PM - 9:30 PM --- DNA Computing --- Evolutionary Algorithms with Mathematica --- Cellular Programming: Evolution Of Parallel Cellular Machines --- Machine Language Genetic Programming ------------------------------------- No refunds will be made; however, we will transfer your registration to a person you designate upon notification. ------------------------------------- SEND TO: GP-97 Conference, c/o American Association for Artificial Intelligence, 445 Burgess Drive, Menlo Park, CA 94025. PHONE: 415-328-3123. FAX: 415-321-4457. E-MAIL: gp at aaai.org. WWW FOR AAAI: http://www.aaai.org/. WWW FOR GP-97: http://www-cs-faculty.stanford.edu/~koza/gp97.html ----------------------------------------------------------------------- List of 84 Papers for Second Annual Genetic Programming Conference (GP-97), July 13-16, 1997, Stanford University ----------------------------------------------------------------- GENETIC PROGRAMMING Ahluwalia, Manu, Larry Bell, and Terence C. Fogarty Co-evolving Functions in Genetic Programming: A Comparison in ADF Selection Strategies Angeline, Peter J. Subtree Crossover: Building Block Engine or Macromutation? Ashlock, Dan GP-Automata for Dividing the Dollar Ashlock, Dan, and Charles Richter The Effect of Splitting Populations on Bidding Strategies Banzhaf, Wolfgang, Peter Nordin, and Markus Olmer Generating Adaptive Behavior for a Real Robot using Function Regression within Genetic Programming Bennett III, Forrest H A Multi-Skilled Robot that Recognizes and Responds to Different Problem Environments Bruce, Wilker Shane The Lawnmower Problem Revisited: Stack-Based Genetic Programming and Automatically Defined Functions Chen, Shu-Heng, and Chia-Hsuan Yeh Using Genetic Programming to Model Volatility in Financial Time Series Daida, Jason, Steven Ross, Jeffrey McClain, Derrick Ampy, and Michael Holczer Challenges with Verification, Repeatability, and Meaningful Comparisons in Genetic Programming Dain, Robert A. Genetic Programming For Mobile Robot Wall-Following Algorithms Deakin, Anthony G., and Derek F. Yates Economical Solutions with Genetic Programming: the Non- Hamstrung Squadcar Problem, FvM and EHP Dracopoulos, Dimitris C. Evolutionary Control of a Satellite Droste, Stefan Efficient Genetic Programming for Finding Good Generalizing Boolean Functions Eberbach, Eugene Enhancing Genetic Programming by $-calculus Esparcia-Alcazar, Anna J., and Ken Sharman Evolving Recurrent Neural Network Architectures by Genetic Programming Fernandez, Thomas, and Matthew Evett Training Period Size and Evolved Trading Systems Freitas, Alex A. A Genetic Programming Framework for Two Data Mining Tasks: Classification and Generalized Rule Induction Fuchs, Matthias, Dirk Fuchs, and Marc Fuchs Solving Problems of Combinatory Logic with Genetic Programming Gathercole, Chris, and Peter Ross Small Populations over Many Generations can beat Large Populations over Few Generations in Genetic Programming Gathercole, Chris, and Peter Ross Tackling the Boolean Even N Parity Problem with Genetic Programming and Limited-Error Fitness Geyer-Schulz, Andreas The Next 700 Programming Languages for Genetic Programming Gray, H. F., and R. J. Maxwell Genetic Programming for Multi-class Classification of Magnetic Resonance Spectroscopy Data Greeff, D. J., and C. Aldrich Evolution of Empirical Models for Metallurgical Process Systems Gritz, Larry, and James K. Hahn Genetic Programming Evolution of Controllers for 3-D Character Animation Harries, Kim, and Peter Smith Exploring Alternative Operators and Search Strategies in Genetic Programming Haynes, Thomas On-line Adaptation of Search via Knowledge Reuse Haynes, Thomas, and Sandip Sen Crossover Operators for Evolving A Team Hiden, Hugo, Mark Willis, Ben McKay, and Gary Montague Non-Linear And Direction Dependent Dynamic Modelling Using Genetic Programming Hooper, Dale C., Nicholas S. Flann, and Stephanie R. Fuller Recombinative Hill-Climbing: A Stronger Search Method for Genetic Programming Howley, Brian Genetic Programming and Parametric Sensitivity: a Case Study In Dynamic Control of a Two Link Manipulator Huelsbergen, Lorenz Learning Recursive Sequences via Evolution of Machine- Language Programs Iba, Hitoshi Multiple-Agent Learning for a Robot Navigation Task by Genetic Programming Jaske, Harri On code reuse in genetic programming Koza, John R., Forest H. Bennett III, Martin A. Keane, and David Andre Evolution of a Time-Optimal Fly-To Controller Circuit using Genetic Programming Koza, John R., Forest Bennett III, Jason Lohn, Frank Dunlap, Martin A. Keane, and David Andre Use of Architecture-Altering Operations to Dynamically Adapt a Three-Way Analog Source Identification Circuit to Accommodate a New Source Langdon, W. B., and R. Poli An Analysis of the MAX Problem in Genetic Programming Lensberg, Terje A Genetic Programming Experiment on Investment Behavior under Knightian Uncertainty Luke, Sean, and Lee Spector A Comparison of Crossover and Mutation in Genetic Programming Moore, Frank W., and Dr. Oscar N. Garcia A Genetic Programming Approach to Strategy Optimization in the Extended Two-Dimensional Pursuer/Evader Problem Nordin, Peter, and Wolfgang Banzhaf Genetic Reasoning Evolving Proofs with Genetic Search Park, YoungJa, and ManSuk Song Genetic Programming Approach to Sense Clustering in Natural Language Processing Paterson, Norman, and Mike Livesey Evolving caching algorithms in C by genetic programming Pelikan, Martin, Vladimir Kvasnicka, and Jiri Pospichal Read's linear codes and genetic programming Poli, Riccardo, and Stefano Cagnoni Genetic Programming with User-Driven Selection: Experiments on the Evolution of Algorithms for Image Enhancement Poli, R., and W. B. Langdon A New Schema Theory for Genetic Programming with One- point Crossover and Point Mutation Rosca, Justinian P. Analysis of Complexity Drift in Genetic Programming Ryan, Conor, and Paul Walsh The Evolution of Provable Parallel Programs Segovia, Javier, and Pedro Isasi Genetic Programming For Designing Ad Hoc Neural Network Learning Rules Sherrah, Jamie R., Robert E. Bogner, and Abdesselam Bouzerdoum The Evolutionary Pre-Processor: Automatic Feature Extraction for Supervised Classification using Genetic Programming Soule, Terence, and James A. Foster Code Size and Depth Flows in Genetic Programming Teller, Astro, and David Andre Automatically Choosing the Number of Fitness Cases: The Rational Allocation of Trials Watson, Andrew H., and Ian C. Parmee Steady State Genetic Programming With Constrained Complexity Crossover Winkeler, Jay F., and B. S. Manjunath Genetic Programming for Object Detection Zhang, Byoung-Tak, and Je-Gun Joung Enhancing Robustness of Genetic Programming at the Species Level Zhao, Kai and Jue Wang "Chromosone-Protein'': A Representation Scheme ----------------------------------------------------------------- GENETIC ALGORITHMS Bull, Larry, and Owen Holland Evolutionary Computing in Multi-Agent Environments: Eusociality Cantu-Paz, Erick, an David E. Goldberg Modeling Idealized Bounding Cases of Parallel Genetic Algorithms Dill, Karen M., and Marek A. Perkowski Minimization of GRM Forms with a Genetic Algorithm Gockel, Nicole, Martin Keim, Rolf Drechsler, and Bernd Becker A Genetic Algorithm for Sequential Circuit Test Generation based on Symbolic Fault Simulation Kargupta, Hillol, David E. Goldberg, and Liwei Wang Extending The Class of Order-k Delineable Problems For The Gene Expression Messy Genetic Algorithm Lathrop, James I. Compression Depth and Genetic Programs Mullen, David S., and Ralph M. Butler Genetic Algorithms In Optimization of Adjacency Constrained Timber Harvest Scheduling Problems Yang, Jihoon, and Vasant Honavar Feature Subset Selection Using A Genetic Algorithm ----------------------------------------------------------------- ARTIFICIAL LIFE AND EVOLUTIONARY ROBOTICS Balakrishnan, Karthik, and Vasant Honavar Spatial Learning for Robot Localization Floreano, Dario, and Stefano Nolfi God Save the Red Queen! Competition in Co-Evolutionary Robotics Hasegawa, Yasuhisa and Toshio Fukuda Motion Generation of Two-link Brachiation Robot Maeshiro, Tetsuya, and Masayuki Kimura Genetic Code as an Evolving Organism Ray, Thomas S. Selecting Naturally for Differentiation ----------------------------------------------------------------- EVOLUTIONARY PROGRAMMING AND EVOLUTIONARY STRATEGIES Angeline, Peter J. An Alternative to Indexed Memory for Evolving Programs with Explicit State Representations Chellapilla, Kumar Evolutionary Programming with Tree Mutations: Evolving Computer Programs without Crossover Greenwood, Garrison W. Experimental Observation of Chaos in Evolution Strategies Longshaw, Tom Evolutionary learning of large Grammars ----------------------------------------------------------------- DNA COMPUTING Arita, Masanori, Akira Suyama, and Masami Hagiya A Heuristic Approach for Hamiltonian Path Problem with Molecules Deaton, R, M. Garzon, R. C. Murphy, D. R. Francschetti, J. A. Rose, and S. E. Stevens Jr. Information Transfer through Hybridization Reactions in DNA based Computing Garzon, M., P. Neathery, R. Deaton, R. C. Murphy, D. R. Franschetti, S. E. Stevens Jr. A New Metric for DNA Computing Rose, J. A., Y. Gao, M. Garzon, and R. C. Murphy DNA Implementation of Finite-State Machines ----------------------------------------------------------------- EVOLVABLE HARDWARE Dreschler, Rolf, Nicole Gockel, Elke Mackensen, and Bernd Becker BEA: Specialized Hardware for Implementation of Evolutionary Algorithms Kazimierczak, Jan An Approach to Evolvable Hardware representing the Knowledge Base in an Automatic Programming System Michael Korkin, Hugo de Garis, Felix Gers, and Hitoshi Hemmi ``CBM (CAM-Brain Machine)'': A Hardware Tool which Evolves a Neural Net Module in a Fraction of a Second and Runs a Million Neuron Artificial Brain in Real Time Liu, Weixin, Masahiro Murakawa, and Tetsuya Higuchi Evolvable Hardware for On-line Adaptive Traffic Control in ATM Networks Sipper, Moshe, Eduardo Sanchez, Daniel Mange, Marco Tomassini, Andres Perez-Uribe, and Andre Stauffer The POE Model of Bio-Inspired Hardware Systems: A Short Introduction ----------------------------------------------------------------- CLASSIFIER SYSTEMS Nagasaka, Ichiro, and Toshiharu Taura Geometic Representation for Shape Generation using Classifier System Spohn, Bryan G., and Philip H. Crowley Complexity of Strategies and the Evolution of Cooperation Westerdale, T. H. Classifier Systems--No Wonder They Don't Work ------------------------- CITATION FOR GP-97 PROCEEDINGS: Koza, John R., Deb, Kalyanmoy, Dorigo, Marco, Fogel, David B., Garzon, Max, Iba, Hitoshi, and Riolo, Rick L. (editors). 1997. Genetic Programming 1997: Proceedings of the Second Annual Conference, July 13P16, 1997, Stanford University. San Francisco, CA: Morgan Kaufmann. From jagota at cse.ucsc.edu Mon May 12 12:27:00 1997 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Mon, 12 May 1997 09:27:00 -0700 Subject: PAC learning in NNs survey Message-ID: <199705121627.JAA02458@bristlecone.cse.ucsc.edu> The following refereed paper (47 pages, 118 references) is now available, in postscript form, from the Neural Computing Surveys web site: http://www.icsi.berkeley.edu/~jagota/NCS Probabilistic Analysis of Learning in Artificial Neural Networks: The PAC Model and its Variants Martin Anthony Department of Mathematics, The London School of Economics and Political Science, There are a number of mathematical approaches to the study of learning and generalization in artificial neural networks. Here we survey the `probably approximately correct' (PAC) model of learning and some of its variants. These models provide a probabilistic framework for the discussion of generalization and learning. This survey concentrates on the sample complexity questions in these models; that is, the emphasis is on how many examples should be used for training. Computational complexity considerations are briefly discussed for the basic PAC model. Throughout, the importance of the Vapnik-Chervonenkis dimension is highlighted. Particular attention is devoted to describing how the probabilistic models apply in the context of neural network learning, both for networks with binary-valued output and for networks with real-valued output. From sml%essex.ac.uk at seralph21.essex.ac.uk Mon May 12 12:04:08 1997 From: sml%essex.ac.uk at seralph21.essex.ac.uk (Simon Lucas) Date: Mon, 12 May 1997 17:04:08 +0100 Subject: Face Recognition with the continuous n-tuple classifier (paper available) Message-ID: <33773F78.1464@essex.ac.uk> The following paper is avaiable: Face Recognition with the continuous n-tuple classifier S.M. Lucas Submitted to the Britich Machine Vision Conference-97 Face recognition is an important field of research with many potential applications for suitably efficient systems, including biometric security and searching large face databases. This paper describes a new approach to the problem based on a new type of n-tuple classifier: the continuous n-tuple system. Results indicate that the new method is faster and more accurate than previous methods reported in the literature on the widely used Olivetti Research Laboratories face database. This paper is available via my web page: http://esewww.essex.ac.uk/~sml Comments welcome, as always. Best Regards, Simon Lucas ------------------------------------------------ Dr. Simon Lucas Department of Electronic Systems Engineering University of Essex Colchester CO4 3SQ United Kingdom Tel: (+44) 1206 872935 Fax: (+44) 1206 872900 Email: sml at essex.ac.uk http://esewww.essex.ac.uk/~sml secretary: Mrs Janet George (+44) 1206 872438 ------------------------------------------------- From rod at imm.dtu.dk Mon May 12 12:13:25 1997 From: rod at imm.dtu.dk (R. Murray-Smith) Date: Mon, 12 May 1997 18:13:25 +0200 Subject: New Book: Multiple Model Approaches to Nonlinear Modelling and Control Message-ID: <337741A5.1C4D@imm.dtu.dk> New Book. Full details available at http://www.itk.ntnu.no/SINTEF/ansatte/Johansen_Tor.Arne/mmamc/mmamc_book.html Multiple Model Approaches to Modelling and Control Roderick Murray-Smith and Tor Arne Johansen (Eds.) ---------------------------------------------------------------------- This book presents a variety of approaches which produce complex models or controllers by piecing together a number of simpler subsystems. This divide-and-conquer strategy is a long-standing and general way of coping with complexity in engineering systems, nature and human problem solving. More complex plants, advances in information technology, and tightened economical and environmental constraints in recent years have lead to practising engineers being faced with modelling and control problems of increasing complexity. When confronted with such problems, there is a strong intuitive appeal in building systems which operate robustly over a wide range of operating conditions by decomposing them into a number of simpler linear modelling or control problems, even for nonlinear modelling or control problems. This appeal has been a factor in the development of increasingly popular `local' and multiple-model approaches to coping with strongly nonlinear and time-varying systems. Such local approaches are directly based on the divide-and-conquer strategy, in the sense that the core of the representation of the model or controller is a partitioning of the system's full range of operation into multiple smaller operating regimes each of which is associated a locally valid model or controller. This can often give a simplified and transparent nonlinear model or control representation. In addition, the local approach has computational advantages, it lends itself to adaptation and learning algorithms, and allows direct incorporation of high-level and qualitative plant knowledge into the model. These advantages have proven to be very appealing for industrial applications, and the practical, intuitively appealing nature of the framework is demonstrated in chapters describing applications of local methods to problems in the process industries, biomedical applications and autonomous systems. The successful application of the ideas to demanding problems is already encouraging, but creative development of the basic framework is needed to better allow the integration of human knowledge with automated learning. The underlying question is `How should we partition the system - what is `local'?'. This book presents alternative ways of bringing submodels together, which lead to varying levels of performance and insight. Some are further developed for autonomous learning of parameters from data, while others have focused on the ease with which prior knowledge can be incorporated. It is interesting to note that researchers in Control Theory, Neural Networks, Statistics, Artificial Intelligence and Fuzzy Logic have more or less independently developed very similar modelling methods, calling them Local Model Networks, Operating Regime based Models, Multiple Model Estimation and Adaptive Control, Gain Scheduled Controllers Heterogeneous Control, Mixtures of Experts, Piecewise Models, Local Regression techniques, or Tagaki-Sugeno Fuzzy Models, among other names. Each of these approaches has different merits, varying in the ease of introduction of existing knowledge, as well as the ease of model interpretation. This book attempts to outline much of the common ground between the various approaches, encouraging the transfer of ideas. Recent progress in algorithms and analysis is presented, with constructive algorithms for automated model development and control design, as well as techniques for stability analysis, model interpretation and model validation. ---------------------------------------------------------------------- Table Of Contents Preface - the book outline. The Operating Regime Approach to Nonlinear Modelling and Control Tor Arne Johansen, SINTEF, and Roderick Murray-Smith, Daimler-Benz AG Fuzzy Set Methods for Local Modelling and Identification R. Babuska and H.B. Verbruggen, Delft University of Technology Modelling of Electrically Stimulated Muscle H. Gollee, University of Glasgow, K.J. Hunt, Daimler-Benz AG, N. Donaldson, University College London and J. Jarvis, University of Liverpool Process Modelling Using the Functional State Approach Aarne Halme, Arto Visala and Xia-Chang Zhang, Helsinki University of Technology Markov Mixtures of Experts Marina Meila, Michael Jordan, Massachusetts Institute of Technology Active Learning with Mixture Models David Cohn, and Zoubin Ghahramani and Michael Jordan, Massachusetts Institute of Technology Local Learning in Local Model Networks Roderick Murray-Smith, Daimler-Benz AG and Tor Arne Johansen, SINTEF Side-Effects of Normalising Basis Functions in Local Model Networks Robert Shorten and Roderick Murray-Smith, Daimler-Benz AG The Composition and Validation of Hetrogeneous Control Laws B. Kuipers, University of Texas at Austin and K. Astrom, Lund Insitute of Technology Local Laguerre Models Daniel Sbarbaro, University of Concepcisn Multiple Model Adaptive Control Kevin D. Schott, B. Wayne Bequette, Rensselaer Polytechnic Institute H-infinity Control of Nonlinear Processes Using Multiple Linear Models A. Banerjee, Y. Arkun, Georgia Insitute of Technology, and R. Pearson and B. Ogunnaike, DuPont Synthesis of Fuzzy Control Systems Based on Linear Takagi-Sugeno Fuzzy Models J. Zhao, R. Gorez and V. Wertz, Catholic University of Louvain -------------------------------------------------------------------- Ordering Information ISBN Number 07484 0595 X The book is hardback 350 pages, published by Taylor and Francis and costs 55.00 pounds sterling. You can order over the web (http://www.tandf.co.uk/books/borders.htm), order by E-mail (from UK, Europe and Asia use book.orders at tandf.co.uk. For USA use bkorders at tandfpa.com) or write, phone or fax to Taylor and Francis: The Book Ordering Department, Taylor and Francis, Rankine Road, Basingstoke, Hants RG24 8PR, UK Telephone: +44 (0) 1256 813000 Ext. 236, Fax: +44 (0) 1256 479438 --------------------------------------------------------------------- From jbower at bbb.caltech.edu Tue May 13 22:03:32 1997 From: jbower at bbb.caltech.edu (James M. Bower) Date: Tue, 13 May 1997 18:03:32 -0800 Subject: Registration open for CNS*97 Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 3474 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/84a4b2ac/attachment.bin From bishopc at helios.aston.ac.uk Wed May 14 03:10:53 1997 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Wed, 14 May 1997 08:10:53 +0100 Subject: Summer School: Probabilistic Graphical Models Message-ID: <18477.199705140710@sun.aston.ac.uk> --------------------------------------------------------------------------- A Newton Institute EC Summer School PROBABILISTIC GRAPHICAL MODELS 1 - 5 September 1997 Isaac Newton Institute, Cambridge, U.K. Organisers: C M Bishop (Aston) and J Whittaker (Lancaster) Probabilistic graphical models provide a very general framework for representing complex probability distributions over sets of variables. A powerful feature of the graphical model viewpoint is that it unifies many of the common techniques used in pattern recognition and machine learning including neural networks, latent variable models, probabilistic expert systems, Boltzmann machines and Bayesian belief networks. Indeed, the increasing interactions between the neural computing and graphical modelling communities have resulted in a number of powerful new ideas and techniques. The conference will include several tutorial presentations on key topics as well as advanced research talks. Provisional themes: Conditional independence; Bayesian belief networks; message propagation; latent variable models; variational techniques; mean field theory; learning and estimation; model search; EM and MCMC algorithms; axiomatic approaches; causality; decision theory; neural networks; information and coding theory; scientific applications and examples. Provisional list of speakers: C M Bishop (Aston) D J C MacKay (Cambridge) R Cowell (City) J Pearl (UCLA) A P Dawid (UCL) M D Perlman (Washington) D Geiger (Technion) M Piccioni (Aquila) E George (Texas) R Shachter (Stanford) W Gilks (Cambridge) J Q Smith (Warwick) D Heckermann (Microsoft) M Studeny (Prague) G E Hinton (Toronto) M Titterington (Glasgow) T Jaakkola (UCSC) J Whittaker (Lancaster) M I Jordan (MIT) S Lauritzen (Aalborg) B Kappen (Nijmegen) D Spiegelhalter (Cambridge) M Kearns (AT&T) S Russell (Berkeley) This instructional conference will form a component of the Newton Institute programme on Neural Networks and Machine Learning, organised by C M Bishop, D Haussler, G E Hinton, M Niranjan and L G Valiant. Further information about the programme is available via the WWW at http://www.newton.cam.ac.uk/programs/nnm.html Location and Costs: The conference will take place in the Isaac Newton Institute and accommodation for participants will be provided at Wolfson Court, adjacent to the Institute. The conference package costs 270 UK pounds which includes accommodation from Sunday 31 October to Friday 5 September, together with breakfast, lunch during the days that the lectures take place and evening meals. Applications: To participate in the conference, please complete and return an application form and, for students and postdoctoral fellows, arrange for a letter of reference from a senior scientist. Limited financial support is available for participants from appropriate countries. Application forms are available from the conference Web Page at http://www.newton.cam.ac.uk/programs/nnmec.html Completed forms and letters of recommendation should be sent to Heather Dawson at the Newton Institute, or by e-mail to h.dawson at newton.cam.ac.uk *Closing Date for the receipt of applications and letters of recommendation is 16 June 1997* --------------------------------------------------------------------------- From moatl at cs.tu-berlin.de Wed May 14 11:33:02 1997 From: moatl at cs.tu-berlin.de (Martin Stetter) Date: Wed, 14 May 1997 17:33:02 +0200 Subject: PhD-fellowship in Computer Science Message-ID: <3379DB2D.313E@cs.tu-berlin.de> PhD-fellowship in Computer Science, Technische Universitaet Berlin, Germany Acquisition and Analysis of Optical Imaging Data from Primate Visual Cortex The Neural Informations Processing Group (Department of Computer Science,Technische Universitaet Berlin, Germany) solicits applications for a predoctoral fellowship. The applicant is expected to join an international collaborative project which aims at the development of new methodologies for image acquisition, image analysis, and for the separation of different superimposed signal components from optical imaging data, which are recorded from primate visual cortex. Although focus will be on statistical signal processing, the applicant is also expected to participate in the setup of a time-resolved optical imaging system as well as in its modification for depth-resolved measurements of cortical activity. Applicants should have a strong background in a theoretical discipline, such as physics, mathematics, electrical engineering or computer science, should have gained experience with both hardware (optics, electronics) and programming (C, C++), and should be familiar with signal processing techniques. In addition, applicants should be willing to stay abroad for parts of the project. Prior biological or neuroscience training is not required but the applicant is expected to acquire the relevant expertise during the project. The position is initially for one year with possible extension of up to five years. Salary is commensurable to BAT IIa / 2. Please send applications including copies of certificates, CV, list of publications, statement of research interests and list of skills relevant to the project to: Prof. Klaus Obermayer, FR2-1, Informatik, Technische Universitaet Berlin, Franklinstrasse 28/29, 10587 Berlin, Germany, phone: ++49-30-314-73120, fax: ++49-30-314-73121, email: oby at cs.tu-berlin.de preferably by fax or email. From rojas at inf.fu-berlin.de Thu May 15 21:44:00 1997 From: rojas at inf.fu-berlin.de (Raul Rojas) Date: Thu, 15 May 97 21:44 MET DST Subject: New Book: "Neural Networks" Message-ID: This e-mail is to announce a new book on neural networks: "Neural Networks - A Systematic Introduction" by Raul Rojas, With a Foreword by Jerome Feldman, Springer-Verlag, Berlin-New York, 1996 (502 pp., 350 illustrations). The book has a homepage with a sample chapter ("The Backpropagation Algorithm", 33 pp.) that you are invited to download. The address of the homepage is http://www.inf.fu-berlin.de/~rojas/neural This is the Review which appeared in April in "Computing Reviews": Connectionism and neural nets Rojas, Raul (Univ. Halle, Halle, Germany) 9704-0262 Neural networks: a systematic introduction. Springer-Verlag New York, Inc., New York, NY, 1996, 502 pp., $39.95, ISBN 3-540-60505-3. If you want a systematic and thorough overview of neural networks, need a good reference book on this subject, or are giving or taking a course on neural networks, this book is for you. More generally, the book is of value for anyone interested in understanding artificial neural networks or in learning more about them. It attempts to solve the puzzle of artificial neural models and proposals. Rojas systematically introduces and discusses each of the neural network models in relation to the others. The book is divided into 18 chapters, each designed to be taught in about one week. The only mathematical tools needed to understand the text are those learned during the first two years at university. The first eight chapters form a logical sequence, and later ones can be covered in a variety of orders. The first eight chapters are "The Biological Paradigm"; "Threshold Logic"; "Weighted Networks - The Perceptron"; "Perceptron Learning"; "Unsupervised Learning and Clustering Algorithms"; "One and Two Layered Networks"; "The Backpropagation Algorithm"; and "Fast Learning Algorithms." The later chapters cover "Statistics and Neural Networks"; "The Complexity of Learning"; "Fuzzy Logic"; "Associative Networks"; "The Hopfield Model"; "Stochastic Networks"; "Kohonen Networks"; "Modular Neural Networks"; "Genetic Algorithms"; and "Hardware for Neural Networks." Proofs are rigorous, but not overly formal, and the author makes extensive use of geometric intuition and diagrams. There are a modest number of exercises at the end of each chapter. Material from the book has been used successfully for courses in Germany, Austria and the United States. It seems quite extensive for a one-semester course. Neural network applications are discussed, with the emphasis on computational rather than engineering issues. Those who want to expend a minimum amount of time and effort on a first overview of neural networks and those who need to apply neural network technology in the most cost-effective way for a specific task, should consult another reference as well. On the whole, though, the author has done excellent work. The book includes an index and an up-to-date and useful list of 473 references. The German edition has been quite successful and has been through five printings in three years. The English version has been radically rewritten and deserves the same success. -J. Tepnadi, Tallinn, Estonia --------------------- From smagt at dlr.de Fri May 16 01:38:00 1997 From: smagt at dlr.de (Patrick van der Smagt) Date: Fri, 16 May 1997 07:38:00 +0200 Subject: BOOK ANNOUNCEMENT: Neural Systems for Robotics Message-ID: <337BF2B8.400B@dlr.de> BOOK ANNOUNCEMENT ================= Neural Systems for Robotics ed. by Omid Omidvar and Patrick van der Smagt Academic Press, 1997 ISBN 0125262809 http://www.apcatalog.com/cgi-bin/AP?ISBN=0125262809&LOCATION=US&FORM=FORM2 http://www.amazon.com/exec/obidos/ISBN%3D0125262809/6344-0055321-349380 In this book we attempt to give an overview of state-of-the-art applications of neural methodologies for robot control. This is done via in-depth and summarizing studies. Table of contents: ================== Neural Network Sonar as a Perceptual Modality for Robotics Itiel E. Dror, Mark Zagaeski, Damien Rios, Cynthia F. Moss Dynamic Balance of a Biped Walking Robot W. Thomas Miller III, Andrew L. Kun Visual Feedback in Motion Patrick van der Smagt, Frans Groen Inverse Kinematics of Dextrous Manipulators David DeMers, Kenneth Kreutz-Delgado Stable Manipulator Trajectory Control Using Neural Networks Yichuang Jin, Tony Pipe, Alan Winfield The Neural Dynamics Approach to Sensory-Motor Control Paolo Gaudiano, Frank H. Guenther, Eduardo Zalama Operant Conditioning in Robots Andreas B\"uhlmeier, Gerhard Manteuffel A Dynamic Net for Robot Control Bridget Hallam, John Hallam, Gillian Hayes Neural Vehicles Ben Kr\"ose, Joris van Dam Self-Organization and Autonomous Robots Jukka Heikkonen, Pasi Koikkalainen >From the preface ================ The chapters in this book are logically selected and grouped. The path that is followed goes through four stages: * Research inspired by biological systems at the behavioral level * Control of robot arms using artificial neural networks * Simulation of and inspiration by biological neural systems * Control and navigation of mobile robots using artificial neural networks. The first three chapters describe neural networks which simulate biological systems at the behavioral level. The third chapter ends with neural control of a robot arm; this topic is picked up by the subsequent---overview---chapter, followed by an in-depth study in this field. The next three chapters are focused on biological neural systems, and describe applications in the navigation of mobile robots. This theme is covered in detail in the final two chapters. Evaluating a biological system at the behavioral level, Chapter 1, ``Neural Network Sonar as a Perceptual Modality for Robotics,'' by Itiel Dror, Mark Zagaeski, Damien Rios, and Cynthia Moss, describes a neural network which approximates echo-locating behavior of the big brown bat, Eptesicus fuscus. Using previous studies of this bat, a neural system is introduced which can determine speed of movement using a single echolocation only, referring back to studies which show that bats differentiate between different wingbeat rates of insects. The results presented in this chapter provide a good basis for the use of echolocation in robotic systems. In Chapter 2, ``Dynamic Balance of a Biped Walking Robot,'' by Thomas Miller III and Andrew Kun, a neural system is used to have a robot learn to walk. The approach is unique: Instead of using analyses of walking behavior of biological systems, the neural network-driven robot uses feedback from force sensors mounted on the undersides of the feet, as well as from accelerometers mounted on the body. The learning behavior that is exhibited typically resembles that of biological systems which learn to walk. A technique for the control of robot manipulators is introduced in Chapter 3, ``Visual Feedback in Motion,'' by Patrick van der Smagt and Frans Groen. This research is also inspired by a biological system at the behavioral level. Using studies of the gannet from the family of Sulidae, sequences of two-dimensional visual signals are interpreted to guide a monocular robot arm in three-dimensional space without using models of the visual sensor nor the robot arm. Exploration of the control of robot arms is continued in Chapter 4, ``Inverse Kinematics of Dextrous Manipulators,'' by David DeMers and Kenneth Kreutz-Delgado. The chapter gives an overview of neural and non-neural methods to solve the inverse kinematics problem: Given an end-effector position and orientation, how should one move a robot arm (in a most efficient way) to reach that position/orientation? The theoretically inclined Chapter 5, ``Stable Manipulator Trajectory Control Using Neural Networks,'' by Yichuang Jin, Tony Pipe, and Alan Winfield, describes neural network approaches for trajectory following of a robot arm. The key issue here is how to improve the accuracy of the followed trajectory when the dynamic model of the robot arm is inaccurate. Studies of sensory motor control in biological organisms and robots are presented in Chapter 6, ``The Neural Dynamics Approach to Sensory-Motor Control,'' by Paolo Gaudiano, Frank Guenther, and Eduardo Zalama. It extensively discusses neural network models developed at Boston University's Center for Adaptive Systems. The neural models are used in two applications: trajectory following of a mobile robot, and controlling the motor skills required for speech reproduction using auditory-orosensory feedback. Biomorphic robots are discussed in Chapter 7, ``Operant Conditioning in Robots,'' by Andreas B\"uhlmeier and Gerhard Manteuffel. In their overview chapter, they discuss neural systems which maintain homeostasis for (mobile) robot systems. After discussing neural learning systems with neurophysiological backgrounds, a survey of several implementations on mobile robots, which have to learn to navigate between obstacles, is given. In Chapter 8, ``A Dynamic Net for Robot Control,'' by Bridget Hallam, John Hallam, and Gillian Hayes, a neural model, designed for explaining various learning phenomena from animal literature, is used to control a mobile robot. The navigation of mobile robots using artificial neural networks is covered in Chapter 9, ``Neural Vehicles,'' by Ben Kr\"ose and Joris van Dam. The authors make the distinction between reactive navigation, planned navigation in known environments, and map building from sensor signals. In the final chapter, ``Self-Organization and Autonomous Robots,'' Jukka Heikkonen and Pasi Koikkalainen describe the use of self-organizing maps for reactive control of mobile robots. -- dr Patrick van der Smagt phone +49 8153 281152 DLR/Institute of Robotics and Systems Dynamics fax +49 8153 281134 P.O. Box 1116, 82230 Wessling, Germany email From Paul.Vitanyi at cwi.nl Tue May 13 08:30:45 1997 From: Paul.Vitanyi at cwi.nl (Paul.Vitanyi@cwi.nl) Date: Tue, 13 May 1997 14:30:45 +0200 Subject: Book Announcement: 2nd Edition Li-Vitanyi on Kolmogorov Complexity & Appl. Message-ID: <9705131230.AA18126=paulv@gnoe.cwi.nl> Ming Li and Paul Vitanyi, AN INTRODUCTION TO KOLMOGOROV COMPLEXITY AND ITS APPLICATIONS, REVISED AND EXPANDED SECOND EDITION, Springer-Verlag, New York, 1997, xx+637 pp, 41 illus. Hardcover \$49.95/ISBN 0-387-94868-6 (Graduate Texts in Computer Science Series) See the web pages "http://www.cwi.nl/~paulv/kolmogorov.html" and "http://www.springer-ny.com/catalog/np/nov96np/DATA/0-387-94868-6.html" a subpage of "http://www.springer-ny.com/". The first edition appeared late in 1993. The second edition is revised and expanded by about 90 pages. The price is reduced by $9.05. From jose.millan at jrc.it Mon May 19 06:47:11 1997 From: jose.millan at jrc.it (jose.millan) Date: Mon, 19 May 97 12:47:11 +0200 Subject: Job opening Message-ID: <9705191047.AA01752@ jrc.it> I apologize if you receive this announcement multiple times. Best regards, Jose **************************************************************************** POSTDOCTORAL RESEARCH FELLOWSHIP available at the Institute for Systems, Informatics and Safety Joint Research Centre of the European Commission 21020 Ispra (VA) Italy Applications are invited for a one-year postdoctoral research position in the area of "Robot Learning". The candidate will carry out applied research on the use of reinforcement learning and other neural network paradigms for (1) the acquisition of efficient reactive navigation strategies, (2) map building, and (3) the integration of both---i.e., topological reasoning and reactive control. The algorithms will be implemented on physical mobile robots equipped with range sensors (sonar, infrared and/or laser range finder) and devoted to surveillance and safeguards applications. The ideal candidate should have experience with neural networks and mobile robots, good programming skills in C/C++, ability to communicate research results, and be willing to build on previous work. The position is available immediately. It is funded by the European Commission in the framework of the SMART-II Network. Thus, only citizens of the European Union or associated countries are eligible. The gross salary is about 2400 ECU/month. There is travel funding in case of papers accepted at important conferences. Interested candidates should send a full CV and the names of 2 referees by email to jose.millan at jrc.it ------------------------------- Jose del R. Millan, Ph.D. Institute for Systems, Informatics and Safety Joint Research Centre of the European Commission 21020 Ispra (VA) Italy e-mail: jose.millan at jrc.it Phone: +39 - 332 - 78 5751 Fax: +39 - 332 - 78 9185 From drl at eng.cam.ac.uk Mon May 19 13:50:32 1997 From: drl at eng.cam.ac.uk (drl@eng.cam.ac.uk) Date: Mon, 19 May 97 13:50:32 BST Subject: Cambridge Neural Networks Summer School '97 Message-ID: <9705191250.6232@ganesh.eng.cam.ac.uk> +-----------------------------------------------------+ | THE SEVENTH CAMBRIDGE NEURAL NETWORKS SUMMER SCHOOL | +-----------------------------------------------------+ Neural computation, network design and industrial applications September 22-24, 1997 Emmanuel College, Cambridge, UK. Course director: David Lovell. This three day school provides an introduction to, and an overview of the field of neural computation. The course is aimed at a broad range of participants, including those needing to assess the potential of neural networks for their own business, to those wishing to keep up to date with recent developments. As well as 23 presentations from international experts in the field, the course offers a hands-on session, laboratory tour and sessions devoted to neural network applications. Discounts are available for academics and there are fully-funded places available for EPSRC students. The deadline for applications for EPSRC funding is Friday June 13, 1997. Full details of the course, registration and EPSRC funding application forms are available via: http://svr-www.eng.cam.ac.uk/~drl/cnnss97/brochure.html For enquiries or reservation please contact Lynda Bryers: by 'phone on: +44 (0)1223 302233 by fax on: +44 (0)1223 301122 by email on: CPI at hermes.cam.ac.uk by post to: University of Cambridge Programme for Industry 1 Trumpington Street, Cambridge CB2 1QA, UK List of speakers and presentation titles Chris BISHOP 1.Regularization and model complexity. 2.Density estimation, mixture models and the EM algorithm. 3.(ADV) Latent variables, topographic mappings and data visualization. Herve BOURLARD 1.Statistics, neural nets and parallels with conventional algorithms. 2.Speech recognition. 3.(ADV) Applications of neural nets to speech recognition. George HARPUR 1.An introduction to unsupervised learning. 2.ICA and information theoretic approaches to unsupervised learning. David LOVELL 1.Neural computing in perspective (course framework). 2.(APP) Predicting risk in pregnancy using neural networks. John MOODY 1.Time series prediction: classical and nonlinear approaches. 2.Neural networks for time series analysis. 3.(APP) Models for economic and financial time series. Mahesan NIRANJAN 1.Neural Networks in Signal Processing. Richard PRAGER 1.Classification Trees and the CMAC. Rich SUTTON 1.Reinforcement learning I: learning to act. 2.Reinforcement learning II: temporal-difference learning. 3.(APP) Reinforcement learning III: generalization and cognition. Volker TRESP 1.Introduction to supervised learning in neural networks. 2.Combining neural networks: stacking, arcing, boosting, bagging, bragging and all that. 3.(APP) Does it all work? Successful industrial applications of neural networks. Chris WILLIAMS 1.Gaussian processes for regression. 2.(APP) Estimating wind-fields from satellite data with neural networks and Gaussian processes. From kruschke at croton.psych.indiana.edu Mon May 19 13:34:28 1997 From: kruschke at croton.psych.indiana.edu (John Kruschke) Date: Mon, 19 May 1997 12:34:28 -0500 (EST) Subject: 30th Annual Math Psych Conference Message-ID: A preliminary version of the program for the Thirtieth Annual Meeting of the Society for Mathematical Psychology (SMP), to be held July 31-Aug 3, 1997, is now available on the World Wide Web at the address: http://www.indiana.edu/~mathpsy/ Registration forms and hotel and travel information will be added in the near future. A complete regular mailing of this information for all SMP members and conference participants will take place next week. Some highlights for this year's conference include a symposium on formal models of face recognition, a symposium on the use of selective influence in the analysis of mental architectures, and a satellite conference on methods for model selection (to be held Aug 3-4; the program is available at http://www.cwi.nl/~pdg/modsel.html). Invited addresses are being given by Thomas Landauer (Latent Semantic Analysis), Roger Ratcliff (Diffusion Model for Reaction Time), and Jerald Balakrishnan (Misrepresentations of Signal Detection Theory). Multiple sessions are planned in areas of learning and memory, judgment and decision making, categorization, information processing, sensation and perception, measurement and scaling, and methodology and statistics. The activities include an opening reception Thursday evening, a banquet Friday evening, and an opportunity to attend an opera on Saturday evening. We hope you can attend. Sincerely, Robert Nosofsky Richard Shiffrin From ericr at mech.gla.ac.uk Mon May 19 06:26:28 1997 From: ericr at mech.gla.ac.uk (Eric Ronco) Date: Mon, 19 May 1997 11:26:28 +0100 (BST) Subject: No subject Message-ID: <19265.199705191026@googie.mech.gla.ac.uk> From sbcho at csai.yonsei.ac.kr Tue May 20 08:28:58 1997 From: sbcho at csai.yonsei.ac.kr (Sung-Bae Cho) Date: Tue, 20 May 1997 21:28:58 +0900 (KST) Subject: CFP: Hybrid Evolutionary Learning Systems Message-ID: <9705201228.AA03396@csai.yonsei.ac.kr> ------------------------- CALL FOR PAPERS ------------------------- Special Session on "Hybrid Evolutionary Learning Systems" at ICONIP'97 The Fourth International Conference on Neural Information Processing November 24~28, 1997 Dunedin/Queenstown, New Zealand -------------------------------------------------------------------- As part of the International Conference on Neural Information Processing, a special session is planned on Hybrid Evolutionary Learning Systems. The session will be devoted to exploring different hybrid approaches of Neural Networks, Fuzzy Logic and Evolutionary Computation for achieving better learning systems. The scope of the special session will include any topics related with the hybrid learning systems not only based on the above softcomputing techniques, but also on any biologically inspired methodologies. Prospective authors are invited to submit three copies of the paper written in English on A4-format white paper with one inch margins on all four sides, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. Centred at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by an abstract and the text. Those who are interested should send a title and an extended abstract (not more than 300 words) via email and the manuscripts should be sent to the following address no later than June 16: Session Chair: Prof. Sung-Bae Cho Dept. of Computer Science Yonsei University 134 Shinchon-dong, Sudaemoon-ku Seoul 120-749, Korea Tel: +82 2 361-2720 Fax: +82 2 365-2579 Email: sbcho at csai.yonsei.ac.kr Important Dates: June 16, 1997 Paper Submission Due July 20, 1997 Notification of Acceptance August 20, 1997 Final Submission For general information of ICONIP'97, please visit the web page of the conference at http://divcom.otago.ac.nz:800/com/infosci/kel/iconip97.htm. -------------------------------------------------------------------- From gordon at AIC.NRL.Navy.Mil Tue May 20 10:29:16 1997 From: gordon at AIC.NRL.Navy.Mil (gordon@AIC.NRL.Navy.Mil) Date: Tue, 20 May 97 10:29:16 EDT Subject: workshop Message-ID: <9705201429.AA10349@sun14.aic.nrl.navy.mil> ======= CALL FOR PARTICIPATION REINFORCEMENT LEARNING: TO MODEL OR NOT TO MODEL, THAT IS THE QUESTION Workshop at the Fourteenth International Conference on Machine Learning (ICML-97) Vanderbilt University, Nashville, TN July 12, 1997 www.cs.cmu.edu/~ggordon/ml97ws Recently there has been some disagreement in the reinforcement learning community about whether finding a good control policy is helped or hindered by learning a model of the system to be controlled. Recent reinforcement learning successes (Tesauro's TD-gammon, Crites' elevator control, Zhang and Dietterich's space-shuttle scheduling) have all been in domains where a human-specified model of the target system was known in advance, and have all made substantial use of the model. On the other hand, there have been real robot systems which learned tasks either by model-free methods or via learned models. The debate has been exacerbated by the lack of fully-satisfactory algorithms on either side for comparison. Topics for discussion include (but are not limited to) o Case studies in which a learned model either contributed to or detracted from the solution of a control problem. In particular, does one method have better data efficiency? Time efficiency? Space requirements? Final control performance? Scaling behavior? o Computational techniques for finding a good policy, given a model from a particular class -- that is, what are good planning algorithms for each class of models? o Approximation results of the form: if the real system is in class A, and we approximate it by a model from class B, we are guaranteed to get "good" results as long as we have "sufficient" data. o Equivalences between techniques of the two sorts: for example, if we learn a policy of type A by direct method B, it is equivalent to learning a model of type C and computing its optimal controller. o How to take advantage of uncertainty estimates in a learned model. o Direct algorithms combine their knowledge of the dynamics and the goals into a single object, the policy. Thus, they may have more difficulty than indirect methods if the goals change (the "lifelong learning" question). Is this an essential difficulty? o Does the need for an online or incremental algorithm interact with the choice of direct or indirect methods? Preliminary schedule of talks: 9:00- 9:30 Chris Atkeson "Why Model-Based Learning Should Be Inconsistent With the Model" 9:30-10:15 Jeff Schneider "Exploiting Model Uncertainty Estimates for Safe Dynamic Control Learning" 10:15-10:45 Discussion break 10:45-11:15 David Andre, Nir Friedman, and Ronald Parr "Generalized Prioritized Sweeping" 11:15-12:00 Scott Davies, Andrew Y. Ng, and Andrew Moore "Applying Model-Based Search to Reinforcement Learning" 12:00- 1:00 LUNCH BREAK 1:00- 1:45 Rich Sutton "Multi-Time Models: A Unified View of Modeling and Not Modeling" 1:45- 2:15 Doina Precup and Rich Sutton "Multi-Time Models for Reinforcement Learning" 2:15- 2:45 Howell, Frost, Gordon, and Wu "Real-Time Learning of Vehicle Suspension Control Laws" 2:45- 3:15 Discussion break 3:15- 3:45 Leonid Kuvayev and Rich Sutton "Approximation in Model-Based Learning" 3:45-4:15 Geoff Gordon "Wrap-up" 4:15- 5:00 Discussion Organizers: Chris Atkeson (cga at cc.gatech.edu) College of Computing Georgia Institute of Technology 801 Atlantic Drive Atlanta, GA 30332-0280 Geoff Gordon (ggordon at cs.cmu.edu) Computer Science Department Carnegie Mellon University 5000 Forbes Ave Pittsburgh, PA 15213-3891 (412) 268-3613, (412) 361-2893 Contact: Geoff Gordon (ggordon at cs.cmu.edu) From pr at physik.uni-wuerzburg.de Tue May 20 15:32:57 1997 From: pr at physik.uni-wuerzburg.de (Peter Riegler) Date: Tue, 20 May 1997 21:32:57 +0200 (METDST) Subject: thesis available Message-ID: The following Ph.D. thesis is available via anonymous-ftp. FTP-host: ftp.uni-wuerzburg.de FTP-file: file: pub/dissertation/riegler/these.ps.gz Dynamics of On-line Learning in Neural Networks Peter Riegler Institut fuer Theoretische Physik Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg, Germany Abstract: One of the most important features of natural as well as artificial neural networks is their ability to adjust to their environment by ``learning''. This results in the network's ability to ``generalize'', i.e. to generate with high probability the appropriate response to an unknown input. The theoretical description of generalization in artificial neural networks by means of statistical physics is the subject of this thesis. The focus is on {\em on-line learning}, where the presentation of examples used in the learning process occurs in a sequential manner. Hence, the systems investigated are dynamical in nature. They typically consist of a large number of degrees of freedom, requiring a description in terms of order parameters. In the first part of this work the most fundamental network, the perceptron, is investigated. Following a recent proposal by Kinouchi and Caticha it will be shown how one can derive a learning dynamics starting from first principles that results in an optimal generalization ability. Results will be presented for learning processes where the training examples are corrupted by different types of noise. The resulting generalization ability will be shown to be comparable to the noiseless case. Furthermore the results obtained reveal striking similarities to those obtained for batch learning. The optimal algorithms derived will be shown to depend on the characteristics of the particular learning task including the type and strength of the corrupting noise. In general this requires an additional estimation of such characteristic quantities. For the strength of the noise this estimation leads to interesting dynamical phase transitions. The second part deals with the dynamical properties of two-layer neural networks. This is of particular importance since these networks are known to represent universal approximators. Understanding the dynamical features will help to construct fast training algorithms that lead to best generalization. Specifically, an exact analysis of learning a rule by on-line gradient descent (backpropagation of error) in a two-layered neural network will be presented. Hereby, the emphasis is on adjustable hidden-to-output weights which have been left out of the analysis in the literature so far. Results are compared with the training of networks having the same architecture but fixed weights in the second layer. It will be shown, that certain features of learning in a two-layered neural network are independent of the state of the second layer. Motivated by this result it will be argued that putting the dynamics of the hidden-to-output weights on a faster time scale will speed up the learning process. For all systems investigated, simulations confirm the results. ________________________________________________________________ _/_/_/_/_/_/_/_/_/_/_/_/ _/ _/ Peter Riegler _/ _/_/_/ _/ Institut fuer Theoretische Physik _/ _/ _/ _/ Universitaet Wuerzburg _/ _/_/_/ _/ Am Hubland _/ _/ _/ D-97074 Wuerzburg, Germany _/_/_/ _/_/_/ phone: (++49) (0)931 888-4908 fax: (++49) (0)931 888-5141 email: pr at physik.uni-wuerzburg.de www: http://www.physik.uni-wuerzburg.de/~pr ________________________________________________________________ From ken at nagano.is.hosei.ac.jp Wed May 21 01:55:26 1997 From: ken at nagano.is.hosei.ac.jp (Ken-ichiro Miura) Date: Wed, 21 May 1997 14:55:26 +0900 Subject: CFP:ICONIP`97 -New Deadline- Message-ID: <33828E4E.1208@nagano.is.hosei.ac.jp> NEW DEADLINE Call For Papers ICONIP'97, The Fourth International Conference on Neural Information Processing, November 24-28, 1997. Dunedin/Queenstown, New Zealand "Special Session on Spatio-temporal Information Processings in the Brain" Session Co-Organizers : Minoru Tsukada(Tamagawa University) Takashi Nagano(Hosei University) Recently spatio-temporal aspects have been recognized to be very important in order to understand the neural information processing mechanisms in the brain. The importance lies not only in the sensory systems such as the visual system, the auditory system etc. but also in the higher order systems like memory and learning. Many works from the aspects are now being done. A special session devoted to these works will be organized at ICONIP'97. The scope of the special session covers computational theories, neural network models, physiological studies and psychological studies which are related to spatio-temporal information processings in the brain. Prospective authors are invited to submit papers to the special session. (Traveling expenses and conference fee are not supplied.) The submissions must be received by June 16, 1997 (new deadline), Please send five copies of your manuscript to Prof. Takashi Nagano, Special Session Co-Organizer Dept. Industrial and Systems Engineering, College of Engineering, Hosei University 3-7-2 Kajino-cho, Koganei, Tokyo, 184, JAPAN For the most up-to-date information about ICONIP'97, please browse the conference home page: http://divcom.otago.ac.nz:800/com/infosci/kel/iconip97.htm Important dates: Paper due: June 16, 1997 (new deadline) Notification of acceptance: July 20, 1997 Final camera-ready papers due: August 20, 1997 Manuscript format: Papers must be written in English on A4-format white paper with one inch margins on all four sides, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. ------------------------------------------------ Takashi Nagano Nagano Labo., Dept. of Industrial and Systems Engineering, College of Engineering, Hosei University 3-7-2 Kajino-cho, Koganei, Tokyo JAPAN Tel +81-423-87-6350 Fax +81-423-87-6350 mailto:nagano at nagano.is.hosei.ac.jp ------------------------------------------------ From helnet97 at dds.nl Wed May 21 21:11:04 1997 From: helnet97 at dds.nl (HELNET 1997 Workshop on Neural Networks) Date: Thu, 22 May 1997 01:11:04 +0000 Subject: Call for papers: HELNET Workshop on Neural Networks Message-ID: <199705212308.BAA17935@k9.dds.nl> CALL FOR PAPERS HELNET 1997 International Workshop on Neural Networks October 3 - October 5, Montreux Announcing the HELNET 1997 Workshop on Neural Networks Montreux, Switzerland from October 3 - October 5, 1997 http://www.leidenuniv.nl/medfac/fff/groepc/chaos/helnet/index.html mailto:helnet97 at dds.nl The HELNET workshops are informal meetings primarily targeted towards young researchers from neural networks and related fields. They are traditionally organised a few days prior to the ICANN conferences. Participants are offered the opportunity to present and extensively discuss their work as well as more general topics from the neural network field. One of the aims of the HELNET Workshops is to fascilitate such exchange and enable (young) researchers, (PhD) students and postdocs in the field to learn more about the varying disciplines in the neural networks field outside their own research program. That is why we encourage researchers from related fields to register and particpate. Although the final workshop program has not been fixed yet the following topics have been proposed for presentation and discussion. - Optimal complexity in reduced connectivity neural network paradigms - Speech recognition by neural networks - Neural networks and statistical inference - The emergence of consciousness in neural networks - Applications of differential geometric system theory in dynamic neural networks - Circuit and VLSI complexity issues - VLSI friendly learning - Neural network applications in control - Visualization by neural networks - Markov modelling of sensory neural spike trains - An application of neural networks in cellular wireless networks Please find more detailed information on the HELNET 1997 Workshop on Neural Networks and the registration form below. =================================================================== GENERAL INFORMATION =================================================================== Important Dates and Deadlines Deadline paper submission July 15, 1997 Notification of acceptance August 1, 1997 Deadline registration August 15, 1997 Deadline revised papers September 15, 1997 Workshop start October 3, 1997 Workshop end October 5, 1997 Travel Directions Venue site: Hotel des Alpes Vaudoises Rue de Bugnon 1823 Glion (Montreux) Switzerland Tel: + 41 21 963 20 76 Fax: + 41 21 963 56 94 The workshop site is located at the foot of the Rochers-de-Naye at an altitude of approximatly 670 meters above sea-level in the village of Glion. It is overlooking Montreux and the Lac Leman and offers an exciting view on the Alpes. The hotel is equipped with a private parking, a large garden and an outdoor swimming pool. There is a direct connection by local train from the venue site to the Montreux/Territet trainstation and vice versa. Participants will be provided with train tickets when needed. Getting there: The easiest connection by plane is using Geneva Airport. There are trains running directly from Geneva Airport to Montreux regularly in less than an hour. At Montreux you switch to a small local train which will take you from Montreux/Territet up the mountain toward the Hotel des Alpes Vaudoises. This trains stops right in front of the hotel at the "Hotel des Alpes Vaudoises" train stop. To ICANN...: There is a direct connection from Montreux to Laussane. More information can be found at the ICANN www-site. Paper Format The workshop participants are encouraged to submit their papers in LaTeX format. However, we can also process Word and Wordperfect documents. If you submit non-LaTeX formatted papers please include plain text versions of your paper on PC disk as well as postscript versions of your figures. - Papers should be submitted camera-ready. - The printable are should be 12 by 20 centimeters (including page numbers) in which case font size should be 10-point size using a Times or similar font. - Titles and subtitle should be typeset using 12-point fonts. Footnotes and super/subscripts should be 9-point characters. - When you use standard LaTeX styles (e.g. article) your paper will have a default printable area of approximatly 15 by 23 centimeters and will be reduced in size by 80% for publication. Please take this into account when preparing your figures. The length of submitted papers should not exceed 8 pages including figures, tables and references.Centered at the top of the first page should be the title of the papers, author names(s), affiliation(s) and mailing address(es). Please submit 4 (four) copies of your paper as well as a version on disk (PC/DOS format only!) with the graphic files included on this disk. Paper versions of submitted manuscripts should not be stapled or folded. Like the previous occasions, the proceedings of this year's workshop will be published by the VUU Publishers, Amsterdam. Workshop Fee -Approximate- exchange rates: 100 DFL = 30 BPS = 50 USD = 85 DM Check your local exchange office for the actual rates! The workshop fee is DFL. 600,- (Dutch guilders) and includes 4 nights in a shared double room (DFL 725,- if you prefer a single room), half-board, refreshements during the sessions, welcome drinks on the night of arrival. Accompanying persons are welcome and charged DFL. 500 (DFL. 625 for single room). An optional outing is to be organized and included in the fee. The workshop proceedings will be handed out upon arrival. You can register by printing and filling the registration form and sending it per fax or regular mail. If you do not intend to pay by credit card you can also email the filled-out registration form. You will receive confirmation of your registration and payment upon receipt. If you have any question please direct your queries to: HELNET 1997 P.O. Box 2318 1000 CH Amsterdam Netherlands Fax: + 31 20 471 49 11 Email: helnet97 at dds.nl The workshop fee is payable in the following ways: - Bank transfer: D.S.R. ABN-AMRO Bank, Ceintuurbaan 89, Amsterdam, the Netherlands Account No. 43.55.28.521 Stating: HELNET97 - Credit card: We accept VISA and American Express credit cards. Please print and fill out the registration form and send it to us by fax or regular mail using details stated above. =================================================================== REGISTRATION FORM =================================================================== In order to register as a participant to the HELNET Workshop on Neural Networks please print and fill out the form below completely and mail or fax to the address indicated below: If you do NOT intend to pay using a credit card you can also email us the filled-out registration form. HELNET 1997 P.O. Box 2318 1000 CH Amsterdam Netherlands Fax: + 31 20 471 49 11 helnet97 at dds.nl Accomodation: [ ] Double room (FDL 600,-) Preference for sharing your room with: [ ] Single room (DFL 725.-) [ ] Accompanying person (DFL. 500,- when sharing a double room, DFL. 625,- otherwise) Your personal details: Name ----------------------------------------------------------- Position ------------------------------------------------------ Institution --------------------------------------------------- Address ------------------------------------------------------- P.O. Box ------------------------------------------------------ Zip Code ------------------------------------------------------ City ---------------------------------------------------------- Country ------------------------------------------------------ Tel ----------------------------------------------------------- Fax ----------------------------------------------------------- Email --------------------------------------------------------- Your registration: [ ] I can not make a final regsitration yet. Please keep me informed [ ] I register for HELNET97 but I will not submit a paper and would just like to attend and participate in the discussions. [ ] I would like to present the following paper: ----------------------------------------------------------------- ----------------------------------------------------------------- ----------------------------------------------------------------- ----------------------------------------------------------------- I would like to propose the following topics for discussion: 1. -------------------------------------------------------------- 2. -------------------------------------------------------------- 3. -------------------------------------------------------------- 4. -------------------------------------------------------------- I will make the workshop fee payable in the following way: [ ] Bank transfer: D.S.R. Account No. 43.55.28.521 Stating: HELNET97 ABN-AMRO Bank Ceintuurbaan 89, Amsterdam, the Netherlands [ ] Credit cards: We can accept the following cards: [ ] VISA [ ] American Express Card No. ------------------------------------------------------- Expiry Date: --------------------------------------------------- Signature: ----------------------------------------------------- Date: ---------------------------------------------------------- From lemmon at endeavor.ee.nd.edu Fri May 23 09:16:06 1997 From: lemmon at endeavor.ee.nd.edu (Michael Lemmon) Date: Fri, 23 May 1997 08:16:06 -0500 (EST) Subject: Final CFP - IEEE-TAC special issue Message-ID: <199705231316.IAA01592@endeavor.ee.nd.edu> Contributed by Michael D. Lemmon (lemmon at maddog.ee.nd.edu) FINAL CALL FOR PAPERS IEEE Transactions on Automatic Control announces a Special Issue on ARTIFICIAL NEURAL NETWORKS IN CONTROL, IDENTIFICATION, and DECISION MAKING Edited by Anthony N. Michel Michael Lemmon Dept of Electrical Engineering Dept. of Electrical Engineering University of Notre Dame University of Notre Dame Notre Dame, IN 46556, USA Notre Dame, IN, 46556, USA (219)-631-5534 (voice) (219)-631-8309 (voice) (219)-631-4393 (fax) (219)-631-4393 (fax) Anthony.N.Michel.1 at nd.edu lemmon at maddog.ee.nd.edu Deadlines: Paper Submission: July 1, 1997 Acceptance Decisions: December 31, 1997 There is a growing body of experimental work suggesting that artificial neural networks can be very adept at solving pattern classification problems where there is significant real-world uncertainty. Neural networks also provide an analog method for quickly determining approximate solutions to complex optimization problems. Both of these capabilities can be of great use in solving various control problems and in recent years there has been increased interest in the use of artificial neural networks in the control and supervision of complex dynamical systems. This announcement is a call for papers addressing the topic of neural networks in control, identification, and decision making. Accepted papers will be published in a special issue of the IEEE Transactions of Automatic Control. The special issue is seeking papers which use formal analysis to establish the role of neural networks in control, identification, and decision making. For this reason, papers consisting primarily of empirical simulation results will not be considered for publication. Before submitting, prospective authors should consult past issues of the IEEE Transactions on Automatic Control to identify the type of results and the level of mathematical rigor that are the norm in this journal. Submitted papers are due by July 1, 1997 and should be sent to Michael D. Lemmon or Anthony N. Michel. Notification of acceptance decisions will be sent by December 31, 1997. The special issue is targeted for publication in 1998 or early 1999. All papers will be refereed in accordance with IEEE guidelines. Please consult the inside back cover of any recent issue of the Transactions on Automatic Control for style and length of the manuscript and the number of required copies (seven copies with cover letter) to be sent to one of the editors of this special issue. From bd1q at eagle.cnbc.cmu.edu Fri May 23 11:27:11 1997 From: bd1q at eagle.cnbc.cmu.edu (Barbara Dorney) Date: Fri, 23 May 1997 11:27:11 -0400 (EDT) Subject: Opening for a Scientist/Educator Message-ID: <199705231527.LAA27284@eagle.cnbc.cmu.edu> Opening for a Scientist/Educator at The Center for the Neural Basis of Cognition (CNBC) a Joint Center of Carnegie Mellon and the University of Pittsburgh A Ph. D. Scientist/Educator with a background in COGNITIVE NEUROSCIENCE is sought to take a central role in formulation and development of an interactive planetarium show, TRACKING THE HUMAN BRAIN, funded by the National Science Foundation. The show will describe how cognition and perception arise from neural activity. The show will use the dome of the planetarium as a metaphor for the cerebral cortex of the brain, and will also use a novel interactive technology to allow members of the audience to participate as neurons in the simulation of human cognitive and perceptual processes. The individual hired will be expected to work with a team comprising other CNBC scientists and artists at the Studio for Creative Inquiry at Carnegie Mellon University, and science museum professionals at the Carnegie Science Center in Pittsburgh, to ensure that the production, which is to be disseminated to science centers world-wide, communicates essential scientific content in a manner accessible to the general public. This is a two-year position, and employment would ideally begin as early as July 1, 1997, but excellent candidates who may not be available until 1998 are also encouraged to apply. Salary commensurate with experience. Send a resume' detailing experience in research and science education relevant to the Neural Basis of Cognition to J. L. McClelland, Co-Director Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213-2683. Carnegie Mellon University is an Equal Opportunity /Affirmative Action Employer. From minton at ISI.EDU Fri May 23 13:09:06 1997 From: minton at ISI.EDU (Steve Minton) Date: Fri, 23 May 97 10:09:06 PDT Subject: JAIR article: Connectionist Theory Refinement: Genetically..." Message-ID: <9705231709.AA13087@sungod.isi.edu> Readers of this mailing list might be interested in the following article, which was just published by JAIR: Opitz, D.W. and Shavlik, J.W. (1997) "Connectionist Theory Refinement: Genetically Searching the Space of Network Topologies", Volume 6, pages 177-209. Available in HTML, Postscript (578K) and compressed Postscript (267K). For quick access via your WWW browser, use this URL: http://www.jair.org/abstracts/opitz97a.html More detailed instructions are below. Abstract: An algorithm that learns from a set of examples should ideally be able to exploit the available resources of (a) abundant computing power and (b) domain-specific knowledge to improve its ability to generalize. Connectionist theory-refinement systems, which use background knowledge to select a neural network's topology and initial weights, have proven to be effective at exploiting domain-specific knowledge; however, most do not exploit available computing power. This weakness occurs because they lack the ability to refine the topology of the neural networks they produce, thereby limiting generalization, especially when given impoverished domain theories. We present the REGENT algorithm which uses (a) domain-specific knowledge to help create an initial population of knowledge-based neural networks and (b) genetic operators of crossover and mutation (specifically designed for knowledge-based networks) to continually search for better network topologies. Experiments on three real-world domains indicate that our new algorithm is able to significantly increase generalization compared to a standard connectionist theory-refinement system, as well as our previous algorithm for growing knowledge-based networks. The article is available via: -- comp.ai.jair.papers (also see comp.ai.jair.announce) -- World Wide Web: The URL for our World Wide Web server is http://www.jair.org/ For direct access to this article and related files try: http://www.jair.org/abstracts/opitz97a.html -- Anonymous FTP from either of the two sites below. Carnegie-Mellon University (USA): ftp://ftp.cs.cmu.edu/project/jair/volume6/opitz97a.ps The University of Genoa (Italy): ftp://ftp.mrg.dist.unige.it/pub/jair/pub/volume6/opitz97a.ps The compressed PostScript file is named opitz97a.ps.Z (267K) -- automated email. Send mail to jair at cs.cmu.edu or jair at ftp.mrg.dist.unige.it with the subject AUTORESPOND and our automailer will respond. To get the Postscript file, use the message body GET volume6/opitz97a.ps (Note: Your mailer might find this file too large to handle.) Only one can file be requested in each message. For more information about JAIR, visit our WWW or FTP sites, or send electronic mail to jair at cs.cmu.edu with the subject AUTORESPOND and the message body HELP, or contact jair-ed at ptolemy.arc.nasa.gov. From edelman at ai.mit.edu Fri May 23 14:37:07 1997 From: edelman at ai.mit.edu (Shimon Edelman) Date: Fri, 23 May 1997 14:37:07 -0400 Subject: preprint - Complex Cells and Object Recognition Message-ID: <199705231837.OAA04793@it-cortex> Title: Complex Cells and Object Recognition Authors: Shimon Edelman, Nathan Intrator, Tomaso Poggio ftp URL: ftp://eris.wisdom.weizmann.ac.il/pub/edelman/nips97.ps.Z http URL: http://www.ai.mit.edu/~edelman/mirror/nips97.ps.Z Abstract: Nearest-neighbor correlation-based similarity computation in the space of outputs of complex-type receptive fields can support robust recognition of 3D objects. Our experiments with four collections of objects resulted in mean recognition rates between 84% (for subordinate-level discrimination among 15 quadruped animal shapes) and 94% (for basic-level recognition of 20 everyday objects), over a 40deg X 40deg range of viewpoints, centered on a stored canonical view and related to it by rotations in depth. This result has interesting implications for the design of a front end to an artificial object recognition system, and for the understanding of the faculty of object recognition in primate vision. ------------------------------------------------------------------------ Comments welcome. -Shimon Dr. Shimon Edelman, Center for Biol & Comp Learning, MIT DELENDA BIBI http://www.ai.mit.edu/~edelman fax: (+1) 617 253-2964 tel: 253-6357 edelman at ai.mit.edu From edelman at ai.mit.edu Sat May 24 09:06:16 1997 From: edelman at ai.mit.edu (Shimon Edelman) Date: Sat, 24 May 97 09:06:16 EDT Subject: preprint - Complex Cells and Object Recognition Message-ID: <9705241306.AA00563@peduncle.ai.mit.edu> A correction to the ftp URL: it should be ftp://eris.wisdom.weizmann.ac.il/pub/nips97.ps.Z Also, the server at www.ai.mit.edu seems to have slipped a disk, as a result of which the other URL I listed in the original posting will be unavailable until Tuesday or so. I apologize for these problems. -Shimon From rao at cs.rochester.edu Mon May 26 17:45:19 1997 From: rao at cs.rochester.edu (Rajesh Rao) Date: Mon, 26 May 1997 17:45:19 -0400 Subject: Tech Report: Shift Invariance and Local Receptive Fields Message-ID: <199705262145.RAA29053@skunk.cs.rochester.edu> The following technical report on learning localized receptive fields for transformation estimation is available on the WWW page: http://www.cs.rochester.edu/u/rao/ or via anonymous ftp (see instructions below). Comments and suggestions welcome (This message has been cross-posted - my apologies to those who receive it more than once). -- Rajesh Rao Internet: rao at cs.rochester.edu Dept. of Computer Science VOX: (716) 275-2527 University of Rochester FAX: (716) 461-2018 Rochester NY 14627-0226 WWW: http://www.cs.rochester.edu/u/rao/ =========================================================================== Localized Receptive Fields May Mediate Transformation-Invariant Recognition in the Visual Cortex Rajesh P.N. Rao and Dana H. Ballard Technical Report 97.2 National Resource Laboratory for the Study of Brain and Behavior Department of Computer Science, University of Rochester May 1997 Neurons in the visual cortex are known to possess localized, oriented receptive fields. It has previously been suggested that these distinctive properties may reflect an efficient image encoding strategy based on maximizing the sparseness of the distribution of output neuronal activities or alternately, extracting the independent components of natural image ensembles. Here, we show that a relatively simple neural solution to the problem of transformation-invariant visual recognition also causes localized, oriented receptive fields to be learned from natural images. These receptive fields, which code for various transformations in the image plane, allow a pair of cooperating neural networks, one estimating object identity (``what'') and the other estimating object transformations (``where''), to simultaneously recognize an object and estimate its pose by jointly maximizing the a posteriori probability of generating the observed visual data. We provide experimental results demonstrating the ability of these networks to factor retinal stimuli into object-centered features and object-invariant transformations. The resulting neuronal architecture suggests concrete computational roles for the neuroanatomical connections known to exist between the dorsal and ventral visual pathways. Retrieval information: FTP-host: ftp.cs.rochester.edu FTP-pathname: /pub/u/rao/papers/local.ps.Z WWW URL: http://www.cs.rochester.edu/u/rao/ 9 pages; 229K compressed. ========================================================================== Anonymous ftp instructions: >ftp ftp.cs.rochester.edu Connected to anon.cs.rochester.edu. 220 anon.cs.rochester.edu FTP server (Version wu-2.4(3)) ready. Name: [type 'anonymous' here] 331 Guest login ok, send your complete e-mail address as password. Password: [type your e-mail address here] ftp> cd /pub/u/rao/papers/ ftp> get local.ps ftp> bye From krose at wins.uva.nl Tue May 27 04:38:25 1997 From: krose at wins.uva.nl (Ben Krose) Date: Tue, 27 May 1997 10:38:25 +0200 (MET DST) Subject: Postdoc position available Message-ID: <199705270838.KAA19603@domedb.wins.uva.nl> I apologize if you receive this announcement multiple times. Ben Kr\"ose **************************************************************************** POSTDOCTORAL RESEARCH FELLOWSHIP available at the University of Amsterdam, Dept. of Computer Science Amsterdam, the Netherlands The Intelligent Autonomous Systems (IAS) Group at the University of Amsterdam is looking for a highly motivated research fellow for a 2 year postdoctoral position in the area of `Active Map Building and Sensoric Representations for Autonomous Learning Robot Systems'. With this project, the IAS group participates (with the Foundation for Neural Networks, SNN) in the Japanese `Real World Computing Partnership' (RWCP). We will develop methods for an intelligent, `hearing' and `seeing' robot, which has to act in an environment inhabitated by humans. Perception and map building are essential tasks for the system. The emphasis of the research will be on the application of neural (or other statistical) methodologies for the projection of high- dimensional sensor data to low-dimensional `environment representations'. Applicants should have a PhD in computer science, physics or electronic engineering, must have a strong background in learning, neurocomputing or statistics and must see the challenge of dealing with real-world sensoric data. Job specification: The post-doc salary will be maximally Dfl. 6140 per month, depending on experience. The position is available for 2 years with possible extension with a year. Applications: Interested candidates should send a letter with a CV and list of publications before June 15 1997 to dr. Ben J.A. Krose, Department of Computer Science University of Amsterdam Kruislaan 403 1098 SJ Amsterdam The Netherlands For information you can contact me: krose at wins.uva.nl Phone +31 20 525 7463 Fax +31 20 525 7490 A short description on the project can be found at http://www.wins.uva.nl/research/neuro/rwc.html From ataxr at IMAP1.ASU.EDU Mon May 26 13:21:26 1997 From: ataxr at IMAP1.ASU.EDU (Asim Roy) Date: Mon, 26 May 1997 13:21:26 -0400 (EDT) Subject: CONNECTIONIST LEARNING: IS IT TIME TO RECONSIDER THE FOUNDATIONS? Message-ID: Dave, I am posting the responses I have received so far. Some of the responses provide a great deal of insight on connectionist learning and neuroscience and their interactions (see, in particular, the second note by Dr. Peter Cariani). I have also tried to provide answers to two frequently asked questions. I hope all of this will generate more interest in the questions being raised about connectionist learning. As you can see below, perhaps other questions need to be raised. The original posting is attached below for reference. Asim Roy Arizona State University ============================ ANSWERS TO SOME FREQUENTLY ASKED QUESTIONS: a) Humans get stuck in local minima all the time. So what is wrong with algorithms getting stuck in local minima? RESPONSE: We can only claim that humans are sometimes "unable to learn." We cannot make any claim beyond that. And so this phenomenon of "unable to learn" does not necessarily imply "getting stuck in a local minima." Inability to learn maybe due to a number of reasons, including insufficient information, inability to extract the relevant features of a problem, insufficient reward or punishment, and so on. Again, to reiterate, "inability to learn" does not imply "getting stuck in a local minima." Perhaps this misconception has been promoted in order to justify certain algorithms and their weak learning characteristics. b) Why do you say classical connectionist learning is memoryless? Isn't memory actually in the weights? RESPONSE: Memoryless learning implies there is no EXPLICIT storage of any learning example in the system in order to learn. In classical connectionist learning, the weights of the net are adjusted whenever a learning example is presented, but it is promptly forgotten by the system. There is no EXPLICIT storage of any presented example in the system. That is the generally accepted view of "adaptive" or "on-line learning systems." Imagine such a system "planted" in some human brain. And suppose we want to train it to learn addition. So we provide the first example - say, 2 + 2 promptly adjust the weights of the net and forgets the particular example. It has done what it is supposed to do - adjust the weights, given a learning example. Suppose, you then ask this "human", fitted with this learning algorithm: "How much is 2 + 2?" Since it has only seen one example and has not yet fully grasped the rule for adding numbers, it probably would give a wrong answer. So you, as the teacher, perhaps might ask at that point: "I just told you 2 + 2 remember?" And this "human" might respond: "Very honestly, I don't recall you ever having said that! I am very sorry." And this would continue to happen after every example you present to this "human"!!! So do you think there is memory in those "weights"? Do you think humans are like that? Please send any comments on these issues directly to me (asim.roy at asu.edu). All comments/criticisms/suggestions are welcome. And all good science depends on vigorous debate. Asim Roy Arizona State University ============================ From weaveraj at helios.aston.ac.uk Wed May 28 08:49:53 1997 From: weaveraj at helios.aston.ac.uk (Andrew Weaver) Date: Wed, 28 May 1997 13:49:53 +0100 Subject: Postgraduate opportunities at Aston University Message-ID: <18503.199705281449@sun.aston.ac.uk> Research in Neural Computing PhD in Neural Computing at Aston University, Birmingham, UK PhD programmes are available on either a full-time or part-time basis. Most full-time students start in October each year and take the taught modules from the MSc in Pattern Analysis and Neural Networks during the first term. MSc by Research in Pattern Analysis and Neural Networks The MSc comprises a substantial 9 month research project, taught modules in Artificial Neural Networks, Statistical Pattern Analysis, Algorithms & Computational Mathematics, Time Series & Signal Processing, Data Visualisation & Density Modelling, & Research Methodology, computer lab. sessions & tutorials. The course emphasises the advantages of a principled approach to data analysis. There are strong commercial & industrial links through research projects & bursaries, & employment prospects for graduates are good. Funding Full studentships (for eligible students, please check EPS), paying tuition fees and living expenses, are available for both research programmes. Funding is provided by the EPSRC (http://www.epsrc.ac.uk/) and industrial sponsors and is awarded by the Research Group on a competitive basis. Applications from self-funded students are also very welcome. For further information, please contact: Hanni Sondermann, tel:0121 333 4631; email: ncrg at aston.ac.uk; www: http://www.ncrg.aston.ac.uk/ From fayyad at MICROSOFT.com Wed May 28 15:18:11 1997 From: fayyad at MICROSOFT.com (Usama Fayyad) Date: Wed, 28 May 1997 12:18:11 -0700 Subject: Data Mining and Knowledge Discovery: vol.1:2 & full contents 1:1 on web Message-ID: <28347281A2B5CF119AB000805FD4186602F15CF9@RED-77-MSG.dns.microsoft.com> Please post the following announcement: Issue 2 of the new journal: Data Mining and Knowledge Discovery has been finalized. You can access the abstracts and full text of the editorial at the journal's home page: http://www.research.microsoft.com/datamine Also, issue 1 is now available free on line from Kluwer's web server. Links to Kluwer's server are accessible via the above homepage or directly at: http://www.wkap.nl/kapis/CGI-BIN/WORLD/kaphtml.htm?DAMISAMPLE =================================== DATA MINING AND KNOWLEDGE DISCOVERY Volume 1, issue 2 =================================== CONTENTS: -------- Editorial Usama Fayyad, editor-in-chief ---------------------------------------------- PAPERS ------ BIRCH: A New Data Clustering Algorithm and Its Applications Tian Zhang, Raghu Ramakrishnan, Miron Livny Mathematical Programming in Data Mining O. L. Mangasarian A Simple Constraint-Based Algorithm for Efficiently Mining Observational Databases for Causal Relationships Gregory F. Cooper ---------------------------------------------- BREIF APPLICATION SUMMARY ------------------------- Visual Data Mining: Recognizing Telephone Calling Fraud Kenneth C. Cox, Stephen G. Eick, Graham J. Wills, and Ronald J. Brachman ================================================ Usama Fayyad datamine at microsoft.com for more information on the journal, CFP, and to submit a paper, please see: http://www.research.microsoft.com/datamine From tibs at utstat.toronto.edu Thu May 29 16:16:00 1997 From: tibs at utstat.toronto.edu (tibs@utstat.toronto.edu) Date: Thu, 29 May 97 16:16 EDT Subject: new paper available Message-ID: Tech report available: The out-of-bootstrap method for model averaging and selection ...enjoying the Bayesian omelette without making a mess in the kitchen J. Sunil Rao and Robert Tibshirani We propose a bootstrap-based method for model averaging and selection that focuses on training points that are left out of individual bootstrap samples. This information can be used to estimate optimal weighting factors for combining estimates from different bootstrap samples, and also for finding the best subsets the linear model setting. These proposals provide alternatives to Bayesian approaches to model averaging and selection, requiring less computation and fewer subjective choices. Comments welcome ftp://utstat.toronto.edu/pub/tibs/outofbootstrap.ps http://utstat.toronto.edu/tibs/research.html ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Rob Tibshirani, Dept of Preventive Med & Biostats, and Dept of Statistics Univ of Toronto, Toronto, Canada M5S 1A8. Phone: 416-978-4642 (PMB), 416-978-0673 (stats). FAX: 416 978-8299 computer fax 416-978-1525 (please call or email me to inform) tibs at utstat.toronto.edu. ftp: //utstat.toronto.edu/pub/tibs http://www.utstat.toronto.edu/~tibs +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ From gluck at pavlov.rutgers.edu Thu May 29 18:02:57 1997 From: gluck at pavlov.rutgers.edu (Mark A. Gluck) Date: Thu, 29 May 1997 18:02:57 -0400 Subject: Hippocampus Special Issue on Computational Models of Memory Message-ID: NEW SPECIAL ISSUE OF HIPPOCAMPUS ON COMPUTATIONAL MODELS: Now available for purchase as a single-issue from Wiley-Liss publishers (see below for ordering information): Computational Models of Hippocampal Function in Memory A Special Issue of Hippocampus (V.6, No.6, 1996) Guest Edited by: Mark A. Gluck (Rutgers-Newark Neuroscience) PRECIS: This special issue of Hippocampus focuses on computational network models of hippocampal function, especially those that make substantive contact with data from behavioral studies of learning and memory. It provides the non-specialist reader with a general understanding of the aims, accomplishments and limitations of computational approaches to understanding hippocampal function. The articles in this issue are written so as to facilitate the comparison between different computational models, and to assist the non-mathematically inclined reader in understanding how and where these models can be used as tools for understanding and motivating empirical research, including physiological, anatomical, and behavioral studies. The articles included focus on describing the spirit and behavior of computational models, omitting most details on their exact mathematical underpinnings. CONTENTS: INTRODUCTION page 565 Mark A. Gluck Computational Models of Hippocampal Function in Memory CIRCUIT-LEVEL MODELS page 567 Richard Granger, Sherman P. Wiebe, Makoto Taketani, and Gary Lynch Distinct Memory Circuits Composing the Hippocampal Region page 579 William B. Levy A Sequence Predicting CA3 Is a Flexible Associator That Learns and Uses Context to Solve Hippocampal-Like Tasks page 591 Jim-Shih Liaw and Theodore W. Berger Dynamic Synapse: A New Concept of Neural Representation and Computation page 601 Edmund T. Rolls A Theory of Hippocampal Function in Memory CONDITIONING AND ANIMAL LEARNING page 621 Catalin V. Buhusi and Nestor A. Schmajuk Attention, Configuration, and Hippocampal Function page 643 Mark A. Gluck and Catherine E. Meyers Integrating Behavioral and Physiological Models of Hippocampal Function EPISODIC MEMORY AND CONSOLIDATION page 654 James L. McClelland and Nigel H. Goddard Considerations Arising From a Complementary Learning Systems Perspective on Hippocampus and Neocortex page 666 Alessandro Treves, William E. Skaggs, and Carol A. Barnes How Much of the Hippocampus Can Be Explained by Functional Constraints? page 675 Jaap M.J. Murre TraceLink: A Model of Amnesia and Consolidation of Memory page 685 Bin Shen and Bruce L. McNaughton Modeling the Spontaneous Reactivation of Experience-Specific Hippocampal Cell Assembles During Sleep page 693 Michael E. Hasselmo, Bradley P. Wyble, and Gene V. Wallenstein Encoding and Retrieval of Episodic Memories: Role of Cholinergic and GABAergic Modulation in the Hippocampus SPATIAL NAVIGATION page 709 Robert U. Muller and Matt Stead Hippocampal Place Cells Connected by Hebbian Synapses Can Solve Spatial Problems page 720 Patricia E. Sharp, Hugh T. Blair, and Michael Brown Neural Network Modeling of the Hippocampal Formation Spatial Signals and Their Possible Role in Navigation: A Modular Approach page 735 Michael Recce and Kenneth D. Harris Memory for Places: A Navigational Model in Support of Marr's Theory of Hippocampal Function page 749 Neil Burgess and John O'Keefe Neuronal Computations Underlying the Firing of Place Cells and Their Role in Navigation page 763 Index for Volume 6 ORDERING INFORMATION This special issue is available for $45.00 from the publisher. To order, contact: Stacey Lee, John Wiley & Sons, Inc. 605 Third Avenue, New York, NY 10158 (212) 850-8840 or slee at wiley.com. ____________________________________________________________________________ Dr. Mark A. Gluck, Associate Professor Center for Molecular & Behavioral Neuroscience Rutgers University 197 University Ave. Newark, New Jersey 07102 Phone: (201) 648-1080 (Ext. 3221) Fax: (201) 648-1272 Cellular: (917) 855-8906 Email: gluck at pavlov.rutgers.edu WWW Homepage: www.gluck.edu _____________________________________________________________________________ From schwenk at IRO.UMontreal.CA Thu May 29 17:52:48 1997 From: schwenk at IRO.UMontreal.CA (Holger Schwenk) Date: Thu, 29 May 1997 17:52:48 -0400 (EDT) Subject: techreport on application of AdaBoost to neural networks Message-ID: <199705292152.RAA29852@grosse.iro.umontreal.ca> Hello, The following technical report on the application of AdaBoost to neural networks is available on the WWW page: http://www.iro.umontreal.ca/~lisa/pointeurs/AdaBoostTR.ps or http://www.iro.umontreal.ca/~schwenk/papers/AdaBoostTR.ps.gz Comments and suggestions are welcome. Holger Schwenk ------------------------------------------------------------------------------- Holger Schwenk phone: (514) 343-6111 ext 1655 fax: (514) 343-5834 LISA, Dept. IRO University of Montreal email: schwenk at iro.umontreal.ca 2920 Chemin de la tour, CP 6128 http://www.iro.umontreal.ca/~schwenk Montreal, Quebec, H3C 3J7 CANADA ------------------------------------------------------------------------------- Adaptive Boosting of Neural Networks for Character Recognition Holger Schwenk and Yoshua Bengio Dept. Informatique et Recherche Operationnelle Universite de Montreal, Montreal, Qc H3C-3J7, Canada {schwenk,bengioy}@iro.umontreal.ca May, 29 1997 "Boosting" is a general method for improving the performance of any learning algorithm that consistently generates classifiers which need to perform only slightly better than random guessing. A recently proposed and very promising boosting algorithm is AdaBoost [5]. It has been applied with great success to several benchmark machine learning problems using rather simple learning algorithms [4], in particular decision trees [1,2,6]. In this paper we use AdaBoost to improve the performances of neural networks applied to character recognition tasks. We compare training methods based on sampling the training set and weighting the cost function. Our system achieves about 1.4% error on a data base of online handwritten digits from more than 200 writers. Adaptive boosting of a multi-layer network achieved 2% error on the UCI Letters offline characters data set. From terry at salk.edu Sat May 31 01:00:23 1997 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 30 May 1997 22:00:23 -0700 (PDT) Subject: 4th Annual Joint Symposium on Neural Computation Message-ID: <199705310500.WAA13660@helmholtz.salk.edu> Abstract for the papers at this meeting can be found at: http://www.cnl.salk.edu/inc/JSNC97abstracts.html Proceedings can be obtained from the Institute for Neural Computation, UCSD 0523, La Jolla, CA 92093. --- 4th Annual Joint Symposium on Neural Computation --- Co-sponsored by Institute for Neural Computation University of California, San Diego and Biomedical Engineering Department and Neuroscience Program University of Southern California The University of Southern California University Park Campus Rm. 124, Seeley G. Mudd Building Saturday, May 17, 1997 8:00 a.m. to 5:30 p.m. Session 1: "VISION" - Bartlett Mel, Chair 9:00 am Peter Kalocsai, USC "Using Extension Fields to Improve Proformance of a Biologically Inspired Recognition Model" 9:15 am Kechen Zhang, The Salk Institute "A Conjugate Neural Representation of Visual Objects in Three Dimensions" 9:30 am Alexander Grunewald, Caltech "Detection of First and Second Order Motion" 9:45 am Zhong-Lin Lu, USC "Extracting Characteristic Structures from Natural Images Through Statistically Certified Unsupervised Learning" 10:00 am Don McCleod, UC San Diego "Optimal Nonlinear Codes" 10:15 am Lisa J. Croner, The Salk Institute "Segmentation by Color Influences Response of Motion-Sensitive Neurons in Cortical Area MT" 10:15 am - 10:30 am *** BREAK *** Session 2: "CODING in NEURAL SYSTEMS" - Christof Koch, Chair 10:30 am Dawei Dong, Caltech "How Efficient is Temporal Coding in the Early Visual System?" 10:45 am Martin Stemmler, Caltech "Entropy Maximization in Hodgkin-Huxley Models" 11:00 am Michael Wehr, Caltech "Temporal coding with Oscillatory Sequences of Firing" 11:15 am Martin J. McKeown, The Salk Institutde "Functional Magnetic Resonance Imaging Data Interpreted as Spatially Independent Mixtures" 11:30 am KEYNOTE SPEAKER: Prof. Irving Biederman, William M. Keck Professor of Cognitive Neuroscience Departments of Psychology and Computer Science and the Neuroscience Program, USC "Shape Representation in Mind and Brain" ------------- 12:30 pm - 2:30 pm *** LUNCH/POSTERS *** P1. Konstantinos Alataris, USC "Modeling of Neuronal Ensemble Dynamics" P2. George Barbastathis, Caltech "Awareness-Based Computation" P3. Marian Stewart Bartlett, UC San Diego "What are the Independent Components of Face Images?" P4. Maxim Bazhenov,The Salk Institute "A Computational Model of Intrathalamic Augmenting Responses" P5. Alan Bond, Caltech "A Computational Model for the Primate Brain Based on its Functional Architecture" P6. Glen Brown, The Salk Institute "Output Sign Switching by Neurons is Mediated by a Novel Voltage-Dependent Sodium Current" P7. Martin Chian, USC "Characterization of Unobservable Neural Circuitry in the Hippocampus with Nonlinear Systems Analysis" P8. Carl Chiang, The Neuroscience Institute "Visual and Sensorimotor Intra- and Intercolumnar Synchronization in Awake Behaving Cat" P9. Matthew Dailey, UC San Diego "Learning a Specializtion for Face Recognition" P10. Emmanuel Gillissen, Caltech "Comparative Studies of Callosal Specification in M ammals" P11. Michael Gray, The Salk Institute "Infomative Features for Visual Speechreading" P12. Alex Guazzelli, USC "A Taxon-Affordances Model of Rat Navigation" P13. Marwan Jabri, The Salk Instutute/Sydney University "A Neural Network Model for Saccades and Fixation on Superior Colliculus" P14. Mathew Lamb, USC "Depth Based Prey Capture in Frogs and Salamanders" P15. Te-Won Lee, The Salk Instutute "Independent Component Analysis for Mixed Sub-Gaussian and Super-Gaussian Sources" P16. George Marnellos, The Salk Institute "A Gene Network of Early Neurogenesis in Drosophila" P17. Steve Potter, Caltech "Animat in a Petri Dish: Cultured Neural Networks for Studying Neural Computation" P18. James Prechtl, UC San Diego "Visual Stimuli Induce Propagating Waves of Electrical Activity in Turtle Cortex" P19. Raphael Ritz, The Salk Institute "Multiple Synfire Chains in Simultaneous Action Lead to Poisson-Like Neuronal Firing" P20. Adrian Robert, UC San Diego "A Model of the Effects of Lamination and Celltype Specialization in the Neocortex" P21. Joseph Sirosh, HNC Software Inc. "Large-Scale Neural Network Simulations Suggest a Single Mechanism for the Self-Organization of Orientation Maps, Lateral Connections and Dynamic Receptive Fields in the Primary Visual Cortex" P22. George Sperling, UC Irvine "A Proposed Architecture for Visual Motion Perception" P23. Adam Taylor, UC San Diego "Dynamics of a Recurrent Network of Two Bipolar Units" P24. Laurenz Wiskott, The Salk Institute "Objective Functions for Neural Map Formation" ------------------------------------------- Session 3: "HARDWARE" - Michael Arbib, Chair 2:30 pm Christof Born, Caltech "Real Time Ego-Motion Estimation with Neuromorphic Analog VLSI Sensors" 2:45 pm Anil Thakoor, JPL "High Speed Image Computation with 3D Analog Neural Hardware" Session 4: "VISUOMOTOR COORDINATION" - Michael Arbib, Chair 3:00 pm Marwan Jabri, The Salk Institute/Sydney University "A Computational Model of Auditory Space Neural Coding in the Superior Colliculus" 3:15 pm Amanda Bischoff, USC "Modeling the Basal Ganglia in a Reciprocal Aiming Task" 3:30 pm Jacob Spoelstra, USC "A Computational Model of the Role of the Cerebellum in Adapting to Throwing While Wearing Wedge Prism Glasses" 3:45 pm - 4:00 pm *** BREAK *** Session 5: "CHANNELS, SYNAPSES, and DENDRITES" - Terry Sejnowski, Chair 4:00 pm Akaysha C. Tang, The Salk Institute "Modeling the Effect of Neuromodulation of Spike Timing in Neocortical Neurons" 4:15 pm Michael Eisele, The Salk Institute "Reinforcement Learning by Pyramidal Neurons" 4:30 pm Sunil S. Dalal, USC "A Nonlinear Prositive Feedback Model of Glutamatergic Synaptic Transmission in Dentate Gyrus" 4:45 pm Venkatesh Murthy, The Salk Institute "Are Neighboring Synapses Independent?" 5:00 pm Gary Holt, Caltech "Shunting Inhibition Does Not Have a Divisive Effect on Firing Rates" 5:15 pm Kevin Archie, USC "Binocular Disparity Tuning in Cortical 'Complex' Cells: Yet Another Role for Intradendritic Computation?" From lfausett at work2.cc.nps.navy.mil Thu May 1 18:44:02 1997 From: lfausett at work2.cc.nps.navy.mil (Laurene Fausett) Date: Thu, 1 May 1997 14:44:02 -0800 Subject: CFP-abstracts due by 5/15 Message-ID: SIAN KA'AN 97 The Second Joint Mexico-US International Workshop on Neural Networks and Neurocontrol August 19-29 1997 Playa del Carmen Quintana Roo Mexico GOAL Sian Ka'an 97, The Second Joint Mexico-US International Workshop on Neural Networks and Neurocontrol will provide an opportunity for academic and industrial researchers of the United States and Mexico, and other participants including graduate students, to exchange ideas and research results with the aim of enhancing the overall research effort, reducing duplication, and encouraging cooperation among the participants. FORMAT Approximately twenty speakers have been selected to present two hour keynote talks describing the state of the art of topics related to the development of the theoretical and applied aspects of neural networks and neurocontrol. Poster sessions of invited and contributed papers will allow for discussion of particular works. A hands-on workshop for students, and other interested participants, will provide some practice and more basic information about the topics addressed in the conference presentations and poster sessions. LOCATION The conference will be held in Playa del Carmen, in the state of Quintana Roo, Mexico. The hotel is a five star luxury facility, with many legendary Mayan ruins in the vicinity of the hotel. The facilities provide a relaxed and informal atmoshpere for the exchange of ideas. A program of cultural, educational, and social events will enhance the understanding, and help to establish solid cooperative efforts, among conference participants. CALL FOR PAPERS The Organizing Committee invites all persons interested in Artificial Neural Networks, Fuzzy Sets, Evolutionary Programming, Control Theory, and Smart Engineering Systems Theory to submit papers for presentation (poster sessions) at the conference. All papers accepted for presentation will be published in the conference proceedings. To ensure a high-quality conference and proceedings, all paper submissions will be reviewed for technical merit and content by three senior researchers in the field. Best Paper Awards will be presented in the areas of "Novel Applications", "Theoretical Development", and "Electronic Implementations". Authors are requested to submit letter of intent, abstract (up to 250 words), and information sheet (full name(s) of author(s), title of paper, phone/fax numbers, e-mail address) Important Dates: Letter of intent: by May 15, 1997. Notice of acceptance of papers will be sent by May 30, 1997. Camera-ready laser manuscripts: by June 30, 1997. Authors should forward the letter of intent, information sheet, and abstract to: Nydia Lara Zavala Laboratorio de Neurocomputacion Centro de Instrumentos Universidad Nacional Autonoma de Mexico Circuito Exterior, Ciudad Universitaria A.P. 70-186 Mexico, 04510, D.F. Phone: (525) 652 5920 Fax: 622 8620 e-mail: nydia at aleph.cinstrum.unam.mx or Laurene Fausett 450 Larkin Street Monterey, CA 93940 Phone: (408) 656-2714 Fax: (408) 656-2355 e-mail:lfausett at nps.navy.mil COST Conference Registration (until June 30) $300 ($350 after June 30) Hotel (per person/per night) including breakfast Single Room $ 60 Double Room $ 40 Triple Room $ 30 Lunch and Dinner are available at the Hotel for approximately $15 each Fellowships will be available for students on a competitive basis. Please request application forms from the organizing Committee Chairs (before May 15). KEYNOTE SPEAKERS (preliminary list, alphabetical order) James Bezdek Donald Fausett Laurene Fausett Benito Fernandez Joydeep Ghosh Robert Hecht-Nielsen John Koza George Lendaris Ken Marko Thomas J. McAvoy Kumpati Narendra Jose Principe Dejan J. Sobajic Eduardo Sontag Jean Jacques E. Slotine Paul Werbos Bernard Widrow Lotfi A. Zadeh SIAN KA'AN 97 SPONSORS National Science Foundation Academia de la Investigacion Cientifica Sociedad Mexicana de Instrumentacion Florida Institute of Technology University of Texas at Austin Consejo Nacional de Ciencia y Tecnologia Universidad Nacional Autonoma de Mexico Universidad de Quintana Roo Gobierno del Estado de Quintana Roo IBM de Mexico, S.A. de C.V. Sun de Mexico Apple Computer de Mexico, S.A. de C.V. HONORARY CONFERENCE CHAIRS Claudio Firmani Clementi Paul Werbos Bernard Widrow Felipe Lara ORGANIZING COMMITTEE CHAIRS Nydia Lara (Mexico) Laurene Fausett (USA) PROGRAM COMMITTEE CHAIR Bernard Widrow Felipe Lara Jose Principe Dejan Sobajic Benito Fernandez WORKSHOP COMMITTEE CHAIR Nydia Lara Laurene Fausett Benito Fernandez George Lendaris Guillermo Morales TOPICS OF INTEREST Architecture - ANN Paradigms - Associative Memories - Hybrid Systems Learning - Gradient-Based Learning - Stochastic Learning - Adaptive Methods - Supervised Learning - Reinforcement Learning Systems Analysis - Time Series - Signal Processing - Systems Modeling - Process Monitoring - Fuzzy Models Control & Design - Optimization - Neurocontrol - Adaptive Control - Learning Control - Fuzzy Control - Intelligent Control From moody at chianti.cse.ogi.edu Thu May 1 19:14:41 1997 From: moody at chianti.cse.ogi.edu (John Moody) Date: Thu, 1 May 97 16:14:41 -0700 Subject: Research Position in Statistical Learning Algorithms at OGI Message-ID: <9705012314.AA20039@chianti.cse.ogi.edu> Research Position in Nonparametric Statistics, Neural Networks and Machine Learning at Department of Computer Science & Engineering Oregon Graduate Institute of Science & Technology I am seeking a highly qualified researcher to take a leading role on a project involving the development and testing of new model selection and input variable subset selection algorithms for classification, regression, and time series prediction applications. Candidates should have a PhD in Statistics, EE, CS, or a related field, have experience in neural network modeling, nonparametric statistics or machine learning, have strong C programming skills, and preferably have experience with S-Plus and Matlab. The compensation and level of appointment (Postdoctoral Research Associate or Senior Research Associate) will depend upon experience. The initial appointment will be for one year, but may be extended depending upon the availability of funding. Candidates who can start by July 1, 1997 or before will be given preference, although an extremely qualified candidate who is available by September 1 may also be considered. If you are interested in applying for this position, please mail, fax, or email your CV (ascii text or postscript only), a letter of application, and a list of at least three references (names, addresses, emails, phone numbers) to: Ms. Sheri Dhuyvetter Computer Science & Engineering Oregon Graduate Institute PO Box 91000 Portland, OR 97291-1000 Phone: (503) 690-1476 FAX: (503) 690-1548 Email: sherid at cse.ogi.edu Please do not send applications to me directly. I will consider all applications received by Sheri on or before June 1. OGI (Oregon Graduate Institute of Science and Technology) has over a dozen faculty, senior research staff, and postdocs doing research in Neural Networks, Machine Learning, Signal Processing, Time Series, Control, Speech, Language, Vision, and Computational Finance. Short descriptions of our research interests are appended below. Additional information is available on the Web at http://www.cse.ogi.edu/Neural/ and http://www.cse.ogi.edu/CompFin/ . OGI is a young, but rapidly growing, private research institute located in the Silicon Forest area west of downtown Portland, Oregon. OGI offers Masters and PhD programs in Computer Science and Engineering, Electrical Engineering, Applied Physics, Materials Science and Engineering, Environmental Science and Engineering, Biochemistry, Molecular Biology, Management, and Computational Finance. The Portland area has a high concentration of high tech companies that includes major firms like Intel, Hewlett Packard, Tektronix, Sequent Computer, Mentor Graphics, Wacker Siltronics, and numerous smaller companies like Planar Systems, FLIR Systems, Flight Dynamics, and Adaptive Solutions (an OGI spin-off that manufactures high performance parallel computers for neural network and signal processing applications). John Moody Professor, Computer Science and Electrical Engineering Director, Computational Finance Program +++++++++++++++++++++++++++++++++++++++++++++++++++++++ Oregon Graduate Institute of Science & Technology Department of Computer Science & Engineering Department of Electrical Engineering Research Interests of Faculty, Research Staff, and Postdocs in Neural Networks, Machine Learning, Signal Processing, Control, Speech, Language, Vision, Time Series, and Computational Finance Etienne Barnard (Associate Professor, EE): Etienne Barnard is interested in the theory, design and implementation of pattern-recognition systems, classifiers, and neural networks. He is also interested in adaptive control systems -- specifically, the design of near-optimal controllers for real- world problems such as robotics. Ron Cole (Professor, CSE): Ron Cole is director of the Center for Spoken Language Understanding at OGI. Research in the Center currently focuses on speaker- independent recognition of continuous speech over the telephone and automatic language identification for English and ten other languages. The approach combines knowledge of hearing, speech perception, acoustic phonetics, prosody and linguistics with neural networks to produce systems that work in the real world. Mark Fanty (Research Associate Professor, CSE): Mark Fanty's research interests include continuous speech recognition for the telephone; natural language and dialog for spoken language systems; neural networks for speech recognition; and voice control of computers. Dan Hammerstrom (Associate Professor, CSE): Based on research performed at the Institute, Dan Hammerstrom and several of his students have spun out a company, Adaptive Solutions Inc., which is creating massively parallel computer hardware for the acceleration of neural network and pattern recognition applications. There are close ties between OGI and Adaptive Solutions. Dan is still on the faculty of the Oregon Graduate Institute and continues to study next generation VLSI neurocomputer architectures. Hynek Hermansky (Associate Professor, EE); Hynek Hermansky is interested in speech processing by humans and machines with engineering applications in speech and speaker recognition, speech coding, enhancement, and synthesis. His main research interest is in practical engineering models of human information processing. Todd K. Leen (Associate Professor, CSE): Todd Leen's research spans theory of neural network models, architecture and algorithm design and applications to speech recognition. His theoretical work is currently focused on the foundations of stochastic learning, while his work on Algorithm design is focused on fast algorithms for non-linear data modeling. John Moody (Professor, CSE and EE): John Moody does research on the design and analysis of learning algorithms, statistical learning theory (including generalization and model selection), optimization methods (both deterministic and stochastic), and applications to signal processing, time series, economics, and computational finance. He is the Director of OGI's Computational Finance Program. David Novick (Associate Professor, CSE): David Novick conducts research in interactive systems, including computational models of conversation, technologically mediated communication, and human-computer interaction. A central theme of this research is the role of meta-acts in the control of interaction. Current projects include dialogue models for telephone-based information systems. Misha Pavel (Associate Professor, EE): Misha Pavel does mathematical and neural modeling of adaptive behaviors including visual processing, pattern recognition, visually guided motor control, categorization, and decision making. He is also interested in the application of these models to sensor fusion, visually guided vehicular control, and human-computer interfaces. He is the Director of OGI's Center for Information Technology. Hong Pi (Senior Research Associate, CSE) Hong Pi's research interests include neural network models, time series analysis, and dynamical systems theory. He currently works on the applications of nonlinear modeling and analysis techniques to time series prediction problems and financial market analysis. Pieter Vermeulen (Research Associate Professor, EE): Pieter Vermeulen is interested in the theory, design and implementation of pattern-recognition systems, neural networks and telephone based speech systems. He currently works on the realization of speaker independent, small vocabulary interfaces to the public telephone network. Current projects include voice dialing, a system to collect the year 2000 census information and the rapid prototyping of such systems. He is also a cofounder of Livingston Legend, a company specializing in neural network based intelligent sensors. Eric A. Wan (Assistant Professor, EE): Eric Wan's research interests include learning algorithms and architectures for neural networks and adaptive signal processing. He is particularly interested in neural applications to time series prediction, adaptive control, active noise cancellation, and telecommunications. Lizhong Wu (Senior Research Associate, CSE): Lizhong Wu's research interests include neural network theory and modeling, time series analysis and prediction, pattern classification and recognition, signal processing, vector quantization, source coding and data compression. He is now working on the application of neural networks and nonparametric statistical paradigms to finance. From sml%essex.ac.uk at seralph21.essex.ac.uk Fri May 2 04:53:55 1997 From: sml%essex.ac.uk at seralph21.essex.ac.uk (Simon Lucas) Date: Fri, 02 May 1997 09:53:55 +0100 Subject: Rapid best-first retrieval from massive dictionaries (paper available) Message-ID: <3369ABA3.D8A@essex.ac.uk> The following paper has recently been published in Pattern Recognition Letters (vol 17; 1507 - 1512), and may be of interest to people on this list. ------------------------------------------------ Rapid Best-First Retrieval from Massive Dictionaries by Lazy Evaluation of a Syntactic Neural Network S.M. Lucas A new method of searching large dictionaries given uncertain inputs is described, based on the lazy evaluation of a syntactic neural network (SNN). The new method is shown to significantly outperform a conventional trie-based method for large dictionaries (e.g.\ in excess of 100,000 entries). Results are presented for the problem of recognising UK postcodes using dictionary sizes of up to 1 million entries. Most significantly, it is demonstrated that the SNN actually gets {\em faster} as more data is loaded into it. ------------------------------------------------ Sorry, but no electronic version available due to copyright. Paper offprints available on request. Regards, Simon Lucas ------------------------------------------------ Dr. Simon Lucas Department of Electronic Systems Engineering University of Essex Colchester CO4 3SQ United Kingdom Tel: (+44) 1206 872935 Fax: (+44) 1206 872900 Email: sml at essex.ac.uk http://esewww.essex.ac.uk/~sml secretary: Mrs Janet George (+44) 1206 872438 ------------------------------------------------- From harnad at cogsci.soton.ac.uk Mon May 5 10:51:08 1997 From: harnad at cogsci.soton.ac.uk (Stevan Harnad) Date: Mon, 5 May 97 15:51:08 +0100 Subject: Call for Papers: Psycoloquy Message-ID: <2758.9705051451@cogsci.ecs.soton.ac.uk> PSYCOLOQUY CALL FOR PAPERS PSYCOLOQUY is a refereed electronic journal (ISSN 1055-0143) now in its 8th year of publication. PSYCOLOQUY is sponsored on an experimental basis by the American Psychological Association and is currently estimated to reach a readership of over 50,000. PSYCOLOQUY publishes reports of new ideas and findings on which the author wishes to solicit rapid peer feedback, international and interdisciplinary ("Scholarly Skywriting"), in all areas of psychology and its related fields (biobehavioral science, cognitive science, neuroscience, social science, etc.). All contributions are refereed. All target articles, commentaries and responses must have (1) a short abstract (up to 100 words for target articles, shorter for commentaries and responses), (2) an indexable title, (3) the authors' full name(s), institutional address(es) and URL(s). In addition, for target articles only: (4) 6-8 indexable keywords, (5) a separate statement of the authors' rationale for soliciting commentary (e.g., why would commentary be useful and of interest to the field? what kind of commentary do you expect to elicit?) and (6) a list of potential commentators (with their email addresses). All paragraphs should be numbered in articles, commentaries and responses (see format of already published articles in the PSYCOLOQUY archive; line length should be < 80 characters, no hyphenation). Two version of the figurese would be helpful, one version as screen-readable ascii the other as .gif .jpeg .tiff or (least preferred:) postscript files (or in some other universally available format) to be printed out locally by readers to supplement the screen-readable text of the article. PSYCOLOQUY also publishes multiple reviews of books in any of the above fields; these should normally be the same length as commentaries, but longer reviews will be considered as well. Book authors should submit a 500-line self-contained Precis of their book, in the format of a target article; if accepted, this will be published in PSYCOLOQUY together with a formal Call for Reviews (of the book, not the Precis). The author's publisher must agree in advance to furnish review copies to the reviewers selected. Authors of accepted manuscripts assign to PSYCOLOQUY the right to publish and distribute their text electronically and to archive and make it permanently retrievable electronically, but they retain the copyright, and after it has appeared in PSYCOLOQUY authors may republish their text in any way they wish -- electronic or print -- as long as they clearly acknowledge PSYCOLOQUY as its original locus of publication. However, except in very special cases, agreed upon in advance, contributions that have already been published or are being considered for publication elsewhere are not eligible to be considered for publication in PSYCOLOQUY, Please submit all material to psyc at pucc.princeton.edu http://www.princeton.edu/~harnad/psyc.html http://cogsci.soton.ac.uk/psyc ftp://ftp.princeton.edu/pub/harnad/Psycoloquy ftp://cogsci.soton.ac.uk/pub/harnad/Psycoloquy gopher://gopher.princeton.edu/11/.libraries/.pujournals news:sci.psychology.journals.psycoloquy ---------------------------------------------------------------------- CRITERIA FOR ACCEPTANCE: To be eligible for publication, a PSYCOLOQUY target article should not only have sufficient conceptual rigor, empirical grounding, and clarity of style, but should also offer a clear rationale for soliciting Commentary. That rationale should be provided in the author's covering letter, together with a list of suggested commentators. A target article can be (i) the report and discussion of empirical research; (ii) an theoretical article that formally models or systematizes a body of research; or (iii) a novel interpretation, synthesis, or critique of existing experimental or theoretical work. Rrticles dealing with social or philosophical aspects of the behavioral and brain sciences are also eligible.. The service of Open Peer Commentary will be primarily devoted to original unpublished manuscripts. However, a recently published book whose contents meet the standards outlined above may also be eligible for Commentary. In such a Multiple Book Review, a comprehensive, 500-line precis by the author is published in advance of the commentaries and the author's response. In rare special cases, Commentary will also be extended to a position paper or an already published article dealing with particularly influential or controversial research. Submission of an article implies that it has not been published or is not being considered for publication elsewhere. Multiple book reviews and previously published articles appear by invitation only. The Associateship and professional readership of PSYCOLOQUY are encouraged to nominate current topics and authors for Commentary. In all the categories described, the decisive consideration for eligibility will be the desirability of Commentary for the submitted material. Controversially simpliciter is not a sufficient criterion for soliciting Commentary: a paper may be controversial simply because it is wrong or weak. Nor is the mere presence of interdisciplinary aspects sufficient: general cybernetic and "organismic" disquisitions are not appropriate for PSYCOLOQUY. Some appropriate rationales for seeking Open Peer Commentary would be that: (1) the material bears in a significant way on some current controversial issues in behavioral and brain sciences; (2) its findings substantively contradict some well-established aspects of current research and theory; (3) it criticizes the findings, practices, or principles of an accepted or influential line of work; (4) it unifies a substantial amount of disparate research; (5) it has important cross-disciplinary ramifications; (6) it introduces an innovative methodology or formalism for consideration by proponents of the established forms; (7) it meaningfully integrates a body of brain and behavioral data; (8) it places a hitherto dissociated area of research into an evolutionary or ecological perspective; etc. In order to assure communication with potential commentators (and readers) from other PSYCOLOQUY specialty areas, all technical terminology must be clearly defined or simplified, and specialized concepts must be fully described. NOTE TO COMMENTATORS: The purpose of the Open Peer Commentary service is to provide a concentrated constructive interaction between author and commentators on a topic judged to be of broad significance to the biobehavioral science community. Commentators should provide substantive criticism, interpretation, and elaboration as well as any pertinent complementary or supplementary material, such as illustrations; all original data will be refereed in order to assure the archival validity of PSYCOLOQUY commentaries. Commentaries and articles should be free of hyperbole and remarks ad hominem. STYLE AND FORMAT FOR ARTICLES AND COMMENTARIES TARGET ARTICLES: should not exceed 500 lines (~4500 words); commentaries should not exceed 200 lines (1800 words), including references. Spelling, capitalization, and punctuation should be consistent within each article and commentary and should follow the style recommended in the latest edition of A Manual of Style, The University of Chicago Press. It may be helpful to examine a recent issue of PSYCOLOQUY. All submissions must include an indexable title, followed by the authors' names in the form preferred for publication, full institutional addresses and electronic mail addresses, a 100-word abstract, and 6-12 keywords. Tables and diagrams should be made screen-readable wherever possible (if unavoidable, printable postscript files may contain the graphics separately). All paragraphs should be numbered, consecutively. No line should exceed 72 characters, and a blank line should separate paragraphs. REFERENCES: Bibliographic citations in the text must include the author's last name and the date of publication and may include page references. Complete bibliographic information for each citation should be included in the list of references. Examples of correct style are: Brown(1973); (Brown 1973); Brown 1973; 1978); (Brown 1973; Jones 1976); (Brown & Jones 1978); (Brown et al. 1978). References should be typed on a separate sheet in alphabetical order in the style of the following examples. Do not abbreviate journal titles. Kupfermann, I. & Weiss, K. (1978) The command neuron concept. Behavioral and Brain Sciences 1:3-39. Dunn, J. (1976) How far do early differences in mother-child relations affect later developments? In: Growing point in ethology, ed. P. P. G. Bateson & R. A. Hinde, Cambridge University Press. Bateson, P. P. G. & Hinde, R. A., eds. (1978) Growing points in ethology, Cambridge University Press. EDITING: PSYCOLOQUY reserves the right to edit and proof all articles and commentaries accepted for publication. Authors of articles will be given the opportunity to review the copy-edited draft. Commentators will be asked to review copy-editing only when changes have been substantial. --------------------------------------------------------------------------- Prof. Stevan Harnad psyc at pucc.princeton.edu Editor, Psycoloquy phone: +44 1703 594-583 fax: +44 1703 593-281 Department of Psychology http://cogsci.soton.ac.uk/psyc University of Southampton http://www.princeton.edu/~harnad/psyc.html Highfield, Southampton ftp://ftp.princeton.edu/pub/harnad/Psycoloquy SO17 1BJ UNITED KINGDOM ftp://cogsci.soton.ac.uk/pub/harnad/Psycoloquy news:sci.psychology.journals.psycoloquy gopher://gopher.princeton.edu/11/.libraries/.pujournals Sponsored by the American Psychological Association (APA) From tony at salk.edu Tue May 6 21:40:53 1997 From: tony at salk.edu (Tony Bell) Date: Tue, 6 May 1997 18:40:53 -0700 (PDT) Subject: NIPS 97 deadlines Message-ID: <199705070140.SAA11760@curie.salk.edu> ************** NIPS*97 DEADLINES APPROACHING WARNING ************* Just to remind you all that the deadlines for NIPS submissions is MAY 23 and for workshop proposals, it is MAY 20, only "a few short weeks away". All information regarding the conference and submissions (including NIPS LaTeX style files) is on the NIPS web page: http://www.cs.cmu.edu/Groups/NIPS/ So you still have time to hone your algorithms, and write your 7 pages! - Tony Bell (NIPS Publicity) ************** NIPS*97 DEADLINES APPROACHING WARNING ************* From adilson at uenf.br Tue May 6 15:19:44 1997 From: adilson at uenf.br (Adilson GONGALVES) Date: Tue, 6 May 1997 17:19:44 -0200 Subject: No subject Message-ID: <9705061919.AA13655@uenf.br> Please Distribute. Thank you. Cabral LIMA. ANNOUNCING InterSymp' 97 9th INTERNATIONAL CONFERENCE on SYSTEMS RESEARCH INFORMATICS AND CYBERNETICS to be held August 18-23, 1997 at the Markgraf-Ludwig-Gymnasium in Baden-Baden, Germany Sponsored by: The International Institute for Advanced Studies in System Research and Cybernetics and Society for Applied Systems Research The Conference will provide a forum for the presentation and discussion of short reports on current systems research in humanities, sciences and engineering. A number of specialized symposia is being organized to focus on reseqrch in computer science, synergetics, cognitive science, psychocybernetics, sociocybernetics, logic, philosophy, management, ecology, health care, education and other related areas. The aim of the Conference is to encourage and facilitate the interdisciplinary and transdisciplinary communication and cooperation amongst scientists, engineers and professionals working in different fields, and to identify and develop those areas of research that will most benefit from such a cooperation. Participants who wish to present a paper are requested to submit two copies of an Abstract up to 200 words) as soon as possible but not later than May 20, 1997. All submitted papers will be refereed. Those selected will be scheduled for presentation and published in Conference Proceedings. Notification of acceptance will be sent to authors by July 5, 1997. The full papers not exceeding 5 single-spaced typed pages with photoready copies of artwork should be submitted by July 25, 1997. Important Dates Abstracts/Summary by May 20, 1997 Notification of Acceptance by July 5, 1997 InterSymp' 97 9th INTERNATIONAL CONFERENCE on SYSTEMS RESEARCH INFORMATICS AND CYBERNETICS PRELIMINARY PROGRAM Friday - August 15, 1997 10:00-24:00 City Festival (StadtFest) Saturday - August 16, 1997 10:00-24:00 City Festival (StadtFest) Sunday - August 17, 1997 10:00-24:00 City Festival (StadtFest) Monday - August 18, 1997 09:00-12:00 Registration of Participants 14:00-17:30 Opening Session 18:00-20:00 Presidential Reception Tuesday - August 19, 1997 08:30-12:30 Plenary Session & Symposia 12:30-14:00 Lunch Break 14:00-18:00 Plenary Session & Symposia Wednesday - August 20, 1997 08:30-12:30 Plenary Session & Symposia 12:30-14:00 Lunch Break 14:00-18:00 Plenary Session & Symposia Thursday - August 21, 1997 08:30-12:30 Plenary Session & Symposia 12:30-14:00 Lunch Break 14:00-18:00 Plenary Session & Symposia 17:00-18:00 Award Ceremony Friday - August 22, 1997 10:00-12:00 Board of Directors Meeting 12:00-14:00 Lunch Break 14:00-17:00 General Assembly I Saturday - August 23, 1997 10:00-12:00 General Assembly II 12:00-14:00 Lunch Break 14:00-16:00 Closing Session TRAVEL INFORMATION: Baden-Baden is a beautiful spa-resort town and convention center located in the middle of the Black Forest in the western part of Germany. It can be reached in two hours by train from Frankfurt or Stuttgart. The best way to travel to the conference site is to fly first to Frankfurt(or Sttutgart) and then to take na express train to Baden-Baden. Those travelling by car can reach Baden-Baden through Hwy (Autobahn) A5 (Frankfurt-Basel) or through Hwy A8 (Stuttgart-Karlsruhe). The conference will be held at the Markgraf-Ludwig-Gymnasium, located at Hardstr. 2 in the center of Baden-Baden. This conference site can be easily reached from Baden-Baden railway station by city bus travelling to Augustaplatz, or by taxi. ACCOMODATION: All conference participants are responsible for making their own travel arrangements and hotel reservations. Convenient and reasonable accomodation is available in various Baden-Baden hotels, some of which are indicated on the overleaf. Prices for accomodations range between $40.00 and $90.00 (U.S. $) per day, depending on hotel category and the type of occupancy (single, double,etc.). Participants or their travel agencies should make a reservation in a hotel of their choice in writing, indicating preferred price range, type of occupancy and the length of intended stay. This reservation should be made as soon as possible. August is a very popular vacation month in Europe and preferred flights and hotel accomodation may not be easily available unless booked well in advance. Further tourist information and help with hotel reservations in Baden-Baden is provided by: TOURIST - INFO BUREAU Augustaplatz 8 76530 Baden-Baden Germany CUT HERE NAME:____________________________________________ TITLE: _____________________ Institution/Organization: ____________________________________________________________ Mailing Address: __________________________________________________________________ ____________________________________________________________________________ __________________________________________ Home Phone: ___________ Office Phone: ___________ Fax: ___________ Email: _____________ I am an Author Presenter Session Organizer Participant Tentative Title of My Presentation: ___________________________________________________ ________________________________________________________________________________ Attached is my cheque/money order for Conference Registration (US$300.00 [if paid before May 5, 1997]; US$350.00 [if paid after May 5, 1997] payable to Intersymp'97. Please mail your cheque with this form to: before June 12, 1997 to: after June 12, 1997 to: Dr. George E. Lasker Dr. George E. Lasker Shool of Computer Science Hauptpostlagernd University of Windsor 7001 Stuttgart Windsor, Ontario, Canada N9B 3P4 Germany InterSymp' 97 Hotels in Standard Category Hotel Roemerhof Sophienstr 25, 76530 Baden-Baden Tel.: 07221-23415, Fax: 07221-391707 Hotel Schweizer Hof Lange Str. 73, 76530 Baden-Baden Tel.: 07221-24231, Fax: 07221-24069 Hotel Shuetzenhof Baldreitstr, 1, 76530 Baden-Baden Tel.: 07221-24088, Fax: 07221-390674 Hotel Deutscher Kaiser Hauptstr. 35, 76530 Baden-Baden Tel.: 07221-72152, Fax: 07221-72154 Hotel Pension Schuler (WC & shower on the floor) Lichtentaler Str. 29, 76530 Baden-Baden Tel.: 07221-23619, Fax: 07221-82639 Hotel Tanneck Werderstr. 14, 76530 Baden-Baden Tel.: 07221-23035, Fax: 07221-38327 Hotel Bischoff Roemerplatz 2, 76530 Baden-Baden Tel.: 07221-22373, Fax: 07221-38308 Prof. Adilson GONCALVES Chefe do Laboratorio de Ciencias Matematicas Centro de Ciencias e Tecnologia Universidade Estadual do Norte Fluminense Av. Alberto LAMEGO 2000 Campos RJ BRAZIL t: 0247 263731 f: 0247 263730 **************************** *"Science non facit sautum"* **************************** From info at cogsci.ed.ac.uk Wed May 7 11:52:42 1997 From: info at cogsci.ed.ac.uk (Centre for Cognitive Science) Date: Wed, 7 May 1997 16:52:42 +0100 Subject: MSc in Cognitive Science, Edinburgh Message-ID: <9284.199705071552@lindsay.cogsci.ed.ac.uk> POSTGRADUATE STUDY IN THE CENTRE FOR COGNITIVE SCIENCE AT THE UNIVERSITY OF EDINBURGH Cognitive Psychology Neural Computation Computational Linguistics Formal Logic Data Intensive Linguistics Logic Programming Theoretical Linguistics & Knowledge Representation The Centre for Cognitive Science (CCS) offers a programme of postgraduate study in cognitive science, centred on language and cognition. The programme leads to the degrees of MSc in Cognitive Science and Natural Language, MPhil or PhD. Some MSc places are still available for the year starting October 1997. CCS is committed to research and postgraduate teaching in cognitive science at international level. The work of the Centre is at the heart of Edinburgh's view of *informatics* -- the study of the structure, behaviour, and design of computational systems, both natural and artificial. CCS has a well-developed system of collaboration with departments within Informatics (Artificial Intelligence, Computer Science) and beyond (Linguistics, Philosophy, Psychology). The Centre's lecturers and research fellows work with over 60 postgraduates in a rich and varied intellectual and social environment. Regular interdisciplinary research workshops, in which students actively participate, focus on current problems in cognitive science. Visiting researchers contribute to a lively seminar series. Research projects, many of them collaborative with other European centres of excellence, have been funded by the UK research councils ESRC, EPSRC and MRC as well as by the European Union LRE and ESPRIT programmes in such areas as natural language understanding and computational neuroscience. Teaching staff: [with associated departments] Ewan Klein Head of Department linguistic theory, phonology Chris Brew [HCRC] corpora, data intensive linguistics, language technology Jo Calder [HCRC] grammar formalisms, computational linguistics Matthew Crocker [ESRC Fellow] statistical language processing, computational psycholinguistics Mark Ellison computational phonology and morphology, natural computation Bruce Graham computational neuroscience, neural networks Alexander Holt natural language semantics, computational linguistics Alex Lascarides [HCRC] lexical and discourse processing, semantics, pragmatics Paul Schweizer PhD Organiser philosophical logic, philosophy of mind, philosophy of language Richard Shillcock MSc Course Organiser psycholinguistics, cognitive modelling, cognitive neuropsychology Keith Stenning [HCRC] human memory, inference, connectionism Associates and Fellows: Sheila Glasbey EPSRC Fellow M. Louise Kelly [Linguistics] Robert Ladd [Linguistics] John Lee [HCRC] Chris Mellish [Artificial Intelligence] Jon Oberlander [HCRC] Massimo Poesio EPSRC Fellow David Willshaw [MRC] Human Communication Research Centre: The HCRC is a centre of excellence in the interdisciplinary study of cognition and computation in human communication, funded by the Economic and Social Research Council (UK). Drawing together researchers from Edinburgh, Glasgow and Durham, HCRC focuses on the psychological aspects of real language processing. HCRC shares a site with CCS, and the two contribute towards a joint research environment. Studying in Edinburgh: Edinburgh contains the largest concentration of expertise in Artificial Intelligence and Natural Language Processing in Europe. Students have access to that expertise, to Edinburgh's large copyright libraries, and within Cognitive Science, to a substantial offprint library. The department possesses extensive computing facilities based on a network of Sun workstations and Apple Macintoshes; access to Edinburgh's concurrent supercomputer and other central computing services is easily arranged. Requirements: Applicants typically have a first degree in one of the participating areas or an appropriate joint honours degree. Funding: UK and EU students following the MSc and PhD courses are eligible to apply for studentships. CCS will advise all students concerning funding possibilities. CCS attracts studentships from a variety of UK and non-UK funding bodies. Non-UK applicants with sufficient background may enroll as non-graduating students. If you would like more information about the Postgraduate Programme in Cognitive Science at the University of Edinburgh, please contact: Admissions Centre for Cognitive Science University of Edinburgh 2 Buccleuch Place Edinburgh EH8 9LW UK Telephone: +44 131 650 4667 Fax: +44 131 650 6626 Email: info at cogsci.ed.ac.uk WWW: http://www.cogsci.ed.ac.uk/ From beigi at watson.ibm.com Wed May 7 13:09:21 1997 From: beigi at watson.ibm.com (Homayoon S.M. Beigi) Date: Wed, 7 May 1997 13:09:21 -0400 (EDT) Subject: ISSCI Pattern Recognition Section Message-ID: CALL FOR PAPERS PATTERN RECOGNITION SECTION of ISSCI'98 (WAC'98) http://ace.unm.edu/wac98/issci.html Hi, I am putting together a few sessions in the ISSCI'98 conference of WAC'98 revolving around pattern recognition. I am soliciting a 2 page abstract by the END of MAY. The papers may be on any Pattern Recognition related areas including but not limited to Speaker Recognition (Verification and Identification), Speech Recognition, On-Line Handwriting Recognition, Optical Character Recognition (Handwritten and text), image recognition and clasification problems. To see the details of the conference please see our web page at http://ace.unm.edu/wac98/issci.html. Address to send the papers: Homayoon Beigi IBM TJ Watson Research Center P.O. Box 218 -- Room 36-219 Yorktown Heights, NY 10598 USA Alternate Street address for Express Mail: Homayoon Beigi IBM TJ Watson Research Center Room 36-219 Route 134 Yorktown Heights, NY 10598 USA Tel. (914) 945-1894 Fax. (914) 243-4965 EMail: beigi at watson.ibm.com From avm at CS.ColoState.EDU Wed May 7 13:51:30 1997 From: avm at CS.ColoState.EDU (anneliese von mayrhauser) Date: Wed, 7 May 1997 11:51:30 -0600 (MDT) Subject: ICCIMA'98 Second Call for Papers Message-ID: <171E4667236@gscit-1.fcit.monash.edu.au> 5th International Workshop on Program Comprehension The Dearborn Inn Dearborn, Michigan May 28-30, 1997 ========================= Preliminary Program ========================= Theme Comprehending programs written by others is at the heart of various software engineering activities. Program comprehension is performed when one reuses, reengineers, or enhances existing (or legacy) programs. It is also performed during review or code walk-through of new programs. The goal of this workshop is to bring together practitioners and researchers from government, academia, and industry to review the current state of the art and explore solutions to the program comprehension problem. On-line Web sites for detailed information http://www.dis.unina.it/~iwpc97 http://www.cacs.usl.edu/~iwpc97 IWPC'97 Chairs General Chair: Anneliese von Mayrhauser, Colorado State University, USA Program Co-Chairs: Gerardo Canfora, University of Salerno, Italy Arun Lakhotia, University of Southwestern Louisiana. USA Local arrangements Chair: Vaclav Rajlich, Wayne State University, USA Sponsored by: The Institute of Electrical & Electronics Engineers Inc. IEEE Computer Society Technical Council on Software Engineering In cooperation with: Wayne State University, Detroit, Michigan Louisiana Board of Regents Preliminary Program: DAY 1: Wednesday, May 28, 1997 ============================== SESSION 1: Intro and Keynote 8:45-10:00am Keynote Address: Problems versus Solutions: The Role of the Domain in Software Comprehension. Iris Vessey, Indiana University, USA SESSION 2: Program understanding-in-the-large 10:30-12:00pm Session Chair: M. Vans, Hewlett-Packard Co., USA - - Relationships between Comprehension and Maintenance Activities G. Visaggio University of Bari, Bari, Italy - - Cognitive design elements to support the construction of a mental model during software visualization M.A.D. Storey, F.D. Fracchia, H.A. Muller University of Victoria, Victoria, Canada - - Understanding-in-the-large J.M. Favre IMAG Institute, Grenoble, France SESSION 3: Automated Program Understanding 1:30-3:00pm Session Chair: Hongji Yang, De Montfort University, UK - - Automated chunking to support program comprehension I.J. Burnstein, K. Roberson Illinois Institute of Technology, Chicago, USA - - Semi-automatic generation of parallelizable patterns from source code examples D. Markovic, J.R. Hagemeister, C.S. Raghavendra, S. Bhansali Washington State University, Pullman, USA - - Using knowledge representation to understand interactive systems M. Moore, S. Rugaber Georgia Institute of Technology, Atlanta, USA SESSION 4: Program Analysis 3:30-5:00pm Session Chair: A. De Lucia, University of Salerno, Italy - - Amorphous Program Slicing M. Harman, S. Danicic University of North London, London, UK - - Dynamic program slicing in understanding of program execution B. Korel, J. Rilling Illinois Institute of Technology, Chicago, USA - - Points-to Analysis for Program Understanding P. Tonella, G. Antoniol, R. Fiutem IRST, Povo (Trento), Italy E. Merlo Ecole Polytechnique, Montreal, Quebec, Canada DAY 2: Thursday, May 29, 1997 ============================= SESSION 5: Program Comprehension 8:30-10:00am Session Chair: J. Q. Ning, Andersen Consulting, USA - - A case study of domain-based program understanding R. Clayton, S. Rugaber, L. Taylor, L. Wills Georgia Institute of Technology, Atlanta, USA - - A little knowledge can go a long way towards program understanding J. Sayyad-Shirabad, T.C. Lethbridge University of Ottawa, Ottawa, Canada S. Lyon Mitel Corporation, Kanata, Canada - - Facilitating Program Comprehension via Generic Components for State Machines J. Weidl, R. Klosch, G. Trausmuth, H. Gall Technical University of Vienna, Vienna, Austria SESSION 6: Finding Reusable Assets 10:30-12:00pm Session Chair: A. Cahill, University of Limerick, Ireland - - Enriching Program Comprehension for Software Reuse E.L. Burd, M. Munro University of Durham, Durham, UK - - Identifying Objects in Legacy Systems A. Cimitile, A. De Lucia University of Salerno, Benevento, Italy G.A. Di Lucca, A.R. Fasolino University of Naples, Naples, Italy - - Code Understanding through Program Transformation for Reusable Component Identification W.C. Chu Feng Chia University, Taiwan P. Luker, Hongji Yang De Montfort University, Leicester, UK SESSION 7: Panel 1:30-3:00pm Session Chair: V. Rajlich, Wayne State University, USA Panellists: - - S. Rugaber, Georgia Institute of Technology, USA - - S. Tilley, Software Engineering Institute, USA - - A. Von Mayrhauser, Colorado State University, USA Infrastructure for Software Comprehension and Reengineering SESSION 8: Tools 3:30-5:00pm Session Chair: P. Linos, Tennessee Technological University, USA - - Evaluation of the ITOC information system design recovery tool A. Berglas, J. Harrison University of Queensland, Australia - - Glyphs for software visualization M.C. Chuah Carnegie Mellon University, USA S.G. Eick, Bell Laboratories, USA - - PUI: A Tool to Support Program Understanding P.S. Chan, M. Munro University of Durham, Durham, UK DAY 3: Friday, May 30, 1997 =========================== TUTORIAL: 8:30-4:00pm Empirical techniques: putting empirical evidence into perspective Marian Petre, Open University, UK Workshop location The 5th International Workshop on Program Comprehension is held in Dearborn Inn, Dearborn, Michigan. Dearborn is a suburb of Detroit and it is a home of Henry Ford Museum and historic Greenfield Village, both star attractions. Dearborn is easily accessible from Detroit airport. Hotel Information The Dearborn Inn Phone :13-271-2700 20301 Oakwood Blvd Fax: 313-271-7464 Dearborn, MI 48124 Cost:US$ 119 pernight USA (Single/Double) Deadline for hotel reservations : May 6,1997 Airport to Hotel : Taxi - $15,Shuttle -$10 ============================ IWPC'97 Registration form ============================ Return this registration form to: IEEE Computer Society IWPC'97 Registration 1730 Massachusetts Ave., N.W. Washington, DC 20036-1992 Fax: 202-728-0884 Phone: 202-371-1013 (sorry, no phone registrations) Name__________________________________________________________________ Affiliation___________________________________________________________ Mailing Address_______________________________________________________________ ______________________________________________________________________ Daytime Phone Number_________________________________________________________ Fax Number__________________________________________________________ E-mail address_______________________________________________________________ IEEE/CSMembership Number:_________________________________ Do not include my mailing address on: __Non-society mailing lists __Meeting attendee lists Workshop Registration Fees (Please check appropriate fee) Advance (received by May 6) On-Site (received after May 6) Member __US $250 __US $290 Nonmember __US $320 __US $370 Student __US $100 __US $100 Workshop registration fee includes admission to the workshop, refreshment breaks, the workshop luncheon, and one copy of the workshop proceedings. Student fee does not include the luncheon. Total Enclosed (in US dollars): $_________ Method of Payment (All payments must be in US dollars, drawn on US banks.) __Personal Check __Company Check __Traveler's Check Please make checks payable to IEEE Computer Society. __Purchase Order (U.S. organizations only) __VISA __Mastercard __American Express __Diners Club Card Number:_________________________________ Expiration Date:_____________________________ Cardholder Name:_____________________________ Signature:___________________________________ Written requests for refunds must be received in the registration office no later than 5/9/97. Refunds are subject to a $50 processing fee. All no-show registrations will be billed in full. Students are required to show current picture ID cards at the time of registration. Registrations after 5/9/97 will be accepted on-site only. ======================= Program Committee ======================= Paul Bailes, University of Queensland, Australia Paolo Benedusi, CRIAI, Italy Keith Bennett, University of Durham, UK Anthony Cahill, University of Limerick, Ireland Doris Carver, Louisiana State University, USA Aniello Cimitile, University of Benevento, Italy Robin Chen, AT&T Research, USA Ugo De Carlini, University of Naples, Italy Prem Devanbu, AT&T Research, USA Stephen G. Eick, AT&T Research, USA Philippe Facon, IEE-CNAM, France Harald Gall, Vienna University of Technology, Austria Ric Holt, University of Toronto, Canada Daniel Jackson, Carnegie Mellon University, USA Robin Jeffries, SunSoft, Inc., USA Rene Kloesch, Vienna University of Technology, Austria Panos Linos, Tennessee Technological University, USA Ettore Merlo, Ecole Polytechnique, Canada Hausi A. Muller, University of Victoria, Canada Malcolm Munro, University of Durham, UK Jim Q. Ning, Andersen Consulting, USA Alex Quilici, University of Hawaii, USA Vaclav Rajlich, Wayne State University, USA Spencer Rugaber, Georgia Institute of Technology, USA Dennis Smith, Software Engineering Institute, USA Harry M. Sneed, SES, Germany Jorma Taramaa, VTT Electronics, Finland Scott Tilley, Software Engineering Institute, USA Maria Tortorella, University of Naples, Italy Marie Vans, Hewlett-Packard Co., USA Giuseppe Visaggio, University of Bari, Italy Norman Wilde, University of Western Florida, USA Linda Wills, Georgia Institute of Technology, USA Hongji Yang, De Montfort University, UK From honavar at cs.iastate.edu Thu May 8 11:48:27 1997 From: honavar at cs.iastate.edu (Vasant Honavar) Date: Thu, 8 May 1997 10:48:27 -0500 (CDT) Subject: Call for Participation: Workshop on Automata Induction, Grammatical Inference, and Language Acquisition Message-ID: <199705081548.KAA05472@ren.cs.iastate.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 6576 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/caec0088/attachment-0001.ksh From berthold at ira.uka.de Thu May 8 11:54:21 1997 From: berthold at ira.uka.de (Michael Berthold) Date: Thu, 8 May 1997 17:54:21 +0200 Subject: IDA-97 Call for Participation Message-ID: <"i80fs1.ira.061:08.05.97.15.55.21"@ira.uka.de> CALL FOR PARTICIPATION The Second International Symposium on Intelligent Data Analysis (IDA-97) Birkbeck College, University of London 4th-6th August 1997 In Cooperation with AAAI, ACM SIGART, BCS SGES, IEEE SMC, and SSAISB [ http://web.dcs.bbk.ac.uk/ida97.html ] You are invited to participate in IDA-97, to be held in the heart of London. IDA-97 will be a single-track conference consisting of oral and poster presentations, invited speakers, demonstrations and exhibitions. The conference Call for Papers introduced a theme, "Reasoning About Data", and many papers complement this theme, but other, exciting topics have emerged, including exploratory data analysis, data quality, knowledge discovery and data-analysis tools, as well as the perennial technologies of classification and soft computing. A new and exciting theme involves analyzing time series data from physical systems, such as medical instruments, environmental data and industrial processes. Information regarding registration can be found on the IDA-97 web page (address listed above). Please note that there are reduced rates for early registration (before 2nd June). Also there are still a limited number of spaces available for exhibition, and potential exhibitors are encouraged to book early (the application deadline is 2nd June). Provisional Technical Program Schedule Intelligent Data Analysis 97 Monday 4 August 10:00 to 10:30 Opening Ceremony 10:30 to 11:45 Invited Presentation I Intelligent Data Analysis: Issues and Opportunities .... David J Hand (UK) (The abstract of Professor Hand's talk, and a brief biographical sketch, can be found at the end of this document) 12:00 to 1:30PM LUNCH 1:30 to 2:45 PAPER SESSION 1: Exploratory Data Analysis and Preprocessing Decomposition of heterogeneous classification problems .... C Apte, S J Hong, J Hosking, J Lepre, E Pednault & B Rosen (USA) Managing Dialogue in a Statistical Expert Assistant with a Cluster-based User Model .... M Muller (South Africa) How to Find Big-Oh in Your Data Set (and How Not To) .... C McGeoch, D Precup and P R Cohen (USA) 2:45 to 3:00 COFFEE BREAK 3:00 to 4:45 PAPER SESSION 2: Classification and Feature Selection A Connectionist Approach to Structural Similarity Determination as a Basis of Clustering, Classification and Feature Detection .... K Schadler and F Wysotzki (Germany) Efficient GA Bases Techniques for Automating the Design of Classification Models .... R Glover and P Sharpe (UK) Data Representation and ML Techniques .... C Lam, G West and T Caelli (Australia) Development of a Knowledge-Driven Constructive Induction Mechanism .... S Lo and A Famili (Canada) 4:45 to 5:00 COFFEE BREAK 5:00 to 5:45 POSTER SESSION I: Introduction TOPIC ONE: Exploratory Data Analysis, Preprocessing and Tools Data Classification Using a W.I.S.E. Toolbox .... I Berry and P Gough (UK) Mill's Methods for Complete Intelligent Data Analysis .... T Cornish (UK) Integrating Many Techniques for Discovering Structure in Data .... D Gregory and P Cohen (USA) Meta-Reasoning for Data Analysis Tool Allocation .... R Levinson and J Wilkinson (USA) Navigation for Data Analysis Systems .... R St Amant (USA) An Annotated Data Collection System to Support Intelligent Analysis of Intensive Care Unit Data .... C Tsien and J Fackler (USA) A Combined Approach to Uncertain Data Manipulation .... H Yu and A Ramer (Australia) TOPIC TWO: Classification and Feature Selection Oblique Linear Tree .... J Gama (Portugal) Feature selection for Neural Networks through Functional Links found by Evolutionary Computation .... S Haring, J Kok and M van Wezel (The Netherlands) Overfitting Explained: a Case Study .... D Jensen, T Oates and P Cohen (USA) Exploiting Symbolic Learning in Visual Inspection .... M Piccardi, R Cucchiara, M Bariani and P Mello (Italy) Forming Categories in Exploratory Data Analysis and Data Mining .... P Scott, H Williams and K Ho, (UK) A systematic description of greedy optimisation algorithms for cost sensitive generalisation .... M van Someren, C Torres and F Verdenius (The Netherlands) Automatic classification within object knowledge bases .... P Valtchev and J Euzenat (France) 5:45 to 7:00 POSTER SESSION I: Posters 7:00 to 9:00 Conference Reception Tuesday 5 August 9:15 to 10:30 Invited Presentation II Given 3,000,000,000 Nucleotides, Induce a Person or Intelligent Data Analysis for Molecular Biology .... Lawrence Hunter (USA) (The abstract of Dr Hunter's talk, and a brief biographical sketch, can be found at the end of this document) 10:30 to 10:45 COFFEE BREAK 10:45 to 12:00 PAPER SESSION 3: Medical Applications ECG Segmentation using Time-Warping .... H Vullings, M Verhaegen and H Vergruggen (The Netherlands) Interpreting longitudinal data through temporal abstractions: an application to diabetic patients monitoring .... R Bellazzi and C Larizza (Italy) Intelligent Support for Multidimensional Data Analysis in Environmental Epidemiology .... V Kamp and F Wietek (Germany) 12:00 to 1:30pm LUNCH 1:30 to 2:45 PAPER SESSION 4: Soft Computing Network Performance Assessment for Neurofuzzy Data Modelling .... S Gunn, M Brown and K Bossley (UK) A Genetic Approach to Fuzzy Clustering with a Validity Measure Fitness Function .... S Nascimento and F Moura-Pires (Portugal) The Analysis of Artificial Neural Network Data Models ....C Roadknight, D Palmer-Brown and G Mills (UK) 2:45 to 3:00 COFFEE BREAK 3:00 to 4:15 PAPER SESSION 5: Knowledge Discovery A Strategy for Increasing the Efficiency of Rule Discovery in Data Mining .... D McSherry (UK) Knowledge-Based Concept Discovery from Textual Data .... U Hahn and K Schnattinger (Germany) Knowledge Discovery in Endgame Databases .... M Schlosser (Germany) 4:15 to 4:30 COFFEE BREAK 4:30 to 5:15 POSTER SESSION II: Introduction TOPIC ONE: Fuzzy and Soft Computing Simulation Data Analysis Using Fuzzy Graphs .... K-P Huber and M Berthold (Germany) Mathematical Analysis of Fuzzy Classifiers .... F Klawonn and E-P Klement (Germany; Austria) Neuro-Fuzzy Diagnosis System with a Rated Diagnosis Reliability and Visual Data Analysis .... A Lapp and H Kranz (Germany) Genetic Fuzzy Clustering by means of Discovering Membership Functions .... M Turhan (Turkey) TOPIC TWO: Data Mining Parallelising Induction Algorithms for Data Mining .... J Darlington, Y Guo, J Sutiwaraphun, H To (UK) Data Analysis for Query Processing .... J Robinson (UK) Datum Discovery .... L Siklossy and M Ayel (France) Using neural network to extract knowledge from database .... Y Zhou, Y Lu and C Shi (China) TOPIC THREE: Estimation, Clustering A Modulated parzen-Windows Approach for Probability Density Estimation .... G van den Eijkel, J van der Lubbe and E Backer (The Netherlands) Improvement on Estimating Quantiles in Finite Population Using Indirect Methods of Estimation .... M Garcia, E Rodriguez and A Cebrian (Spain) Robustness of Clustering under Outliers .... Y Kharin (Belarus) The BANG-Clustering System: Grid-Based Data Analysis .... E Schikuta and M Erhart (Austria) TOPIC FOUR: Qualitative Models Diagnosis of Tank Ballast Systems .... B Schieffer and G Hotz (Germany) Qualitative Uncertainty Models from Random Set Theory .... O Wolkenhauer (UK) 5:15 to 6:30 POSTER SESSION II: Posters 7:30 - Conference Dinner Wednesday 6 August 9:15 to 10:30 PAPER SESSION 6: Data Quality Techniques for Dealing with Missing Values in Classification .... W Liu, A White, S Thompson and M Bramer (UK) The Use of Exogenous Knowledge to Learn Bayesian networks from Incomplete Databases .... M Ramoni and P Sebastiani (UK) Reasoning about Outliers in Visual Field Data .... J Wu, G Cheng and X Liu (UK) 10:30 to 10:45 COFFEE BREAK 10:45 to 12:00 PAPER SESSION 7: Qualitative Models Reasoning about sensor data for automated system identification .... E Bradley and M Easley (USA) Modeling Discrete Event Sequences as State Transition Diagrams .... A E Howe and G Somlo (USA) Detecting and Describing Patterns in Time-Varying Data Using Wavelets .... S Boyd (Australia) 12:00 to 12:30 Closing Ceremony 12:30 to 2:00pm LUNCH 2:00 to 4:00 IDA Open Business Meeting -------------------------------------------------------------------------- Intelligent data analysis: issues and opportunities David J. Hand Modern data analysis is the product of the union of several disciplines: statistics, computer science, pattern recognition, machine learning, and others. Perhaps the oldest parent is statistics, being driven by the demands of the different areas to which it has been applied. More recently, however, the possibilities arising from powerful and available computers have stimulated a revolution. Data of new kinds and in unimaginable quantities now occur; they bring with them entirely new classes of problems, problems to which the classical statistical solutions are not always well-matched; these problems in turn require novel and original solutions. In this talk I look at some of these new kinds of data, and the associated problems and solutions. The data include data sets which are large in dimensionality or number of records, data which are dependent on each other, and that special kind of qualitative data known as metadata. New problems arising from these data include straightforward mechanical issues of how to handle them, how to estimate descriptors and parameters (adaptive and sequential methods are obviously more important than in classical statistics), the (ir)relevance of significance tests, and automatic data analysis (as in anomaly detection in large data sets or, quite differently, in automatic model fitting). Some of the new types of model which are becoming so important nicely illustrate the interdisciplinary nature of modern data analysis: rule-based systems, hidden Markov models, neural networks, genetic algorithms, and so on. These are briefly discussed. All of this leads us to consider more carefully the link between data and information and to recognise the complementary data analytic abilities and powers of humans and computers. But we can go too far. If there is 'intelligent data analysis' there is also 'unintelligent data analysis'. Two different manifestations of the latter are examined, and a cautionary note sounded. -------------- David J. Hand is Professor of Statistics at the Open University in the UK. He has published over 150 papers and fourteen books, including Artificial Intelligence Frontiers in Statistics, Practical Longitudinal Data Analysis, and, most recently, Construction and Assessment of Classification Rules. He is founding editor and editor-in-chief of the journal Statistics and Computing. His research interests include developments at the interface between statistics and computing, multivariate statistics, the foundations of statistics, and applications in medicine, psychology, and finance. ------------------------------------------------------------------------ Given 3,000,000,000 Nucleotides, Induce a Person or Intelligent Data Analysis for Molecular Biology Lawrence Hunter In the last decade or so, large scale gene sequencing, combinatorial biochemistry, DNA PCR and many other innovations in molecular biotechnology have transformed biology from a data-poor science to a data-rich one. This data is a harbinger of great change in medicine, in agriculture, and in our fundamental understanding of life. However, the availability of an exponentially growing onslaught of relevant data is only the first step toward understanding. There are many scientifically and economically significant opportunities (and challenges) for intelligent data analysis in exploiting this information. In this talk, I will give a brief overview of the kinds of data available and the open problems in the field, describe a few successes, and speculate about the future. ------------------- Dr. Lawrence Hunter is the director of the Machine Learning Project at the (U.S.) National Library of Medicine, and a Fellow of the Krasnow Institute of Advanced Study in Cognition at George Mason University. He received his Ph.D. in Computer Science from Yale University in 1989. He edited the MIT Press book "Artificial Intelligence and Molecular Biology," and was recently elected the founding president of the International Society for Computational Biology. His research contributions span the range from basic contributions to machine learning methodology to development of IDA technology for clinical and pharmaceutical industry applications. From ingber at ingber.com Thu May 8 17:41:20 1997 From: ingber at ingber.com (Lester Ingber) Date: Thu, 8 May 1997 17:41:20 -0400 Subject: EEG data now publicly available Message-ID: <19970508174120.21441@ingber.com> EEG data now publicly available It is extremely difficult for modelers of nonlinear time series, and EEG systems in particular, to get access to large sets of raw clean data. Such a set of data was acquired and used for the study in %A L. Ingber %T Statistical mechanics of neocortical interactions: Canonical momenta indicators of electroencephalography %J Physical Review E %V 55 %N 4 %P 4578-4593 %D 1997 %O URL http://www.ingber.com/smni97_cmi.ps.Z The above adaptive simulated annealing (ASA) application to EEG analysis is one of several ASA applications being prepared for the SPEC (Standard Performance Evaluation Corporation) CPU98 suite. Eventually the code used to perform these calculations will be published on a CDROM by SPEC. Raw EEG data is now publicly available, as described in http://www.ingber.com/smni_eeg_data.html ftp://ftp.ingber.com/MISC.DIR/smni_eeg_data.txt The ASA code is publicly available at no charge from http://www.ingber.com/ ftp:/ftp.ingber.com A complete homepage is mirrored on http://www.alumni.caltech.edu/~ingber/ Lester -- /* RESEARCH ingber at ingber.com * * INGBER ftp://ftp.ingber.com * * LESTER http://www.ingber.com/ * * Prof. Lester Ingber __ PO Box 857 __ McLean, VA 22101-0857 __ USA */ From bruno at redwood.ucdavis.edu Thu May 8 14:44:26 1997 From: bruno at redwood.ucdavis.edu (Bruno A. Olshausen) Date: Thu, 8 May 1997 11:44:26 -0700 Subject: postdoctoral openings Message-ID: <199705081844.LAA17724@redwood.ucdavis.edu> MULTIDISCIPLINARY POSTDOCTORAL TRAINING A new National Science Foundation Biology Research Training Group at the University of California, Davis, invites postdoctoral applications from United States citizens and permanent residents. This multidisciplinary program is designed to provide biologists with sufficient mathematical skills and applied mathematicians with sufficient biological knowledge to solve problems in cell physiology, neurobiology, biofluiddynamics, ecology, and population biology. Relevant training faculty within neurobiology include Charles Gray (visual cortex physiology), Joel Keizer (modeling of intracellular dynamics), Bruno Olshausen (computational models of vision), and Mitch Sutter (auditory cortex physiology). Applicants should see the Research Training Group webpage (http://www.itd.ucdavis.edu/rtg) for information about applications, which are due by June 1, 1997. From pci-inc at aub.mindspring.com Tue May 6 14:52:37 1997 From: pci-inc at aub.mindspring.com (Mary Lou Padgett) Date: Tue, 06 May 1997 14:52:37 -0400 Subject: ICNN97 FINAL PROGRAM & REGISTRATION Message-ID: <2.2.16.19970506185237.1d37b920@pop.aub.mindspring.com> PLEASE CIRCULATE WIDELY (APOLOGIES IF YOU RECEIVE DUPLICATE COPIES) ICNN'97 FINAL PROGRAM Schedule and Registration (Conference, Tutorials, Tours, Hotel) *** Check our website: http://www.mindspring.com/ICNN97/ for details, final paper abstracts *** I. SCHEDULE _______________________________________________ SUNDAY, June 8, 1997 TUTORIALS _______________________________________________ 9:00 AM - 12:00 Noon T1: Neural Networks for Consciousness. J. G. Taylor: King's College, London T2: Network ensembles and hybrid systems. Joydeep Ghosh: Univ. of Texas 13:30 - 16:30 PM T3: Neuro-Fuzzy Recognition System: Concepts, Features and Feasibility. Sankar K. Pal: Indian Statistical Institute T4: Learning from Examples : from theory to practice. Don R. Hush: University of New Mexico 18:00 - 21:00 PM T5: Principles of Neurobiological Information Processing for Biology-Inspired Neural Computers. Rolf Eckmiller: University of Bonn T6: Hybrid Intelligent Information Systems - Models, Tools, Applications. Nik Kasabov: University of Otago Robert Kozma: Tohoku University _______________________________________________ MONDAY, June 9, 1997 _______________________________________________ 8:30 - 10:30 Opening Remarks and Plenary Session Plenary talk (David Waltz: Neural Nets and AI: Time for a Synthesis) Plenary talk (Jean-Jaques Slotine: Adaptive approximation networks for stable learning and control) 10:30 - 10:50 Coffee Break 10:50 - 12:30 8 Parallel sessions of 5 papers each AP1, SU1, LM1, PR1, TS1, AR1, CI1, SS6 12:30 - 13:50 Lunch Break 13:50 - 15:50 8 Parallel sessions of 6 papers each AP2, SU2, LM2, PR2, TS2, AR2, CI2, EC1 15:50 - 16:10 Tea Break 16:10 - 18:10 4 Parallel sessions of 6 papers each SS1, SS3, SS4, SS5 Also Panel Session: Classical Connectionist Learning 18:30 - 20:30 Opening Reception _______________________________________________ TUESDAY, June 10, 1997 _______________________________________________ 8:30 - 9:40 Parallel Plenary Talks Plenary talk (James Bezdek: A geometric approach to edge detection) Plenary talk (Teuvo Kohonen: Exploration of very large databases by self-organizing maps) 9:40 - 10:00 Coffee Break 10:00 - 12:00 8 Parallel sessions of 6 papers each AP3, SU3, LM3, PR3, TS3, OA1, CI3, EC2 12:00 - 13:20 Lunch Break 13:20 - 14:30 Plenary talk (Peter Fox: Functional volume models: System level models for functional neuroimaging) Plenary talk (Kaoru Hirota: Research and application aspects in soft computing: History and recent trends in Japan) 14:40 - 16:00 8 Parallel sessions of 4 papers each AP4, AR3, LM4, EC3, TS4, OA2, SS2.1, SS9.1 16:00 - 16:20 Tea Break Also Poster I (starts at 16:00) ARP1, CIP1, ECP1, OAP1, RVP1, TSP1 16:20 - 18:20 Panel Session: Brain Imaging Poster Session I (ends at 18:20) 20:00 - 22:00 INNS / SIG Meetings _______________________________________________ WEDNESDAY, June 11, 1997 _______________________________________________ 8:30 - 9:40 Parallel Plenary Sessions Plenary talk (Joaquim Fuster: Structure and dynamics of network memory) Plenary talk (Geoffrey Hinton: Towards neurally plausible Bayesian networks) 9:40 - 10:00 Coffee Break 10:00 - 12:00 8 Parallel sessions of 6 papers each TS5, SU4, LM5, PR4, BI1, EC4, RV1, EO1 12:00 - 13:20 Lunch Break 13:20 - 14:30 Parallel Plenary Sessions Plenary talk (Karl Pribram: The deep and surface structure of memory) Plenary talk (Eric Baum: Reinforcement learning by an economy of agents) 14:40 - 16:00 8 Parallel sessions of 4 papers each AR4, AP5, CS1, EO2, OA3, SS8, SS2.2, SS9.2 16:00 - 16:20 Tea Break Poster II (starts at 16:00) APP2, BIP2, CSP2, EOP2, LMP2, PRP2, SUP2 16:20 - 18:20 Panel Session: Creativity Poster Session II (ends at 18:20) 19:00 - 22:00 BANQUET Presentation of IEEE Fellowship Awards by the President of the IEEE Neural Networks Council, Dr. James C. Bezdek Banquet talk (Robert J. Marks II: Neural Networks, reduction to practice) _______________________________________________ THURSDAY, June 12, 1997 _______________________________________________ 8:30 - 9:40 Plenary Session Plenary talk (Paul Werbos: From neuro-control to brain-like intelligence) 9:40 - 10:00 Coffee Break 10:00 - 12:00 8 Parallel sessions of 6 papers each BI2, SU5, TS6, EO3, PR5, EC5, OA4, SS7.1 also 12:00 - 13:20 Lunch Break 13:20 - 15:20 8 Parallel sessions of 6 papers each CS2, SU6, AP6, RV2, PR6, EC6, OA5, SS7.2 also Tours ADJOURN _______________________________________________ ICNN'97 PAPER LISTS are on the web site * Note Two Letter Abbreviations for Sessions: AP Applications SU Supervised/Unsupervised Learning LM Learning and Memory BI Biological Neural Nets CS Cognitive Science and Cognitive Neuroscience EO Electronics and Optical Implementation PR Pattern Recognition and Image Processing RV Robotics and Vision OA Optimization and Associative Memory TS Speech Processing, Time Series and Filtering AR Architectures CI Computational Intelligence SS Special Sessions SS1: Adaptive Critic Designs SS2: Visual System Models & Prostheses SS3: Adaptive Applications SS4: Linguistic Rule Extraction SS5: Intelligent Control Theory & Applications SS6: NN Appl. for Monitoring of Complex Systems SS7: Biomedical Applications SS8: Sensors and Biosensors SS9: Knowledge-based Methods in NN ______________________________________________________ ______________________________________________________ ICNN97 CONFERENCE REGISTRATION ______________________________________________________ 1997 IEEE International Conference on Neural Networks June 9-12, 1997 Westin Galleria Hotel, Houston, Texas, USA Circle One: Dr. Mr. Ms. Last Name:____________________________________ First Name: ______________________________________ IEEE or INNS Membership Number: _______________________________ Affiliation: _____________________ (Must be current to qualify for discount. Email questions to: pci-inc at mindspring.com ) Mailing Address: ________________________________________________________________________________ City: _____________________________State: ______________________ Zip: ___________ Country: _________ Phone: __________________________ Fax: __________________________ Email: _________________________ Information to Appear on Badge: (First Name or "Nickname") ________________________________________ ______________________________________________________ Conference Registration Fees: Early Rate Late Rate (Before May 9, 1997) (After May 9, 1997) IEEE or INNS Members $375 $450 Non-Members $425 $550 Students* $110 $150 * A letter from the Department Head to verify full-time student status at the time of registration is required. At the conference, all students must present a current student ID with picture. Student registration does not include the banquet. ______________________________________________________ ______________________________________________________ TUTORIALS ______________________________________________________ Tutorial Registration Fees: (Tutorials June 8, 1997) One Tutorial $300 Two Tutorials $450 Three Tutorials $550 Student - One Tutorial $300 Student - Two Tutorials $450 Student - Three Tutorials $550 Tutorial Selection: (Circle desired tutorials) Morning Tutorials: 9:00 am - 12:00 noon T1 "Neural Networks for Conciousness" T2 "Network Ensembles and Hybrid Systems" Afternoon Tutorials: 1:30 pm - 4:30 pm T3 "Neuro-Fuzzy Recognition System: Concepts, Features and Feasibility T4 "Learning from Examples: From Theory to Practice" Evening Tutorials: 6:00 pm - 9:00 pm T5 "Principles of Neurobiological Information Processing for Biology-Inspired Neural Computers" T6 "Hybrid Intelligent Information Systems - Models, Tools, Applications" Tutorial Registration is on a first-come, first-served basis. ______________________________________________________ Please make check payable to: ICNN 97 Or indicate credit card payment(s) enclosed ______ Mastercard ______ Visa ______ Amex Credit Card No.: _____________________________________________ Exp. Date: ________________________ Signature: ___________________________________________________ For Credit Card Registration Only, Fax to: (714) 752-7444 Registrations by check/money order must be mailed. Payments Enclosed: Registration Fees: U.S. $ ________________________ Tutorial Fees: U.S. $ ________________________ Additional Proceedings: U.S. $ ________________________ GRAND TOTAL: U.S. $ ________________________ IMPORTANT NOTE: All registrations are fully processed only after payment is received. Payments made for registrations after the early deadline (May 9, 1997) are subject to the late registration fee. Cancellations received in writing by Meeting Management by May 1, 1997 will receive a refund, minus a $100 administrative charge. No refunds will be issued after that date, although substitutions may be made any time before June 5, 1997 by faxing/mailing the substitute registrant's name to Meeting Management. All other substitutions and registrations after June 5, 1997 must be handled on-site. ______________________________________________________ Please mail or fax your completed Conference Registration form, along with your payment to: ICNN 97 Meeting Management 2603 Main Street, Suite 690 Irvine, CA 92614, USA Phone: (714) 712-8205; FAX: (714) 752-7444; Email: MeetingMgt at aol.com ______________________________________________________ ______________________________________________________ ICNN97 TOURS ______________________________________________________ TOUR #1: Space Center Houston TOUR #2: Moody Gardens TOUR #3: Theater Museum Districts TOUR #4: Virtual Environment Technology Laboratory ______________________________________________________ TOUR #1: Space Center Houston A state-of-the-art education and entertainment complex designed by Walt Disney Imagineering. The magical adventure begins in Space Center Plaza where a space shuttle mock up welcomes you. Exhibits and demonstrations let you land a space shuttle by computer simulation, touch a real moon rock, and listen to communications between Mission Control and astronaut crews on board the space shuttle. In Destiny Theater, relive the great moments of the space program in the film "On Human Destiny." Adjacent to the complex is Johnson Space Center (JSC), home to the famed Mission Control Center and Rocket Park, the outdoor home of retired flight hardware too huge to house indoors. There will be a technical tour to JSC by NASA engineers. For the technical tour the attendees must be either US citizen or permanent resident. ______________________________________________________ TOUR #2: Moody Gardens Traveling to the Rainforest Pyramid at Moody Gardens in Galveston is an ultimate treat. Housing a tropical rainforest with waterfalls, cliffs and caverns, the 10-story glass pyramid, open daily, holds more than 2,000 species of exotic plants, animals, tropical fish and butterflies. Also on the grounds are the 3-D IMAX Theater, the white sand Palm Beach, the Colonel Paddlewheel Boat, the fascinating Bat Cave, and acres of lush gardens to explore. ______________________________________________________ TOUR #3: Theater Museum Districts Flocked by striking, contemporary architecture of downtown Houston, the Theater District covers ten square blocks of the city's central business district and is home to five of the country's most innovative performing arts companies. The Museum District, a lovely oak-lined area near Hermann Park an the Texas Medical Center, is anchored by the Museum of Fine Arts, the Contemporary Arts Museum, the Children's Museum of Houston, the nearby Menil Collection and the Museum of Natural Science. Two intriguing new museums to the area are the Holocaust Museum and the Museum of Health and Medical Science. ______________________________________________________ TOUR #4: Virtual Environment Technology Laboratory The Virtual Environment Technology Laboratory (VETL) is a joint enterprise of the University of Houston and NASA/Johnson Space Center. The laboratory's objectives include research and development activities utilizing virtual environments in (1) scientific/ engineering data visualization, (2) training, and (3) education. With a complement of over three million dollars in high performance computing and display equipment, and this region's only CAVE (a cube, ten feet on a side, with four display surfaces for total immersion), the VETL is advancing the state-of-the-art in virtual environment technology. The VETL is capable of displaying the visual components of virtual environments via monitors, stereoscopic head-mounted displays (HMDs), and projection displays. Demonstrations will include interactive immersive virtual environments in the CAVE and Flogiston "flostation" as well as 3-D modeling. ______________________________________________________ TOUR REGISTRATION FORM ______________________________________________________ I am interested in attending: (circle one) Tour #1 Tour #2 Tour #3 Tour #4 Name:______________________________________________________ (First) (Last) Address:______________________________________________________ (Street Address) (City) (State) (Zip) Phone: ________________________________ Fax: ____________________________ Social Security Number (Space Center attendees only): _________________________________ IMPORTANT NOTE: Prices for tours have not been finalized. Please fill out this form if you are interested in attending a tour. Tour fee information will be included on your registration confirmation letter. ______________________________________________________ Please mail your completed Tour form to: ICNN 97 Meeting Management 2603 Main Street, Suite 690 Irvine, CA 92614, USA Phone: (714) 712-8205; FAX: (714) 752-7444; Email: MeetingMgt at aol.com ______________________________________________________ ICNN97 HOTEL REGISTRATION FORM ______________________________________________________ ABOUT THE CONFERENCE VENUE The Conference will be held in the Westin Galleria Hotel, located in The Galleria complex, a glass enclosed entertainment/shopping center with over 350 retail stores in the heart of uptown Houston. Houston, Texas is America's fourth largest city and is also the capital of the international energy industry, the largest international port in North America, and the headquarters for America's Mann Space Flight Effort. Houston is also home of the world's largest medical center--a burgeoning Mecca of multicultural arts which also defines the colorful nature of this city. ______________________________________________________ TRAVEL INFORMATION Houston boasts two major international airports: Houston Intercontinental and Hobby. Houston Intercontinental Airport is the eighth largest airport system in the United States for international travel. It has three modern terminals connected by a state-of-the-art subway system. Hobby Airport offers full US and regional service with easy access to all parts of the city. Travel is very convenient, with many non-stop flights and easy connections from other major cities, as Houston is a hub for many airlines. ICNN attendees will find that flexible flight schedules and low fares await them in Houston. For special discounted rates, contact Nationwide Travel at (714) 847-1788. Be sure to identify yourself as an IEEE ICNN 97 attendee. Round trip shuttle bus service to both airports is available through Airport Express, which provides passenger pick-up and drop-off at both the Westin Galleria and The Westin Oaks Hotels. The cost of a one-way fare to Houston Intercontinental Airport is $16.00, and the airport is approximately 25 miles from the Galleria area. A flat rate taxi fare is $35.00 each way to Houston Intercontinental Airport. Approximate travel time is one hour. The one-way shuttle bus fare to Hobby Airport is $11.00. Hobby Airport is approximately 20 miles from the Galleria area and taxi fare is $26.00 each way. Approximate travel time is 45 minutes. Service to both airports funs every hour on the hour, from 5:30am to 7pm from the hotel. Reservations are required only for your return trip to the airport from the hotel. ______________________________________________________ HOTEL RESERVATION FORM ______________________________________________________ 1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS June 8-12, 1997 Westin Galleria Hotel 5060 West Alabama Houston, Texas 77056 Phone: (713) 960-8100 Fax: (713) 960-6549 ***** FLASH: PLENTY OF ROOM LEFT IN HOTEL AS OF MAY 6 -- YOU MAY HAVE TO MENTION "IEEE CONFERENCE" -- IF YOU HAVE TROUBLE RESERVING A ROOM, CALL MEETING MANAGEMENT AT (714) 752-8205 OR DR. KARAYIANNIS (713)743-4436 ! ***** Rates: $119.00 Single or Double Occupancy Note: All guest rooms have one king size bed or two double beds. There is an additional charge for roll-aways. There will be an additional charge of $15.00 for each additional person in the room. Discounted rates are available to attendees three days prior and three days after the dates of the conference. Reservations must be received by May 18, 1997 to ensure availability and special rates. Reservations and deposits received after this date will be accepted on a space available basis at the hotel's published rates. Please MAIL or FAX this form to: Westin Galleria Hotel at the above address. RESERVATIONS MUST BE ACCOMPANIED BY A DEPOSIT FOR THE FIRST NIGHT PLUS 15% TAX. (Check Appropriate Blanks) Bed Type Request: ______ One King Bed ______Two Double Beds Smoking Room: ______ No ______Yes Arrival Date/Time: ____________________________ (Check in time is 3:00 pm) Departure Date: ______________________________ (Check out time is 12 noon) Name: _________________________________________________ Organization/Firm: ________________________ Address: ______________________________________________________ City: ______________________________________ State: _______________________ Zip: ___________________ Sharing Room with: _________________________________________ Special Requests: ______________________ Reservations at the Westin Galleria Hotel require one night's deposit or credit card guarantee (including 15% tax plus $1.50 occupancy tax), please complete the following: ______ Enclosed is a check or money order for $__________________ OR ______ Enclosed is my credit card information authorizing my reservation to be guaranteed in the amount of $ _______ Reservations must be canceled 72 hours prior to arrival date for deposit refund. A $35.00 charge will be assessed if your departure is changed to an earlier date after check-in. Reservations are subject to cancellation at 4pm if not guaranteed. ______American Express ______Mastercard ______Diners Club ______Visa ______Carte Blanche ______Discover Credit Card #: ________________________________________________ Exp. Date: _______________________ Please print name as it appears on the card: __________________________________________________________ Signature: ____________________________________________________________________________ ________ NOTE: DO NOT EMAIL YOUR CREDIT INFORMATION. PRINT AND FAX OR MAIL THESE FORMS Mary Lou Padgett m.padgett at ieee.org http://www.mindspring.com/~pci-inc/ICNN97/ Mary Lou Padgett 1165 Owens Road Auburn, AL 36830 P: (334) 821-2472 F: (334) 821-3488 m.padgett at ieee.org Auburn University, EE Dept. Padgett Computer Innovations, Inc. (PCI) Simulation, VI, Seminars IEEE Standards Board -- Virtual Intelligence ( VI): NN, FZ, EC, VR http://www.mindspring.com/~pci-inc/ From henrys at gscit-1.fcit.monash.edu.au Tue May 6 16:08:06 1997 From: henrys at gscit-1.fcit.monash.edu.au (Henry Selvaraj) Date: Tue, 6 May 1997 16:08:06 EST-10 Subject: ICCIMA'98 Second Call for Papers Message-ID: <14E329D54CB@gscit-1.fcit.monash.edu.au> ICCIMA'98 International Conference on Computational Intelligence and Multimedia Applications 9-11 February 1998 Monash University, Gippsland Campus, Churchill, Australia S E C O N D C A L L F O R P A P E R S The International Conference on Computational Intelligence and Multimedia Applications will be held at Monash University on 9-11 February 1998. The conference will provide an international forum for discussion on issues in the areas of Computational Intelligence and Multimedia for scientists, engineers, researchers and practitioners. The conference will include sessions on theory, implementation and applications, as well as the non-technical areas of challenges in education and technology transfer to industry. There will be both oral and poster sessions. Accepted full papers will be included in the proceedings to be published by World Scientific. Several well-known keynote speakers will address the conference. Conference Topics Include (but not limited to): Artificial Intelligence Artificial Neural Networks Artificial Intelligence and Logic Synthesis Functional decomposition Pattern Recognition Fuzzy Systems Genetic Algorithms Intelligent Control Intelligent Databases Knowledge-based Engineering Learning Algorithms Memory, Storage and Retrieval Multimedia Systems Formal Models for Multimedia Interactive Multimedia Multimedia and Virtual Reality Multimedia and Telecommunications Multimedia Information Retrieval Special Sessions: Artificial Intelligence and Logic Synthesis: intelligent algorithms for logic synthesis; functional decomposition in machine learning, pattern recognition, knowledge discovery and logic synthesis;evolutionary and reconfigurable computing with FPGAs. Chair: Lech Jozwiak, Eindhoven University, Netherlands. Multimedia Information Retrieval: segmentation of audio, image and video; feature extraction and representation; semi-automatic text annotation techniques; indexing structure; query model and retrieval methods; feature similarity measurement; system integration issues; prototype systems and applications. Chair: Guojun Lu, Monash University, Australia. Pre-Conference Workshops and Tutorial: Proposals for pre-conference workshops and tutorials relevant to the conference topics are invited. These are to be held on Saturday 7th February and Sunday 8th February at the conference venue. People wishing to organise such workshops or tutorials are invited to submit a proposal at the same time as submission deadline for papers. The accepted proposals will be advertised. Special Poster Session: ICCIMA'98 will include a special poster session devoted to recent work and work-in-progress. Abstracts are solicited for this session (2 page limit) in camera ready form, and may be submitted up to 30 days before the conference date. They will not be refereed and will not be included in the proceedings, but will be distributed to attendees upon arrival. Students are especially encouraged to submit abstracts for this session. Invited Sessions Keynote speakers (key industrialists, chief research scientists and leading academics) will be addressing the main issues of the conference. Important Dates: Submission of papers received latest on: 7 July 97 Notification of acceptance: 19 September 97 Camera ready papers & registration received by: 24 October 97 Submission of Papers Papers in English reporting original and unpublished research results and experience are solicited. Electronic submission of papers via e-mail in postscript or Microsoft Word for Windows format directly to the General Chair are acceptable and encouraged for the refereeing process. If not submitting an electronic version, please submit three hard copy originals to the General Chair. Papers for refereeing purposes must be received at the ICCIMA 98 secretariat latest by 7 July 1997. Notification of acceptance will be mailed by 19 September 1997. Page Limits Papers for refereeing should be double-spaced and must include an abstract of 100-150 words with up to six keywords. The accepted papers will need to be received at the ICCIMA 98 secretariat by 24 October 1997 in camera ready format. A final preparation format for the camera-ready papers will be provided upon notification of acceptance. Camera ready papers exceeding 6 pages (including abstract, all text, figures, tables and references etc.) will be charged an extra fee per page in excess to the normal registration. Evaluation Process All submissions will be refereed based on the following criteria by two reviewers with appropriate background. originality significance contribution to the area of research technical quality relevance to ICCIMA 98 topics clarity of presentation Referees report will be provided to all authors. Check List Prospective authors should check that the following items are attached and guidelines followed while submitting the papers for refereeing purpose. * The paper and its title page should not contain the name(s) of the author(s), or their affiliation * The paper should have attached a covering page containing the following information: -title of the paper -author name(s), Affiliation, mail and e-mail addresses, phone and fax numbers -Conference topic area -up to six keywords * The name, e-mail, phone, fax and postal address of the contact person should be attached to the submission Visits and Social Events Industrial and sight seeing visits will be arranged for the delegates and guests. A separate program will be arranged for companions during the conference. General Chair: Henry Selvaraj Gippsland School of Computing & Information Technology Monash University, Churchill, VIC, Australia 3842 Henry.Selvaraj at fcit.monash.edu.au Phone: +61 3 9902 6665 Fax: +61 3 9902 6842 International Programme Committee: Abdul Sattar, Griffith University, Australia Andre de Carvalho, University of Sao Paulo, Brazil Bob Bignall, Monash University, Australia Brijesh Verma, Griffith University, Australia (Programme Chair) Dinesh Patel, Surrey University, UK Henry Selvaraj, Monash University, Australia Hyunsoo Lee, University of Yonsei, Korea Jan Mulawka, Warsaw University of Technology, Poland Jong-Hwan Kim, Korea Advanced Institute of Science & Technology, Korea Lech Jozwiak, Eindhoven Univ. of Tech, Netherlands Margaret Marek-Sadowska, University of California, USA Marek Perkowski, Portland State University, USA Michael Bove, MIT Media Laboratory, USA Mikio Takagi, University of Tokyo, Japan Nagarajan Ramesh,Tencor Instruments, USA Ramana Reddy, West Virginia University, USA Regu Subramanian, Nanyang Tech University, Singapore Sargur Srihari, State University of New York, USA Shyam Kapur, James Cook University, Australia Sourav Kundu, Kanazawa University, Japan S. Srinivasan, IIT, Madras, India Subhash Wadhwa, IIT, Delhi, India Tadeusz Luba, Warsaw University of Technology, Poland Vishy Karri, University of Tasmania, Australia Xin Yao, University of New South Wales, Australia International Liaison Asian Liaison: Regu Subramanian, Network Technology Research Centre, Nanyang Technological University, Singapore U.S. Liaison: Marek Perkowski, Portland State University, USA European Liaison: Tadeusz Luba, Warsaw University of Technology, Poland Organising Committee: Bob Bignall, Monash University, Australia Baikunth Nath, Monash University, Australia Vishy Karri, University of Tasmania, Australia Syed M. Rahman, Monash University, Australia Bala Srinivasan, Monash University,Australia Cheryl Brickell, Monash University, Australia Andy Flitman, Monash University, Australia Lindsay Smith, Monash University, Australia Further Information: Conference Email : iccima98 at fcit.monash.edu.au Conference WWW Page: http://www-gscit.fcit.monash.edu.au/~iccima98 From esann at dice.ucl.ac.be Tue May 6 07:43:02 1997 From: esann at dice.ucl.ac.be (esann@dice.ucl.ac.be) Date: Tue, 6 May 1997 13:43:02 +0200 Subject: ESANN'97 proceedings available Message-ID: <199705061136.NAA09184@ns1.dice.ucl.ac.be> The following proceedings are available: --------------------------------------------------- | ESANN'97 | | European Symposium | | on Artificial Neural Networks | | | | Bruges - April 16-17-18, 1997 | --------------------------------------------------- ESANN'97 proceedings D facto publications (Belgium) ISBN 2-9600049-7-3, 362 pages Price: BEF 2000 Instructions to obtain these proceedings and the table of contents are available on the ESANN Web server: http://www.dice.ucl.ac.be/neural-nets/esann/ You may also contact directly the publisher: D facto publications 45 rue Masui B-1000 Brussels Belgium Tel: + 32 2 203 43 63 Fax: + 32 2 203 42 94 The previous ESANN proceedings are also available: - ESANN'96 proceedings ISBN 2-9600049-6-5, 340 pages Price: BEF 2000 - ESANN'95 proceedings ISBN 2-9600049-3-0, 382 pages Price: BEF 2000 - ESANN'94 proceedings ISBN 2-9600049-1-4, 287 pages Price: BEF 1500 - ESANN'93 proceedings ISBN 2-9600049-0-6, 243 pages Price: BEF 1500 Please add BEF 500 to any order for p. & p. _____________________________ _____________________________ D facto publications - Michel Verleysen conference services Univ. Cath. de Louvain - DICE 45 rue Masui 3, pl. du Levant 1000 Brussels B-1348 Louvain-la-Neuve Belgium Belgium tel: +32 2 203 43 63 tel: +32 10 47 25 51 fax: +32 2 203 42 94 fax: +32 10 47 25 98 esann at dice.ucl.ac.be verleysen at dice.ucl.ac.be http://www.dice.ucl.ac.be/neural-nets/esann _____________________________ _____________________________ From khosla at latcs1.cs.latrobe.edu.au Fri May 9 21:31:04 1997 From: khosla at latcs1.cs.latrobe.edu.au (khosla@latcs1.cs.latrobe.edu.au) Date: Sat, 10 May 1997 11:31:04 +1000 (AEST) Subject: Book on Engineering Intelligent Hybrid Multi-Agent Systems Message-ID: <199705100131.LAA09241@ipc6.cs.latrobe.edu.au> Please accept my sincere apologies if you receive multiple copies of this posting. *** BOOK ANNOUNCEMENT *** ENGINEERING INTELLIGENT HYBRID MULTI-AGENT SYSTEMS by Rajiv Khosla and Tharam Dillon This book is about building intelligent hybrid systems, problem solving, and software modeling. It is relevant to practioners and researchers in the areas of intelligent hybrid systems, control systems, multi-agent systems, knowledge discovery and data mining, software engineering, and enterprise-wide systems modeling. The book in many ways is a synergy of all these areas. The book can also be used as a text or reference book for postgraduate students in intelligent hybrid systems, software engineering, and system modeling. On the intelligent hybrid systems front, the book covers applications and design concepts related to fusion systems, transformation systems and combination systems. It describes industrial applications in these areas involving hybrid configurations of knowledge based systems, case-based reasoning, fuzzy systems, artificial neural networks, and genetic algorithms. On the problem solving front, the book describes an architectural theory for engineering intelligent associative hybrid multi-agent systems. The architectural theory is described at the task structure level and the computational level. From an organizational context the problem solving architecture is not only relevant at the knowledge engineering layer for developing knowledge agents but also at the information engineering layer for developing information agents. On the software modeling front, the book describes the role of objects, agents and problem solving in knowledge engineering, information engineering, data engineering, and software modeling of intelligent hybrid systems. Based on the concepts developed in the book, an enterprise-wide systems modeling framework is described to facilitate forward and backward integration of systems developed in the knowledge, information, and data engineering layers of an organization. In the modeling process, agent oriented analysis, design, and reuse aspects of software engineering are also discussed. The book consists of four parts: Part I: introduces various methodologies and their hybrid applications in the industry. Part II: describes a multi-agent architectural theory of associative intelligent hybrid systems at the task structure level and the computational level. It covers various aspects related to knowledge modeling of hybrid systems. Part III: describes the software engineering aspects of the architecture. It does that by describing a real-time alarm processing application of the architecture. Part IV: takes a boader view of the various concepts and theories developed in Part II and III of the book respectively in terms of enterprise-wide systems modeling, multi-agent systems, control systems, and software engineering and reuse. Part I is described through chapters 1, 2, 3, 4 and 5 respectively. Part II is described through chapters 6, 7, and 8 respectively. Part III is described through chapters 9, 10, 11, and 12 respectively. Part IV is described through chapters 13 and 14 respectively. ------------------------------------------------------------------------------- Summary of Table of Contents PART I: Methodologies and Applications Chapter 1. Why Intelligent Hybrid Systems 1.1 Introduction 1.2 Evolution of Hybrid Systems 1.3 Classes of Hybrid Systems 1.4 Summary Chapter References Chapter 2. Methodologies 2.1 Introduction 2.2 Expert Systems 2.3 Artficial Neural Networks 2.4 Fuzzy Systems 2.5 Genetic Algorithms 2.6 Knowledge Discovery and Data Mining 2.7 Object-Oriented Methodology 2.8 Agents and Agent Architectures 2.9 Summary Chapter References Chapter 3. Intelligent Fusion and Transformation Systems 3.1 Introduction 3.2 Fusion and Transformation 3.3 Neural Network Based Neuro-Symbolic Fusion and Transformation Systems 3.4 Neural Network Based Neuro-Fuzzy Fusion and Transformation Systems 3.5 Recapitulation 3.6 Genetic Algorithms Based Fusion and Transformation Systems 3.7 Summary Chapter References Chapter4. Intelligent Combination Systems 4.1 Introduction 4.2 Intelligent Combination Approaches 4.3 Neuro-Symbolic Combination Systems 4.4 Symbolic-Genetic Combination Scheduling System 4.5 Neuro-Fuzzy Combination Systems 4.6 Neuro-Fuzzy-Case Combination System 4.7 Combination Approach Based Intelligent Hybrid Control Applications 4.8 Summary Chapter References Chapter 5. Knowledge Discovery, Data Mining and Hybrid Systems 5.1 Introduction 5.2 KDD Process 5.3 KDD Application in Forecasting 5.4 Financial Trading Application 5.5 Learning Rules \& Knowledge Hierarchies in the LED Digit Domain 5.6 Rule Extraction in Computer Network Diagnosis Application 5.7 Summary Chapter References ------------------------------------------------------------------------------ PART II: Problem Solving and Architectural Theory Chapter 6. Association Systems - Task Structure Level Associative Hybrid Architecture 6.1 Introduction 6.2 Various Perspectives Characterizing Problem Solving 6.3 Task Structure Level Architecture 6.4 Some Observations 6.5 Summary Chapter References Chapter 7. Intelligent Multi-Agent Hybrid Computational Architecture - Part I 7.1 Introduction 7.2 Object-Oriented Model 7.3 Agent Model 7.4 Distributed Operating System Process Model 7.5 Computational Level Intelligent Multi-Agent Hybrid Distributed Architecture (IMAHDA) 7.6 Agent Building Blocks of IMAHDA 7.7 Summary Chapter References Chapter 8. Intelligent Multi-Agent Hybrid Computational Architecture - Part II 8.1 Introduction 8.2 Communication in IMAHDA 8.3 Concept Learning in IMAHDA 8.4 Underlying Training Problems with Neural Networks 8.5 Learning and IMAHDA 8.6 Learning Knowledge in IMAHDA 8.7 Learning Strategy in IMAHDA 8.8 Dynamic Analysis of IMAHDA 8.9 Comprehensive View of IMAHDA's Agents 8.10 Emergent Characteristics of IMAHDA 8.11 Summary Chapter References ----------------------------------------------------------------------- PART III: Software Engineering Aspects Chapter 9. Alarm Processing - An Application of IMAHDA 9.1 Introduction 9.2 Characteristics of the Problem 9.3 Survey of Existing Methods 9.4 IMAHDA and Alarm Processing 9.5 Application of IMAHDA 9.6 Summary Chapter References Chapter 10. Agent Oriented Analysis and Design of the RTAPS - Part I 10.1 Introduction 10.2 Agent Oriented Analysis (AOA) 10.3 AOA of the RTAPS 10.4 Summary Chapter References Chapter 11. Agent Oriented Analysis and Design of the RTAPS - Part II 11.1 Introduction 11.2 Agent Oriented Analysis Continued 11.3 Agent Oriented Design of the RTAPS 11.4 Emergent Characteristics of the RTAPS Agents 11.5 Summary Chapter References Chapter 12. RTAPS Implementation 12.1 Introduction 12.2 IMAHDA Related Issues 12.3 Training of Neural Networks in RTAPS 12.4 Power System Aspects of the RTAPS 12.5 Scalability and Cost Effectiveness 12.6 Summary Chapter References ----------------------------------------------------------------------- PART IV: Software Modeling Chapter 13. From Data Repositories to Knowledge Repositories - Intelligent Organizations 13.1 Introduction 13.2 Information Systems and Organizational Levels 13.3 Characteristics of Information Systems 13.4 Information Systems and Knowledge Systems 13.5 IMAHDA and Organizational Knowledge Systems 13.6 Application of IMAHDA in Sales \& Marketing Function 13.7 Unified Approach to Enterprise-Wide System Modeling 13.8 Summary Chapter References Chapter 14. IMAHDA Revisited 14.1 Introduction 14.2 IMAHDA and Problem Solving 14.3 IMAHDA and Hybrid Systems 14.4 IMAHDA and Control Systems 14.5 IMAHDA and Multi-Agent Systems 14.6 IMAHDA and Software Engineering 14.7 IMAHDA and Enterprise-wide Systems Modeling Chapter References Appendices Index The book consists of 412 pages and is being published in USA by Kluwer Academic Publishers. For ordering and other information, please contact Alexander Greene Publisher Kluwer Academic Publishers 101 Philip Drive Assinippi Park Norwell, MA 02061 U.S.A Phone: +1.617.8716600 Fax: +1.617.871.6528 E-Mail: agreene at wkap.com -------------------------------------------------------------------- Dr Rajiv Khosla School of Computer and Computer Engineering La Trobe University Melbourne, Victoria - 3083 Australia Phone: +61.3.94793034 Fax: +61.3.94793060 E-Mail:khosla at cs.latrobe.edu.au From imlm at tuck.cs.fit.edu Sat May 10 16:29:41 1997 From: imlm at tuck.cs.fit.edu (IMLM Workshop (pkc)) Date: Sat, 10 May 1997 15:29:41 -0500 Subject: CFP: MLJ special issue on IMLM Message-ID: <199705102029.PAA01437@tuck.cs.fit.edu> Dear colleagues, Here is a CFP for the Machine Learning Journal special issue on IMLM. Submission is due on Oct 1st, 97. Hope you can submit. Thanks. Phil, Sal, and Dave ------ CALL FOR PAPERS Machine Learning Journal Special Issue on Integrating Multiple Learned Models for Improving and Scaling Machine Learning Algorithms Most modern Machine Learning, Statistics and KDD techniques use a single model or learning algorithm at a time, or at most select one model from a set of candidate models. Recently however, there has been considerable interest in techniques that integrate the collective predictions of a set of models in some principled fashion. With such techniques often the predictive accuracy and/or the training efficiency of the overall system can be improved, since one can "mix and match" among the relative strengths of the models being combined. Any aspect of integrating multiple models is appropriate for the special issue. However we intend the focus of the special issue to be on the issues of improving prediction accuracy and improving training efficiency in the context of large databases. Submissions are sought in, but not limited to, the following topics: 1) Techniques that generate and/or integrate multiple learned models. Examples are schemes that generate and combine models by * using different training data distributions (in particular by training over different partitions of the data) * using different sampling techniques to generate different partitions * using different output classification schemes (for example using output codes) * using different hyperparameters or training heuristics (primarily as a tool for generating multiple models) 2) Systems and architectures to implement such strategies. For example, * parallel and distributed multiple learning systems * multi-agent learning over inherently distributed data 3) Techniques that analyze the integration of multiple learned models for * selecting/pruning models * estimating the overall accuracy * comparing different integration methods * tradeoff of accuracy and simplicity/comprehensibility Schedule: October 1: Deadline for submissions December 15: Deadline for getting decisions back to authors March 15: Deadline for authors to submit final versions August 1998: Publication Submission Guidelines: 1) Manuscripts should conform to the formatting instructions in: http://www.cs.orst.edu/~tgd/mlj/info-for-authors.html The first author will be the primary contact unless otherwise stated. 2) Authors should send 5 copies of the manuscript to: Karen Cullen Machine Learning Editorial Office Attn: Special Issue on IMLM Kluwer Academic Press 101 Philip Drive Assinippi Park Norwell, MA 02061 617-871-6300 617-871-6528 (fax) kcullen at wkap.com and one copy to: Philip Chan MLJ Special Issue on IMLM Computer Science Florida Institute of Technology 150 W. University Blvd. Melbourne, FL 32901 407-768-8000 x7280 (x8062) (407-674-7280/8062 after 6/1/97) 407-984-8461 (fax) 3) Please also send an ASCII title page (title, authors, email, abstract, and keywords) and a postscript version of the manuscript to imlm at cs.fit.edu. General Inquiries: Please address general inquiries to: imlm at cs.fit.edu Up-to-date information is maintained on WWW at: http://www.cs.fit.edu/~imlm/ Co-Editors: Philip Chan, Florida Institute of Technology pkc at cs.fit.edu Salvatore Stolfo, Columbia University sal at cs.columbia.edu David Wolpert, IBM Almaden Research Center dhw at almaden.ibm.com From school at cogs.nbu.acad.bg Sun May 11 09:22:52 1997 From: school at cogs.nbu.acad.bg (CogSci Summer School) Date: Sun, 11 May 1997 16:22:52 +0300 Subject: CogSci97 deadline approaches Message-ID: 4th International Summer School in Cognitive Science Sofia, July 14 - 26, 1997 Call for Papers and School Brochure The Summer School features introductory and advanced courses in Cognitive Science, participant symposia, panel discussions, student sessions, and intensive informal discussions. Participants will include university teachers and researchers, graduate and senior undergraduate students. International Advisory Board Elizabeth BATES (University of California at San Diego, USA) Amedeo CAPPELLI (CNR, Pisa, Italy) Cristiano CASTELFRANCHI (CNR, Roma, Italy) Daniel DENNETT (Tufts University, Medford, Massachusetts, USA) Ennio De RENZI (University of Modena, Italy) Charles DE WEERT (University of Nijmegen, Holland ) Christian FREKSA (Hamburg University, Germany) Dedre GENTNER (Northwestern University, Evanston, Illinois, USA) Christopher HABEL (Hamburg University, Germany) Joachim HOHNSBEIN (Dortmund University, Germany) Douglas HOFSTADTER (Indiana University, Bloomington, Indiana, USA) Keith HOLYOAK (University of California at Los Angeles, USA) Mark KEANE (Trinity College, Dublin, Ireland) Alan LESGOLD (University of Pittsburg, Pennsylvania, USA) Willem LEVELT (Max-Plank Institute of Psycholinguistics, Nijmegen, Holland) David RUMELHART (Stanford University, California, USA) Richard SHIFFRIN (Indiana University, Bloomington, Indiana, USA) Paul SMOLENSKY (University of Colorado, Boulder, USA) Chris THORNTON (University of Sussex, Brighton, England) Carlo UMILTA' (University of Padova, Italy) Eran ZAIDEL (University of California at Los Angeles, USA) Courses Dynamics of Change: Lessons from Human Development - Linda Smith (Indiana University, USA) Ecological Approaches to Human Memory - William Hirst (New School for Social Research, USA) Cognitive Approaches to Syntax - Robert Van Valin (State University of New York at Buffalo, USA) Culture and Cognition - Naomi Quinn (Duke University, USA) Spatial Attention - Carlo Umilta' (University of Padova, Italy) Cognitive modeling in ACT-R - Werner Tack (University of Saarlandes, Germany) Spatial Concepts and Spatial Representation - Emile van der Zee (Hamburg University) Brain Imaging Techniques for Cognitive Neurosciences - Joachim Hohnsbein (University of Dortmund) Participant Symposia Participants are invited to submit papers reporting completed research which will be presented (30 min) at the participant symposia. Authors should send full papers (8 single spaced pages) in triplicate or electronically (postscript, RTF, MS Word or plain ASCII) by May 15. Selected papers will be published in the School's Proceedings. Only papers presented at the School will be eligible for publication. Student Session Graduate students in Cognitive Science are invited to present their work at the student session. Research in progress as well as research plans and proposals for M.Sc. Theses and Ph.D. Theses will be discussed at the student session. Papers will not be published in the School's Proceedings. Panel Discussions Cognitive Science in the 21st century Cognition in Context: Social, Cutural, Physical, Developmental Brain, Body, Environment, and Cognition Dynamics of Cognition: Short-Term and Long-Term Dynamics Local Organizers New Bulgarian University, Bulgarian Academy of Sciences, Bulgarian Cognitive Science Society Sponsors TEMPUS SJEP 07272/94 Local Organizing Committee Boicho Kokinov - School Director, Elena Andonova, Gergana Yancheva, Iliana Haralanova Timetable Registration Form: as soon as possible Deadline for paper submission: May 15 Notification for acceptance: June 1 Early registration: June 5 Arrival date and on site registration July 13 Summer School July 14-25 Excursion July 20 Departure date July 26 Paper submission to: Boicho Kokinov Cognitive Science Department New Bulgarian University 21, Montevideo Str. Sofia 1635, Bulgaria e-mail: school at cogs.nbu.acad.bg Send your Registration Form to: e-mail: school at cogs.nbu.acad.bg (If you don't receive an aknowledgement within 3 days, send a message to kokinov at bgearn.acad.bg) From koza at CS.Stanford.EDU Sat May 10 16:09:18 1997 From: koza at CS.Stanford.EDU (John R. Koza) Date: Sat, 10 May 1997 13:09:18 -0700 (PDT) Subject: GP-97 Revised Call for Participation Message-ID: <199705102009.NAA26523@Sunburn.Stanford.EDU> CALL FOR PARTICIPATION Genetic Programming 1997 Conference (GP-97) July 13 - 16 (Sunday - Wednesday), 1997 Fairchild Auditorium - Stanford University - Stanford, California ----------------------------------------------------------------------- In cooperation with American Association for Artificial Intelligence (AAAI), Association for Computing Machinery (ACM), SIGART, and Society for Industrial and Applied Mathematics (SIAM) ----------------------------------------------------------------------- WWW FOR GP-97: http://www-cs-faculty.stanford.edu/~koza/gp97.html ----------------------------------------------------------------------- NOTE: You are urged to make your housing arrangements as early as possible since convenient hotel locations are limited. Also, if you are driving to the Stanford campus, please be aware of parking lot construction in the area of Fairchild Auditorium and allow a little extra time (particularly on the first Monday session) to find a parking place. ----------------------------------------------------------------------- Genetic programming is an automatic programming technique for evolving computer programs that solve (or approximately solve) problems. Starting with a primordial ooze of thousands of randomly created computer programs, a population of programs is progressively evolved over many generations using the Darwinian principle of survival of the fittest, a sexual recombination operation, and occasional mutation. The first annual genetic programming conference in 1996 featured 15 tutorials, 2 invited speakers, 3 parallel tracks, 73 papers, and 17 poster papers in proceedings book, and 27 late-breaking papers in a separate book distributed to conference attendees, and 288 attendees. A description of GP-96 appears in the October 1996 issue of Scientific American (http://www.sciam.com/WEB/1096issue/1096techbus3.html). This second annual conference in 1997 reflects the rapid growth of this field in which over 600 technical papers have been published since 1992. For August 5, 1996 article in E. E. Times on GP-96 conference and August 12, 1996 article in E. E Times on John Holland's invited speech at GP-96, go to http://www.techweb.com/search/search.html There will be 36 long, 33 short, and 15 poster papers at the Second Annual Genetic Programming Conference to be held on July 13-16 (Sunday - Wednesday), 1997 at Stanford University. In addition, there will be late-breaking papers (published in a separate book in mid June after the June 11 deadline for late-breaking papers). Topics include, but are not limited to, applications of genetic programming, theoretical foundations of genetic programming, implementation issues, technique extensions, cellular encoding, evolvable hardware, evolvable machine language programs, automated evolution of program architecture, evolution and use of mental models, automatic programming of multi-agent strategies, distributed artificial intelligence, auto-parallelization of algorithms, automated circuit synthesis, automatic programming of cellular automata, induction, system identification, control, automated design, data and image compression, image analysis, pattern recognition, molecular biology applications, grammar induction, and parallelization. Papers describing recent developments are also solicited in the following additional areas: genetic algorithms, classifier systems, evolutionary programming and evolution strategies, artificial life and evolutionary robotics, DNA computing, and evolvable hardware. ----------------------------------------------------------------------- INVITED SPEAKERS: - Ellen Goldberg, President, Santa Fe Institute - Susumu Ohno, Ben Horowitz Chair of Distinguished Scientist in Theoretical Biology, Beckman Research Institute - David B. Fogel, Natural Selection Inc. and Editor-In-Chief of the IEEE Transactions on Evolutionary Computation ----------------------------------------------------------------------- SPECIAL PROGRAM CHAIRS The main focus of the conference (and most of the papers) will be on genetic programming. In addition, papers describing recent developments in the closely related areas will be reviewed and selected by special program committees appointed and supervised by the following special program chairs. --- Genetic Algorithms: Kalyanmoy Deb, Indian Inst of Tech - Kanpur, India --- Classifier Systems: Rick L. Riolo, University of Michigan --- Evolutionary Programming and Evolution Strategies: David B. Fogel, Natural Selection Inc, San Diego --- Artificial Life and Evolutionary Robotics: Marco Dorigo, Universite Libre de Bruxelles --- DNA Computing: Max Garzon, University of Memphis --- Evolvable Hardware: Hitoshi Iba, Electrotechnical Laboratory, Japan ----------------------------------------------------------------------- 20 TUTORIALS AT GP-97 (Note: Slight Revisions from earlier listing) Sunday July 13 - 9:15 AM - 11:30 AM --- Genetic Algorithms - David E. Goldberg, University of Illinois at Urbana- Champaign --- Evolvable Hardware - Tetsuya Higuchi - Electrotechnical Laboratory, Tsukuba, Japan --- Program Growth Control in Genetic Programming - Byoung-Tak Zhang, Konkuk University, Seoul, South Korea and Hitoshi Iba, Electrotechnical Laboratory, Tsukuba, Japan --- Introduction to Genetic Programming - John Koza, Stanford University ----------------------------------------------------------------------- Sunday July 13 - 1:00 PM - 3: 15 PM --- Evolutionary Algorithms for Computer-Aided Design of Integrated Circuits - Rolf Drechsler - Albert-Ludwigs-University, Freiburg, Germany --- Self-Replicating Systems in Cellular Space Models - Jason Lohn - Stnaofrd University --- Neural Networks - Bernard Widrow - Stanford University --- Advanced Genetic Programming - John Koza, Stanford University ----------------------------------------------------------------------- Sunday July 13 - 3:45 PM - 6 PM --- Evolutionary Programming and Evolution Strategies - David Fogel, University of California, San Diego --- Genetic Programming Representations - Astro Teller - Carnegie Mellon University --- Design of Electrical Circuits using Genetic Programming - David Andre University of California - Berkeley and Forrest H Bennett III - Stanford University --- Genetic Programming with Linear Genomes - Wolfgang Banzhaf, University of Dortmund, Germany ----------------------------------------------------------------------- Tuesday July 15 - 3:25 PM - 5:40 PM --- Computational Learning Theory - Vasant Honavar - Iowa State University --- Machine Learning - Pat Langley, Institute for the Study of Learning and Expertise --- Molecular Biology for Computer Scientists - Russ B. Altman, Stanford University --- Simulated Evolution of Models - Janine Graf - Inquire America Corp ----------------------------------------------------------------------- Tuesday July 15 - 7:30 PM - 9:30 PM --- DNA Computing - Russell Deaton and Randy C. Murphy - University of Memphis --- Evolutionary Algorithms with Mathematica - Christian Jacobs --- Cellular Programming: Evolution Of Parallel Cellular Machines - Moshe Sipper - Swiss Federal Institute of Technology, Lausanne --- Machine Language Genetic Programming - Peter Nordin DaCapo AB, Sweden ----------------------------------------------------------------------- GENERAL CHAIR: John Koza, Stanford University PUBLICITY CHAIR: Patrick Tufts, Brandeis University EXECUTIVE COMMITTEE: David Andre, Forrest H Bennett III, Jason Lohn ----------------------------------------------------------------------- FOR MORE INFORMATION ABOUT THE GP-97 CONFERENCE: See the GP-97 home page on the World Wide Web: http://www-cs-faculty.stanford.edu/~koza/gp97.html E- MAIL: gp at aaai.org. PHONE: 415-328-3123. FAX: 415-321-4457. The conference is operated by Genetic Programming Conferences, Inc. (a California not-for- profit corporation). ----------------------------------------------------------------------- FOR MORE INFORMATION ABOUT GENETIC PROGRAMMING IN GENERAL: http://www-cs- faculty.stanford.edu/~koza/. ----------------------------------------------------------------------- Hotel information: Numerous local hotels within a short distance of Stanford University are listed at the GP-97 home page. Because of other events held in the area during the summer, attendees are urged to make their arrangements for accomodations early. For your convenience, AAAI has reserved a block of rooms at the Holiday Inn-Palo Alto Hotel, 625 El Camino Real, Palo Alto, CA 94301, Phone: 800-874-3516 or 415-328-2800, FAX: 415-327-7362. Make your reservations directly with the Holiday Inn before June 28, 1997 for the GP-97 rate rate of $99 single and $109 double. In addition, AAAI has reserved a block of rooms at the Stanford Terrace Inn, 531 Stanford Avenue, Palo Alto, CA 94306, Phone: 800-729-0332 or 415-857-0333, FAX: 415-857-0343. Make your reservations directly with the Stanford Terrace Inn before June 11, 1997. There is a free Stanford University shuttle (called Marguerite) that stops near both of these hotels (and various other hotels, the train station, and Palo Alto locations). ----------------------------------------------------------------------- University Housing information: A limited number of spaces are available at Stanford University housing on a first-come-first-served basis. The final deadline for University housing applications is June 13, 1997. See the GP-97 WWW home page for a university housing application form. ----------------------------------------------------------------------- TRAVEL INFORMATION: Stanford University is near Palo Alto in Northern California and is about 40 miles south of San Francisco. Stanford is about 25 miles south of the San Francisco International Airport and about 25 miles north of San Jose International Airport. Oakland airport is about 45 miles away. Conventions in America has arranged special GP-97 airline and car rental discounts. For travel between July 10 - 20, 1997, American Airlines can save you 5% on lowest applicable fares or 10% off lowest unrestricted coach fares, with 7-day advance purchases. Some restrictions apply. Hertz is offering special low conference rates with unlimited free mileage. Please contact Conventions in America concerning "Group #428" at 1-800-929-4242; or phone 619-678-3600; or FAX 619-678-3699 or e-mail scltravel at cgl.com.If you call American Airlines direct at 800-433-1790, ask for "Index #S9485." If you call Hertz direct at 800-654-2240, ask for "CV #24250." See the GP-97 WWW home page for additional details. ------------------------------------------------------------------------ SAN FRANCISCO BAY AND SILICON VALLEY TOURIST INFORMATION: Try the Stanford University home page at http://www.stanford.edu/, the Hyperion Guide at http://www.hyperion.com/ba/sfbay.html; the Palo Alto weekly at http://www.service.com/PAW/home.html; the California Virtual Tourist at http://www.research.digital.com/SRC/virtual-tourist/California.html; and the Yahoo Guide of San Francisco at http://www.yahoo.com/Regional_Information/States/California/San_Francisco. ----------------------------------------------------------------------- CONTEMPORANEOUS CONFERENCES IN CALIFORNIA AND ELSEWHERE: GP-97 is concurrent with the 45th Anniversary meeting of the Society for Industrial and Applied Mathematics (SIAM) on July 14-18, 1997 at Stanford University (http://www.siam.org). GP-97 comes just after the IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA-97) on July 10 - 11, 1997 in Monterey, California (90 miles from Stanford University) and the IEEE 8th International Conference on Advanced Robotics (ICAR-97) on July 5 - 9, 1997 in Monterey http://www.cs.cmu.edu/afs/cs/project/space/www/cira97/conference.html. Other non-California conferences of interest include AAAI-97 on July 27-31, 1997 in Providence, Rhode Island (http://www.aaai.org/); ICGA-97 on July 20-23, 1997 in East Lansing, Michigan (http://isl.cps.msu.edu/GA/icga97); European Artificial Life Conference on July 28-31, 1997 in Brighton, England (http://www.cogs.susx.ac.uk/ecal97/); and IJCAI-97 on August 26-29, 1997 in Nagoya, Japan (http://www.aaai.org/). ----------------------------------------------------------------------- MEMBERSHIP IN THE ACM, AAAI, or SIAM: For information about ACM membership, go to http://www.acm.org/; for SIGART, http://sigart.acm.org/; for AAAI http://www.aaai.org/; and for SIAM, http://www.siam.org. There is a discount on GP-97 registration fees for members of ACM, SIGART, AAAI, and SIAM. ----------------------------------------------------------------------- ADDRESSES FOR GP-97: GP-97 Conference, c/o American Association for Artificial Intelligence, 445 Burgess Drive, Menlo Park, CA 94025. PHONE: 415- 328-3123. FAX: 415-321-4457. E-MAIL: gp at aaai.org. WWW FOR AAAI: http://www.aaai.org/. WWW FOR GP-97: http://www-cs- faculty.stanford.edu/~koza/gp97.html ----------------------------------------------------------------------- REGISTRATION FORM FOR genetic programming 1997 CONFERENCE July 13 - 16 (Sunday - Wednesday), 1997 at Stanford University First Name ________________ Last Name _____________ Affiliation _________________________________________ Address ____________________________________________ __________________________________________________ City _______________________ State/Province _________ Zip/Postal Code ______________ Country _______________ Daytime telephone __________________________________ E-Mail address _____________________________________ Conference registration fee includes admission to all conference sessions and events, one copy of conference proceedings book, attendance at 5 tutorials of your choice, syllabus books for your 5 tutorials, Sunday night welcoming wine and cheese reception, Monday night conference dinner reception, one copy of a book of late-breaking papers, the conference T-shirt, 4 box lunches, and coffee breaks. Conference proceedings will be mailed to registered attendees with U.S. mailing addresses via 2-day U.S. priority mail about 1 - 2 weeks prior to the conference at no extra charge (at addressee's risk). If you are uncertain as to whether you will be at the above address at that time or DO NOT WANT your proceedings mailed to you at the above address for any other reason, your copy of the proceedings will be held for you at the conference registration desk if you check here ___. ------------------------------------- REGISTER BY June 19 FOR LOWER REGISTRATIONS FEES ------------------------------------- Postmarked by June 19 Student - ACM, SIAM or AAAI Member - $245 Regular - ACM, SIAM, or AAAI Member - $445 Student - Non-member - $265 Regular - Non-member - $465 ------------------------------------- Postmarked after June 19, 1997 or on-site - Add $50 to June 19 rates ------------------------------------- Member Number: ACM # ___________ SIAM # _________ AAAI # _________ Students must send legible proof of full-time student status. ------------------------------------- Stanford Parking Permits ($6 per day - C). Number of days ___ Total $_____ ------------------------------------- Grand Total (enter appropriate amount) $ _____________ ------------------------------------- ___ Check or money order made payable to "AAAI" (in U.S. funds) ___ Mastercard ___ Visa ___ American Express Credit card number __________________________________________ Expiration Date _________ Signature ____________________________________________ ------------------------------------- T-Shirt Size: ___ small ___ medium ___ large ___ extra-large ------------------------------------- TUTORIALS: Check off a box for one tutorial from each of the 6 rows: Sunday July 13 - 9:15 AM - 11:30 AM --- Genetic Algorithms --- Evolvable Hardware --- Program Growth Control in Genetic Programming --- Introduction to Genetic Programming ------------------------------------- Sunday July 13 - 1:00 PM - 3: 15 PM --- Evolutionary Algorithms for Computer-Aided Design of Integrated Circuits --- Self-Replicating Systems in Cellular Space Models --- Neural Networks --- Advanced Genetic Programming ------------------------------------- --- Evolutionary Programming and Evolution Strategies --- Genetic Programming Representations --- Design of Electrical Circuits using Genetic Programming --- Genetic Programming with Linear Genomes ------------------------------------- Tuesday July 15 - 3:25 PM - 5:40 PM --- Computational Learning Theory --- Simulated Evolution of Models --- Machine Learning --- Molecular Biology for Computer Scientists ------------------------------------- Tuesday July 15 - 7:30 PM - 9:30 PM --- DNA Computing --- Evolutionary Algorithms with Mathematica --- Cellular Programming: Evolution Of Parallel Cellular Machines --- Machine Language Genetic Programming ------------------------------------- No refunds will be made; however, we will transfer your registration to a person you designate upon notification. ------------------------------------- SEND TO: GP-97 Conference, c/o American Association for Artificial Intelligence, 445 Burgess Drive, Menlo Park, CA 94025. PHONE: 415-328-3123. FAX: 415-321-4457. E-MAIL: gp at aaai.org. WWW FOR AAAI: http://www.aaai.org/. WWW FOR GP-97: http://www-cs-faculty.stanford.edu/~koza/gp97.html ----------------------------------------------------------------------- List of 84 Papers for Second Annual Genetic Programming Conference (GP-97), July 13-16, 1997, Stanford University ----------------------------------------------------------------- GENETIC PROGRAMMING Ahluwalia, Manu, Larry Bell, and Terence C. Fogarty Co-evolving Functions in Genetic Programming: A Comparison in ADF Selection Strategies Angeline, Peter J. Subtree Crossover: Building Block Engine or Macromutation? Ashlock, Dan GP-Automata for Dividing the Dollar Ashlock, Dan, and Charles Richter The Effect of Splitting Populations on Bidding Strategies Banzhaf, Wolfgang, Peter Nordin, and Markus Olmer Generating Adaptive Behavior for a Real Robot using Function Regression within Genetic Programming Bennett III, Forrest H A Multi-Skilled Robot that Recognizes and Responds to Different Problem Environments Bruce, Wilker Shane The Lawnmower Problem Revisited: Stack-Based Genetic Programming and Automatically Defined Functions Chen, Shu-Heng, and Chia-Hsuan Yeh Using Genetic Programming to Model Volatility in Financial Time Series Daida, Jason, Steven Ross, Jeffrey McClain, Derrick Ampy, and Michael Holczer Challenges with Verification, Repeatability, and Meaningful Comparisons in Genetic Programming Dain, Robert A. Genetic Programming For Mobile Robot Wall-Following Algorithms Deakin, Anthony G., and Derek F. Yates Economical Solutions with Genetic Programming: the Non- Hamstrung Squadcar Problem, FvM and EHP Dracopoulos, Dimitris C. Evolutionary Control of a Satellite Droste, Stefan Efficient Genetic Programming for Finding Good Generalizing Boolean Functions Eberbach, Eugene Enhancing Genetic Programming by $-calculus Esparcia-Alcazar, Anna J., and Ken Sharman Evolving Recurrent Neural Network Architectures by Genetic Programming Fernandez, Thomas, and Matthew Evett Training Period Size and Evolved Trading Systems Freitas, Alex A. A Genetic Programming Framework for Two Data Mining Tasks: Classification and Generalized Rule Induction Fuchs, Matthias, Dirk Fuchs, and Marc Fuchs Solving Problems of Combinatory Logic with Genetic Programming Gathercole, Chris, and Peter Ross Small Populations over Many Generations can beat Large Populations over Few Generations in Genetic Programming Gathercole, Chris, and Peter Ross Tackling the Boolean Even N Parity Problem with Genetic Programming and Limited-Error Fitness Geyer-Schulz, Andreas The Next 700 Programming Languages for Genetic Programming Gray, H. F., and R. J. Maxwell Genetic Programming for Multi-class Classification of Magnetic Resonance Spectroscopy Data Greeff, D. J., and C. Aldrich Evolution of Empirical Models for Metallurgical Process Systems Gritz, Larry, and James K. Hahn Genetic Programming Evolution of Controllers for 3-D Character Animation Harries, Kim, and Peter Smith Exploring Alternative Operators and Search Strategies in Genetic Programming Haynes, Thomas On-line Adaptation of Search via Knowledge Reuse Haynes, Thomas, and Sandip Sen Crossover Operators for Evolving A Team Hiden, Hugo, Mark Willis, Ben McKay, and Gary Montague Non-Linear And Direction Dependent Dynamic Modelling Using Genetic Programming Hooper, Dale C., Nicholas S. Flann, and Stephanie R. Fuller Recombinative Hill-Climbing: A Stronger Search Method for Genetic Programming Howley, Brian Genetic Programming and Parametric Sensitivity: a Case Study In Dynamic Control of a Two Link Manipulator Huelsbergen, Lorenz Learning Recursive Sequences via Evolution of Machine- Language Programs Iba, Hitoshi Multiple-Agent Learning for a Robot Navigation Task by Genetic Programming Jaske, Harri On code reuse in genetic programming Koza, John R., Forest H. Bennett III, Martin A. Keane, and David Andre Evolution of a Time-Optimal Fly-To Controller Circuit using Genetic Programming Koza, John R., Forest Bennett III, Jason Lohn, Frank Dunlap, Martin A. Keane, and David Andre Use of Architecture-Altering Operations to Dynamically Adapt a Three-Way Analog Source Identification Circuit to Accommodate a New Source Langdon, W. B., and R. Poli An Analysis of the MAX Problem in Genetic Programming Lensberg, Terje A Genetic Programming Experiment on Investment Behavior under Knightian Uncertainty Luke, Sean, and Lee Spector A Comparison of Crossover and Mutation in Genetic Programming Moore, Frank W., and Dr. Oscar N. Garcia A Genetic Programming Approach to Strategy Optimization in the Extended Two-Dimensional Pursuer/Evader Problem Nordin, Peter, and Wolfgang Banzhaf Genetic Reasoning Evolving Proofs with Genetic Search Park, YoungJa, and ManSuk Song Genetic Programming Approach to Sense Clustering in Natural Language Processing Paterson, Norman, and Mike Livesey Evolving caching algorithms in C by genetic programming Pelikan, Martin, Vladimir Kvasnicka, and Jiri Pospichal Read's linear codes and genetic programming Poli, Riccardo, and Stefano Cagnoni Genetic Programming with User-Driven Selection: Experiments on the Evolution of Algorithms for Image Enhancement Poli, R., and W. B. Langdon A New Schema Theory for Genetic Programming with One- point Crossover and Point Mutation Rosca, Justinian P. Analysis of Complexity Drift in Genetic Programming Ryan, Conor, and Paul Walsh The Evolution of Provable Parallel Programs Segovia, Javier, and Pedro Isasi Genetic Programming For Designing Ad Hoc Neural Network Learning Rules Sherrah, Jamie R., Robert E. Bogner, and Abdesselam Bouzerdoum The Evolutionary Pre-Processor: Automatic Feature Extraction for Supervised Classification using Genetic Programming Soule, Terence, and James A. Foster Code Size and Depth Flows in Genetic Programming Teller, Astro, and David Andre Automatically Choosing the Number of Fitness Cases: The Rational Allocation of Trials Watson, Andrew H., and Ian C. Parmee Steady State Genetic Programming With Constrained Complexity Crossover Winkeler, Jay F., and B. S. Manjunath Genetic Programming for Object Detection Zhang, Byoung-Tak, and Je-Gun Joung Enhancing Robustness of Genetic Programming at the Species Level Zhao, Kai and Jue Wang "Chromosone-Protein'': A Representation Scheme ----------------------------------------------------------------- GENETIC ALGORITHMS Bull, Larry, and Owen Holland Evolutionary Computing in Multi-Agent Environments: Eusociality Cantu-Paz, Erick, an David E. Goldberg Modeling Idealized Bounding Cases of Parallel Genetic Algorithms Dill, Karen M., and Marek A. Perkowski Minimization of GRM Forms with a Genetic Algorithm Gockel, Nicole, Martin Keim, Rolf Drechsler, and Bernd Becker A Genetic Algorithm for Sequential Circuit Test Generation based on Symbolic Fault Simulation Kargupta, Hillol, David E. Goldberg, and Liwei Wang Extending The Class of Order-k Delineable Problems For The Gene Expression Messy Genetic Algorithm Lathrop, James I. Compression Depth and Genetic Programs Mullen, David S., and Ralph M. Butler Genetic Algorithms In Optimization of Adjacency Constrained Timber Harvest Scheduling Problems Yang, Jihoon, and Vasant Honavar Feature Subset Selection Using A Genetic Algorithm ----------------------------------------------------------------- ARTIFICIAL LIFE AND EVOLUTIONARY ROBOTICS Balakrishnan, Karthik, and Vasant Honavar Spatial Learning for Robot Localization Floreano, Dario, and Stefano Nolfi God Save the Red Queen! Competition in Co-Evolutionary Robotics Hasegawa, Yasuhisa and Toshio Fukuda Motion Generation of Two-link Brachiation Robot Maeshiro, Tetsuya, and Masayuki Kimura Genetic Code as an Evolving Organism Ray, Thomas S. Selecting Naturally for Differentiation ----------------------------------------------------------------- EVOLUTIONARY PROGRAMMING AND EVOLUTIONARY STRATEGIES Angeline, Peter J. An Alternative to Indexed Memory for Evolving Programs with Explicit State Representations Chellapilla, Kumar Evolutionary Programming with Tree Mutations: Evolving Computer Programs without Crossover Greenwood, Garrison W. Experimental Observation of Chaos in Evolution Strategies Longshaw, Tom Evolutionary learning of large Grammars ----------------------------------------------------------------- DNA COMPUTING Arita, Masanori, Akira Suyama, and Masami Hagiya A Heuristic Approach for Hamiltonian Path Problem with Molecules Deaton, R, M. Garzon, R. C. Murphy, D. R. Francschetti, J. A. Rose, and S. E. Stevens Jr. Information Transfer through Hybridization Reactions in DNA based Computing Garzon, M., P. Neathery, R. Deaton, R. C. Murphy, D. R. Franschetti, S. E. Stevens Jr. A New Metric for DNA Computing Rose, J. A., Y. Gao, M. Garzon, and R. C. Murphy DNA Implementation of Finite-State Machines ----------------------------------------------------------------- EVOLVABLE HARDWARE Dreschler, Rolf, Nicole Gockel, Elke Mackensen, and Bernd Becker BEA: Specialized Hardware for Implementation of Evolutionary Algorithms Kazimierczak, Jan An Approach to Evolvable Hardware representing the Knowledge Base in an Automatic Programming System Michael Korkin, Hugo de Garis, Felix Gers, and Hitoshi Hemmi ``CBM (CAM-Brain Machine)'': A Hardware Tool which Evolves a Neural Net Module in a Fraction of a Second and Runs a Million Neuron Artificial Brain in Real Time Liu, Weixin, Masahiro Murakawa, and Tetsuya Higuchi Evolvable Hardware for On-line Adaptive Traffic Control in ATM Networks Sipper, Moshe, Eduardo Sanchez, Daniel Mange, Marco Tomassini, Andres Perez-Uribe, and Andre Stauffer The POE Model of Bio-Inspired Hardware Systems: A Short Introduction ----------------------------------------------------------------- CLASSIFIER SYSTEMS Nagasaka, Ichiro, and Toshiharu Taura Geometic Representation for Shape Generation using Classifier System Spohn, Bryan G., and Philip H. Crowley Complexity of Strategies and the Evolution of Cooperation Westerdale, T. H. Classifier Systems--No Wonder They Don't Work ------------------------- CITATION FOR GP-97 PROCEEDINGS: Koza, John R., Deb, Kalyanmoy, Dorigo, Marco, Fogel, David B., Garzon, Max, Iba, Hitoshi, and Riolo, Rick L. (editors). 1997. Genetic Programming 1997: Proceedings of the Second Annual Conference, July 13P16, 1997, Stanford University. San Francisco, CA: Morgan Kaufmann. From jagota at cse.ucsc.edu Mon May 12 12:27:00 1997 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Mon, 12 May 1997 09:27:00 -0700 Subject: PAC learning in NNs survey Message-ID: <199705121627.JAA02458@bristlecone.cse.ucsc.edu> The following refereed paper (47 pages, 118 references) is now available, in postscript form, from the Neural Computing Surveys web site: http://www.icsi.berkeley.edu/~jagota/NCS Probabilistic Analysis of Learning in Artificial Neural Networks: The PAC Model and its Variants Martin Anthony Department of Mathematics, The London School of Economics and Political Science, There are a number of mathematical approaches to the study of learning and generalization in artificial neural networks. Here we survey the `probably approximately correct' (PAC) model of learning and some of its variants. These models provide a probabilistic framework for the discussion of generalization and learning. This survey concentrates on the sample complexity questions in these models; that is, the emphasis is on how many examples should be used for training. Computational complexity considerations are briefly discussed for the basic PAC model. Throughout, the importance of the Vapnik-Chervonenkis dimension is highlighted. Particular attention is devoted to describing how the probabilistic models apply in the context of neural network learning, both for networks with binary-valued output and for networks with real-valued output. From sml%essex.ac.uk at seralph21.essex.ac.uk Mon May 12 12:04:08 1997 From: sml%essex.ac.uk at seralph21.essex.ac.uk (Simon Lucas) Date: Mon, 12 May 1997 17:04:08 +0100 Subject: Face Recognition with the continuous n-tuple classifier (paper available) Message-ID: <33773F78.1464@essex.ac.uk> The following paper is avaiable: Face Recognition with the continuous n-tuple classifier S.M. Lucas Submitted to the Britich Machine Vision Conference-97 Face recognition is an important field of research with many potential applications for suitably efficient systems, including biometric security and searching large face databases. This paper describes a new approach to the problem based on a new type of n-tuple classifier: the continuous n-tuple system. Results indicate that the new method is faster and more accurate than previous methods reported in the literature on the widely used Olivetti Research Laboratories face database. This paper is available via my web page: http://esewww.essex.ac.uk/~sml Comments welcome, as always. Best Regards, Simon Lucas ------------------------------------------------ Dr. Simon Lucas Department of Electronic Systems Engineering University of Essex Colchester CO4 3SQ United Kingdom Tel: (+44) 1206 872935 Fax: (+44) 1206 872900 Email: sml at essex.ac.uk http://esewww.essex.ac.uk/~sml secretary: Mrs Janet George (+44) 1206 872438 ------------------------------------------------- From rod at imm.dtu.dk Mon May 12 12:13:25 1997 From: rod at imm.dtu.dk (R. Murray-Smith) Date: Mon, 12 May 1997 18:13:25 +0200 Subject: New Book: Multiple Model Approaches to Nonlinear Modelling and Control Message-ID: <337741A5.1C4D@imm.dtu.dk> New Book. Full details available at http://www.itk.ntnu.no/SINTEF/ansatte/Johansen_Tor.Arne/mmamc/mmamc_book.html Multiple Model Approaches to Modelling and Control Roderick Murray-Smith and Tor Arne Johansen (Eds.) ---------------------------------------------------------------------- This book presents a variety of approaches which produce complex models or controllers by piecing together a number of simpler subsystems. This divide-and-conquer strategy is a long-standing and general way of coping with complexity in engineering systems, nature and human problem solving. More complex plants, advances in information technology, and tightened economical and environmental constraints in recent years have lead to practising engineers being faced with modelling and control problems of increasing complexity. When confronted with such problems, there is a strong intuitive appeal in building systems which operate robustly over a wide range of operating conditions by decomposing them into a number of simpler linear modelling or control problems, even for nonlinear modelling or control problems. This appeal has been a factor in the development of increasingly popular `local' and multiple-model approaches to coping with strongly nonlinear and time-varying systems. Such local approaches are directly based on the divide-and-conquer strategy, in the sense that the core of the representation of the model or controller is a partitioning of the system's full range of operation into multiple smaller operating regimes each of which is associated a locally valid model or controller. This can often give a simplified and transparent nonlinear model or control representation. In addition, the local approach has computational advantages, it lends itself to adaptation and learning algorithms, and allows direct incorporation of high-level and qualitative plant knowledge into the model. These advantages have proven to be very appealing for industrial applications, and the practical, intuitively appealing nature of the framework is demonstrated in chapters describing applications of local methods to problems in the process industries, biomedical applications and autonomous systems. The successful application of the ideas to demanding problems is already encouraging, but creative development of the basic framework is needed to better allow the integration of human knowledge with automated learning. The underlying question is `How should we partition the system - what is `local'?'. This book presents alternative ways of bringing submodels together, which lead to varying levels of performance and insight. Some are further developed for autonomous learning of parameters from data, while others have focused on the ease with which prior knowledge can be incorporated. It is interesting to note that researchers in Control Theory, Neural Networks, Statistics, Artificial Intelligence and Fuzzy Logic have more or less independently developed very similar modelling methods, calling them Local Model Networks, Operating Regime based Models, Multiple Model Estimation and Adaptive Control, Gain Scheduled Controllers Heterogeneous Control, Mixtures of Experts, Piecewise Models, Local Regression techniques, or Tagaki-Sugeno Fuzzy Models, among other names. Each of these approaches has different merits, varying in the ease of introduction of existing knowledge, as well as the ease of model interpretation. This book attempts to outline much of the common ground between the various approaches, encouraging the transfer of ideas. Recent progress in algorithms and analysis is presented, with constructive algorithms for automated model development and control design, as well as techniques for stability analysis, model interpretation and model validation. ---------------------------------------------------------------------- Table Of Contents Preface - the book outline. The Operating Regime Approach to Nonlinear Modelling and Control Tor Arne Johansen, SINTEF, and Roderick Murray-Smith, Daimler-Benz AG Fuzzy Set Methods for Local Modelling and Identification R. Babuska and H.B. Verbruggen, Delft University of Technology Modelling of Electrically Stimulated Muscle H. Gollee, University of Glasgow, K.J. Hunt, Daimler-Benz AG, N. Donaldson, University College London and J. Jarvis, University of Liverpool Process Modelling Using the Functional State Approach Aarne Halme, Arto Visala and Xia-Chang Zhang, Helsinki University of Technology Markov Mixtures of Experts Marina Meila, Michael Jordan, Massachusetts Institute of Technology Active Learning with Mixture Models David Cohn, and Zoubin Ghahramani and Michael Jordan, Massachusetts Institute of Technology Local Learning in Local Model Networks Roderick Murray-Smith, Daimler-Benz AG and Tor Arne Johansen, SINTEF Side-Effects of Normalising Basis Functions in Local Model Networks Robert Shorten and Roderick Murray-Smith, Daimler-Benz AG The Composition and Validation of Hetrogeneous Control Laws B. Kuipers, University of Texas at Austin and K. Astrom, Lund Insitute of Technology Local Laguerre Models Daniel Sbarbaro, University of Concepcisn Multiple Model Adaptive Control Kevin D. Schott, B. Wayne Bequette, Rensselaer Polytechnic Institute H-infinity Control of Nonlinear Processes Using Multiple Linear Models A. Banerjee, Y. Arkun, Georgia Insitute of Technology, and R. Pearson and B. Ogunnaike, DuPont Synthesis of Fuzzy Control Systems Based on Linear Takagi-Sugeno Fuzzy Models J. Zhao, R. Gorez and V. Wertz, Catholic University of Louvain -------------------------------------------------------------------- Ordering Information ISBN Number 07484 0595 X The book is hardback 350 pages, published by Taylor and Francis and costs 55.00 pounds sterling. You can order over the web (http://www.tandf.co.uk/books/borders.htm), order by E-mail (from UK, Europe and Asia use book.orders at tandf.co.uk. For USA use bkorders at tandfpa.com) or write, phone or fax to Taylor and Francis: The Book Ordering Department, Taylor and Francis, Rankine Road, Basingstoke, Hants RG24 8PR, UK Telephone: +44 (0) 1256 813000 Ext. 236, Fax: +44 (0) 1256 479438 --------------------------------------------------------------------- From jbower at bbb.caltech.edu Tue May 13 22:03:32 1997 From: jbower at bbb.caltech.edu (James M. Bower) Date: Tue, 13 May 1997 18:03:32 -0800 Subject: Registration open for CNS*97 Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 3474 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/84a4b2ac/attachment-0001.bin From bishopc at helios.aston.ac.uk Wed May 14 03:10:53 1997 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Wed, 14 May 1997 08:10:53 +0100 Subject: Summer School: Probabilistic Graphical Models Message-ID: <18477.199705140710@sun.aston.ac.uk> --------------------------------------------------------------------------- A Newton Institute EC Summer School PROBABILISTIC GRAPHICAL MODELS 1 - 5 September 1997 Isaac Newton Institute, Cambridge, U.K. Organisers: C M Bishop (Aston) and J Whittaker (Lancaster) Probabilistic graphical models provide a very general framework for representing complex probability distributions over sets of variables. A powerful feature of the graphical model viewpoint is that it unifies many of the common techniques used in pattern recognition and machine learning including neural networks, latent variable models, probabilistic expert systems, Boltzmann machines and Bayesian belief networks. Indeed, the increasing interactions between the neural computing and graphical modelling communities have resulted in a number of powerful new ideas and techniques. The conference will include several tutorial presentations on key topics as well as advanced research talks. Provisional themes: Conditional independence; Bayesian belief networks; message propagation; latent variable models; variational techniques; mean field theory; learning and estimation; model search; EM and MCMC algorithms; axiomatic approaches; causality; decision theory; neural networks; information and coding theory; scientific applications and examples. Provisional list of speakers: C M Bishop (Aston) D J C MacKay (Cambridge) R Cowell (City) J Pearl (UCLA) A P Dawid (UCL) M D Perlman (Washington) D Geiger (Technion) M Piccioni (Aquila) E George (Texas) R Shachter (Stanford) W Gilks (Cambridge) J Q Smith (Warwick) D Heckermann (Microsoft) M Studeny (Prague) G E Hinton (Toronto) M Titterington (Glasgow) T Jaakkola (UCSC) J Whittaker (Lancaster) M I Jordan (MIT) S Lauritzen (Aalborg) B Kappen (Nijmegen) D Spiegelhalter (Cambridge) M Kearns (AT&T) S Russell (Berkeley) This instructional conference will form a component of the Newton Institute programme on Neural Networks and Machine Learning, organised by C M Bishop, D Haussler, G E Hinton, M Niranjan and L G Valiant. Further information about the programme is available via the WWW at http://www.newton.cam.ac.uk/programs/nnm.html Location and Costs: The conference will take place in the Isaac Newton Institute and accommodation for participants will be provided at Wolfson Court, adjacent to the Institute. The conference package costs 270 UK pounds which includes accommodation from Sunday 31 October to Friday 5 September, together with breakfast, lunch during the days that the lectures take place and evening meals. Applications: To participate in the conference, please complete and return an application form and, for students and postdoctoral fellows, arrange for a letter of reference from a senior scientist. Limited financial support is available for participants from appropriate countries. Application forms are available from the conference Web Page at http://www.newton.cam.ac.uk/programs/nnmec.html Completed forms and letters of recommendation should be sent to Heather Dawson at the Newton Institute, or by e-mail to h.dawson at newton.cam.ac.uk *Closing Date for the receipt of applications and letters of recommendation is 16 June 1997* --------------------------------------------------------------------------- From moatl at cs.tu-berlin.de Wed May 14 11:33:02 1997 From: moatl at cs.tu-berlin.de (Martin Stetter) Date: Wed, 14 May 1997 17:33:02 +0200 Subject: PhD-fellowship in Computer Science Message-ID: <3379DB2D.313E@cs.tu-berlin.de> PhD-fellowship in Computer Science, Technische Universitaet Berlin, Germany Acquisition and Analysis of Optical Imaging Data from Primate Visual Cortex The Neural Informations Processing Group (Department of Computer Science,Technische Universitaet Berlin, Germany) solicits applications for a predoctoral fellowship. The applicant is expected to join an international collaborative project which aims at the development of new methodologies for image acquisition, image analysis, and for the separation of different superimposed signal components from optical imaging data, which are recorded from primate visual cortex. Although focus will be on statistical signal processing, the applicant is also expected to participate in the setup of a time-resolved optical imaging system as well as in its modification for depth-resolved measurements of cortical activity. Applicants should have a strong background in a theoretical discipline, such as physics, mathematics, electrical engineering or computer science, should have gained experience with both hardware (optics, electronics) and programming (C, C++), and should be familiar with signal processing techniques. In addition, applicants should be willing to stay abroad for parts of the project. Prior biological or neuroscience training is not required but the applicant is expected to acquire the relevant expertise during the project. The position is initially for one year with possible extension of up to five years. Salary is commensurable to BAT IIa / 2. Please send applications including copies of certificates, CV, list of publications, statement of research interests and list of skills relevant to the project to: Prof. Klaus Obermayer, FR2-1, Informatik, Technische Universitaet Berlin, Franklinstrasse 28/29, 10587 Berlin, Germany, phone: ++49-30-314-73120, fax: ++49-30-314-73121, email: oby at cs.tu-berlin.de preferably by fax or email. From rojas at inf.fu-berlin.de Thu May 15 21:44:00 1997 From: rojas at inf.fu-berlin.de (Raul Rojas) Date: Thu, 15 May 97 21:44 MET DST Subject: New Book: "Neural Networks" Message-ID: This e-mail is to announce a new book on neural networks: "Neural Networks - A Systematic Introduction" by Raul Rojas, With a Foreword by Jerome Feldman, Springer-Verlag, Berlin-New York, 1996 (502 pp., 350 illustrations). The book has a homepage with a sample chapter ("The Backpropagation Algorithm", 33 pp.) that you are invited to download. The address of the homepage is http://www.inf.fu-berlin.de/~rojas/neural This is the Review which appeared in April in "Computing Reviews": Connectionism and neural nets Rojas, Raul (Univ. Halle, Halle, Germany) 9704-0262 Neural networks: a systematic introduction. Springer-Verlag New York, Inc., New York, NY, 1996, 502 pp., $39.95, ISBN 3-540-60505-3. If you want a systematic and thorough overview of neural networks, need a good reference book on this subject, or are giving or taking a course on neural networks, this book is for you. More generally, the book is of value for anyone interested in understanding artificial neural networks or in learning more about them. It attempts to solve the puzzle of artificial neural models and proposals. Rojas systematically introduces and discusses each of the neural network models in relation to the others. The book is divided into 18 chapters, each designed to be taught in about one week. The only mathematical tools needed to understand the text are those learned during the first two years at university. The first eight chapters form a logical sequence, and later ones can be covered in a variety of orders. The first eight chapters are "The Biological Paradigm"; "Threshold Logic"; "Weighted Networks - The Perceptron"; "Perceptron Learning"; "Unsupervised Learning and Clustering Algorithms"; "One and Two Layered Networks"; "The Backpropagation Algorithm"; and "Fast Learning Algorithms." The later chapters cover "Statistics and Neural Networks"; "The Complexity of Learning"; "Fuzzy Logic"; "Associative Networks"; "The Hopfield Model"; "Stochastic Networks"; "Kohonen Networks"; "Modular Neural Networks"; "Genetic Algorithms"; and "Hardware for Neural Networks." Proofs are rigorous, but not overly formal, and the author makes extensive use of geometric intuition and diagrams. There are a modest number of exercises at the end of each chapter. Material from the book has been used successfully for courses in Germany, Austria and the United States. It seems quite extensive for a one-semester course. Neural network applications are discussed, with the emphasis on computational rather than engineering issues. Those who want to expend a minimum amount of time and effort on a first overview of neural networks and those who need to apply neural network technology in the most cost-effective way for a specific task, should consult another reference as well. On the whole, though, the author has done excellent work. The book includes an index and an up-to-date and useful list of 473 references. The German edition has been quite successful and has been through five printings in three years. The English version has been radically rewritten and deserves the same success. -J. Tepnadi, Tallinn, Estonia --------------------- From smagt at dlr.de Fri May 16 01:38:00 1997 From: smagt at dlr.de (Patrick van der Smagt) Date: Fri, 16 May 1997 07:38:00 +0200 Subject: BOOK ANNOUNCEMENT: Neural Systems for Robotics Message-ID: <337BF2B8.400B@dlr.de> BOOK ANNOUNCEMENT ================= Neural Systems for Robotics ed. by Omid Omidvar and Patrick van der Smagt Academic Press, 1997 ISBN 0125262809 http://www.apcatalog.com/cgi-bin/AP?ISBN=0125262809&LOCATION=US&FORM=FORM2 http://www.amazon.com/exec/obidos/ISBN%3D0125262809/6344-0055321-349380 In this book we attempt to give an overview of state-of-the-art applications of neural methodologies for robot control. This is done via in-depth and summarizing studies. Table of contents: ================== Neural Network Sonar as a Perceptual Modality for Robotics Itiel E. Dror, Mark Zagaeski, Damien Rios, Cynthia F. Moss Dynamic Balance of a Biped Walking Robot W. Thomas Miller III, Andrew L. Kun Visual Feedback in Motion Patrick van der Smagt, Frans Groen Inverse Kinematics of Dextrous Manipulators David DeMers, Kenneth Kreutz-Delgado Stable Manipulator Trajectory Control Using Neural Networks Yichuang Jin, Tony Pipe, Alan Winfield The Neural Dynamics Approach to Sensory-Motor Control Paolo Gaudiano, Frank H. Guenther, Eduardo Zalama Operant Conditioning in Robots Andreas B\"uhlmeier, Gerhard Manteuffel A Dynamic Net for Robot Control Bridget Hallam, John Hallam, Gillian Hayes Neural Vehicles Ben Kr\"ose, Joris van Dam Self-Organization and Autonomous Robots Jukka Heikkonen, Pasi Koikkalainen >From the preface ================ The chapters in this book are logically selected and grouped. The path that is followed goes through four stages: * Research inspired by biological systems at the behavioral level * Control of robot arms using artificial neural networks * Simulation of and inspiration by biological neural systems * Control and navigation of mobile robots using artificial neural networks. The first three chapters describe neural networks which simulate biological systems at the behavioral level. The third chapter ends with neural control of a robot arm; this topic is picked up by the subsequent---overview---chapter, followed by an in-depth study in this field. The next three chapters are focused on biological neural systems, and describe applications in the navigation of mobile robots. This theme is covered in detail in the final two chapters. Evaluating a biological system at the behavioral level, Chapter 1, ``Neural Network Sonar as a Perceptual Modality for Robotics,'' by Itiel Dror, Mark Zagaeski, Damien Rios, and Cynthia Moss, describes a neural network which approximates echo-locating behavior of the big brown bat, Eptesicus fuscus. Using previous studies of this bat, a neural system is introduced which can determine speed of movement using a single echolocation only, referring back to studies which show that bats differentiate between different wingbeat rates of insects. The results presented in this chapter provide a good basis for the use of echolocation in robotic systems. In Chapter 2, ``Dynamic Balance of a Biped Walking Robot,'' by Thomas Miller III and Andrew Kun, a neural system is used to have a robot learn to walk. The approach is unique: Instead of using analyses of walking behavior of biological systems, the neural network-driven robot uses feedback from force sensors mounted on the undersides of the feet, as well as from accelerometers mounted on the body. The learning behavior that is exhibited typically resembles that of biological systems which learn to walk. A technique for the control of robot manipulators is introduced in Chapter 3, ``Visual Feedback in Motion,'' by Patrick van der Smagt and Frans Groen. This research is also inspired by a biological system at the behavioral level. Using studies of the gannet from the family of Sulidae, sequences of two-dimensional visual signals are interpreted to guide a monocular robot arm in three-dimensional space without using models of the visual sensor nor the robot arm. Exploration of the control of robot arms is continued in Chapter 4, ``Inverse Kinematics of Dextrous Manipulators,'' by David DeMers and Kenneth Kreutz-Delgado. The chapter gives an overview of neural and non-neural methods to solve the inverse kinematics problem: Given an end-effector position and orientation, how should one move a robot arm (in a most efficient way) to reach that position/orientation? The theoretically inclined Chapter 5, ``Stable Manipulator Trajectory Control Using Neural Networks,'' by Yichuang Jin, Tony Pipe, and Alan Winfield, describes neural network approaches for trajectory following of a robot arm. The key issue here is how to improve the accuracy of the followed trajectory when the dynamic model of the robot arm is inaccurate. Studies of sensory motor control in biological organisms and robots are presented in Chapter 6, ``The Neural Dynamics Approach to Sensory-Motor Control,'' by Paolo Gaudiano, Frank Guenther, and Eduardo Zalama. It extensively discusses neural network models developed at Boston University's Center for Adaptive Systems. The neural models are used in two applications: trajectory following of a mobile robot, and controlling the motor skills required for speech reproduction using auditory-orosensory feedback. Biomorphic robots are discussed in Chapter 7, ``Operant Conditioning in Robots,'' by Andreas B\"uhlmeier and Gerhard Manteuffel. In their overview chapter, they discuss neural systems which maintain homeostasis for (mobile) robot systems. After discussing neural learning systems with neurophysiological backgrounds, a survey of several implementations on mobile robots, which have to learn to navigate between obstacles, is given. In Chapter 8, ``A Dynamic Net for Robot Control,'' by Bridget Hallam, John Hallam, and Gillian Hayes, a neural model, designed for explaining various learning phenomena from animal literature, is used to control a mobile robot. The navigation of mobile robots using artificial neural networks is covered in Chapter 9, ``Neural Vehicles,'' by Ben Kr\"ose and Joris van Dam. The authors make the distinction between reactive navigation, planned navigation in known environments, and map building from sensor signals. In the final chapter, ``Self-Organization and Autonomous Robots,'' Jukka Heikkonen and Pasi Koikkalainen describe the use of self-organizing maps for reactive control of mobile robots. -- dr Patrick van der Smagt phone +49 8153 281152 DLR/Institute of Robotics and Systems Dynamics fax +49 8153 281134 P.O. Box 1116, 82230 Wessling, Germany email From Paul.Vitanyi at cwi.nl Tue May 13 08:30:45 1997 From: Paul.Vitanyi at cwi.nl (Paul.Vitanyi@cwi.nl) Date: Tue, 13 May 1997 14:30:45 +0200 Subject: Book Announcement: 2nd Edition Li-Vitanyi on Kolmogorov Complexity & Appl. Message-ID: <9705131230.AA18126=paulv@gnoe.cwi.nl> Ming Li and Paul Vitanyi, AN INTRODUCTION TO KOLMOGOROV COMPLEXITY AND ITS APPLICATIONS, REVISED AND EXPANDED SECOND EDITION, Springer-Verlag, New York, 1997, xx+637 pp, 41 illus. Hardcover \$49.95/ISBN 0-387-94868-6 (Graduate Texts in Computer Science Series) See the web pages "http://www.cwi.nl/~paulv/kolmogorov.html" and "http://www.springer-ny.com/catalog/np/nov96np/DATA/0-387-94868-6.html" a subpage of "http://www.springer-ny.com/". The first edition appeared late in 1993. The second edition is revised and expanded by about 90 pages. The price is reduced by $9.05. From jose.millan at jrc.it Mon May 19 06:47:11 1997 From: jose.millan at jrc.it (jose.millan) Date: Mon, 19 May 97 12:47:11 +0200 Subject: Job opening Message-ID: <9705191047.AA01752@ jrc.it> I apologize if you receive this announcement multiple times. Best regards, Jose **************************************************************************** POSTDOCTORAL RESEARCH FELLOWSHIP available at the Institute for Systems, Informatics and Safety Joint Research Centre of the European Commission 21020 Ispra (VA) Italy Applications are invited for a one-year postdoctoral research position in the area of "Robot Learning". The candidate will carry out applied research on the use of reinforcement learning and other neural network paradigms for (1) the acquisition of efficient reactive navigation strategies, (2) map building, and (3) the integration of both---i.e., topological reasoning and reactive control. The algorithms will be implemented on physical mobile robots equipped with range sensors (sonar, infrared and/or laser range finder) and devoted to surveillance and safeguards applications. The ideal candidate should have experience with neural networks and mobile robots, good programming skills in C/C++, ability to communicate research results, and be willing to build on previous work. The position is available immediately. It is funded by the European Commission in the framework of the SMART-II Network. Thus, only citizens of the European Union or associated countries are eligible. The gross salary is about 2400 ECU/month. There is travel funding in case of papers accepted at important conferences. Interested candidates should send a full CV and the names of 2 referees by email to jose.millan at jrc.it ------------------------------- Jose del R. Millan, Ph.D. Institute for Systems, Informatics and Safety Joint Research Centre of the European Commission 21020 Ispra (VA) Italy e-mail: jose.millan at jrc.it Phone: +39 - 332 - 78 5751 Fax: +39 - 332 - 78 9185 From drl at eng.cam.ac.uk Mon May 19 13:50:32 1997 From: drl at eng.cam.ac.uk (drl@eng.cam.ac.uk) Date: Mon, 19 May 97 13:50:32 BST Subject: Cambridge Neural Networks Summer School '97 Message-ID: <9705191250.6232@ganesh.eng.cam.ac.uk> +-----------------------------------------------------+ | THE SEVENTH CAMBRIDGE NEURAL NETWORKS SUMMER SCHOOL | +-----------------------------------------------------+ Neural computation, network design and industrial applications September 22-24, 1997 Emmanuel College, Cambridge, UK. Course director: David Lovell. This three day school provides an introduction to, and an overview of the field of neural computation. The course is aimed at a broad range of participants, including those needing to assess the potential of neural networks for their own business, to those wishing to keep up to date with recent developments. As well as 23 presentations from international experts in the field, the course offers a hands-on session, laboratory tour and sessions devoted to neural network applications. Discounts are available for academics and there are fully-funded places available for EPSRC students. The deadline for applications for EPSRC funding is Friday June 13, 1997. Full details of the course, registration and EPSRC funding application forms are available via: http://svr-www.eng.cam.ac.uk/~drl/cnnss97/brochure.html For enquiries or reservation please contact Lynda Bryers: by 'phone on: +44 (0)1223 302233 by fax on: +44 (0)1223 301122 by email on: CPI at hermes.cam.ac.uk by post to: University of Cambridge Programme for Industry 1 Trumpington Street, Cambridge CB2 1QA, UK List of speakers and presentation titles Chris BISHOP 1.Regularization and model complexity. 2.Density estimation, mixture models and the EM algorithm. 3.(ADV) Latent variables, topographic mappings and data visualization. Herve BOURLARD 1.Statistics, neural nets and parallels with conventional algorithms. 2.Speech recognition. 3.(ADV) Applications of neural nets to speech recognition. George HARPUR 1.An introduction to unsupervised learning. 2.ICA and information theoretic approaches to unsupervised learning. David LOVELL 1.Neural computing in perspective (course framework). 2.(APP) Predicting risk in pregnancy using neural networks. John MOODY 1.Time series prediction: classical and nonlinear approaches. 2.Neural networks for time series analysis. 3.(APP) Models for economic and financial time series. Mahesan NIRANJAN 1.Neural Networks in Signal Processing. Richard PRAGER 1.Classification Trees and the CMAC. Rich SUTTON 1.Reinforcement learning I: learning to act. 2.Reinforcement learning II: temporal-difference learning. 3.(APP) Reinforcement learning III: generalization and cognition. Volker TRESP 1.Introduction to supervised learning in neural networks. 2.Combining neural networks: stacking, arcing, boosting, bagging, bragging and all that. 3.(APP) Does it all work? Successful industrial applications of neural networks. Chris WILLIAMS 1.Gaussian processes for regression. 2.(APP) Estimating wind-fields from satellite data with neural networks and Gaussian processes. From kruschke at croton.psych.indiana.edu Mon May 19 13:34:28 1997 From: kruschke at croton.psych.indiana.edu (John Kruschke) Date: Mon, 19 May 1997 12:34:28 -0500 (EST) Subject: 30th Annual Math Psych Conference Message-ID: A preliminary version of the program for the Thirtieth Annual Meeting of the Society for Mathematical Psychology (SMP), to be held July 31-Aug 3, 1997, is now available on the World Wide Web at the address: http://www.indiana.edu/~mathpsy/ Registration forms and hotel and travel information will be added in the near future. A complete regular mailing of this information for all SMP members and conference participants will take place next week. Some highlights for this year's conference include a symposium on formal models of face recognition, a symposium on the use of selective influence in the analysis of mental architectures, and a satellite conference on methods for model selection (to be held Aug 3-4; the program is available at http://www.cwi.nl/~pdg/modsel.html). Invited addresses are being given by Thomas Landauer (Latent Semantic Analysis), Roger Ratcliff (Diffusion Model for Reaction Time), and Jerald Balakrishnan (Misrepresentations of Signal Detection Theory). Multiple sessions are planned in areas of learning and memory, judgment and decision making, categorization, information processing, sensation and perception, measurement and scaling, and methodology and statistics. The activities include an opening reception Thursday evening, a banquet Friday evening, and an opportunity to attend an opera on Saturday evening. We hope you can attend. Sincerely, Robert Nosofsky Richard Shiffrin From ericr at mech.gla.ac.uk Mon May 19 06:26:28 1997 From: ericr at mech.gla.ac.uk (Eric Ronco) Date: Mon, 19 May 1997 11:26:28 +0100 (BST) Subject: No subject Message-ID: <19265.199705191026@googie.mech.gla.ac.uk> From sbcho at csai.yonsei.ac.kr Tue May 20 08:28:58 1997 From: sbcho at csai.yonsei.ac.kr (Sung-Bae Cho) Date: Tue, 20 May 1997 21:28:58 +0900 (KST) Subject: CFP: Hybrid Evolutionary Learning Systems Message-ID: <9705201228.AA03396@csai.yonsei.ac.kr> ------------------------- CALL FOR PAPERS ------------------------- Special Session on "Hybrid Evolutionary Learning Systems" at ICONIP'97 The Fourth International Conference on Neural Information Processing November 24~28, 1997 Dunedin/Queenstown, New Zealand -------------------------------------------------------------------- As part of the International Conference on Neural Information Processing, a special session is planned on Hybrid Evolutionary Learning Systems. The session will be devoted to exploring different hybrid approaches of Neural Networks, Fuzzy Logic and Evolutionary Computation for achieving better learning systems. The scope of the special session will include any topics related with the hybrid learning systems not only based on the above softcomputing techniques, but also on any biologically inspired methodologies. Prospective authors are invited to submit three copies of the paper written in English on A4-format white paper with one inch margins on all four sides, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. Centred at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by an abstract and the text. Those who are interested should send a title and an extended abstract (not more than 300 words) via email and the manuscripts should be sent to the following address no later than June 16: Session Chair: Prof. Sung-Bae Cho Dept. of Computer Science Yonsei University 134 Shinchon-dong, Sudaemoon-ku Seoul 120-749, Korea Tel: +82 2 361-2720 Fax: +82 2 365-2579 Email: sbcho at csai.yonsei.ac.kr Important Dates: June 16, 1997 Paper Submission Due July 20, 1997 Notification of Acceptance August 20, 1997 Final Submission For general information of ICONIP'97, please visit the web page of the conference at http://divcom.otago.ac.nz:800/com/infosci/kel/iconip97.htm. -------------------------------------------------------------------- From gordon at AIC.NRL.Navy.Mil Tue May 20 10:29:16 1997 From: gordon at AIC.NRL.Navy.Mil (gordon@AIC.NRL.Navy.Mil) Date: Tue, 20 May 97 10:29:16 EDT Subject: workshop Message-ID: <9705201429.AA10349@sun14.aic.nrl.navy.mil> ======= CALL FOR PARTICIPATION REINFORCEMENT LEARNING: TO MODEL OR NOT TO MODEL, THAT IS THE QUESTION Workshop at the Fourteenth International Conference on Machine Learning (ICML-97) Vanderbilt University, Nashville, TN July 12, 1997 www.cs.cmu.edu/~ggordon/ml97ws Recently there has been some disagreement in the reinforcement learning community about whether finding a good control policy is helped or hindered by learning a model of the system to be controlled. Recent reinforcement learning successes (Tesauro's TD-gammon, Crites' elevator control, Zhang and Dietterich's space-shuttle scheduling) have all been in domains where a human-specified model of the target system was known in advance, and have all made substantial use of the model. On the other hand, there have been real robot systems which learned tasks either by model-free methods or via learned models. The debate has been exacerbated by the lack of fully-satisfactory algorithms on either side for comparison. Topics for discussion include (but are not limited to) o Case studies in which a learned model either contributed to or detracted from the solution of a control problem. In particular, does one method have better data efficiency? Time efficiency? Space requirements? Final control performance? Scaling behavior? o Computational techniques for finding a good policy, given a model from a particular class -- that is, what are good planning algorithms for each class of models? o Approximation results of the form: if the real system is in class A, and we approximate it by a model from class B, we are guaranteed to get "good" results as long as we have "sufficient" data. o Equivalences between techniques of the two sorts: for example, if we learn a policy of type A by direct method B, it is equivalent to learning a model of type C and computing its optimal controller. o How to take advantage of uncertainty estimates in a learned model. o Direct algorithms combine their knowledge of the dynamics and the goals into a single object, the policy. Thus, they may have more difficulty than indirect methods if the goals change (the "lifelong learning" question). Is this an essential difficulty? o Does the need for an online or incremental algorithm interact with the choice of direct or indirect methods? Preliminary schedule of talks: 9:00- 9:30 Chris Atkeson "Why Model-Based Learning Should Be Inconsistent With the Model" 9:30-10:15 Jeff Schneider "Exploiting Model Uncertainty Estimates for Safe Dynamic Control Learning" 10:15-10:45 Discussion break 10:45-11:15 David Andre, Nir Friedman, and Ronald Parr "Generalized Prioritized Sweeping" 11:15-12:00 Scott Davies, Andrew Y. Ng, and Andrew Moore "Applying Model-Based Search to Reinforcement Learning" 12:00- 1:00 LUNCH BREAK 1:00- 1:45 Rich Sutton "Multi-Time Models: A Unified View of Modeling and Not Modeling" 1:45- 2:15 Doina Precup and Rich Sutton "Multi-Time Models for Reinforcement Learning" 2:15- 2:45 Howell, Frost, Gordon, and Wu "Real-Time Learning of Vehicle Suspension Control Laws" 2:45- 3:15 Discussion break 3:15- 3:45 Leonid Kuvayev and Rich Sutton "Approximation in Model-Based Learning" 3:45-4:15 Geoff Gordon "Wrap-up" 4:15- 5:00 Discussion Organizers: Chris Atkeson (cga at cc.gatech.edu) College of Computing Georgia Institute of Technology 801 Atlantic Drive Atlanta, GA 30332-0280 Geoff Gordon (ggordon at cs.cmu.edu) Computer Science Department Carnegie Mellon University 5000 Forbes Ave Pittsburgh, PA 15213-3891 (412) 268-3613, (412) 361-2893 Contact: Geoff Gordon (ggordon at cs.cmu.edu) From pr at physik.uni-wuerzburg.de Tue May 20 15:32:57 1997 From: pr at physik.uni-wuerzburg.de (Peter Riegler) Date: Tue, 20 May 1997 21:32:57 +0200 (METDST) Subject: thesis available Message-ID: The following Ph.D. thesis is available via anonymous-ftp. FTP-host: ftp.uni-wuerzburg.de FTP-file: file: pub/dissertation/riegler/these.ps.gz Dynamics of On-line Learning in Neural Networks Peter Riegler Institut fuer Theoretische Physik Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg, Germany Abstract: One of the most important features of natural as well as artificial neural networks is their ability to adjust to their environment by ``learning''. This results in the network's ability to ``generalize'', i.e. to generate with high probability the appropriate response to an unknown input. The theoretical description of generalization in artificial neural networks by means of statistical physics is the subject of this thesis. The focus is on {\em on-line learning}, where the presentation of examples used in the learning process occurs in a sequential manner. Hence, the systems investigated are dynamical in nature. They typically consist of a large number of degrees of freedom, requiring a description in terms of order parameters. In the first part of this work the most fundamental network, the perceptron, is investigated. Following a recent proposal by Kinouchi and Caticha it will be shown how one can derive a learning dynamics starting from first principles that results in an optimal generalization ability. Results will be presented for learning processes where the training examples are corrupted by different types of noise. The resulting generalization ability will be shown to be comparable to the noiseless case. Furthermore the results obtained reveal striking similarities to those obtained for batch learning. The optimal algorithms derived will be shown to depend on the characteristics of the particular learning task including the type and strength of the corrupting noise. In general this requires an additional estimation of such characteristic quantities. For the strength of the noise this estimation leads to interesting dynamical phase transitions. The second part deals with the dynamical properties of two-layer neural networks. This is of particular importance since these networks are known to represent universal approximators. Understanding the dynamical features will help to construct fast training algorithms that lead to best generalization. Specifically, an exact analysis of learning a rule by on-line gradient descent (backpropagation of error) in a two-layered neural network will be presented. Hereby, the emphasis is on adjustable hidden-to-output weights which have been left out of the analysis in the literature so far. Results are compared with the training of networks having the same architecture but fixed weights in the second layer. It will be shown, that certain features of learning in a two-layered neural network are independent of the state of the second layer. Motivated by this result it will be argued that putting the dynamics of the hidden-to-output weights on a faster time scale will speed up the learning process. For all systems investigated, simulations confirm the results. ________________________________________________________________ _/_/_/_/_/_/_/_/_/_/_/_/ _/ _/ Peter Riegler _/ _/_/_/ _/ Institut fuer Theoretische Physik _/ _/ _/ _/ Universitaet Wuerzburg _/ _/_/_/ _/ Am Hubland _/ _/ _/ D-97074 Wuerzburg, Germany _/_/_/ _/_/_/ phone: (++49) (0)931 888-4908 fax: (++49) (0)931 888-5141 email: pr at physik.uni-wuerzburg.de www: http://www.physik.uni-wuerzburg.de/~pr ________________________________________________________________ From ken at nagano.is.hosei.ac.jp Wed May 21 01:55:26 1997 From: ken at nagano.is.hosei.ac.jp (Ken-ichiro Miura) Date: Wed, 21 May 1997 14:55:26 +0900 Subject: CFP:ICONIP`97 -New Deadline- Message-ID: <33828E4E.1208@nagano.is.hosei.ac.jp> NEW DEADLINE Call For Papers ICONIP'97, The Fourth International Conference on Neural Information Processing, November 24-28, 1997. Dunedin/Queenstown, New Zealand "Special Session on Spatio-temporal Information Processings in the Brain" Session Co-Organizers : Minoru Tsukada(Tamagawa University) Takashi Nagano(Hosei University) Recently spatio-temporal aspects have been recognized to be very important in order to understand the neural information processing mechanisms in the brain. The importance lies not only in the sensory systems such as the visual system, the auditory system etc. but also in the higher order systems like memory and learning. Many works from the aspects are now being done. A special session devoted to these works will be organized at ICONIP'97. The scope of the special session covers computational theories, neural network models, physiological studies and psychological studies which are related to spatio-temporal information processings in the brain. Prospective authors are invited to submit papers to the special session. (Traveling expenses and conference fee are not supplied.) The submissions must be received by June 16, 1997 (new deadline), Please send five copies of your manuscript to Prof. Takashi Nagano, Special Session Co-Organizer Dept. Industrial and Systems Engineering, College of Engineering, Hosei University 3-7-2 Kajino-cho, Koganei, Tokyo, 184, JAPAN For the most up-to-date information about ICONIP'97, please browse the conference home page: http://divcom.otago.ac.nz:800/com/infosci/kel/iconip97.htm Important dates: Paper due: June 16, 1997 (new deadline) Notification of acceptance: July 20, 1997 Final camera-ready papers due: August 20, 1997 Manuscript format: Papers must be written in English on A4-format white paper with one inch margins on all four sides, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. ------------------------------------------------ Takashi Nagano Nagano Labo., Dept. of Industrial and Systems Engineering, College of Engineering, Hosei University 3-7-2 Kajino-cho, Koganei, Tokyo JAPAN Tel +81-423-87-6350 Fax +81-423-87-6350 mailto:nagano at nagano.is.hosei.ac.jp ------------------------------------------------ From helnet97 at dds.nl Wed May 21 21:11:04 1997 From: helnet97 at dds.nl (HELNET 1997 Workshop on Neural Networks) Date: Thu, 22 May 1997 01:11:04 +0000 Subject: Call for papers: HELNET Workshop on Neural Networks Message-ID: <199705212308.BAA17935@k9.dds.nl> CALL FOR PAPERS HELNET 1997 International Workshop on Neural Networks October 3 - October 5, Montreux Announcing the HELNET 1997 Workshop on Neural Networks Montreux, Switzerland from October 3 - October 5, 1997 http://www.leidenuniv.nl/medfac/fff/groepc/chaos/helnet/index.html mailto:helnet97 at dds.nl The HELNET workshops are informal meetings primarily targeted towards young researchers from neural networks and related fields. They are traditionally organised a few days prior to the ICANN conferences. Participants are offered the opportunity to present and extensively discuss their work as well as more general topics from the neural network field. One of the aims of the HELNET Workshops is to fascilitate such exchange and enable (young) researchers, (PhD) students and postdocs in the field to learn more about the varying disciplines in the neural networks field outside their own research program. That is why we encourage researchers from related fields to register and particpate. Although the final workshop program has not been fixed yet the following topics have been proposed for presentation and discussion. - Optimal complexity in reduced connectivity neural network paradigms - Speech recognition by neural networks - Neural networks and statistical inference - The emergence of consciousness in neural networks - Applications of differential geometric system theory in dynamic neural networks - Circuit and VLSI complexity issues - VLSI friendly learning - Neural network applications in control - Visualization by neural networks - Markov modelling of sensory neural spike trains - An application of neural networks in cellular wireless networks Please find more detailed information on the HELNET 1997 Workshop on Neural Networks and the registration form below. =================================================================== GENERAL INFORMATION =================================================================== Important Dates and Deadlines Deadline paper submission July 15, 1997 Notification of acceptance August 1, 1997 Deadline registration August 15, 1997 Deadline revised papers September 15, 1997 Workshop start October 3, 1997 Workshop end October 5, 1997 Travel Directions Venue site: Hotel des Alpes Vaudoises Rue de Bugnon 1823 Glion (Montreux) Switzerland Tel: + 41 21 963 20 76 Fax: + 41 21 963 56 94 The workshop site is located at the foot of the Rochers-de-Naye at an altitude of approximatly 670 meters above sea-level in the village of Glion. It is overlooking Montreux and the Lac Leman and offers an exciting view on the Alpes. The hotel is equipped with a private parking, a large garden and an outdoor swimming pool. There is a direct connection by local train from the venue site to the Montreux/Territet trainstation and vice versa. Participants will be provided with train tickets when needed. Getting there: The easiest connection by plane is using Geneva Airport. There are trains running directly from Geneva Airport to Montreux regularly in less than an hour. At Montreux you switch to a small local train which will take you from Montreux/Territet up the mountain toward the Hotel des Alpes Vaudoises. This trains stops right in front of the hotel at the "Hotel des Alpes Vaudoises" train stop. To ICANN...: There is a direct connection from Montreux to Laussane. More information can be found at the ICANN www-site. Paper Format The workshop participants are encouraged to submit their papers in LaTeX format. However, we can also process Word and Wordperfect documents. If you submit non-LaTeX formatted papers please include plain text versions of your paper on PC disk as well as postscript versions of your figures. - Papers should be submitted camera-ready. - The printable are should be 12 by 20 centimeters (including page numbers) in which case font size should be 10-point size using a Times or similar font. - Titles and subtitle should be typeset using 12-point fonts. Footnotes and super/subscripts should be 9-point characters. - When you use standard LaTeX styles (e.g. article) your paper will have a default printable area of approximatly 15 by 23 centimeters and will be reduced in size by 80% for publication. Please take this into account when preparing your figures. The length of submitted papers should not exceed 8 pages including figures, tables and references.Centered at the top of the first page should be the title of the papers, author names(s), affiliation(s) and mailing address(es). Please submit 4 (four) copies of your paper as well as a version on disk (PC/DOS format only!) with the graphic files included on this disk. Paper versions of submitted manuscripts should not be stapled or folded. Like the previous occasions, the proceedings of this year's workshop will be published by the VUU Publishers, Amsterdam. Workshop Fee -Approximate- exchange rates: 100 DFL = 30 BPS = 50 USD = 85 DM Check your local exchange office for the actual rates! The workshop fee is DFL. 600,- (Dutch guilders) and includes 4 nights in a shared double room (DFL 725,- if you prefer a single room), half-board, refreshements during the sessions, welcome drinks on the night of arrival. Accompanying persons are welcome and charged DFL. 500 (DFL. 625 for single room). An optional outing is to be organized and included in the fee. The workshop proceedings will be handed out upon arrival. You can register by printing and filling the registration form and sending it per fax or regular mail. If you do not intend to pay by credit card you can also email the filled-out registration form. You will receive confirmation of your registration and payment upon receipt. If you have any question please direct your queries to: HELNET 1997 P.O. Box 2318 1000 CH Amsterdam Netherlands Fax: + 31 20 471 49 11 Email: helnet97 at dds.nl The workshop fee is payable in the following ways: - Bank transfer: D.S.R. ABN-AMRO Bank, Ceintuurbaan 89, Amsterdam, the Netherlands Account No. 43.55.28.521 Stating: HELNET97 - Credit card: We accept VISA and American Express credit cards. Please print and fill out the registration form and send it to us by fax or regular mail using details stated above. =================================================================== REGISTRATION FORM =================================================================== In order to register as a participant to the HELNET Workshop on Neural Networks please print and fill out the form below completely and mail or fax to the address indicated below: If you do NOT intend to pay using a credit card you can also email us the filled-out registration form. HELNET 1997 P.O. Box 2318 1000 CH Amsterdam Netherlands Fax: + 31 20 471 49 11 helnet97 at dds.nl Accomodation: [ ] Double room (FDL 600,-) Preference for sharing your room with: [ ] Single room (DFL 725.-) [ ] Accompanying person (DFL. 500,- when sharing a double room, DFL. 625,- otherwise) Your personal details: Name ----------------------------------------------------------- Position ------------------------------------------------------ Institution --------------------------------------------------- Address ------------------------------------------------------- P.O. Box ------------------------------------------------------ Zip Code ------------------------------------------------------ City ---------------------------------------------------------- Country ------------------------------------------------------ Tel ----------------------------------------------------------- Fax ----------------------------------------------------------- Email --------------------------------------------------------- Your registration: [ ] I can not make a final regsitration yet. Please keep me informed [ ] I register for HELNET97 but I will not submit a paper and would just like to attend and participate in the discussions. [ ] I would like to present the following paper: ----------------------------------------------------------------- ----------------------------------------------------------------- ----------------------------------------------------------------- ----------------------------------------------------------------- I would like to propose the following topics for discussion: 1. -------------------------------------------------------------- 2. -------------------------------------------------------------- 3. -------------------------------------------------------------- 4. -------------------------------------------------------------- I will make the workshop fee payable in the following way: [ ] Bank transfer: D.S.R. Account No. 43.55.28.521 Stating: HELNET97 ABN-AMRO Bank Ceintuurbaan 89, Amsterdam, the Netherlands [ ] Credit cards: We can accept the following cards: [ ] VISA [ ] American Express Card No. ------------------------------------------------------- Expiry Date: --------------------------------------------------- Signature: ----------------------------------------------------- Date: ---------------------------------------------------------- From lemmon at endeavor.ee.nd.edu Fri May 23 09:16:06 1997 From: lemmon at endeavor.ee.nd.edu (Michael Lemmon) Date: Fri, 23 May 1997 08:16:06 -0500 (EST) Subject: Final CFP - IEEE-TAC special issue Message-ID: <199705231316.IAA01592@endeavor.ee.nd.edu> Contributed by Michael D. Lemmon (lemmon at maddog.ee.nd.edu) FINAL CALL FOR PAPERS IEEE Transactions on Automatic Control announces a Special Issue on ARTIFICIAL NEURAL NETWORKS IN CONTROL, IDENTIFICATION, and DECISION MAKING Edited by Anthony N. Michel Michael Lemmon Dept of Electrical Engineering Dept. of Electrical Engineering University of Notre Dame University of Notre Dame Notre Dame, IN 46556, USA Notre Dame, IN, 46556, USA (219)-631-5534 (voice) (219)-631-8309 (voice) (219)-631-4393 (fax) (219)-631-4393 (fax) Anthony.N.Michel.1 at nd.edu lemmon at maddog.ee.nd.edu Deadlines: Paper Submission: July 1, 1997 Acceptance Decisions: December 31, 1997 There is a growing body of experimental work suggesting that artificial neural networks can be very adept at solving pattern classification problems where there is significant real-world uncertainty. Neural networks also provide an analog method for quickly determining approximate solutions to complex optimization problems. Both of these capabilities can be of great use in solving various control problems and in recent years there has been increased interest in the use of artificial neural networks in the control and supervision of complex dynamical systems. This announcement is a call for papers addressing the topic of neural networks in control, identification, and decision making. Accepted papers will be published in a special issue of the IEEE Transactions of Automatic Control. The special issue is seeking papers which use formal analysis to establish the role of neural networks in control, identification, and decision making. For this reason, papers consisting primarily of empirical simulation results will not be considered for publication. Before submitting, prospective authors should consult past issues of the IEEE Transactions on Automatic Control to identify the type of results and the level of mathematical rigor that are the norm in this journal. Submitted papers are due by July 1, 1997 and should be sent to Michael D. Lemmon or Anthony N. Michel. Notification of acceptance decisions will be sent by December 31, 1997. The special issue is targeted for publication in 1998 or early 1999. All papers will be refereed in accordance with IEEE guidelines. Please consult the inside back cover of any recent issue of the Transactions on Automatic Control for style and length of the manuscript and the number of required copies (seven copies with cover letter) to be sent to one of the editors of this special issue. From bd1q at eagle.cnbc.cmu.edu Fri May 23 11:27:11 1997 From: bd1q at eagle.cnbc.cmu.edu (Barbara Dorney) Date: Fri, 23 May 1997 11:27:11 -0400 (EDT) Subject: Opening for a Scientist/Educator Message-ID: <199705231527.LAA27284@eagle.cnbc.cmu.edu> Opening for a Scientist/Educator at The Center for the Neural Basis of Cognition (CNBC) a Joint Center of Carnegie Mellon and the University of Pittsburgh A Ph. D. Scientist/Educator with a background in COGNITIVE NEUROSCIENCE is sought to take a central role in formulation and development of an interactive planetarium show, TRACKING THE HUMAN BRAIN, funded by the National Science Foundation. The show will describe how cognition and perception arise from neural activity. The show will use the dome of the planetarium as a metaphor for the cerebral cortex of the brain, and will also use a novel interactive technology to allow members of the audience to participate as neurons in the simulation of human cognitive and perceptual processes. The individual hired will be expected to work with a team comprising other CNBC scientists and artists at the Studio for Creative Inquiry at Carnegie Mellon University, and science museum professionals at the Carnegie Science Center in Pittsburgh, to ensure that the production, which is to be disseminated to science centers world-wide, communicates essential scientific content in a manner accessible to the general public. This is a two-year position, and employment would ideally begin as early as July 1, 1997, but excellent candidates who may not be available until 1998 are also encouraged to apply. Salary commensurate with experience. Send a resume' detailing experience in research and science education relevant to the Neural Basis of Cognition to J. L. McClelland, Co-Director Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213-2683. Carnegie Mellon University is an Equal Opportunity /Affirmative Action Employer. From minton at ISI.EDU Fri May 23 13:09:06 1997 From: minton at ISI.EDU (Steve Minton) Date: Fri, 23 May 97 10:09:06 PDT Subject: JAIR article: Connectionist Theory Refinement: Genetically..." Message-ID: <9705231709.AA13087@sungod.isi.edu> Readers of this mailing list might be interested in the following article, which was just published by JAIR: Opitz, D.W. and Shavlik, J.W. (1997) "Connectionist Theory Refinement: Genetically Searching the Space of Network Topologies", Volume 6, pages 177-209. Available in HTML, Postscript (578K) and compressed Postscript (267K). For quick access via your WWW browser, use this URL: http://www.jair.org/abstracts/opitz97a.html More detailed instructions are below. Abstract: An algorithm that learns from a set of examples should ideally be able to exploit the available resources of (a) abundant computing power and (b) domain-specific knowledge to improve its ability to generalize. Connectionist theory-refinement systems, which use background knowledge to select a neural network's topology and initial weights, have proven to be effective at exploiting domain-specific knowledge; however, most do not exploit available computing power. This weakness occurs because they lack the ability to refine the topology of the neural networks they produce, thereby limiting generalization, especially when given impoverished domain theories. We present the REGENT algorithm which uses (a) domain-specific knowledge to help create an initial population of knowledge-based neural networks and (b) genetic operators of crossover and mutation (specifically designed for knowledge-based networks) to continually search for better network topologies. Experiments on three real-world domains indicate that our new algorithm is able to significantly increase generalization compared to a standard connectionist theory-refinement system, as well as our previous algorithm for growing knowledge-based networks. The article is available via: -- comp.ai.jair.papers (also see comp.ai.jair.announce) -- World Wide Web: The URL for our World Wide Web server is http://www.jair.org/ For direct access to this article and related files try: http://www.jair.org/abstracts/opitz97a.html -- Anonymous FTP from either of the two sites below. Carnegie-Mellon University (USA): ftp://ftp.cs.cmu.edu/project/jair/volume6/opitz97a.ps The University of Genoa (Italy): ftp://ftp.mrg.dist.unige.it/pub/jair/pub/volume6/opitz97a.ps The compressed PostScript file is named opitz97a.ps.Z (267K) -- automated email. Send mail to jair at cs.cmu.edu or jair at ftp.mrg.dist.unige.it with the subject AUTORESPOND and our automailer will respond. To get the Postscript file, use the message body GET volume6/opitz97a.ps (Note: Your mailer might find this file too large to handle.) Only one can file be requested in each message. For more information about JAIR, visit our WWW or FTP sites, or send electronic mail to jair at cs.cmu.edu with the subject AUTORESPOND and the message body HELP, or contact jair-ed at ptolemy.arc.nasa.gov. From edelman at ai.mit.edu Fri May 23 14:37:07 1997 From: edelman at ai.mit.edu (Shimon Edelman) Date: Fri, 23 May 1997 14:37:07 -0400 Subject: preprint - Complex Cells and Object Recognition Message-ID: <199705231837.OAA04793@it-cortex> Title: Complex Cells and Object Recognition Authors: Shimon Edelman, Nathan Intrator, Tomaso Poggio ftp URL: ftp://eris.wisdom.weizmann.ac.il/pub/edelman/nips97.ps.Z http URL: http://www.ai.mit.edu/~edelman/mirror/nips97.ps.Z Abstract: Nearest-neighbor correlation-based similarity computation in the space of outputs of complex-type receptive fields can support robust recognition of 3D objects. Our experiments with four collections of objects resulted in mean recognition rates between 84% (for subordinate-level discrimination among 15 quadruped animal shapes) and 94% (for basic-level recognition of 20 everyday objects), over a 40deg X 40deg range of viewpoints, centered on a stored canonical view and related to it by rotations in depth. This result has interesting implications for the design of a front end to an artificial object recognition system, and for the understanding of the faculty of object recognition in primate vision. ------------------------------------------------------------------------ Comments welcome. -Shimon Dr. Shimon Edelman, Center for Biol & Comp Learning, MIT DELENDA BIBI http://www.ai.mit.edu/~edelman fax: (+1) 617 253-2964 tel: 253-6357 edelman at ai.mit.edu From edelman at ai.mit.edu Sat May 24 09:06:16 1997 From: edelman at ai.mit.edu (Shimon Edelman) Date: Sat, 24 May 97 09:06:16 EDT Subject: preprint - Complex Cells and Object Recognition Message-ID: <9705241306.AA00563@peduncle.ai.mit.edu> A correction to the ftp URL: it should be ftp://eris.wisdom.weizmann.ac.il/pub/nips97.ps.Z Also, the server at www.ai.mit.edu seems to have slipped a disk, as a result of which the other URL I listed in the original posting will be unavailable until Tuesday or so. I apologize for these problems. -Shimon From rao at cs.rochester.edu Mon May 26 17:45:19 1997 From: rao at cs.rochester.edu (Rajesh Rao) Date: Mon, 26 May 1997 17:45:19 -0400 Subject: Tech Report: Shift Invariance and Local Receptive Fields Message-ID: <199705262145.RAA29053@skunk.cs.rochester.edu> The following technical report on learning localized receptive fields for transformation estimation is available on the WWW page: http://www.cs.rochester.edu/u/rao/ or via anonymous ftp (see instructions below). Comments and suggestions welcome (This message has been cross-posted - my apologies to those who receive it more than once). -- Rajesh Rao Internet: rao at cs.rochester.edu Dept. of Computer Science VOX: (716) 275-2527 University of Rochester FAX: (716) 461-2018 Rochester NY 14627-0226 WWW: http://www.cs.rochester.edu/u/rao/ =========================================================================== Localized Receptive Fields May Mediate Transformation-Invariant Recognition in the Visual Cortex Rajesh P.N. Rao and Dana H. Ballard Technical Report 97.2 National Resource Laboratory for the Study of Brain and Behavior Department of Computer Science, University of Rochester May 1997 Neurons in the visual cortex are known to possess localized, oriented receptive fields. It has previously been suggested that these distinctive properties may reflect an efficient image encoding strategy based on maximizing the sparseness of the distribution of output neuronal activities or alternately, extracting the independent components of natural image ensembles. Here, we show that a relatively simple neural solution to the problem of transformation-invariant visual recognition also causes localized, oriented receptive fields to be learned from natural images. These receptive fields, which code for various transformations in the image plane, allow a pair of cooperating neural networks, one estimating object identity (``what'') and the other estimating object transformations (``where''), to simultaneously recognize an object and estimate its pose by jointly maximizing the a posteriori probability of generating the observed visual data. We provide experimental results demonstrating the ability of these networks to factor retinal stimuli into object-centered features and object-invariant transformations. The resulting neuronal architecture suggests concrete computational roles for the neuroanatomical connections known to exist between the dorsal and ventral visual pathways. Retrieval information: FTP-host: ftp.cs.rochester.edu FTP-pathname: /pub/u/rao/papers/local.ps.Z WWW URL: http://www.cs.rochester.edu/u/rao/ 9 pages; 229K compressed. ========================================================================== Anonymous ftp instructions: >ftp ftp.cs.rochester.edu Connected to anon.cs.rochester.edu. 220 anon.cs.rochester.edu FTP server (Version wu-2.4(3)) ready. Name: [type 'anonymous' here] 331 Guest login ok, send your complete e-mail address as password. Password: [type your e-mail address here] ftp> cd /pub/u/rao/papers/ ftp> get local.ps ftp> bye From krose at wins.uva.nl Tue May 27 04:38:25 1997 From: krose at wins.uva.nl (Ben Krose) Date: Tue, 27 May 1997 10:38:25 +0200 (MET DST) Subject: Postdoc position available Message-ID: <199705270838.KAA19603@domedb.wins.uva.nl> I apologize if you receive this announcement multiple times. Ben Kr\"ose **************************************************************************** POSTDOCTORAL RESEARCH FELLOWSHIP available at the University of Amsterdam, Dept. of Computer Science Amsterdam, the Netherlands The Intelligent Autonomous Systems (IAS) Group at the University of Amsterdam is looking for a highly motivated research fellow for a 2 year postdoctoral position in the area of `Active Map Building and Sensoric Representations for Autonomous Learning Robot Systems'. With this project, the IAS group participates (with the Foundation for Neural Networks, SNN) in the Japanese `Real World Computing Partnership' (RWCP). We will develop methods for an intelligent, `hearing' and `seeing' robot, which has to act in an environment inhabitated by humans. Perception and map building are essential tasks for the system. The emphasis of the research will be on the application of neural (or other statistical) methodologies for the projection of high- dimensional sensor data to low-dimensional `environment representations'. Applicants should have a PhD in computer science, physics or electronic engineering, must have a strong background in learning, neurocomputing or statistics and must see the challenge of dealing with real-world sensoric data. Job specification: The post-doc salary will be maximally Dfl. 6140 per month, depending on experience. The position is available for 2 years with possible extension with a year. Applications: Interested candidates should send a letter with a CV and list of publications before June 15 1997 to dr. Ben J.A. Krose, Department of Computer Science University of Amsterdam Kruislaan 403 1098 SJ Amsterdam The Netherlands For information you can contact me: krose at wins.uva.nl Phone +31 20 525 7463 Fax +31 20 525 7490 A short description on the project can be found at http://www.wins.uva.nl/research/neuro/rwc.html From ataxr at IMAP1.ASU.EDU Mon May 26 13:21:26 1997 From: ataxr at IMAP1.ASU.EDU (Asim Roy) Date: Mon, 26 May 1997 13:21:26 -0400 (EDT) Subject: CONNECTIONIST LEARNING: IS IT TIME TO RECONSIDER THE FOUNDATIONS? Message-ID: Dave, I am posting the responses I have received so far. Some of the responses provide a great deal of insight on connectionist learning and neuroscience and their interactions (see, in particular, the second note by Dr. Peter Cariani). I have also tried to provide answers to two frequently asked questions. I hope all of this will generate more interest in the questions being raised about connectionist learning. As you can see below, perhaps other questions need to be raised. The original posting is attached below for reference. Asim Roy Arizona State University ============================ ANSWERS TO SOME FREQUENTLY ASKED QUESTIONS: a) Humans get stuck in local minima all the time. So what is wrong with algorithms getting stuck in local minima? RESPONSE: We can only claim that humans are sometimes "unable to learn." We cannot make any claim beyond that. And so this phenomenon of "unable to learn" does not necessarily imply "getting stuck in a local minima." Inability to learn maybe due to a number of reasons, including insufficient information, inability to extract the relevant features of a problem, insufficient reward or punishment, and so on. Again, to reiterate, "inability to learn" does not imply "getting stuck in a local minima." Perhaps this misconception has been promoted in order to justify certain algorithms and their weak learning characteristics. b) Why do you say classical connectionist learning is memoryless? Isn't memory actually in the weights? RESPONSE: Memoryless learning implies there is no EXPLICIT storage of any learning example in the system in order to learn. In classical connectionist learning, the weights of the net are adjusted whenever a learning example is presented, but it is promptly forgotten by the system. There is no EXPLICIT storage of any presented example in the system. That is the generally accepted view of "adaptive" or "on-line learning systems." Imagine such a system "planted" in some human brain. And suppose we want to train it to learn addition. So we provide the first example - say, 2 + 2 promptly adjust the weights of the net and forgets the particular example. It has done what it is supposed to do - adjust the weights, given a learning example. Suppose, you then ask this "human", fitted with this learning algorithm: "How much is 2 + 2?" Since it has only seen one example and has not yet fully grasped the rule for adding numbers, it probably would give a wrong answer. So you, as the teacher, perhaps might ask at that point: "I just told you 2 + 2 remember?" And this "human" might respond: "Very honestly, I don't recall you ever having said that! I am very sorry." And this would continue to happen after every example you present to this "human"!!! So do you think there is memory in those "weights"? Do you think humans are like that? Please send any comments on these issues directly to me (asim.roy at asu.edu). All comments/criticisms/suggestions are welcome. And all good science depends on vigorous debate. Asim Roy Arizona State University ============================ From weaveraj at helios.aston.ac.uk Wed May 28 08:49:53 1997 From: weaveraj at helios.aston.ac.uk (Andrew Weaver) Date: Wed, 28 May 1997 13:49:53 +0100 Subject: Postgraduate opportunities at Aston University Message-ID: <18503.199705281449@sun.aston.ac.uk> Research in Neural Computing PhD in Neural Computing at Aston University, Birmingham, UK PhD programmes are available on either a full-time or part-time basis. Most full-time students start in October each year and take the taught modules from the MSc in Pattern Analysis and Neural Networks during the first term. MSc by Research in Pattern Analysis and Neural Networks The MSc comprises a substantial 9 month research project, taught modules in Artificial Neural Networks, Statistical Pattern Analysis, Algorithms & Computational Mathematics, Time Series & Signal Processing, Data Visualisation & Density Modelling, & Research Methodology, computer lab. sessions & tutorials. The course emphasises the advantages of a principled approach to data analysis. There are strong commercial & industrial links through research projects & bursaries, & employment prospects for graduates are good. Funding Full studentships (for eligible students, please check EPS), paying tuition fees and living expenses, are available for both research programmes. Funding is provided by the EPSRC (http://www.epsrc.ac.uk/) and industrial sponsors and is awarded by the Research Group on a competitive basis. Applications from self-funded students are also very welcome. For further information, please contact: Hanni Sondermann, tel:0121 333 4631; email: ncrg at aston.ac.uk; www: http://www.ncrg.aston.ac.uk/ From fayyad at MICROSOFT.com Wed May 28 15:18:11 1997 From: fayyad at MICROSOFT.com (Usama Fayyad) Date: Wed, 28 May 1997 12:18:11 -0700 Subject: Data Mining and Knowledge Discovery: vol.1:2 & full contents 1:1 on web Message-ID: <28347281A2B5CF119AB000805FD4186602F15CF9@RED-77-MSG.dns.microsoft.com> Please post the following announcement: Issue 2 of the new journal: Data Mining and Knowledge Discovery has been finalized. You can access the abstracts and full text of the editorial at the journal's home page: http://www.research.microsoft.com/datamine Also, issue 1 is now available free on line from Kluwer's web server. Links to Kluwer's server are accessible via the above homepage or directly at: http://www.wkap.nl/kapis/CGI-BIN/WORLD/kaphtml.htm?DAMISAMPLE =================================== DATA MINING AND KNOWLEDGE DISCOVERY Volume 1, issue 2 =================================== CONTENTS: -------- Editorial Usama Fayyad, editor-in-chief ---------------------------------------------- PAPERS ------ BIRCH: A New Data Clustering Algorithm and Its Applications Tian Zhang, Raghu Ramakrishnan, Miron Livny Mathematical Programming in Data Mining O. L. Mangasarian A Simple Constraint-Based Algorithm for Efficiently Mining Observational Databases for Causal Relationships Gregory F. Cooper ---------------------------------------------- BREIF APPLICATION SUMMARY ------------------------- Visual Data Mining: Recognizing Telephone Calling Fraud Kenneth C. Cox, Stephen G. Eick, Graham J. Wills, and Ronald J. Brachman ================================================ Usama Fayyad datamine at microsoft.com for more information on the journal, CFP, and to submit a paper, please see: http://www.research.microsoft.com/datamine From tibs at utstat.toronto.edu Thu May 29 16:16:00 1997 From: tibs at utstat.toronto.edu (tibs@utstat.toronto.edu) Date: Thu, 29 May 97 16:16 EDT Subject: new paper available Message-ID: Tech report available: The out-of-bootstrap method for model averaging and selection ...enjoying the Bayesian omelette without making a mess in the kitchen J. Sunil Rao and Robert Tibshirani We propose a bootstrap-based method for model averaging and selection that focuses on training points that are left out of individual bootstrap samples. This information can be used to estimate optimal weighting factors for combining estimates from different bootstrap samples, and also for finding the best subsets the linear model setting. These proposals provide alternatives to Bayesian approaches to model averaging and selection, requiring less computation and fewer subjective choices. Comments welcome ftp://utstat.toronto.edu/pub/tibs/outofbootstrap.ps http://utstat.toronto.edu/tibs/research.html ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Rob Tibshirani, Dept of Preventive Med & Biostats, and Dept of Statistics Univ of Toronto, Toronto, Canada M5S 1A8. Phone: 416-978-4642 (PMB), 416-978-0673 (stats). FAX: 416 978-8299 computer fax 416-978-1525 (please call or email me to inform) tibs at utstat.toronto.edu. ftp: //utstat.toronto.edu/pub/tibs http://www.utstat.toronto.edu/~tibs +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ From gluck at pavlov.rutgers.edu Thu May 29 18:02:57 1997 From: gluck at pavlov.rutgers.edu (Mark A. Gluck) Date: Thu, 29 May 1997 18:02:57 -0400 Subject: Hippocampus Special Issue on Computational Models of Memory Message-ID: NEW SPECIAL ISSUE OF HIPPOCAMPUS ON COMPUTATIONAL MODELS: Now available for purchase as a single-issue from Wiley-Liss publishers (see below for ordering information): Computational Models of Hippocampal Function in Memory A Special Issue of Hippocampus (V.6, No.6, 1996) Guest Edited by: Mark A. Gluck (Rutgers-Newark Neuroscience) PRECIS: This special issue of Hippocampus focuses on computational network models of hippocampal function, especially those that make substantive contact with data from behavioral studies of learning and memory. It provides the non-specialist reader with a general understanding of the aims, accomplishments and limitations of computational approaches to understanding hippocampal function. The articles in this issue are written so as to facilitate the comparison between different computational models, and to assist the non-mathematically inclined reader in understanding how and where these models can be used as tools for understanding and motivating empirical research, including physiological, anatomical, and behavioral studies. The articles included focus on describing the spirit and behavior of computational models, omitting most details on their exact mathematical underpinnings. CONTENTS: INTRODUCTION page 565 Mark A. Gluck Computational Models of Hippocampal Function in Memory CIRCUIT-LEVEL MODELS page 567 Richard Granger, Sherman P. Wiebe, Makoto Taketani, and Gary Lynch Distinct Memory Circuits Composing the Hippocampal Region page 579 William B. Levy A Sequence Predicting CA3 Is a Flexible Associator That Learns and Uses Context to Solve Hippocampal-Like Tasks page 591 Jim-Shih Liaw and Theodore W. Berger Dynamic Synapse: A New Concept of Neural Representation and Computation page 601 Edmund T. Rolls A Theory of Hippocampal Function in Memory CONDITIONING AND ANIMAL LEARNING page 621 Catalin V. Buhusi and Nestor A. Schmajuk Attention, Configuration, and Hippocampal Function page 643 Mark A. Gluck and Catherine E. Meyers Integrating Behavioral and Physiological Models of Hippocampal Function EPISODIC MEMORY AND CONSOLIDATION page 654 James L. McClelland and Nigel H. Goddard Considerations Arising From a Complementary Learning Systems Perspective on Hippocampus and Neocortex page 666 Alessandro Treves, William E. Skaggs, and Carol A. Barnes How Much of the Hippocampus Can Be Explained by Functional Constraints? page 675 Jaap M.J. Murre TraceLink: A Model of Amnesia and Consolidation of Memory page 685 Bin Shen and Bruce L. McNaughton Modeling the Spontaneous Reactivation of Experience-Specific Hippocampal Cell Assembles During Sleep page 693 Michael E. Hasselmo, Bradley P. Wyble, and Gene V. Wallenstein Encoding and Retrieval of Episodic Memories: Role of Cholinergic and GABAergic Modulation in the Hippocampus SPATIAL NAVIGATION page 709 Robert U. Muller and Matt Stead Hippocampal Place Cells Connected by Hebbian Synapses Can Solve Spatial Problems page 720 Patricia E. Sharp, Hugh T. Blair, and Michael Brown Neural Network Modeling of the Hippocampal Formation Spatial Signals and Their Possible Role in Navigation: A Modular Approach page 735 Michael Recce and Kenneth D. Harris Memory for Places: A Navigational Model in Support of Marr's Theory of Hippocampal Function page 749 Neil Burgess and John O'Keefe Neuronal Computations Underlying the Firing of Place Cells and Their Role in Navigation page 763 Index for Volume 6 ORDERING INFORMATION This special issue is available for $45.00 from the publisher. To order, contact: Stacey Lee, John Wiley & Sons, Inc. 605 Third Avenue, New York, NY 10158 (212) 850-8840 or slee at wiley.com. ____________________________________________________________________________ Dr. Mark A. Gluck, Associate Professor Center for Molecular & Behavioral Neuroscience Rutgers University 197 University Ave. Newark, New Jersey 07102 Phone: (201) 648-1080 (Ext. 3221) Fax: (201) 648-1272 Cellular: (917) 855-8906 Email: gluck at pavlov.rutgers.edu WWW Homepage: www.gluck.edu _____________________________________________________________________________ From schwenk at IRO.UMontreal.CA Thu May 29 17:52:48 1997 From: schwenk at IRO.UMontreal.CA (Holger Schwenk) Date: Thu, 29 May 1997 17:52:48 -0400 (EDT) Subject: techreport on application of AdaBoost to neural networks Message-ID: <199705292152.RAA29852@grosse.iro.umontreal.ca> Hello, The following technical report on the application of AdaBoost to neural networks is available on the WWW page: http://www.iro.umontreal.ca/~lisa/pointeurs/AdaBoostTR.ps or http://www.iro.umontreal.ca/~schwenk/papers/AdaBoostTR.ps.gz Comments and suggestions are welcome. Holger Schwenk ------------------------------------------------------------------------------- Holger Schwenk phone: (514) 343-6111 ext 1655 fax: (514) 343-5834 LISA, Dept. IRO University of Montreal email: schwenk at iro.umontreal.ca 2920 Chemin de la tour, CP 6128 http://www.iro.umontreal.ca/~schwenk Montreal, Quebec, H3C 3J7 CANADA ------------------------------------------------------------------------------- Adaptive Boosting of Neural Networks for Character Recognition Holger Schwenk and Yoshua Bengio Dept. Informatique et Recherche Operationnelle Universite de Montreal, Montreal, Qc H3C-3J7, Canada {schwenk,bengioy}@iro.umontreal.ca May, 29 1997 "Boosting" is a general method for improving the performance of any learning algorithm that consistently generates classifiers which need to perform only slightly better than random guessing. A recently proposed and very promising boosting algorithm is AdaBoost [5]. It has been applied with great success to several benchmark machine learning problems using rather simple learning algorithms [4], in particular decision trees [1,2,6]. In this paper we use AdaBoost to improve the performances of neural networks applied to character recognition tasks. We compare training methods based on sampling the training set and weighting the cost function. Our system achieves about 1.4% error on a data base of online handwritten digits from more than 200 writers. Adaptive boosting of a multi-layer network achieved 2% error on the UCI Letters offline characters data set. From terry at salk.edu Sat May 31 01:00:23 1997 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 30 May 1997 22:00:23 -0700 (PDT) Subject: 4th Annual Joint Symposium on Neural Computation Message-ID: <199705310500.WAA13660@helmholtz.salk.edu> Abstract for the papers at this meeting can be found at: http://www.cnl.salk.edu/inc/JSNC97abstracts.html Proceedings can be obtained from the Institute for Neural Computation, UCSD 0523, La Jolla, CA 92093. --- 4th Annual Joint Symposium on Neural Computation --- Co-sponsored by Institute for Neural Computation University of California, San Diego and Biomedical Engineering Department and Neuroscience Program University of Southern California The University of Southern California University Park Campus Rm. 124, Seeley G. Mudd Building Saturday, May 17, 1997 8:00 a.m. to 5:30 p.m. Session 1: "VISION" - Bartlett Mel, Chair 9:00 am Peter Kalocsai, USC "Using Extension Fields to Improve Proformance of a Biologically Inspired Recognition Model" 9:15 am Kechen Zhang, The Salk Institute "A Conjugate Neural Representation of Visual Objects in Three Dimensions" 9:30 am Alexander Grunewald, Caltech "Detection of First and Second Order Motion" 9:45 am Zhong-Lin Lu, USC "Extracting Characteristic Structures from Natural Images Through Statistically Certified Unsupervised Learning" 10:00 am Don McCleod, UC San Diego "Optimal Nonlinear Codes" 10:15 am Lisa J. Croner, The Salk Institute "Segmentation by Color Influences Response of Motion-Sensitive Neurons in Cortical Area MT" 10:15 am - 10:30 am *** BREAK *** Session 2: "CODING in NEURAL SYSTEMS" - Christof Koch, Chair 10:30 am Dawei Dong, Caltech "How Efficient is Temporal Coding in the Early Visual System?" 10:45 am Martin Stemmler, Caltech "Entropy Maximization in Hodgkin-Huxley Models" 11:00 am Michael Wehr, Caltech "Temporal coding with Oscillatory Sequences of Firing" 11:15 am Martin J. McKeown, The Salk Institutde "Functional Magnetic Resonance Imaging Data Interpreted as Spatially Independent Mixtures" 11:30 am KEYNOTE SPEAKER: Prof. Irving Biederman, William M. Keck Professor of Cognitive Neuroscience Departments of Psychology and Computer Science and the Neuroscience Program, USC "Shape Representation in Mind and Brain" ------------- 12:30 pm - 2:30 pm *** LUNCH/POSTERS *** P1. Konstantinos Alataris, USC "Modeling of Neuronal Ensemble Dynamics" P2. George Barbastathis, Caltech "Awareness-Based Computation" P3. Marian Stewart Bartlett, UC San Diego "What are the Independent Components of Face Images?" P4. Maxim Bazhenov,The Salk Institute "A Computational Model of Intrathalamic Augmenting Responses" P5. Alan Bond, Caltech "A Computational Model for the Primate Brain Based on its Functional Architecture" P6. Glen Brown, The Salk Institute "Output Sign Switching by Neurons is Mediated by a Novel Voltage-Dependent Sodium Current" P7. Martin Chian, USC "Characterization of Unobservable Neural Circuitry in the Hippocampus with Nonlinear Systems Analysis" P8. Carl Chiang, The Neuroscience Institute "Visual and Sensorimotor Intra- and Intercolumnar Synchronization in Awake Behaving Cat" P9. Matthew Dailey, UC San Diego "Learning a Specializtion for Face Recognition" P10. Emmanuel Gillissen, Caltech "Comparative Studies of Callosal Specification in M ammals" P11. Michael Gray, The Salk Institute "Infomative Features for Visual Speechreading" P12. Alex Guazzelli, USC "A Taxon-Affordances Model of Rat Navigation" P13. Marwan Jabri, The Salk Instutute/Sydney University "A Neural Network Model for Saccades and Fixation on Superior Colliculus" P14. Mathew Lamb, USC "Depth Based Prey Capture in Frogs and Salamanders" P15. Te-Won Lee, The Salk Instutute "Independent Component Analysis for Mixed Sub-Gaussian and Super-Gaussian Sources" P16. George Marnellos, The Salk Institute "A Gene Network of Early Neurogenesis in Drosophila" P17. Steve Potter, Caltech "Animat in a Petri Dish: Cultured Neural Networks for Studying Neural Computation" P18. James Prechtl, UC San Diego "Visual Stimuli Induce Propagating Waves of Electrical Activity in Turtle Cortex" P19. Raphael Ritz, The Salk Institute "Multiple Synfire Chains in Simultaneous Action Lead to Poisson-Like Neuronal Firing" P20. Adrian Robert, UC San Diego "A Model of the Effects of Lamination and Celltype Specialization in the Neocortex" P21. Joseph Sirosh, HNC Software Inc. "Large-Scale Neural Network Simulations Suggest a Single Mechanism for the Self-Organization of Orientation Maps, Lateral Connections and Dynamic Receptive Fields in the Primary Visual Cortex" P22. George Sperling, UC Irvine "A Proposed Architecture for Visual Motion Perception" P23. Adam Taylor, UC San Diego "Dynamics of a Recurrent Network of Two Bipolar Units" P24. Laurenz Wiskott, The Salk Institute "Objective Functions for Neural Map Formation" ------------------------------------------- Session 3: "HARDWARE" - Michael Arbib, Chair 2:30 pm Christof Born, Caltech "Real Time Ego-Motion Estimation with Neuromorphic Analog VLSI Sensors" 2:45 pm Anil Thakoor, JPL "High Speed Image Computation with 3D Analog Neural Hardware" Session 4: "VISUOMOTOR COORDINATION" - Michael Arbib, Chair 3:00 pm Marwan Jabri, The Salk Institute/Sydney University "A Computational Model of Auditory Space Neural Coding in the Superior Colliculus" 3:15 pm Amanda Bischoff, USC "Modeling the Basal Ganglia in a Reciprocal Aiming Task" 3:30 pm Jacob Spoelstra, USC "A Computational Model of the Role of the Cerebellum in Adapting to Throwing While Wearing Wedge Prism Glasses" 3:45 pm - 4:00 pm *** BREAK *** Session 5: "CHANNELS, SYNAPSES, and DENDRITES" - Terry Sejnowski, Chair 4:00 pm Akaysha C. Tang, The Salk Institute "Modeling the Effect of Neuromodulation of Spike Timing in Neocortical Neurons" 4:15 pm Michael Eisele, The Salk Institute "Reinforcement Learning by Pyramidal Neurons" 4:30 pm Sunil S. Dalal, USC "A Nonlinear Prositive Feedback Model of Glutamatergic Synaptic Transmission in Dentate Gyrus" 4:45 pm Venkatesh Murthy, The Salk Institute "Are Neighboring Synapses Independent?" 5:00 pm Gary Holt, Caltech "Shunting Inhibition Does Not Have a Divisive Effect on Firing Rates" 5:15 pm Kevin Archie, USC "Binocular Disparity Tuning in Cortical 'Complex' Cells: Yet Another Role for Intradendritic Computation?"