From Connectionists-Request at cs.cmu.edu Fri Jul 1 00:05:15 1994 From: Connectionists-Request at cs.cmu.edu (Connectionists-Request@cs.cmu.edu) Date: Fri, 01 Jul 94 00:05:15 -0400 Subject: Bi-monthly Reminder Message-ID: <672.773035515@B.GP.CS.CMU.EDU> *** DO NOT FORWARD TO ANY OTHER LISTS *** This note was last updated May 5, 1994. This is an automatically posted bi-monthly reminder about how the CONNECTIONISTS list works and how to access various online resources. CONNECTIONISTS is a moderated forum for enlightened technical discussions and professional announcements. It is not a random free-for-all like comp.ai.neural-nets. Membership in CONNECTIONISTS is restricted to persons actively involved in neural net research. The following posting guidelines are designed to reduce the amount of irrelevant messages sent to the list. Before you post, please remember that this list is distributed to thousands of busy people who don't want their time wasted on trivia. Also, many subscribers pay cash for each kbyte; they shouldn't be forced to pay for junk mail. -- Dave Touretzky & David Redish --------------------------------------------------------------------- What to post to CONNECTIONISTS ------------------------------ - The list is primarily intended to support the discussion of technical issues relating to neural computation. - We encourage people to post the abstracts of their latest papers and tech reports. - Conferences and workshops may be announced on this list AT MOST twice: once to send out a call for papers, and once to remind non-authors about the registration deadline. A flood of repetitive announcements about the same conference is not welcome here. - Requests for ADDITIONAL references. This has been a particularly sensitive subject. Please try to (a) demonstrate that you have already pursued the quick, obvious routes to finding the information you desire, and (b) give people something back in return for bothering them. The easiest way to do both these things is to FIRST do the library work to find the basic references, then POST these as part of your query. Here's an example: WRONG WAY: "Can someone please mail me all references to cascade correlation?" RIGHT WAY: "I'm looking for references to work on cascade correlation. I've already read Fahlman's paper in NIPS 2, his NIPS 3 abstract, corresponded with him directly and retrieved the code in the nn-bench archive. Is anyone aware of additional work with this algorithm? I'll summarize and post results to the list." - Announcements of job openings related to neural computation. - Short reviews of new textbooks related to neural computation. To send mail to everyone on the list, address it to Connectionists at CS.CMU.EDU ------------------------------------------------------------------- What NOT to post to CONNECTIONISTS: ----------------------------------- - Requests for addition to the list, change of address and other administrative matters should be sent to: "Connectionists-Request at cs.cmu.edu" (note the exact spelling: many "connectionists", one "request"). If you mention our mailing list to someone who may apply to be added to it, please make sure they use the above and NOT "Connectionists at cs.cmu.edu". - Requests for e-mail addresses of people who are believed to subscribe to CONNECTIONISTS should be sent to postmaster at appropriate-site. If the site address is unknown, send your request to Connectionists-Request at cs.cmu.edu and we'll do our best to help. A phone call to the appropriate institution may sometimes be simpler and faster. - Note that in many mail programs a reply to a message is automatically "CC"-ed to all the addresses on the "To" and "CC" lines of the original message. If the mailer you use has this property, please make sure your personal response (request for a Tech Report etc.) is NOT broadcast over the net. ------------------------------------------------------------------------------- The CONNECTIONISTS Archive: --------------------------- All e-mail messages sent to "Connectionists at cs.cmu.edu" starting 27-Feb-88 are now available for public perusal. A separate file exists for each month. The files' names are: arch.yymm where yymm stand for the obvious thing. Thus the earliest available data are in the file: arch.8802 Files ending with .Z are compressed using the standard unix compress program. To browse through these files (as well as through other files, see below) you must FTP them to your local machine. The file "current" in the same directory contains the archives for the current month. ------------------------------------------------------------------------------- How to FTP Files from the CONNECTIONISTS Archive ------------------------------------------------ 1. Open an FTP connection to host B.GP.CS.CMU.EDU 2. Login as user anonymous with password your username. 3. 'cd' directly to the following directory: /afs/cs/project/connect/connect-archives The archive directory is the ONLY one you can access. You can't even find out whether any other directories exist. If you are using the 'cd' command you must cd DIRECTLY into this directory. Problems? - contact us at "Connectionists-Request at cs.cmu.edu". ------------------------------------------------------------------------------- Using Mosaic and the World Wide Web ----------------------------------- You can also access these files using the following url: ftp://b.gp.cs.cmu.edu/afs/cs/project/connect/connect-archives ---------------------------------------------------------------------- The NEUROPROSE Archive ---------------------- Anonymous FTP on archive.cis.ohio-state.edu (128.146.8.52) pub/neuroprose directory This directory contains technical reports as a public service to the connectionist and neural network scientific community which has an organized mailing list (for info: connectionists-request at cs.cmu.edu) Researchers may place electronic versions of their preprints in this directory, announce availability, and other interested researchers can rapidly retrieve and print the postscripts. This saves copying, postage and handling, by having the interested reader supply the paper. We strongly discourage the merger into the repository of existing bodies of work or the use of this medium as a vanity press for papers which are not of publication quality. PLACING A FILE To place a file, put it in the Inbox subdirectory, and send mail to pollack at cis.ohio-state.edu. Within a couple of days, I will move and protect it, and suggest a different name if necessary. Current naming convention is author.title.filetype.Z where title is just enough to discriminate among the files of the same author. The filetype is usually "ps" for postscript, our desired universal printing format, but may be tex, which requires more local software than a spooler. The Z indicates that the file has been compressed by the standard unix "compress" utility, which results in the .Z affix. To place or retrieve .Z files, make sure to issue the FTP command "BINARY" before transfering files. After retrieval, call the standard unix "uncompress" utility, which removes the .Z affix. An example of placing a file is in the appendix. Make sure your paper is single-spaced, so as to save paper, and include an INDEX Entry, consisting of 1) the filename, 2) the email contact for problems, 3) the number of pages and 4) a one sentence description. See the INDEX file for examples. ANNOUNCING YOUR PAPER It is the author's responsibility to invite other researchers to make copies of their paper. Before announcing, have a friend at another institution retrieve and print the file, so as to avoid easily found local postscript library errors. And let the community know how many pages to expect on their printer. Finally, information about where the paper will/might appear is appropriate inside the paper as well as in the announcement. Please add two lines to your mail header, or the top of your message, so as to facilitate the development of mailer scripts and macros which can automatically retrieve files from both NEUROPROSE and other lab-specific repositories: FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/filename.ps.Z When you announce a paper, you should consider whether (A) you want it automatically forwarded to other groups, like NEURON-DIGEST, (which gets posted to comp.ai.neural-networks) and if you want to provide (B) free or (C) prepaid hard copies for those unable to use FTP. To prevent forwarding, place a "**DO NOT FORWARD TO OTHER GROUPS**" at the top of your file. If you do offer hard copies, be prepared for a high cost. One author reported that when they allowed combination AB, the rattling around of their "free paper offer" on the worldwide data net generated over 2000 hardcopy requests! A shell script called Getps, written by Tony Plate, is in the directory, and can perform the necessary retrieval operations, given the file name. Functions for GNU Emacs RMAIL, and other mailing systems will also be posted as debugged and available. At any time, for any reason, the author may request their paper be updated or removed. For further questions contact: Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Phone: (614) 292-4890 APPENDIX: Here is an example of naming and placing a file: gvax> cp i-was-right.txt.ps rosenblatt.reborn.ps gvax> compress rosenblatt.reborn.ps gvax> ftp cheops.cis.ohio-state.edu Connected to cheops.cis.ohio-state.edu. 220 cheops.cis.ohio-state.edu FTP server ready. Name: anonymous 331 Guest login ok, send ident as password. Password:neuron 230 Guest login ok, access restrictions apply. ftp> binary 200 Type set to I. ftp> cd pub/neuroprose/Inbox 250 CWD command successful. ftp> put rosenblatt.reborn.ps.Z 200 PORT command successful. 150 Opening BINARY mode data connection for rosenblatt.reborn.ps.Z 226 Transfer complete. 100000 bytes sent in 3.14159 seconds ftp> quit 221 Goodbye. gvax> mail pollack at cis.ohio-state.edu Subject: file in Inbox. Jordan, I just placed the file rosenblatt.reborn.ps.Z in the Inbox. Here is the INDEX entry: rosenblatt.reborn.ps.Z rosenblatt at gvax.cs.cornell.edu 17 pages. Boastful statements by the deceased leader of the neurocomputing field. Let me know when it is in place so I can announce it to Connectionists at cmu. Frank ^D AFTER FRANK RECEIVES THE GO-AHEAD, AND HAS A FRIEND TEST RETRIEVE THE FILE, HE DOES THE FOLLOWING: gvax> mail connectionists Subject: TR announcement: Born Again Perceptrons FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/rosenblatt.reborn.ps.Z The file rosenblatt.reborn.ps.Z is now available for copying from the Neuroprose repository: Born Again Perceptrons (17 pages) Frank Rosenblatt Cornell University ABSTRACT: In this unpublished paper, I review the historical facts regarding my death at sea: Was it an accident or suicide? Moreover, I look over the past 23 years of work and find that I was right in my initial overblown assessments of the field of neural networks. ~r.signature ^D ------------------------------------------------------------------------ How to FTP Files from the NN-Bench Collection --------------------------------------------- 1. Create an FTP connection from wherever you are to machine "pt.cs.cmu.edu" (128.2.254.155). 2. Log in as user "anonymous" with password your username. 3. Change remote directory to "/afs/cs/project/connect/bench". Any subdirectories of this one should also be accessible. Parent directories should not be. 4. At this point FTP should be able to get a listing of files in this directory and fetch the ones you want. Problems? - contact us at "nn-bench-request at cs.cmu.edu". From pitt at cs.uiuc.edu Fri Jul 1 15:21:49 1994 From: pitt at cs.uiuc.edu (Lenny Pitt) Date: Fri, 01 Jul 94 15:21:49 EDT Subject: ML/COLT tutorial -- Computational Learning Theory Intro & Survey Message-ID: <199407012021.AA06535@pitt.cs.uiuc.edu> ========================================================= Computational Learing Theory: Introduction and Survey ========================================================= Sunday, July 10, 1994 8:45 am to 12:15 pm Rutgers University New Brunswick, New Jersey Tutorial conducted by Lenny Pitt University of Illinois Urbana, IL 61801 pitt at cs.uiuc.edu Held in conjunction with the Eleventh International Conference on Machine Learning (ML94, July 11-13, 1994) and the Seventh Annual Conference on Computational Learning Theory (COLT94, July 12-15, 1994). This tutorial will introduce the different formal learning models (eg, ``pac'' learning, mistake-bounded learning, learning with queries), present basic techniques for proving learnability and nonlearnability (eg, the VC-dimension, Occam algorithms, reductions between learning problems), and survey many of the central results in the area. The tutorial is designed to give ML attendees and those with a general interest in machine learning sufficient background to appreciate past and recent results in computational learning theory. It should also help attendees appreciate the significance and contributions of the papers that will be presented at the COLT94 conference that follows. No prior knowledge of learning theory is assumed. The tutorial is one of a set of DIMACS-sponsored tutorials that are free and open to the general public. Directions to Rutgers can be found in the ML/COLT announcement, which is available via anonymous ftp from www.cs.rutgers.edu in the directory "/pub/learning94". The specific location of the tutorial will be posted, and available with conference materials. Users of www information servers such as mosaic can find the information at "http://www.cs.rutgers.edu/pub/learning94/learning94.html". Other available information includes a campus map, and abstracts of all workshops/tutorials. Questions can be directed to ml94 at cs.rutgers.edu, or to colt94 at research.att.com. From zoon at zoon.postech.ac.kr Wed Jul 6 08:29:09 1994 From: zoon at zoon.postech.ac.kr (Prof. Cho Sungzoon) Date: Wed, 6 Jul 94 08:29:09 KDT Subject: NN-SMP-95 Call for Papers Message-ID: <9407052229.AA28058@zoon.postech.ac.kr.noname> =========================================================================== --------------------------------------------------------------------------- --- CALL FOR PAPERS --- Workshop on the Theory of Neural Networks: The Statistical Mechanics Perspective Sponsored by Center for Theoretical Physics at Seoul National University and Basic Science Research Institute at POSTECH February 2-4, 1995 Pohang University of Science and Technology, Pohang, Korea =========================================================================== During the last decade, methods of statistical mechanicswere successfully applied to the theory of neural networks.The Study of neural networks became an important part of statistical physics and the results influenced neighboring fields. In this workshop, we will review the status of the statistical physics of neural networks and discuss the future directions. We invite papers on the theory of neural networks both from statistical physics community and outside. We look forward to active interdisciplinary discussions, and encourage participation from related fields such as non-linear dynamics, computer science, mathematics, statistics, information theory and neurobiology. Invited speakers S. Amari (Tokyo Univ.) H. Sompolinksy (Hebrew Univ.) D. Haussler (UCSC) I. Kanter (Bar Ilan Univ.) M. Kearns (AT\&T) M. Opper (U. Wuerzburg) G. M. Shim(K.U. Leuven) H. S. Seung(AT\&T) K. Y. M. Wong (HKUST) and more. Abstract Submission Authors should submit six-copies of an abstract to be received by Tuesday, November 15, 1994, to Jong-Hoon Oh - NNSMP Department of Physics Pohang University of Science and Technology(POSTECH) Hyoja San 31, Pohang, Kyongbuk 790-784, Korea nnsmp at galaxy.postech.ac.kr An e-mail submission of the abstract is also possible to the above e-mail address. The abstract should include title, authors' names, affiliations, postal and e-mail addresses, telephone and fax numbers if any. The body of the abstract should be no longer than 300 words. A full paper should be submitted on venue to be included in the proceedings. Program Format We encourage informal discussions between small group of participants during workshop. Invited talks and a limited number of contributed talks will be presented in the oral session. Most of the contributed works will be presented via poster session. Tour of Kyoung-Ju(a 2000 years old city) is a part of the workshop. Registration In order to take advantage of a small workshop, we would like to maintain the number of the participants at an appropriate size. If you are interested in participation, please inform us of your intention as early as possible. Detailed registration information will be distributed via e-mail. Ask for following announcements to nnsmp-info at galaxy.postech.ac.kr. Advisory Committee S. Amari(Tokyo Univ.), S. Y. Bang(POSTECH), S. I. Choi(POSTECH), K. C. Lee(SNU), H. Sompolinsky(Hebrew Univ.). Local Organizing Committee S. Cho(POSTECH), M. Y. Choi(SNU), D. Kim(SNU), S. Kim(POSTECH), C. Kwon(Myoung-Ji U.), J.-H. Oh(POSTECH). Program Committee I. Kanter(Bat Ilan), J.-H. Oh(POSTECH), H. S. Seung(AT\&T). From mccallum at cs.rochester.edu Wed Jul 6 15:54:40 1994 From: mccallum at cs.rochester.edu (mccallum@cs.rochester.edu) Date: Wed, 06 Jul 94 15:54:40 -0400 Subject: paper available by ftp Message-ID: <199407061954.PAA04928@slate.cs.rochester.edu> ------- Blind-Carbon-Copy From mccallum at cs.rochester.edu Wed Jul 6 15:54:40 1994 From: mccallum at cs.rochester.edu (mccallum@cs.rochester.edu) Date: Wed, 06 Jul 94 15:54:40 -0400 Subject: paper available by ftp Message-ID: FTP-host: ftp.cs.rochester.edu FTP-file: pub/papers/robotics/94.mccallum-tr502.ps.Z 27 pages. "First Results with Instance-Based State Identification for Reinforcement Learning" R. Andrew McCallum Department of Computer Science University of Rochester Technical Report 502 When a reinforcement learning agent's next course of action depends on information that is hidden from the sensors because of problems such as occlusion, restricted range, bounded field of view and limited attention, we say the agent suffers from the Hidden State Problem. State identification techniques use history information to uncover hidden state. Previous approaches to encoding history include: finite state machines [Chrisman 1992; McCallum 1992], recurrent neural networks [Lin 1992] and genetic programming with indexed memory [Teller 1994]. A chief disadvantage of all these techniques is their long training time. This report presents Instance-Based State Identification, a new approach to reinforcement learning with state identification that learns with much fewer training steps. Noting that learning with history and learning in continuous spaces both share the property that they begin without knowing the granularity of the state space, the approach applies instance-based (or ``memory-based'') learning to history sequences---instead of recording instances in a continuous geometrical space, we record instances in action-perception-reward sequence space. The first implementation of this approach, called Nearest Sequence Memory, learns with an order of magnitude fewer steps than several previous approaches. The paper is also available through the http URL below: R. Andrew McCallum EBOX: mccallum at cs.rochester.edu Computer Science Dept VOX: (716) 275-2527, (716) 275-1372 (lab) University of Rochester FAX: (716) 461-2018 Rochester, NY 14627-0226 http://www.cs.rochester.edu/u/mccallum ------- End of Blind-Carbon-Copy From holm at thep.lu.se Thu Jul 7 08:18:43 1994 From: holm at thep.lu.se (Holm Schwarze) Date: Thu, 7 Jul 1994 14:18:43 +0200 (MET DST) Subject: Preprint announcement: Learning by Online Gradient Descent Message-ID: <9407071218.AA23942@dacke.thep.lu.se> A non-text attachment was scrubbed... Name: not available Type: text Size: 1888 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/4306561b/attachment.ksh From dlovell at elec.uq.oz.au Fri Jul 8 15:03:14 1994 From: dlovell at elec.uq.oz.au (David Lovell) Date: Fri, 8 Jul 94 14:03:14 EST Subject: Thesis available: The Neocognitron...Limitations and Improvements Message-ID: <9407080403.AA11615@s1.elec.uq.oz.au> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/Thesis/lovell.thesis.tar Title: The Neocognitron as a System for Handwritten Character Recognition: Limitations and Improvements Size: 3080192 bytes (compressed), 234 pages (10 point, single spaced, double-sided format), 100 figures. No hardcopies available (sorry!) Hi folks, This is just to let you know that my doctoral dissertation can be retrieved from the Neuroprose archive. I know 3Mb is pretty hefty for a compressed file but there are a lot of detailed PostScript figures and bitmaps in the document. Here's how to print it out (once you have retrieved it): tar xf lovell.thesis.tar cd dlovell-thesis zcat ch0.ps.Z | lpr -P(your PostScript printer) zcat ch1.ps.Z | lpr -P(your PostScript printer) zcat ch2.ps.Z | lpr -P(your PostScript printer) etc... I have written a c-shell script called "print-thesis.csh" which will automate the uncompressing and printing process for you. The README file contains an explanation of how to use "print-thesis.csh". I hope that the thesis will be useful (or at least interesting) to anyone working in the area of off-line character recognition with hierarchical neural networks. Best regards, David ------------------------------------------------------------------------------- The Neocognitron as a System for Handwritten Character Recognition: Limitations and Improvements by David R. Lovell A thesis submitted for the degree of Doctor of Philosophy Department of Electrical and Computer Engineering, University of Queensland. ABSTRACT This thesis is about the neocognitron, a neural network that was proposed by Fukushima in 1979. Inspired by Hubel and Wiesel's serial model of processing in the visual cortex, the neocognitron was initially intended as a self-organizing model of vision, however, we are concerned with the supervised version of the network, put forward by Fukushima in 1983. Through "training with a teacher", Fukushima hoped to obtain a character recognition system that was tolerant of shifts and deformations in input images. Until now though, it has not been clear whether Fukushima's approach has resulted in a network that can rival the performance of other recognition systems. In the first three chapters of this thesis, the biological basis, operational principles and mathematical implementation of the supervised neocognitron are presented in detail. At the end of this thorough introduction, we consider a number of important issues that have not previously been addressed (at least not with any proven degree of success). How should S-cell selectivity and other parameters be chosen so as to maximize the network's performance? How sensitive is the network's classification ability to the supervisor's choice of training patterns? Can the neocognitron achieve state-of-the-art recognition rates and, if not, what is preventing it from doing so? Chapter 4 looks at the Optimal Closed-Form Training (OCFT) algorithm, a method for adjusting S-cell selectivity, suggested by Hildebrandt in 1991. Experiments reveal flaws in the assumptions behind OCFT and provide motivation for the development and testing (in Chapter 5) of three new algorithms for selectivity adjustment: SOFT, SLOG and SHOP. Of these methods, SHOP is shown to be the most effective, determining appropriate selectivity values through the use of a validation set of handwritten characters. SHOP serves as a method for probing the behaviour of the neocognitron and is used to investigate the effect of cell masks, skeletonization of input data and choice of training patterns on the network's performance. Even though SHOP is the best selectivity adjustment algorithm to be described to date, the system's peak correct recognition rate (for isolated ZIP code digits from the CEDAR database) is around 75% (with 75% reliability) after SHOP training. It is clear that the neocognitron, as originally described by Fukushima, is unable to match the performance of today's most accurate digit recognition systems which typically achieve 90% correct recognition with near 100% reliability. After observing the neocognitron's failure to exploit the distinguishing features of different kinds of digits in its classification of images, Chapter 6 proposes modifications to enhance the networks ability in this regard. Using this new architecture, a correct classification rate of 84.62% (with 96.36% reliability) was obtained on CEDAR ZIP codes, a substantial improvement but still a level of performance that is somewhat less than state-of-the-art recognition rates. Chapter 6 concludes with a critical review of the hierarchical feature extraction paradigm. The final chapter summarizes the material presented in this thesis and draws the significant findings together in a series of conclusions. In addition to the investigation of the neocognitron, this thesis also contains a derivation of statistical bounds on the errors that arise in multilayer feedforward networks as a result of weight perturbation (Appendix E). ------------------------------------------------------------------------------ David Lovell - dlovell at elec.uq.oz.au | | Dept. Electrical and Computer Engineering | "Oh bother! The pudding is ruined University of Queensland | completely now!" said Marjory, as BRISBANE 4072 | Henry the dachshund leapt up and Australia | into the lemon surprise. | tel: (07) 365 3770 | From pjh at compsci.stirling.ac.uk Fri Jul 8 12:50:59 1994 From: pjh at compsci.stirling.ac.uk (Peter J.B. Hancock) Date: 8 Jul 94 12:50:59 BST (Fri) Subject: NCPW3: Programme & registration (176 lines) Message-ID: <9407081250.AA21688@uk.ac.stir.cs.nevis> CALL FOR PARTICIPATION ********************** Third Neural Computation and Psychology Workshop. Wednesday Aug 31 - Fri Sept 2 1994. Location: University of Stirling, Scotland, UK. Provisional Programme: Tue Aug 30th 14:00-19:00 registration 19:30 reception Wed Aug 31st 09:00 Introduction. Session 1: Cognition. Chair: Prof Vicky Bruce. 09:10 David Willshaw (Centre for Cognitive Science, University of Edinburgh),: title tba 09:50 Dienes Z., Altmann G., Gao S-J, Goode A. (Experimental Psychology, University of Sussex), Mapping across domains without feedback: a neural network model of transfer of information. 10:25 Coffee 10:50 Bullinaria J. (Dept of Psychology, University of Edinburgh), Modelling reaction times 11:25 Glasspool D., Houghton G., Shallice T. (Dept of Psychology, University College, London), Interactions between knowledge sources in a dual-route connectionist model of spelling 12:00 Slack J.M. (Institute of Social and Applied Psychology, University of Kent), Distributed representations: a capacity constraint. 12:35 Lunch Session 2: Low-Level perception. Chair: Peter Hancock 14:00 Stone J.V. (Cognitive and Computing Sciences, University of Sussex), Learning spatio-temporal visual invariances using a self-organising neural network model. 14:35 Baddeley R. (University of Oxford), A Bayesian framework for understanding topographic map formation. 15:10 Tea 15:40 Fyfe C., Baddeley R. (University of Oxford), Edge sensitivity from exploratory projection pursuit. 16:15 Herrmann M., Bauer H.-U, Der R. (Nordita, Copenhagen, Inst. f. Theor Physik, Universitaet Frankfurt, and Inst f. Informatik, Universitaet Leipzig), The "perceptual magnet" effect: a model based on self-organizing maps. 16:40 Smyth D, Phillips W.A. , Kay J.W. (Dept of Psychology, University of Stirling, and SASS, Aberdeen), Discovery of high-order functions in multi-stream, multi-stage nets without external supervision. 17:15 Session ends Thur Sept 1st Session 3: Audition. Chair: Prof David Willshaw. 09:00 Meddis R. (Dept of Human Sciences, Loughborough University of Technology), The conceptual basis of modelling auditory processing in the brainstem. 09:45 Scott S. (MRCAPU, Cambridge), `Beats' in speech and music: a model of the perceptual centres of acoustic signals. 10:20 Coffee 10:45 Smith L. (Dept of Computing Science, University of Stirling), Data-driven Sound Segmentation. 11:20 Beauvois M.W. , Meddis R., (IRCAM, Paris, and Dept of Human Sciences, Loughborough University of Technology), Computer simulation of auditory stream segregation in pure-tone sequences. 11:55 Wiles J., Stevens C. (Depts of Computer Science and Psychology, University of Queensland), Music as a Componential Code: Acquisition and Representation of Temporal Relationships in a Simple Recurrent Network 12:30 Session Ends. 12:40 : Poster introductions (3-5 mins per poster). 13:10 Lunch 14:10 Poster Session. CCCN 15:45 Tea Session 4: Sequence Learning. Chair: Leslie Smith. 16:15 Lovatt P.J., Bairaktaris D. (Dept of Computing Science, University of Stirling), A computational account of phonologically mediated free recall. 16:50 Harris K.D., Sanford A.J. (Dept of Psychology, University of Glasgow), Connectionist and process modelling of long-term sequence: the integration of relative judgements, representation and learning. 17:25 Bradbury D. (Human Cognition Research Lab, Open University), A model of aspects of visual perception that uses temporal neural network techniques. 18:00 Session Ends 19:00 Conference Dinner. To be arranged. Fri Sept 2nd Session 5: Vision 2. Chair: Prof Roger Watt 09:30 Tao L-M, (IIASS, Vietri sul Mare, Italy) Compuatational color vision: the force of combination computational theory with ecological optics. 10:05 Shillcock R., Cairns P. (Centre for Cognitive Science, University of Edinburgh), Connectionist modelling of the neglect syndrome. 10:40 Coffee 11:20 Smith K.J., Humphreys G.W. (School of Psychology, University of Birmingham), Mechanisms of visual search:an implementation of guided search. 11:55 Burton M. (Dept of Psychology, University of Stirling), title tba 12:30 Lunch End of Conference *************************************** NCPW3: Neural Computing And Psychology Workshop Aug 31 - Sept 2 1994 Cottrell Building, University of Stirling. Registration Form: Please return this form (by post) to Dr. Leslie Smith, Department of Computing Science, University of Stirling Stirling FK9 4LA, Scotland. Conference fee: Before 1 August #60 After 1 August #80 ---------- This includes coffees and lunches on all three days. | Dinner (#6.72) | B&B(#16.08) +------------------------------------------------------ Tues | | +------------------------------------------------------ Wed | | +------------------------------------------------------ Thur | Conference Dinner (#20) | +------------------------------------------------------ Fri | | Dinner and B&B subtotal: ---------- Total ____________ Payment may be made by * Cheque in UK# drawn on a UK bank * Eurocheque in UK# * Bank transfer (in UK#) to Account: University of Stirling No 1 Bank of Scotland, Craigs House, 78 Upper Craigs, Stirling FK8 2DE, Scotland. Sort code: 80 91 29 Account number: 00891500 Mark the transfer: NCPW3. Charges must be paid by sender. Please enclose a copy of the proof of transfer with the registration form. (The University of Stirling does not accept credit cards.) Name Address/Affiliation Email Telephone Fax Will you be bringing a poster? ____________________ Note all prices in UK pounds. From john at dcs.rhbnc.ac.uk Fri Jul 8 10:37:02 1994 From: john at dcs.rhbnc.ac.uk (john@dcs.rhbnc.ac.uk) Date: Fri, 08 Jul 94 15:37:02 +0100 Subject: Technical Report Series in Neural and Computational Learning Message-ID: <14468.9407081437@platon.cs.rhbnc.ac.uk> The European Community ESPRIT Basic Research Action is funding a Working Group in the area of Neural and Computational Learning Theory involving 10 European sites. As part of its activities the NeuroCOLT Working Group is maintaining a Technical Report Series of the ongoing work of its members at the coordinating site of Royal Holloway, University of London. This message is to announce the instalment of the first two reports and to describe how they can be accessed. -------------------------------------- NeuroCOLT Technical Report NC-TR-94-1: -------------------------------------- Computing over the Reals with Addition and Order: Higher Complexity Classes by Felipe Cucker and Pascal Koiran Abstract: This paper deals with issues of structural complexity in a linear version of the Blum-Shub-Smale model of computation over the real numbers. Real versions of $\pspace$ and of the polynomial time hierarchy are defined, and their properties are investigated. Mainly two types of results are presented: \begin{itemize} \item Equivalence between quantification over the real numbers and over $\{0,1\}$; \item Characterizations of recognizable subsets of $\{0,1\}^*$ in terms of familiar discrete complexity classes. \end{itemize} The complexity of the decision and quantifier elimination problems in the theory of the reals with addition and order is also studied. -------------------------------------- NeuroCOLT Technical Report NC-TR-94-3: -------------------------------------- Probabilistic Analysis of Learning in Artificial Neural Networks: The PAC Model and its Variants by Martin Anthony Abstract: This report (72 pages) surveys the probably approximately correct model of machine learning, with emphasis on the sample complexity of learning. Applications to the theory of learning in artificial neural networks are discussed. The survey should be accessible to those unfamiliar with computational learning theory. It is assumed the reader has some familiarity with neural networks, but otherwise the survey is largely self-contained. The basic PAC model of concept learning is discussed and the key results involving the Vapnik-Chervonenkis dimension are derived. Implications for the theory of artificial neural networks are discussed through a survey of known results on the VC-dimension of neural nets. A brief discussion of the computational complexity of PAC learning follows. We then discuss generalisations and extensions of the PAC model: stochastic concepts, learning with respect to particular distributions, and the learnability of functions and p-concepts. (We do not discuss computational complexity in these contexts.) Contents: 1. Introduction 2. The Basic PAC Model of Learning 3. VC-Dimension and Growth Function 4. VC-Dimension and Linear Dimension 5. A Useful Probability Theorem 6. PAC Learning and the VC-Dimension 7. VC-Dimension of Binary-Output Networks introduction linearly weighted neural networks linear threshold networks other activation functions the effect of weight restrictions 8. Computational Complexity of Learning 9. Stochastic Concepts 10. Distribution-Specific Learning 11. Graph Dimension and Multiple-Output Nets the graph dimension multiple-output feedforward threshold networks 12. Pseudo-Dimension and Function Learning the pseudo-dimension learning real-valued functions 13. Capacity of a Function Space capacity and learning applications to sigmoid neural networks 14. Scale-Sensitive Dimensions learnability of p-concepts learnability of functions 15. Conclusions ----------------------- The Report NC-TR-94-1 can be accessed and printed as follows % ftp cscx.cs.rhbnc.ac.uk (134.219.200.45) Name: anonymous password: your full email address ftp> cd pub/neurocolt/tech_reports ftp> binary ftp> get nc-tr-94-1.ps.Z ftp> bye % zcat nc-tr-94-1.ps.Z | lpr -l Similarly for the Report NC-TR-94-3. Uncompressed versions of the postscript files have also been left for anyone not having an uncompress facility. Best wishes John From C.Campbell at bristol.ac.uk Mon Jul 11 05:30:33 1994 From: C.Campbell at bristol.ac.uk (I C G Campbell) Date: Mon, 11 Jul 1994 09:30:33 +0000 (GMT) Subject: Postdoc position Message-ID: <24600.9407110930@irix.bris.ac.uk> UNIVERSITY OF BRISTOL POSTDOCTORAL POSITION IN NEURAL NETWORKS Applications are invited for a postdoctoral position at the Advanced Computing Research Centre, University of Bristol. We are looking for a researcher with an interest in theoretical work, particularly the development of new learning algorithms suited to hardware (VLSI) implementations. Applicants should have a good background in mathematics and computing. Apart from theoretical work and computer simulations the project will also involve collaboration with several groups with an interest in applications involving dedicated neural hardware. The position is initially for a two year period. Applications should be supported by a Curriculum Vitae, a list of publications and a brief outline of research interests. Some sample papers would also be helpful. Further details about this position may be obtained from Dr. C. Campbell at the address below. Applications should be sent to: Dr. C. Campbell, Advanced Computing Research Centre, Queen's Building, Bristol University, Bristol BS8 1TR United Kingdom E-mail: C.Campbell at bristol.ac.uk Tel: 0272 303030 X3382 (secretary X3246) Candidates should also arrange to have 3 letters of recommendation sent to Dr. Campbell at the address above. ***Closing date: 15th September 1994*** From mbrown at aero.soton.ac.uk Mon Jul 11 16:31:25 1994 From: mbrown at aero.soton.ac.uk (Martin Brown) Date: Mon, 11 Jul 94 16:31:25 BST Subject: New Neurofuzzy Book Message-ID: <24756.9407111531@aero.soton.ac.uk> Could you please post this announcement about our following book which may be of interest to workers in the neurofuzzy field. NEUROFUZZY ADAPTIVE MODELLING AND CONTROL, Martin Brown and Chris Harris (University of Southampton, UK), Prentice Hall, Hemel Hempstead, UK, 1994. 13-134453-6 Price: 29.95 UK pounds or 49.95 US dollars (Hardback). This book provides a unified description of several adaptive neural and fuzzy networks and introduces the associative memory class of systems - which describe the similarities and differences existing between fuzzy and neural algorithms. Three networks are described in detail - the Albus CMAC, the B-spline network and a class of fuzzy systems - and then analysed, their desirable features (local learning, linearly dependent on the parameter set, fuzzy interpretation) are emphasised and the algorithms are all evaluated on a common time series problem and applied to a common ship control benchmark. Contents: 1 An Introduction to Learning Modelling and Control 1.1 Preliminaries 1.2 Intelligent Control 1.3 Learning Modelling and Control 1.4 Artificial Neural Networks 1.5 Fuzzy Control Systems 1.6 Book Description 2 Neural Networks for Modelling and Control 2.1 Introduction 2.2 Neuromodelling and Control Architectures 2.3 Neural Network Structure 2.4 Training Algorithms 2.5 Validation of a Neural Model 2.6 Discussion 3 Associative Memory Networks 3.1 Introduction 3.2 A Common Description 3.3 Five Associative Memory Networks 3.4 Summary 4 Adaptive Linear Modelling 4.1 Introduction 4.2 Linear Models 4.3 Performance of the Model 4.4 Gradient Descent 4.5 Multi-Layer Perceptrons and Back Propagation 4.6 Network Stability 4.7 Conclusion 5 Instantaneous Learning Algorithms 5.1 Introduction 5.2 Instantaneous Learning Rules 5.3 Parameter Convergence 5.4 The Effects of Instantaneous Estimates 5.5 Learning Interference in Associative Memory Networks 5.6 Higher Order Learning Rules 5.7 Discussion 6 The CMAC Algorithm 6.1 Introduction 6.2 The Basic Algorithm 6.3 Adaptation Strategies 6.4 Higher Order Basis Functions 6.5 Computational Requirements 6.6 Nonlinear Time Series Modelling 6.7 Modelling and Control Applications 6.8 Conclusions 7 The Modelling Capabilities of the Binary CMAC 7.1 Modelling and Generalisation in the Binary CMAC 7.2 Measuring the Flexibility of the Binary CMAC 7.3 Consistency Equations 7.4 Orthogonal Functions 7.5 Bounding the Modelling Error 7.6 Investigating the CMAC's Coarse Coding Map 7.7 Conclusion 8 Adaptive B-spline Networks 8.1 Introduction 8.2 Basic Algorithm 8.3 B-spline Learning Rules 8.4 B-spline Time Series Modelling 8.5 Model Adaptation Rules 8.6 ASMOD Time Series Modelling 8.7 Discussion 9 B-spline Guidance Algorithms 9.1 Introduction 9.2 Autonomous Docking 9.3 Constrained Trajectory Generation 9.4 B-spline Interpolants 9.5 Boundary and Kinematic Constraints 9.6 Example: A Quadratic Velocity Interpolant 9.7 Discussion 10 The Representation of Fuzzy Algorithms 10.1 Introduction: How Fuzzy is a Fuzzy Model? 10.2 Fuzzy Algorithms 10.3 Fuzzy Sets 10.4 Logical Operators 10.5 Compositional Rule of Inference 10.6 Defuzzification 10.7 Conclusions 11 Adaptive Fuzzy Modelling and Control 11.1 Introduction 11.2 Learning Algorithms 11.3 Plant Modelling 11.4 Indirect Fuzzy Control 11.5 Direct Fuzzy Control References Appendix A Modified Error Correction Rule Appendix B Improved CMAC Displacement Tables Appendix C Associative Memory Network Software Structure C.1 Data Structures C.2 Interface Functions C.3 Sample C Code Appendix D Fuzzy Intersection Appendix E Weight to Rule Confidence Vector Map For further information about this book (mailing/shipping costs etc.) and other neurofuzzy titles in the Prentice Hall series please contact: Liz Dickinson, Prentice Hall, Paramount Publishing International, Campus 400, Maylands Avenue, Hemel Hempstead, HP2 7EZ, United Kingdom. Tel: 0442 881900 Fax: 0442 257115 From swe at unix.brighton.ac.uk Tue Jul 12 11:17:14 1994 From: swe at unix.brighton.ac.uk (ellacott) Date: Tue, 12 Jul 94 11:17:14 BST Subject: No subject Message-ID: <1924.9407121017@unix.bton.ac.uk> ************************* MAIL FROM STEVE ELLACOTT ************************** 1st Announcement and CALL FOR PAPERS MATHEMATICS of NEURAL NETWORKS and APPLICATIONS (MANNA 1995) International Conference at Lady Margaret Hall, Oxford, July 3-7, 1995 run by the University of Huddersfield in association with the University of Brighton We are delighted to announce the first conference on the Mathematics of Neural Networks and Applications (MANNA), in which we aim to provide both top class research and a friendly motivating atmosphere. The venue, Lady Margaret Hall is an Oxford College, set in an attractive and quiet location adjacent to the University Parks and River Cherwell. Applications of neural networks (NNs) have often been carried out with a limited understanding of the underlying mathematics but it is now essential that fuller account should be taken of the many topics that contribute to NNs: approximation theory, control theory, genetic algorithms, dynamical systems, numerical analysis, optimisation, statistical decision theory, statistical mechanics, computability and information theory, etc. . We aim to consider the links between these topics and the insights they offer, and identify mathematical tools and techniques for analysing and developing NN theories, algorithms and applications. Working sessions and panel discussions are planned. Keynote speakers who have provisionally accepted invitations include: N M Allinson (York University, UK) S Grossberg (Boston, USA) S-i Amari (Tokyo) M Hirsch (Berkeley, USA) N Biggs (LSE, London) T Poggio (MIT, USA) G Cybenko (Dartmouth USA) H Ritter (Bielefeld, Germany) J G Taylor (King's College, London) P C Parks (Oxford) It is anticipated that about 40 contributed papers and posters will be presented. The proceedings will be published, probably as a volume of an international journal, and contributed papers will be considered for inclusion. The deadline for submission of abstracts is 17 February 1995. Accommodation will be available at Lady Margaret Hall (LMH) where many rooms have en- suite facilities - early bookings are recommended. The conference will start with Monday lunch and end with Friday lunch, and there will be a full-board charge (including conference dinner) of about #235 for this period as well as a modest conference fee (to be fixed later). We hope to be able to offer a reduction in fees to those who give submitted papers and to students. There will be a supporting social programme, including reception, outing(s) and conference dinner, and family accommodation may be arranged in local guest houses. Please indicate your interest by returning the form below. A booking form will be sent to you with the 2nd announcement. Thanking you in anticipation. Committee: S W Ellacott (Brighton) and J C Mason (Huddersfield) Co-organisers; I Aleksander, N M Allinson, N Biggs, C M Bishop, D Lowe, P C Parks, J G Taylor, K Warwick ______________________________________________________________________________ To: Ros Hawkins, School of Computing and Mathematics, University of Huddersfield, Queensgate, Huddersfield, West Yorkshire, HD1 3DH, England. (Email: j.c.mason at hud.ac.uk) Please send further information on MANNA, July 3 - 7, 1995 Name .......................Address .......................................... ............................................................................. ............................................................................. Telephone ............................. Fax .................................. E Mail ................................ I intend/do not intend to submit a paper Area of proposed contribution ................................................ ***************************************************************************** From hszu%ulysses at relay.nswc.navy.mil Wed Jul 13 12:11:22 1994 From: hszu%ulysses at relay.nswc.navy.mil (Harold Szu) Date: Wed, 13 Jul 94 12:11:22 EDT Subject: UCLA Short Course on Wavelets announcement (September 12-16 1994) Message-ID: <9407131611.AA05944@ulysses.nswc.navy.mil> ANNOUNCEMENT UCLA Extension Short Course The Wavelet Transform: Techniques and Applications Overview For many years, the Fourier Transform (FT) has been used in a wide variety of application areas, including multimedia compression of wideband ISDN for telecommunications; lossless transform for fingerprint storage, identification, and retrieval; an increased signal to noise ratio (SNR) for target discrimination in oil prospect seismic imaging; in-scale and rotation-invariant pattern recognition in automatic target recognition; and in heart, tumor, and biomedical research. This course describes a new connectionist technique, the Wavelet Transform (WT), that is replacing the windowed FT in a neural network to do the applications mentioned above by a WAVENET. The WT uses appropriately matched bandpass kernels, called 'mother' wavelets, thereby enabling improved representation and analysis of wideband, transient, and noisy signals. The principal advantages of the WT are 1) its localized nature, which accepts less noise and enhances the SNR, and 2) the new problem-solving paradigm it offers in the treatment of nonlinear problems. The course covers WT principles as well as adaptive techniques, describing how WT's mimic human ears and eyes by tuning up "best mothers" to spawn "daughter" wavelets that catch multi-resolution components to be fed the expansion coefficient through an artificial neural network, called a "wavenet". This, in turn, provides the useful automation required in multiple application areas, a powerful tool when the inputs are constrained by real time sparse data (for example, the "cocktail party" effect where you perceive a desired message from the cacophony of a noisy party). Another advancement discussed in the course is the theory and experiment for solving nonlinear dynamics for information processing; e.g., the environmental simulation as a non-real time virtual reality. In other words, real time virtual reality can be achieved by the wavelet compression technique, followed by an optical flow technique to acquire those wavelet transform coefficients, then applying the inverse WT to retrieve the virtual reality dynamical evolution. (For example, an ocean wave is analyzed by soliton envelope wavelets.) Finally, implementation techniques in optics and digital electronics are presented, including optical wavelet transforms and wavelet chips. Course Materials Course note and relevant software are distributed on the first day of the course. The notes are for participants only, and are not for sale. Coordinator and Lecturer Harold Szu, Ph.D. Research physicist, Washington, D.C. Dr. Szu's current research involves wavelet transforms, character recognition, and constrained optimization implementation on neural network computer. He has edited two special issues on Wavelets, Sept 1992 & July 1994 of Optical Engineering. He is the Chair of SPIE Orlando Wavelet Applications Conference every year since 1992. He is also involved with the design of a next-generation computer based on the confluence of neural networks and optical data base machines. Dr. Szu is also a technical representative to ARPA and consultant to the Office of Naval Research , and has been engaged in plasma physics, optical engineering, electronic warfare research for the past 16 years. He holds six patents, has published about 200 technical papers, plus edided several textbooks. Dr. Szu is the editor-in-chief for the INNS Press, and currently serves as the Immediate Past President of the International Neural Network Society. Lecturer and UCLA Faculty Representative John D. Villasenor, Ph.D. Assistant Professor, Department of Electrical Engineering, University of California, Los Angeles. Dr. Villasenor has been instrumental in the development of a number of efficient algorithms for a wide range of signal and image processing tasks. His contributions include application-specific optimal compression techniques for tomographic medical images, temporal change measures using synthetic aperture radar, and motion estimation and image modeling for angiogram video compression. Prior to joining UCLA, Dr. Villasenor was with the Radar Science and Engineering section of the Jet Propulsion Laboratory where he applied synthetic aperture radar to interferometric mapping, classification, and temporal change measurement. He has also studied parallelization of spectral analysis algorithms and multidimensional data visualization strategies. Dr. Villasenor's research activities at UCLA include still-frame and video medical image compression, processing and interpretation of satellite remote sensing images, development of fast algorithms for one- and two-dimensional spectral analysis, and studies of JPEG-based hybrid video coding techniques. For more information, call the Short Course Program Office at (310) 825-3344; Facsimile (213) 206-2815. Date: September 12-16 (Monday through Friday) Time: 8am - 5pm (subject to adjustment after the first class meeting). Location: Room G-33 West, UCLA Extension Building, 10995 Le Conte Avenue (adjacent to the UCLA campus), Los Angeles, California. Reg# E0153M Course No. Engineering 867.121 3.0 CEU (30 hours of instruction) Fee: $1495, includes course materials From henders at linc.cis.upenn.edu Thu Jul 14 10:41:17 1994 From: henders at linc.cis.upenn.edu (Jamie Henderson) Date: Thu, 14 Jul 1994 10:41:17 -0400 Subject: paper available on connectionist NLP/temporal synchrony Message-ID: <199407141441.KAA04931@linc.cis.upenn.edu> FTP-host: linc.cis.upenn.edu FTP-filename: pub/henderson/jpr94.ps.Z The following paper on the feasibility and implications of using temporal synchrony variable binding to do syntactic parsing is available by anonymous ftp from linc.cis.upenn.edu. It's in directory pub/henderson, and is called "jpr94.ps.Z". It's 20 pages long. This paper will appear in the Journal of Psycholinguistic Research, probably volume 23, number 6, 1994. - Jamie Henderson University of Pennsylvania -------- Connectionist Syntactic Parsing Using Temporal Variable Binding James Henderson Computer and Information Science University of Pennsylvania Recent developments in connectionist architectures for symbolic computation have made it possible to investigate parsing in a connectionist network while still taking advantage of the large body of work on parsing in symbolic frameworks. The work discussed here investigates syntactic parsing in the temporal synchrony variable binding model of symbolic computation in a connectionist network. This computational architecture solves the basic problem with previous connectionist architectures, while keeping their advantages. However, the architecture does have some limitations, which impose constraints on parsing in this architecture. Despite these constraints, the architecture is computationally adequate for syntactic parsing. In addition, the constraints make some significant linguistic predictions. These arguments are made using a specific parsing model. The extensive use of partial descriptions of phrase structure trees is crucial to the ability of this model to recover the syntactic structure of sentences within the constraints imposed by the architecture. From biehl at connect.nbi.dk Fri Jul 15 17:56:26 1994 From: biehl at connect.nbi.dk (Michael Biehl) Date: Fri, 15 Jul 94 17:56:26 METDST Subject: paper available Message-ID: FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/biehl.online-perceptron.ps.Z The following paper has been placed in the Neuroprose archive in file biehl.online-perceptron.ps.Z (8 pages). Hardcopies are not available. ------------------------------------------------------------------------- ON-LINE LEARNING WITH A PERCEPTRON Michael Biehl CONNECT, The Niels Bohr Institute Blegdamsvej 17, 2100 Copenhagen, Denmark email: biehl at physik.uni-wuerzburg.de and Peter Riegler Institut fuer theoretische Physik Julius-Maximilians-Universitaet Wuerzburg Am Hubland, D-97074 Wuerzburg, Germany submitted to Europhysics Letters ABSTRACT We study on-line learning of a linearly separable rule with a simple perceptron. Training utilizes a sequence of uncorrelated, randomly drawn N-dimensional input examples. In the thermodynamic limit the generalization error after training with P such examples can be calculated exactly. For the standard perceptron algorithm it decreases like (N/P)^(1/3) for large (P/N), in contrast to the faster (N/P)^(1/2)-behavior of the so-called Hebbian learning. Furthermore, we show that a specific parameter-free on- line scheme, the AdaTron-algorithm, gives an asymptotic (N/P)-decay of the generalization error. This coincides (up to a constant factor) with the bound for any training process based on random examples, including off-line learning. Simulations confirm our results. ----------------------------------------------------------------------- --- Michael Biehl biehl at physik.uni-wuerzburg.de From biehl at connect.nbi.dk Fri Jul 15 17:57:47 1994 From: biehl at connect.nbi.dk (Michael Biehl) Date: Fri, 15 Jul 94 17:57:47 METDST Subject: paper available Message-ID: FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/marangi.clusters.ps.Z The following paper has been placed in the Neuroprose archive in file marangi.clusters.ps.Z (8 pages). Hardcopies are not available. ------------------------------------------------------------------------- SUPERVISED LEARNING FROM CLUSTERED INPUT EXAMPLES Carmela Marangi Dipartimento di Fisica dell' Universita' di Bari and I.N.F.N., Sez. di Bari Via Orabona 4, 70126 Bari, Italy Michael Biehl ^ and Sara Solla CONNECT, The Niels Bohr Institute Blegdamsvej 17, 2100 Copenhagen, Denmark ^ email: biehl at physik.uni-wuerzburg.de submitted to Europhysics Letters ABSTRACT In this paper we analyse the effect of introducing a structure in the input distribution on the generalzation ability of a simple perceptron. The simple case of two clusters of input data and a linearly separable rule is considered. We find that the generalization ability improves with the separation between the clusters, and is bounded from below by the result for the unstructured case. The asymptotic behavior for large training sets, however, is the same for structured and unstructured input distributions. For small training sets, the dependence of the generalization error on the number of examples is observed to be nonmonotonic for certain values of the model parameters. ----------------------------------------------------------------------- --- Michael Biehl biehl at physik.uni-wuerzburg.de From tesauro at watson.ibm.com Fri Jul 15 11:03:36 1994 From: tesauro at watson.ibm.com (tesauro@watson.ibm.com) Date: Fri, 15 Jul 94 11:03:36 EDT Subject: Neural nets in commercial anti-virus software Message-ID: IBM BRINGS NEW TECHNOLOGY TO VIRUS PROTECTION IBM's investment in leading-edge research is paying off in unexpected ways. The latest release of IBM AntiVirus uses sophisticated "neural network" technology to help detect new, previously unknown viruses. "Detecting viruses that people have never seen before, while simultaneously preventing false alarms, is a difficult balancing act," said Jeffrey O. Kephart, a manager in the High Integrity Computing Laboratory, the group at the Watson Research Center that develops IBM AntiVirus. "But with several new viruses being written every day, this has become an essential requirement for any anti-virus program." "Traditionally, virus detection heuristics have been developed by trial and error. Our neural-net detector was produced completely automatically, according to sound statistical principles. The anti-virus technical community had been hoping for such a breakthrough, but was pessimistic. We invented several new techniques that overcame previous limitations." By showing a neural network a large number of infected and uninfected files, Kephart and his colleagues trained it to discriminate between viruses and uninfected programs. After the training had taken place, they found that the neural network was able to recognize a very high percentage of previously unknown viruses. "We've been quite successful in bringing leading-edge research into the IBM AntiVirus products very quickly," explained Kephart. "In this case, just a few months after our initial conception of the idea, we are delivering novel but well-tested technology to our customers around the world." IBM AntiVirus version 1.06 provides comprehensive "install-and-forget" automatic protection against computer virus attacks in DOS, Windows*, OS/2** and Novell NetWare*** computing environments. In addition to its patent-pending neural network technology, it can detect viruses inside of files compressed with PKZIP****, ZIP2EXE and LZEXE. It can even detect viruses inside of compressed files that themselves contain compressed files. Common viruses can be detected automatically when infected files are copied from a diskette or downloaded from a computer bulletin board system. New installation programs support automated installation from LAN servers. IBM AntiVirus for NetWare can check NetWare 3.1x and 4.0x servers for viruses in real time, as users add or modify files on the server. IBM AntiVirus protects against thousands of known viruses, including viruses that are said to be impossible to detect. "There's a lot of hype out there about 'killer' viruses," said Steve R. White, Senior Manager of the High Integrity Computing Laboratory. "Here are the facts. Many viruses are silly, badly written programs. A few viruses try to hide by changing their appearance when they spread - 'polymorphic' viruses - or by trying to prevent anti-virus software from seeing them at all - 'stealth' viruses." "People have said these viruses are impossible to detect. They are wrong. We have had no trouble analyzing new viruses and adding protection against them to IBM AntiVirus. The latest version of IBM AntiVirus detects lots of 'difficult' viruses, including Queeg, Pathogen and Junkie-1027. Keeping up with these new viruses does require a lot of expertise and technology, but that's what IBM Research is famous for. People who say that their anti-virus products can't keep up are using the wrong products." * Windows is a trademark of Mircosoft Corp. ** OS/2 is a trademark of IBM Corp. *** Novell and NetWare are trademarks of Novell Corp. **** PKZIP is a trademark of PKWARE, Inc. From gem at cogsci.indiana.edu Fri Jul 15 12:14:40 1994 From: gem at cogsci.indiana.edu (Gary McGraw) Date: Fri, 15 Jul 94 11:14:40 EST Subject: Letter Perception paper available Message-ID: The following paper (available by anonymous ftp) may be of interest to some on this list: Roles in Letter Perception: Human data and computer models CRCC-TR 90 Gary McGraw*, John Rehling*, and Robert Goldstone# * Center for Research on Concepts and Cognition Indiana University, Bloomington, Indiana 47405 & Istituto per la Ricerca Scientifica e Tecnologica Loc. Pante di Povo, I-38100 Trento, Italia gem at irst.it rehling at irst.it # Department of Psychology Indiana University, Bloomington, Indiana 47405 rgoldsto at ucs.indiana.edu Submitted to Cognitive Science We present the results of an experiment in letter recognition. Unlike most psychological studies of letter recognition, we include in our data set letters at the fringes of their categories and investigate the recognition of letters in diverse styles. We are interested in the relationship between the recognition of prototypical letters and the recognition of eccentric, highly-stylized letters. Our results provide empirical evidence for conceptual constituents of letter categories, called roles, which exert clear top-down influence on the segmentation of letterforms into structural components. The human data are analyzed in light of two computational models of letter perception --- one connectionist and the other symbolic. Performance of the models is compared and contrasted with human performance using theoretical tools that shed light on processing. Results point in the direction of a model using a role-based approach to letter perception. To obtain an electronic copy of this paper: Note that the paper (41 pages with many figures) comes in a rather large file of 455116 bytes (compressed). ftp ftp.cogsci.indiana.edu login: anonymous password: cd /pub/ binary get mcgraw+rehling+goldstone.roles_letter_perception.ps.Z quit Then at your system: uncompress mcgraw+rehling+goldstone.roles_letter_perception.ps.Z lpr -s mcgraw+rehling+goldstone.roles_letter_perception.ps If you cannot obtain an electronic copy, send a request for a hard copy to helga at cogsci.indiana.edu You may also retrieve the paper via the web. Open the URL http://www.cogsci.indiana.edu and follow the "papers" pointer. Gary McGraw (gem at cogsci.indiana.edu) From hornik at ci.tuwien.ac.at Mon Jul 18 09:10:00 1994 From: hornik at ci.tuwien.ac.at (Kurt Hornik) Date: Mon, 18 Jul 94 09:10 MET DST Subject: Paper available in Neuroprose Message-ID: FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/baldi.linear.ps.Z *** DO NOT FORWARD TO OTHER GROUPS *** The file baldi.linear.ps.Z is now available for copying from the Neuroprose repository: A survey of learning in linear neural networks (24 pages) Pierre Baldi (Caltech) && Kurt Hornik (TU Wien, Austria) ABSTRACT: Networks of linear units are the simplest kind of networks, where the basic questions related to learning, generalization, and self-organisation can sometimes be answered analytically. We survey most of the known results on linear networks, including: (1) back-propagation learning and the structure of the error function landscape; (2) the temporal evolution of generalization; (3) unsupervised learning algorithms and their properties. The connections to classical statistical ideas, such as principal component analysis (PCA), are emphasized as well as several simple but challenging open questions. A few new results are also spread across the paper, including an analysis of the effect of noise on back-propagation networks and a unified view of all unsupervised algorithms. -Kurt Hornik (Kurt.Hornik at ci.tuwien.ac.at) From esann at dice.ucl.ac.be Mon Jul 18 13:49:59 1994 From: esann at dice.ucl.ac.be (esann@dice.ucl.ac.be) Date: Mon, 18 Jul 1994 19:49:59 +0200 Subject: Neural Processing Letters - announcement and call for papers Message-ID: <9407181745.AA03058@ns1.dice.ucl.ac.be> Dear Colleagues, We are pleased to announce you the creation of a new publication in the field of neural networks, "Neural Processing Letters". Neural Processing Letters is intended to provide to the research community a FAST publication media, in order to rapidly publish new ideas, original developments, work in progress, in all aspects of the neural networks field. Papers will be published as letters (short papers of about 4 published pages), and the maximum delay between submission and publication will be about 3 months. You will find below some information about this new publication. If you are interested in, please don't hesitate to contact us, preferably by fax, to ask for more information. The first issue of the journal will be published in September 1994; if you are interested in submitting a paper, please ask as soon as possible for the instructions for authors. We hope that this new journal will become a standard for the publication of original ideas in the field, and that you will contribute to it by submitting your work and/or by subscribing to it! All correspondence should be addressed to the publisher: Neural Processing Letters F. Blayo and M. Verleysen editors D facto publications 45 rue Masui B-1210 Brussels, Belgium Tel: + 32 2 245 43 63 Fax: + 32 2 245 46 94 _______________________________________________ ! ! ! Neural Processing Letters ! ! ! _______________________________________________ A fast publication medium ------------------------- Neural Processing Letters is a rapid publication journal intended to disseminate the latest results in the field of neural processing. The aim of the journal is to rapidly publish new ideas, original developments and work in progress not previously published. Topics ------ Neural Processing Letters covers all aspects of the Artificial Neural Networks field including, but not restricted to, theoretical developments, biological models, new formal models, learning, applications, software and hardware developments, and prospective researches. Committee --------- Editors : Fran?ois Blayo (France) and Michel Verleysen (Belgium) The preliminary editorial board includes today : Y. Abu-Mostafa (USA) L. Almeida (Portugal) S.I. Amari (Japan) A. Babloyantz (Belgium) J. Barhen (USA) E. Baum (USA) J. Cabestany (Spain) M. Cottrell (France) D. Del Corso (Italy) A. Georgopoulos (USA) A. Guerin-Dugue (France) M. Hassoun (USA) K. Hornik (Austria) C. Jutten (France) P. Lansky (Czech Republic) J.P. Nadal (France) G. Orban (Belgium) R. Reilly (Ireland) H. Ritter (Germany) T. Roska (Hungary) J. Stonham (United Kingdom) E. Vittoz (Switzerland) Instructions to authors ----------------------- Prospective authors are invited to submit a letter, written in English language, not exceeding 3000 words including figures (each medium-sized figure being equivalent to 200 words). The content of the letter will focus on ideas, results and conclusions. All the attention must be paid to the clarity of the presentation and the synthesis of thought process. Prospective authors are strongly invited to ask for full instructions for authors. Short comments (not exceeding 300 words) on letters will be considered for publication ; comments will be published with the author's reply. Book reviews are welcome (about 500 words). Reviewing process ----------------- To ensure short publication delays, no corrections will be allowed in a submitted paper. The reviewing process will be confidential, and submitted materials will be accepted as it or rejected. Publication ----------- Neural Processing Letters will be published each two months, beginning September, 1994. The maximum delay between submission and publication will be 3 months. For further information concerning submission of papers or subscriptions, please contact the editorial office at the following address : Neural Processing Letters F. Blayo and M. Verleysen editors D facto publications 45 rue Masui B-1210 Brussels, Belgium Tel : + 32 2 245 43 63 Fax : + 32 2 245 46 94 _____________________ ! Subscription Form ! _____________________ ******************************* *** Special temporary offer *** ******************************* Please send this form by regular mail. A copy by fax may be sent to avoid delays. Information on the subscriber (please print) Name : First name : Title (M., Mrs, Dr, Prof.) : Organization : Address : Post/Zip Code : City : Country : E-mail : Tel : Fax : VAT number (mandatory for EC customers) : O I wish to subscribe to Neural Processing Letters, for a period of one year Normal price: BEF 4400. **** Special temporary offer (before September 30, 1994) **** BEF 4000 O Please send me a free sample copy of Neural Processing Letters O Please send me the detailed instructions for authors Payment details O I wish to pay by cheque/money order made payable to : D facto s.a. 45 rue Masui B-1210 Brussels - Belgium A supplementary fee of BEF 300 must be added if the payment BEF 300 is made through a bank abroad cheque or a postal money order. This supplementary fee is not required for payment by Eurocheque. O I wish to pay by bank transfer. Account : D facto publications Account number : 310-1177992-13 Bank : Banque Bruxelles-Lambert 11, av. Winston Churchill B-1180 Brussels Belgium Please indicate "Neural Processing Letters" and the name of the subscriber on the bank transfer. A supplementary fee of BEF 300 must be added if the payment BEF 300 is made from a bank abroad account. Payment by credit card is not accepted. TOTAL : BEF ... O Please send me an invoice (for EC customers: only if VAT number is indicated) First subscription issue: Unless otherwise specified, the first issue to be sent to the subscriber is the first one published after receipt of the payment. No subscription will be honoured without payment. Subscriptions are valid for six consecutive issues only. Please indicate here if you want a subscription starting at a specified issue (for example issue n.1 - September 1994). ........................................................................... .... Date : Signature : Please send this form to: Neural Processing Letters F. Blayo and M. Verleysen editors D facto publications 45 rue Masui B-1210 Brussels, Belgium Tel : + 32 2 245 43 63 Fax : + 32 2 245 46 94 _____________________________ Neural Processing Letters D facto publications 45 rue Masui 1210 Brussels Belgium tel: +32 2 245 43 63 fax: +32 2 245 46 94 _____________________________ From Dave_Touretzky at DST.BOLTZ.CS.CMU.EDU Mon Jul 18 15:23:16 1994 From: Dave_Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave_Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Mon, 18 Jul 94 15:23:16 -0400 Subject: posting paper announcements to Connectionists Message-ID: <23367.774559396@DST.BOLTZ.CS.CMU.EDU> We've had a couple of suggestions recently about the posting of paper announcements to Connectionists. Here they are; you can follow this advice or not, as you wish. 1) Don't use a generic Subject line like "paper available". Instead, include the title of your paper, or at least some helpful keywords. Example: Wrong way: "Paper available" Right way: "Paper available: Solving Towers of Hanoi with ART-4" 2) If you're going to post pointers to Postscript files, try to save a few trees by formatting your paper single spaced, instead of using the double-spaced format required for journal submissions. This also makes the paper more suitable for reading with a Postscript previewer. -- Dave Touretzky, CONNECTIONISTS moderator From inmanh at cogs.susx.ac.uk Tue Jul 19 14:31:00 1994 From: inmanh at cogs.susx.ac.uk (Inman Harvey) Date: Tue, 19 Jul 94 14:31 BST Subject: SAB94 Conference reminder Message-ID: Last minute reminder for those contemplating attending SAB94, Third Intl. Conf. on Simulation of Adaptive Behavior "From Animals to Animats", in Brighton, U.K., Aug 8--12 1994. Full program can be obtained on World Wide Web from: http://www.cogs.susx.ac.uk/lab/adapt/sab_program.html or by anonymous ftp from ftp.cogs.susx.ac.uk in directory: pub/sab94 From davec at cogs.susx.ac.uk Tue Jul 19 11:39:47 1994 From: davec at cogs.susx.ac.uk (Dave Cliff) Date: Tue, 19 Jul 1994 16:39:47 +0100 (BST) Subject: CFP: Adaptive Behavior special issue Message-ID: ------------------------------------------------------------------------------ CALL FOR PAPERS (please post) ADAPTIVE BEHAVIOR Journal Special Double Issue on COMPUTATIONAL NEUROETHOLOGY Guest editor: Dave Cliff Submission Deadline: 1 December 1994. Adaptive Behavior is an international journal published by MIT Press; Editor-in-Chief: Jean-Arcady Meyer, Ecole Normale Superieure, Paris. The aim of this special issue (to be published in 1995) is to bring together papers describing research in the field of computational neuroethology. Computational neuroethology (CNE) applies computational modelling techniques to the study of neural mechanisms underlying the generation of adaptive behaviors in embodied, situated, autonomous agents. The focus on studying agents embedded within their environments is a distinguishing feature of CNE research. CNE studies can help in the design of artificial autonomous agents, and can complement standard computational neuroscience approaches to understanding the neural control of behavior in animals. Submitted papers should emphasise the relevance of the content matter to both real and artificial systems. Submitted papers should be delivered by 1 December 1994. Manuscripts should be typed or laser-printed in English (with American spelling preferred), doublespaced, and between 10000 and 15000 words in length counting 500 words per full-page figure. Authors intending to submit should contact Dave Cliff well before the deadline, at the address below. Copies of the complete Adaptive Behavior Instructions to Contributors are available on request. Send five (5) copies of submitted papers (hardcopy only) to: Dave Cliff School of Cognitive and Computing Sciences University of Sussex Brighton BN1 9QH U.K. Phone: +44 273 678754 Fax: +44 273 671320 Email: davec at cogs.susx.ac.uk WWW: http://www.cogs.susx.ac.uk/users/davec ------------------------------------------------------------------------------ From mtx004 at cck.coventry.ac.uk Wed Jul 20 14:48:14 1994 From: mtx004 at cck.coventry.ac.uk (NSteele) Date: Wed, 20 Jul 94 14:48:14 WET DST Subject: Fuzzy Logic Symposium Message-ID: <6346.9407201348@cck.coventry.ac.uk> ICSC ISFL'95 Call for Papers First ICSC International Symposium on FUZZY LOGIC To be held at the Swiss Federal Institute of Technology (ETH) Zurich, Switzerland May 26 and 27, 1995 I. PURPOSE OF THE CONFERENCE The purpose of this conference is to assist communication of research in the field of Fuzzy Logic and its technological applications. Fuzzy Logic is a scientific revolution that has been waiting to happen for decades. Research in Fuzzy Technologies has reched a degree where industrial application is possible. This is reflected by numerous projects in the USA, Korea and Japan, where the leading corporations have invested billions in utilising fuzzy logic in technological innovations. International activities show that by the year 2000 numerous practical realizations will be influenced by Fuzzy Systems. It is thus timely to organize a symposium aimed at bringing together both existing and potential workers in the field. II. TOPICS The following topics are envisaged: * Basic concepts such as various kinds of Fuzzy Sets, Fuzzy Relations, Possibility Theory * Mathematical Aspects such as non-classical logics, Category Theory, Algebra, Topology, Chaos Theory * Methodology and applications for example in Artificial Intelligence, Expert Systems, Patten Recognition, Clustering, Fuzzy Control, Game Theory, Mathematical Programming, Neural Networks, Genetic Algorithms, etc. * Implementation, for example in Engineering, Process Control, Production, Medicine. III. INTERNATIONAL SCIENTIFIC COMMITTEE E. Badreddin, Switzerland J.D. Nicoud, Switzerland H.P. Geering, Switzerland R. Palm, Germany H. Hellendoorn, Germany B. Reusch, Germany M. Jamshidi, USA N. Steele, England (Chairman) E.P. Klement, Austria K. Warwick, England B. Kosko, USA H.J. Zimmermann, Germany R. Kruse, Germany (list incomplete) IV. ORGANISING COMMITTEE The ISFL`95 is a joint operation of the Swiss Federal Institute of Technology (ETH), Zurich and International Computer Science Conventions (ICSC). V. SUBMISSIONS OF MANUSCRIPTS Prospective authors are requested to send two copies of their abstracts of 500 words to the ICSC Secretariat for review by the International Scientific Committee. All abstracts must be written in English, starting with a succinct statement of the problem, the results achieved their significance and a comparison with previous work. If authors believe that more details are necessary to substantiate the main claims of the paper, they may include a clearly marked appendix that will be read at the discretion of the International Scientific Committee. The abstract should also include: * Title of proposed paper * Author's names, affiliations, addresses * Name of author to contact for correspondence * Fax number of contact author * Name of topic which best describes the paper (max. 5 keywords) Contributions are solicited from those working in industry and having experience in the topics of this conference as well as from academics. VI. CONFERENCE LANGUAGE The Conference language is English. Simultaneous interpretation will not be available. VII. DEADLINES AND REGISTRATION It is the intention of the organizers to have the conference proceedings avialable for the delegates. Consequently the deadlines are to be strictly respected: * Submission of Abstracts .................... August 31, 1994 * Notification of Acceptance ................. November 1, 1994 * Delivery of Full Papers .................... February 1, 1995 * Early registrations (Sfrs. 700.-) .......... February 1, 1995 * Late registration (Sfrs. 850.-) Full registration includes attendance to all sessions, conference dinner and conference proceedings. Full-time students who have a valid student ID-card, may register at a reduced rate pf Sfrs. 400.- to all technical sessions. Student registration however does not include the banquet or proceedings. Extra banquet tickets will be sold for accompanying persons and students. The proceedings can be purchased separately through the ICSC-Secretariat. VIII. ACCOMMODATION Accommodation charges are not included in the fees, but block reservations will be made by the ISCS-Secretariat for the conference period at several hotels. More information will be made available with the letter of acceptance. IX. FURTHER INFORMATION For further informations please contact International Computer Science Conventions: ICSC-Secretariat Canada or ICSC-Secretariat Switzerland P O Box 279 P O Box 657 Millet, Alberta TOC 1ZO CH-8055 Zurich Canada Switzerland Fax: ++1-403-387-4329 Fax: ++41-1-761-9627 ****************************************************************************** ========================== Nigel Steele Chairman, Division of Mathematics School of Mathematical and Information Sciences Coventry University Priory Street Coventry CV1 5FB United Kingdom. tel: (0203) 838568 +44 203 838568 email: NSTEELE at uk.ac.coventry (JANET) or NSTEELE at coventry.ac.uk (EARN BITNET etc.) fax: (0203) 838585 +44 203 838585 From plaut at cmu.edu Wed Jul 20 18:06:23 1994 From: plaut at cmu.edu (David Plaut) Date: Wed, 20 Jul 1994 18:06:23 -0400 Subject: Preprint: Modularity and double dissociations in damaged networks Message-ID: <16437.774741983@crab.psy.cmu.edu> Double Dissociation Without Modularity: Evidence from Connectionist Neuropsychology David C. Plaut Department of Psychology Carnegie Mellon University To appear in Journal of Clinical and Experimental Neuropsychology Special Issue on Modularity and the Brain Many theorists assume that the cognitive system is composed of a collection of encapsulated processing components or modules, each dedicated to performing a particular cognitive function. On this view, selective impairment of cognitive tasks following brain damage, as evidenced by double dissociations, are naturally interpreted in terms of the loss of particular processing components. By contrast, the current investigation examines in detail a double dissociation between concrete and abstract word reading after damage to a connectionist network that pronounces words via meaning and yet has no separable components (Plaut & Shallice, 1993, Cogn. Neuropsych.). The functional specialization in the network that gives rise to the double dissociation is not transparently related to the network's structure, as modular theories assume. Furthermore, a consideration of the distribution of effects across quantitatively equivalent individual lesions in the network raises specific concerns about the interpretation of single-case studies. The findings underscore the necessity of relating neuropsychological data to cognitive theories in the context of specific computational assumptions about how the cognitive system operates normally and after damage. ftp-host: hydra.psy.cmu.edu [128.2.248.152] ftp-file: pub/plaut/papers/plaut.modularity.JCEN.ps.Z [31 pages; 0.48Mb uncompressed] =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= David Plaut plaut at cmu.edu "Doubt is not a pleasant Department of Psychology 412/268-5145 condition, but certainty Carnegie Mellon University 412/268-5060 (FAX) is an absurd one." Pittsburgh, PA 15213-3890 345H Baker Hall --Voltaire =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= From plaut at cmu.edu Wed Jul 20 18:06:55 1994 From: plaut at cmu.edu (David Plaut) Date: Wed, 20 Jul 1994 18:06:55 -0400 Subject: Preprint: Rehabilitation and relearning in damaged networks Message-ID: <16478.774742015@crab.psy.cmu.edu> Relearning after Damage in Connectionist Networks: Toward a Theory of Rehabilitation David C. Plaut Department of Psychology Carnegie Mellon University To appear in Brain and Language Special Issue on Cognitive Approaches to Rehabilitation and Recovery in Aphasia Connectionist modeling offers a useful computational framework for exploring the nature of normal and impaired cognitive processes. The current work extends the relevance of connectionist modeling in neuropsychology to address issues in cognitive rehabilitation: the degree and speed of recovery through retraining, the extent to which improvement on treated items generalizes to untreated items, and how treated items are selected to maximize this generalization. A network previously used to model impairments in mapping orthography to semantics is retrained after damage. The degree of relearning and generalization varies considerably for different lesion locations, and has interesting implications for understanding the nature and variability of recovery in patients. In a second simulation, retraining on words whose semantics are atypical of their category yields more generalization than retraining on more typical words, suggesting a counterintuitive strategy for selecting items in patient therapy to maximize recovery. In a final simulation, changes in the pattern of errors produced by the network over the course of recovery is used to constrain explanations of the nature of recovery of analogous brain-damaged patients. Taken together, the findings demonstrate that the nature of relearning in damaged connectionist networks can make important contributions to a theory of rehabilitation in patients. ftp-host: hydra.psy.cmu.edu [128.2.248.152] ftp-file: pub/plaut/papers/plaut.rehab.BrLang.ps.Z [39 pages; 0.66Mb uncompressed] =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= David Plaut plaut at cmu.edu "Doubt is not a pleasant Department of Psychology 412/268-5145 condition, but certainty Carnegie Mellon University 412/268-5060 (FAX) is an absurd one." Pittsburgh, PA 15213-3890 345H Baker Hall --Voltaire =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= From degaris at hip.atr.co.jp Thu Jul 21 15:46:26 1994 From: degaris at hip.atr.co.jp (Hugo de Garis) Date: Thu, 21 Jul 94 15:46:26 JST Subject: ALife IV Conference Report, Hugo de Garis, ATR Message-ID: <9407210646.AA00982@gauss> ALife IV Conference Report, Hugo de Garis, ATR The 4th Artificial Life conference was held at MIT in Boston, Massachusetts, USA, July 6th to 8th, 1994, organised by Rod Brooks and Pattie Maes. About 500 people turned up, to hear roughly 60 talks spread over plenaries and dual split sessions. There were over 50 posters. A book containing only the oral talks will be published within a few weeks by MIT Press. The best talks were set for the morning of the 6th. The kickoff speech was by (of course) Chris Langton, father of and labeller of the field "Artificial Life". Chris spoke of the dream of ALife to build artificial biologies, so that the universal properties of all forms of life whether biological or artificial can be understood. He emphasized the role of evolution much more strongly than he did at the previous conference at Santa Fe in 1992. In an hour long talk he systematically covered the steps in ALife research towards greater autonomy in the evolutionary process of production and selection, ranging over the work of Dawkins, Hillis, Lindgren, to Ray's fully autonomous "Tierra". I was struck at this apparent "about face" of Chris's attitude towards the importance and relevance of evolutionary approaches to ALife. I remember him saying to me at the 1992 conference that he was rather bored by GAs. Chris talked about his concept of "collectionism" or micro-macro dynamics, which is both top-down and bottom-up, where the macro behavior emerges in a bottom up way from the micro local rules of simple agents, yet the macro emergent effects feed back in a top-down way on the behavior of the agents. He spoke of biological hierarchies, from prokaryotes to eukaryotes to multicells to societies. He said the future of life is in humanity's hands. It was an inspiring and fun talk, even if it did run over time, thus testing the patience of Rod who was session chair. (Every 5 minutes over time, Rod would advance a bit, to Chris's "Uh oh!"). The following two talks by Demetri Terzopoulos et al, and Karl Sims were the highlights of the conference in my book. Both effectively built (simulated) artificial organisms. Terzopoulos et al simulated artificial fish using springs and differential equantions to provide the fish with lifelike motions. The scope of their work can be seen from the section titles in their paper, e.g. physics-based fish model and locomotion, mechanics, swimming using muscles and hydrodynamics, motor controllers, pectoral fins, learning muscle based locomotion, learning strategy, low level learning, abstraction of high level controllers, sensory perception, vision sensor, behavioral modeling, habits and mental state, intention generator, behavior routines, artificial fish types, predators, pacifists. It was an extraordinary piece of work and will probably be highly influential in the next year or so. Karl Sims paper combined his genius at computer graphics with some solid research ability. He evolved 3D rectangloid shaped "creatures" AND their neural network controllers and had these creatures fight it out in pairs in a co-evolutionary competition to get as close as possible to a target cube. I had the eery feeling watching the video of these creatures that I was witnessing the birth of a new field, namely "brain building", where the focus is on constructing increasingly elaborate artificial nervous systems. I will say more about this later. The remaining talks of the first morning were by Dave Ackley (on "Altruism in the Evolution of Communication"), Hiroaki Kitano (on "Evolution of Metabolism for Morphogenesis" - which made a solid contribution to the nascient field of artificial embryology), and Craig Reynolds (of "Boid" fame) (on "Competition, Coevolution and the Game of Tag", a coevolution of an alternating cat and mouse game). In the afternoon of the 6th, in a plenary talk, my boss Shimohara, spoke of ALife work at our Evolutionary Systems Department at ATR labs, Kyoto, Japan. (By the way, the next conference, i.e. ALife V, 1996, will be organized by Chris Langton, with local assistance from Shimohara san, and will probably be held in Kyoto or Nara, Japan's favorite tourist cities), around mid May. He introduced the researchers and the work of his group, e.g. software evolution (Tom Ray's "Tierra" and its multicell extension), my "CAM-Brain" (which hopes to evolve billion neuron brains at electronic speeds inside cellular automata machines, Hemmi and Mizoguchi's "Evolvable Hardware", (which uses Koza's Genetic Programming to evolve tree structured HDLs (hardware description languages) to evolve electronic circuit descriptions), and other members of our group. He then briefly showed how extensive ALife research has become in Japan. Shimohara stunned his audience by stating that the long term aim of the group, i.e. by the year 2001, is to build an artificial brain. A string of people came up to me after his talk with the comment "Is he serious?" "Yep", I said. After that, the conference split into dual sessions, so I missed half the talks. To get an overview of the best talks in the dual sessions I asked some of the organizers and "senior attendants" whom they felt gave the best or the most interesting or promising talks. As usual, in these ALife reports of mine, there is a strong dose of subjective judgement and bias. Some highlights were :- Jeffrey Kephart's "A Biologically Inspired Immune System for Computers", introduced the notion of "computer immune systems" to counter computer viruses. He is from IBM, so he was woolly on details, but he said that the millions of dollars spent on viral protection made a computer immune system essential. He also stated that a running system would be ready at IBM within a year. Such a system could be the first multimillion dollar ALife based application. Hosokawa et al's talk "Dynamics of Self Assembling Systems - Analogy with Chemical Kinetics", I did not see at the conference, but had seen already at a seminar they presented at ATR. They shake cardboard triangles with internal magnets so that they self assemble into multicelled systems. They then analyse the probabilities of forming various self assembling shapes. Beckers et al's talk "From Local Actions to Global Tasks : Stigmergy and Collective Robotics" I did not see either. It took a foraging behavioral principle of termites (stigmergy) and applied it to minirobots. Nolfi et al's "How to Evolve Autonomous Robots : Different Approaches in Evolutionary Robotics" discussed the rival approaches to evolving neural controllers for robots, i.e. simulation or real world fitness measurements. (i.e. fast and simple, vs. slow and complex). A good overview paper of a complex and important issue. etc etc The other plenary talks were :- Jill Tarter and Paul Horowitz on "Search for Extra-Terrestrial Intelligence". This promised to be a fun talk, but Tarter is too nuts-and-bolts a personality and was too preoccupied by a recent funding cut to relate well to her audience. Horowitz was more fun, with a definite sense of humor matching his competence. However, what was lacking was a link between SETI and ALife. These two speakers were simply parachuted in from outside, without instructions to connect SETI to ALife. An opportunity for synergy between SETI and ALife was missed. Questions such as "what types of life should SETI expect to find, would their biochemistry necessarily be similar to ours, etc", were not even addressed. Pity. Jack Szostak spoke on "Towards the In Vitro Evolution of an RNA Replicase". This talk I found rivetting. I believe that the blossoming field of molecular evolution is the hottest and most significant branch of ALife around today. It will revolutionize the fields of genetic engineering, the drug industry, and may even play a role in the long term construction of artificial cells. This field is about GAs applied to real molecules, evolving them in a cycle of test, select, amplify. Nobels will flow from this field. Already recognition of Gerald Joyce's pioneering work in this field has come in the form of prizes. Stay tuned. Tom Ray paced up and down the stage introducing his concept of "A Proposal to Create a Network-Wide Biodiversity Reserve for Digitial Organisms", i.e. putting Tierra on thousands of computers on the Internet. Tom wowed his audience with statements like ".. the digital organisms will migrate around the globe on a daily basis, staying on the dark side of the planet, because they will have discovered that there is more CPU time available at night, while users sleep". Ray dreams of "digital farming", i.e. tapping spontaneously evolved digital organisms and using them for useful purposes. He prefers spontaneous evolution to directed evolution ("autonomism" vs. "directivism"). Stefan Helmreich, an anthropologist, reported on his studies of ALifers and their work. Chris Langton introduced him saying that he (i.e. Chris) felt like a bug being examined by Helmreich. I had a rather antsy feeling listening to him, because he sounded rather like a psycho-analyst or a theologian, in the sense of not feeling compelled to put his conjectures to the test. It was most edifying to learn that most ALifers are upper middle class, straight, atheist WASPs, etc. The talk had a definite ideological axe-to-grind edge to it. He also read his speech, a real no-no in computer land, and spoke at machine gun pace, totally losing his non native English speaker audience. While the bullets were flying, I couldnt help thinking that surveys had shown that on average the theoretical physicists and mathematicians are the smartest groups at universities, and the anthropologists are the dumbest. Helmreich was certainly not dumb, but some of his assertions sure were antsy. The afternoon of the second day was taken up with posters and tours of MIT's Media Lab and the AI Lab. I went to the AI Lab and snapped lots of photos of the team members of "COG", Brook's latest attempt at AI. It was production line research, with a PERT chart over 3 years with more than 30 arrows, each arrow being a PhD or masters thesis. I met over a dozen young researchers working on COG, an upper torso robot with vision, hearing, hand and finger control and hopefully COGnitive abilities. This is a very ambitious project. Brooks will need all the luck he can get. At a recent Tokyo workshop, Brooks said that he launched COG, because he felt he had only one 10 year project left in him, and he wanted to have a shot at making an AI human rather than some artificial cockroach or something equally unsexy. Good luck Rod, and a long life! The morning of the third day, Luc Steels gave a plenary talk on "Emergent functionality of robot behavior through on-line evolution". Unfortunately, I skipped the third day, to meet another engagement, so I cant give an opinion. General Comments To those researchers in the field of evolutionary computation, I think you can congratulate yourselves. EC played a significant, if not dominant role at ALife IV. Chris Langton stated in his editorial of the first issue of the new MIT Press journal "Artificial Life" that he did not want to see any more "YANNs" (i.e. yet another (evolved) neural net). This shows how powerful a tool EC has become. A journalist writing on evolvable hardware in the magazine "The Economist" in 1993, described evolution as the computational theme of the 90s. It looks that way more and more. I asked over a dozen people what they thought of the conference in general. An assortment of comments were :- The field of ALife has matured. The mathematicians are starting to move in, time to move out. There was little new, just more of the same. A good solid conference, solid work, respectable. Boring, all the fringey stuff was weeded out. I must say, that the last comment hit home for me. ALife IV felt like "just another conference", to me, whereas ALife III had real zing. Apart from a few papers on evolvable hardware, a paper on computer immunity, and a few others, there was little I could describe as being qualitatively new. It looks as though the field has matured, as evidenced by the fact that there is now an MIT Press ALife journal, and that 500 or so people turned up to ALife IV. Chris Langton's three ALife conferences were characterised by a mix of creative fun and solid competence. I felt the ALife IV conference lacked the fun element. This can be dangerous because the "creative-crazies" who pioneer a field are a fickle lot, and can very easily move on to the next hot topic. I remember a conversation with Chris Langton, wondering what the next hot topic will be. We didnt know. Well, now I think I know what it will be. I had premonitions of it listening to Terzopolous's and Sims's talks. My feeling is that enough people are now playing around with building artificial nervous systems, (e.g. the "3 musketeers" at Sussex, UK; Beer and Arbib in the US; our group at ATR, Japan; Nolfi et al in Italy; etc) that the time is ripe for the birth of a new field, which I call simply "Brain Building". I'm sticking my neck out here, but I feel fairly confident this will happen. I'm predicting that the field of ALife will give birth to this new field. I'm curious to see how other people feel about this prediction. ALife V in Japan (probably Kyoto or Nara), 1996. Finally, if you have been promising yourself a trip to Japan before you get too old, here is your chance. ALife V, will be held in 1996 in Japan, probably in May, in Kyoto or Nara, Japan's favorite tourist cities, with "a temple on every corner". Maybe you can combine the conference with a week or two of touristing. I live here and I still havent exhausted what there is to see. If I'm not too busy talking with my million neuron brain in 1996, see you there (i.e. here). MIT Press will publish the oral papers in a book due out within a matter of weeks I'm told. Cheers, Hugo de Garis Dr. Hugo de Garis, Brain Builder Group, Evolutionary Systems Department, ATR Human Information Processing Research Laboratories, 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto-fu, Kansai Science City, 619-02, Japan. tel. + 81 7749 5 1079, fax. + 81 7749 5 1008, email. degaris at hip.atr.co.jp From Mark_Kantrowitz at GLINDA.OZ.CS.CMU.EDU Thu Jul 21 15:41:56 1994 From: Mark_Kantrowitz at GLINDA.OZ.CS.CMU.EDU (Mark Kantrowitz) Date: Thu, 21 Jul 94 15:41:56 -0400 Subject: CMU Artificial Intelligence Repository Message-ID: <11485.774819716@GLINDA.OZ.CS.CMU.EDU> ** ANNOUNCING ** ++++++++++++++++++++++++++++++++++++++++++ + CMU Artificial Intelligence Repository + + and + + Prime Time Freeware for AI + ++++++++++++++++++++++++++++++++++++++++++ July 1994 The CMU Artificial Intelligence Repository was established by Carnegie Mellon University to contain public domain and freely distributable software, publications, and other materials of interest to AI researchers, educators, students, and practitioners. The AI Repository currently contains more than a gigabyte of material and is growing steadily. The AI Repository is accessible for free by anonymous FTP, AFS, and WWW. A selection of materials from the AI Repository is also being published on CD-ROM by Prime Time Freeware and should be available for purchase at AAAI-94 or direct by mail or fax from Prime Time Freeware (see below). ----------------------------- Accessing the AI Repository: ----------------------------- To access the AI Repository by anonymous FTP, ftp to: ftp.cs.cmu.edu [128.2.206.173] and cd to the directory: /user/ai/ Use username "anonymous" (without the quotes) and type your email address (in the form "user at host") as the password. To access the AI Repository by AFS (Andrew File System), use the directory: /afs/cs.cmu.edu/project/ai-repository/ai/ To access the AI Repository by WWW, use the URL: http://www.cs.cmu.edu:8001/Web/Groups/AI/html/repository.html Be sure to read the files 0.doc and readme.txt in this directory. ------------------------------- Contents of the AI Repository: ------------------------------- The AI Programming Languages and the AI Software Packages sections of the repository are "complete". These can be accessed in the lang/ and areas/ subdirectories of the AI Repository. Compression and archiving utilities may be found in the util/ subdirectory. Other directories, which are in varying states of completion, are events/ (Calendar of Events, Conference Calls) and pubs/ (Publications, including technical reports, books, mail/news archives). The AI Programming Languages section includes directories for Common Lisp, Prolog, Scheme, Smalltalk, and other AI-related programming languages. The AI Software Packages section includes subdirectories for: agents/ Intelligent Agent Architectures alife/ Artificial Life and Complex Adaptive Systems anneal/ Simulated Annealing blackbrd/ Blackboard Architectures bookcode/ Code From AI Textbooks ca/ Cellular Automata classics/ Classical AI Programs constrnt/ Constraint Processing dai/ Distributed AI discover/ Discovery and Data-Mining doc/ Documentation edu/ Educational Tools expert/ Expert Systems/Production Systems faq/ Frequently Asked Questions fuzzy/ Fuzzy Logic games/ Game Playing genetic/ Genetic Algorithms, Genetic Programming, Evolutionary Programming icot/ ICOT Free Software kr/ Knowledge Representation, Semantic Nets, Frames, ... learning/ Machine Learning misc/ Miscellaneous AI music/ Music neural/ Neural Networks, Connectionist Systems, Neural Systems nlp/ Natural Language Processing (Natural Language Understanding, Natural Language Generation, Parsing, Morphology, Machine Translation) planning/ Planning, Plan Recognition reasonng/ Reasoning (Analogical Reasoning, Case Based Reasoning, Defeasible Reasoning, Legal Reasoning, Medical Reasoning, Probabilistic Reasoning, Qualitative Reasoning, Temporal Reasoning, Theorem Proving/Automated Reasoning, Truth Maintenance) robotics/ Robotics search/ Search speech/ Speech Recognition and Synthesis testbeds/ Planning/Agent Testbeds vision/ Computer Vision The repository has standardized on using 'tar' for producing archives of files and 'gzip' for compression. ------------------------------------- Keyword Searching of the Repository: ------------------------------------- To search the keyword index by mail, send a message to: ai+query at cs.cmu.edu with one or more lines containing calls to the keys command, such as: keys lisp iteration in the message body. You'll get a response by return mail. Do not include anything else in the Subject line of the message or in the message body. For help on the query mail server, include: help instead. A Mosaic interface to the keyword searching program is in the works. We also plan to make the source code (including indexes) to this program available, as soon as it is stable. ------------------------------------------ Contributing Materials to the Repository: ------------------------------------------ Contributions of software and other materials are always welcome, but must be accompanied by an unambiguous copyright statement that grants permission for free use, copying, and distribution, such as: - a declaration that the materials are in the public domain, or - a copyright notice that states that the materials are subject to the GNU General Public License (cite version), or - some other copyright notice (we will tell you if the copying permissions are too restrictive for us to include the materials in the repository) Inclusion of materials in the repository does not modify their copyright status in any way. Materials may be placed in: ftp.cs.cmu.edu:/user/ai/new/ When you put anything in this directory, please send mail to ai+contrib at cs.cmu.edu giving us permission to distribute the files, and state whether this permission is just for the AI Repository, or also includes publication on the CD-ROM version (Prime Time Freeware for AI). We would appreciate if you would include a 0.doc file for your package; see /user/ai/new/package.doc for a template. (If you don't have the time to write your own, we can write it for you based on the information in your package.) ------------------------------------- Prime Time Freeware for AI (CD-ROM): ------------------------------------- A portion of the contents of the repository is published annually by Prime Time Freeware. The first issue consists of two ISO-9660 CD-ROMs bound into a 224-page book. Each CD-ROM contains approximately 600 megabytes of gzipped archives (more than 2 gigabytes uncompressed and unpacked). Prime Time Freeware for AI is particularly useful for folks who do not have FTP access, but may also be useful as a way of saving disk space and avoiding annoying FTP searches and retrievals. Prime Time Freeware helped establish the CMU AI Repository, and sales of Prime Time Freeware for AI will continue to help support the maintenance and expansion of the repository. It sells (list) for US$60 plus applicable sales tax and shipping and handling charges. Payable through Visa, MasterCard, postal money orders in US funds, and checks in US funds drawn on a US bank. For further information on Prime Time Freeware for AI and other Prime Time Freeware products, please contact: Prime Time Freeware 370 Altair Way, Suite 150 Sunnyvale, CA 94086 USA Tel: +1 408-433-9662 Fax: +1 408-433-0727 E-mail: ptf at cfcl.com ------------------------ Repository Maintainer: ------------------------ The AI Repository was established by Mark Kantrowitz in 1993 as an outgrowth of the Lisp Utilities Repository (established 1990) and his work on the FAQ (Frequently Asked Questions) postings for the AI, Lisp, Scheme, and Prolog newsgroups. The Lisp Utilities Repository has been merged into the AI Repository. Bug reports, comments, questions and suggestions concerning the repository should be sent to Mark Kantrowitz . Bug reports, comments, questions and suggestions concerning a particular software package should be sent to the address indicated by the author. From rreilly at nova.ucd.ie Fri Jul 22 06:49:41 1994 From: rreilly at nova.ucd.ie (Ronan Reilly) Date: Fri, 22 Jul 1994 11:49:41 +0100 Subject: Research Position - Dublin, Ireland Message-ID: RESEARCH POSITION IN CONNECTIONIST AI Applications are invited for a two-year research position in the Computer Science Department of University College Dublin in the area of connectionist and symbolic knowledge representation with particular application to the construction of expert systems. The position is funded under ESPRIT Project 8162: Quality Assessment of Living with Information Technology (QUALIT). The ideal candidate will have a good honours degree in computer science or related discipline and will have research experience in the area of hybrid connectionist/symbolic expert systems. Salary will be in the range IR15,000-18,000 depending on experience. Applications should be sent to: Dr Ronan Reilly Department of Computer Science University College Dublin Belfield Dublin 4 IRELAND or by e-mail to: rreilly at nova.ucd.ie Closing date for applications is 19 August, 1994. ------------------------------------------------------------------------------ Ronan Reilly, PhD e-mail: rreilly at nova.ucd.ie Dept. of Computer Science Tel.: +353-1-706 2475 University College Dublin Fax: +353-1-269 7262 Belfield, Dublin 4 IRELAND ------------------------------------------------------------------------------ From terry at salk.edu Fri Jul 22 20:26:05 1994 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 22 Jul 94 17:26:05 PDT Subject: ALife IV Conference Report, Hugo de Garis, ATR Message-ID: <9407230026.AA18267@salk.edu> >I remember a conversation with Chris Langton, wondering what the >next hot topic will be. We didnt know. Well, now I think I know what it >will be. I had premonitions of it listening to Terzopolous's and Sims's talks. >My feeling is that enough people are now playing around with building >artificial nervous systems, (e.g. the "3 musketeers" at Sussex, UK; Beer and >Arbib in the US; our group at ATR, Japan; Nolfi et al in Italy; etc) that the >time is ripe for the birth of a new field, which I call simply "Brain >Building". I'm sticking my neck out here, but I feel fairly confident this >will happen. I'm predicting that the field of ALife will give birth to this >new field. I'm curious to see how other people feel about this prediction. The First Annual Telluride Workshop on Neuromorphic Engineering sponsored by the NSF recently tackled this very issue. The goal is to develop a new low power autonomous technology suitable for guiding robots like the ones that Mark Tilden has been evolving. (It may be significant that Mark attended this workshop rather than the AL Meeting). Christof Koch and I helped to organize this workshop which brought together engineers and neuroscientists from academia and industry who were interested in building living creatures. Low power analog vlsi chips already exist that can analyze visual and auditory sensory inputs and cortical circuit chips are being developed by Rodney Douglas and Misha Mahowald; the principles of sensorimotor integration as studied by Dana Ballard and Richard Andersen are at the focus of the theoretical breakthroughs that will be needed to achieve the goal of autonomy in the real world by the next century. One important milestone was announced at the workshop: Reliable analog on-chip learning has been developed in Carver Mead's laboratory that will make possible adaptive mechanisms and learning at all levels of processing, as occurs in biological systems. A report on the outcome of the workshop will be made available via ftp -- an announcement will follow in August. Terry ----- From lars at eiffel.ei.dth.dk Mon Jul 25 10:59:10 1994 From: lars at eiffel.ei.dth.dk (Lars Kai Hansen) Date: Mon, 25 Jul 94 15:59:10 +0100 Subject: Report: The Error-Reject Tradeoff Message-ID: <9407251459.AA16729@eiffel.ei.dth.dk> FTP-host: eivind.ei.dth.dk [129.142.65.123] FTP-file: /dist/hansen_reject.ps.Z [30 pages; 400kb uncompressed] The following technical report is available by anonymous ftp from the Electronics Institute ftp-server. Hardcopies are not available. ------------------------------------------------------------------------------- "THE ERROR-REJECT TRADEOFF" Lars Kai Hansen Christian Liisberg Peter Salamon CONNECT, Applied Bio Cybernetics Dept. of Math. Sciences Electronics Inst. B349 DK-3390 Hundested, San Diego State University Tech. Univ. Denmark Denmark San Diego CA 92182 USA, DK-2800 Lyngby, Denmark Abstract: We investigate the error versus reject tradeoff for classifiers. Our analysis is motivated by the remarkable similarity in error-reject tradeoff curves for widely differing algorithms classifying handwritten characters. We present the data in a new scaled version that makes this universal character particularly evident. Based on Chow's theory of the error-reject tradeoff and its underlying Bayesian analysis we argue that such universality is in fact to be expected for general classification problems. Furthermore, we extend Chow's theory to classifiers working from finite samples on a broad, albeit limited class of problems. The problems we consider are effectively binary, i.e., classification problems for which almost all inputs involve a choice between the right classification and at most one predominant alternative. We show that for such problems at most half of the initially rejected inputs would have been erroneously classified. We show further that such problems arise naturally as small perturbations of the PAC model for large training sets. The perturbed model leads us to conclude that the dominant source of error comes from pairwise overlapping categories. For infinite training sets, the overlap is due to noise and/or poor preprocessing. For finite training sets there is an additional contribution from the inevitable displacement of the decision boundaries due to finiteness of the sample. In either case, a rejection mechanism which rejects inputs in a shell surrounding the decision boundaries leads to a universal form for the error-reject tradeoff. Finally we analyze a specific reject mechanism based on the extent of consensus among an ensemble of classifiers. For the ensemble reject mechanism we find an analytic expression for the error-reject tradeoff based on a maximum entropy estimate of the problem difficulty distribution. Keywords: error-reject tradeoff, handwritten digits, ensembles, neural networks. ------------------------------------------------------------------------------------ - Lars Kai Hansen lkhansen at ei.dtu.dk From ifsa95 at dep.fem.unicamp.br Mon Jul 25 15:03:20 1994 From: ifsa95 at dep.fem.unicamp.br (IFSA95) Date: Mon, 25 Jul 94 14:03:20 EST Subject: IFSA'95 Call for papers Message-ID: <9407251703.AA01463@dep.fem.unicamp.br> IFSA '95 Sixth International Fuzzy Systems Association World Congress Sao Paulo-Brazil July 22-28, 1995 ___________ First Call for Papers ___________ The International Fuzzy Systems Association is pleased to announce the sixth IFSA World Congress (IFSA '95) to be held in Sao Paulo, Brazil, from July 22nd to 28th, 1995. The goals of the congress are twofold; the first is to encourage communication between researchers throughout the world whose research either draws support from, or complements, the theory and applications of fuzzy sets related models. The second goal is to explore industrial applications of fuzzy systems technology to make systems more convenient. The theme of IFSA '95 is "New Frontiers", aiming at enlarging the horizons of research in fuzzy sets beyond its traditional areas. The congress is divided into 7 main topical areas, whose themes include, but are not limited to, the theory or applications in the topics listed below with the respective area chairs. Authors are invited to submit extended abstracts for consideration by the program committee. Four copies of 4-page extended abstracts, one paper submission form, and one cadastration form filled for each author, should be sent to the secretariat (see address below) before November 1st, 1994. Fax submissions will not be accepted. The proceedings will contain the final versions of refereed papers, prepared on camera-ready sheets. For any further inquires, please send a message to ifsa95 at dep.fem.unicamp.br. DATES _____ Tutorials July 22 - 23, 1995 Conference July 24 - 28, 1995 Demonstrations July 22 - 28, 1995 DEADLINES _________ Reception of 4 copies of 4-page extended abstracts; 1 cadastration form for each author; and 1 paper submission form - November 1st, 1994 Notification of acceptance - February 1st, 1995 Reception of final camera ready copy for proceedings - April 1st, 1995 SECRETARIAT ___________ Address : INPE/Setor de Eventos/ IFSA'95 Av. dos Astronautas, 1758 - Caixa Postal 515 12201-970 Sao Jose' dos Campos - SP - Brazil Phone: +55-123-418977 Fax: +55-123-218743 IFSA'95 MAILING LIST SUBSCRIPTION AND SPECIFIC INFORMATION __________________________________________________________ To subscribe to IFSA'95 mailing list please send a message to : listserv at cesar.unicamp.br In the body of the message please write "subscribe IFSA" along with your name, in a line by itself. For further information please contact : ifsa95 at dep.fem.unicamp.br ORGANIZERS __________ General Chairman : Armando Rocha (Brazil) Vice Chairman : George Klir (USA) Honorary Chairman : Lofti Zadeh (USA) Honorary Vice-Chairmen: J. Bezdeck (USA) D. Dubois (France) E. Sanchez (France) T. Terano (Japan) H.-J. Zimmerman (Germany) Steering Commitee Chairwoman : Sandra Sandri (Brazil) Scientific Board Chairman : Fernando Gomide (Brazil) Local Arrangements Chairman : Marcio Rillo (Brazil) AREA CHAIRS ___________ Artificial Intelligence - Ronald Yager (USA) _______________________________________________________________________ Approximate Reasoning, Knowledge Acquisition, Knowledge Representation, Expert Systems Design, Natural Language Issues, Decision Making, Computer Vision & Pattern Recognition, Distributed AI, Genetic Algorithms, Artificial Life, Evolutionary Systems, and other related topics. Engineering - Kaoru Hirota (Japan) _______________________________________________________________________ Fuzzy Control, Hybrid Control, Industrial Robots, Intelligent Robotics, Industrial Systems, Fuzzy Petri Nets, Manufacturing, and other related topics. Mathematical Foundations - Peter Klement _______________________________________________________________________ Non-Classical Logics, Category Theory, Analysis Algebra and Topology, Functional Equations, Fuzzy Measures, Approximation Theory, Evidence Theory & Probability & Statistics, Relational Equations, and other related topics. Information Sciences - Henri Prade (France) _______________________________________________________________________ Automata, Fuzzy Grammars, Formal Languages, Information Retrieval, Fuzzy Databases, Distributed Data Bases, Information Theory, Distributed Soft-Computing, and other related topics. Health Sciences, Biology, Psychology - Donna Hudson (USA) _______________________________________________________________________ Medical Diagnosis, Intelligent Patient Monitoring, Laboratory Intelligent Systems, Applications in Molecular Biology, Physiology, Pharmacology, Perception, and other related topics. Neural Nets and Hardware - Takeshi Yamakawa (Japan) _______________________________________________________________________ Neural Nets, System Identifications by Neural Networks, Neural Chips, Hardware Implementations of Neural Networks, Fuzzy Hardware and Fuzzy Logic Computers Implementation, Novel Devices for Neural & Chaotic Systems, and other related topics. Fuzzy Systems - J.L. Verdegay (Spain) _______________________________________________________________________ Fuzzy Linear and Non-linear Programming, Multiple Objective Decision Making, Fuzzy Mathematic Programming, Multiple Criteria Decision, Multi-person Decision Making, Operational Research, Management, Economic Systems, Fuzzy Databases, Distributed Data Bases, and other related topics. Also, papers not classified in the above areas. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - CADASTRATION FORM Last Name ____________________________________________________________ First name ___________________________________________________________ Organization/Affiliation _____________________________________________ ______________________________________________________________________ Address ______________________________________________________________ ______________________________________________________________________ Zip/Postal Code _____________________ City ___________________________ Country ______________________________________________________________ Telephone ___________________________________________________________ Fax __________________________________________________________________ E-mail address ______________________________________________________ 1a) Do you intend to submit a paper at the conference ? Yes __ No __ 1b) As main author ? Yes __ No __ 2) Do you intend to attend the conference ? Yes __ No __ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - PAPER SUBMISSION FORM Title ________________________________________________________________ Authors ______________________________________________________________ Reader at the conference _____________________________________________ Area of the paper (please mark only one option) [ ] Artificial Intelligence [ ] Engineering [ ] Mathematical Foundations [ ] Information Sciences [ ] Health Sciences, Biology, Psychology [ ] Neural Nets and Hardware [ ] Fuzzy Systems From jan at riks.nl Wed Jul 27 09:07:09 1994 From: jan at riks.nl (Jan Paredis) Date: Wed, 27 Jul 94 15:07:09 +0200 Subject: TR: Co-evolutionary Training of NNs Message-ID: <9407271307.AA07180@london> *** DO NOT FORWARD TO OTHER GROUPS *** The following paper is now available: TITLE: Steps towards Co-evolutionary Classification Neural Networks AUTHOR: Jan Paredis 10 pages To appear in: Proc. Artificial Life IV, R. Brooks, P. Maes (eds), MIT Press / Bradford Books. ABSTRACT This paper proposes two improvements to the genetic evolution of neural networks (NNs): life-time fitness evaluation and co-evolution. A classi- fication task is used to demonstrate the potential of these methods and to compare them with state-of-the-art evolutionary NN approaches. Furthermore, both methods are complementary: co-evolution can be used in combination with life-time fitness evaluation. Moreover, the continuous feedback associated with life-time evalua- tion paves the way for the incorporation of life-time learning. This may lead to hybrid approaches which involve genetic as well as, for example, back- propagation learning. In addition to this, life-time fitness evaluation allows an apt response to noise and changes in the problem to be solved. ------------------------------- To obtain a hardcopy send an e-mail to: jan at riks.nl Subject: NN paper request Body: Your full Snail Mail adress Then a hardcopy will be sent to you Jan Paredis RIKS Postbus 463 NL-6200 AL Maastricht The Netherlands email: jan at riks.nl tel: +31 43 253433 fax: +31 43 253155 From tishby at CS.HUJI.AC.IL Wed Jul 27 09:15:55 1994 From: tishby at CS.HUJI.AC.IL (Tali Tishby) Date: Wed, 27 Jul 1994 16:15:55 +0300 Subject: Updates on 12-ICPR, Jerusalem Message-ID: <199407271315.AA04812@humus.cs.huji.ac.il> =============================================================================== ***** Updates ***** 12th ICPR : INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION 9-13 October 1994 Renaissance Hotel, Jerusalem, Israel ***** Advance Registration Deadline: 9 August 1994 ***** ***** Authors: Camera ready due August 8 at the IEEE Computer Society ***** =============================================================================== 1. Get full updated information by sending E-Mail to icpr-info at cs.huji.ac.il. 2. A network of 15 Silicon Graphics computers and 10 NCD X-terminals, with a high-speed Internet link, will be available. Bring your Demonstrations!! You could also telnet to your own computer, of course, and read E-Mail. 3. On-Line information about Jerusalem can be obtained by telnet into "www.huji.ac.il", login as www, and then select "[1] Line Mode Interface" followed by "[3] Databases in Israel" and "[13] The Jerusalem Mosaic". Dont worry if you get some funny symbols. If you have Mosaic you can select: http://shum.cc.huji.ac.il/jeru/jerusalem.html 4. The Banquet will be a Bedouine feast, combined with a special sight-and-sound show, at the foot of Massada. An unfogettable experience! During the banquet, the following announcements will be made: * IAPR Announcement: New IAPR Executive Committee, Venue for 14-ICPR * Nomination of IAPR Fellows * Best Industry-Related Paper Award * Best-Paper-Award by the journal "Pattern Recognition" 5. The opening session of the conference will be on Monday, Oct 10, 08:30 AM: 8:30 Welcome Address: J. Aggarwal, President of IAPR 8:40 Presentation of the K.S. Fu Award 8:45 Address by the winner of the K.S. Fu Award 9:15 Welcome Address: 12-ICPR Conference Chairmen 9:30 Plenary Talk: Avnir, D. - Hebrew University - THE PATTERNED NATURE 10:00 Coffee Break 10:30 Start of 4 Parallel Sessions 6. Master Card is now also accepted for registration payments. =============================================================================== ------- End of Forwarded Message From ro2m at crab.psy.cmu.edu Wed Jul 27 14:57:31 1994 From: ro2m at crab.psy.cmu.edu (Randall C. O'Reilly) Date: Wed, 27 Jul 94 14:57:31 EDT Subject: TR: Hippocampal Conjunctive Encoding, Storage, and Recall Message-ID: <9407271857.AA27494@crab.psy.cmu.edu.psy.cmu.edu> The following Technical Report is available both electronically from our own FTP server or in hard copy form. Instructions for obtaining copies may be found at the end of this post. ======================================================================== HIPPOCAMPAL CONJUNCTIVE ENCODING, STORAGE, AND RECALL: AVOIDING A TRADEOFF Randall C. O'Reilly James L. McClelland Carnegie Mellon University Technical Report PDP.CNS.94.4 June 1994 The hippocampus and related structures are thought to be capable of: 1) representing cortical activity in a way that minimizes overlap of the representations assigned to different cortical patterns (pattern separation); and 2) modifying synaptic connections so that these representations can later be reinstated from partial or noisy versions of the cortical activity pattern that was present at the time of storage (pattern completion). We point out that there is a tradeoff between pattern separation and completion, and propose that the unique anatomical and physiological properties of the hippocampus might serve to minimize this tradeoff. We use analytical methods to determine quantitative estimates of both separation and completion for specified parameterized models of the hippocampus. These estimates are then used to evaluate the role of various properties and of the hippocampus, such as the activity levels seen in different hippocampal regions, synaptic potentiation and depression, the multi-layer connectivity of the system, and the relatively focused and strong mossy fiber projections. This analysis is focused on the feedforward pathways from the Entorhinal Cortex (EC) to the Dentate Gyrus (DG) and region CA3. Among our results are the following: 1) Hebbian synaptic modification (LTP) facilitates completion but reduces separation, unless the strengths of synapses from inactive presynaptic units to active postsynaptic units are reduced (LTD). 2) Multiple layers, as in EC to DG to CA3, allow the compounding of pattern separation, but not pattern completion. 3) The variance of the input signal carried by the mossy fibers is important for separation, not the raw strength, which may explain why the mossy fiber inputs are few and relatively strong, rather than many and relatively weak like the other hippocampal pathways. 4) The EC projects to CA3 both directly and indirectly via the DG, which suggests that the two-stage pathway may dominate during pattern separation and the one-stage pathway may dominate during completion; methods the hippocampus may use to enhance this effect are discussed. ======================================================================= Retrieval information for pdp.cns TRs: unix> ftp 128.2.248.152 # hydra.psy.cmu.edu Name: anonymous Password: ftp> cd pub/pdp.cns ftp> binary ftp> get pdp.cns.94.4.ps.Z ftp> quit unix> zcat pdp.cns.94.4.ps.Z | lpr # or however you print postscript NOTE: The compressed file is 249,429 bytes long. Uncompressed, the file is 754,521 byes long. The printed version is 41 total pages long. For those who do not have FTP access, physical copies can be requested from Barbara Dorney . From jbower at smaug.bbb.caltech.edu Thu Jul 28 14:42:12 1994 From: jbower at smaug.bbb.caltech.edu (jbower@smaug.bbb.caltech.edu) Date: Thu, 28 Jul 94 11:42:12 PDT Subject: No subject Message-ID: <9407281842.AA17673@smaug.bbb.caltech.edu> >I remember a conversation with Chris Langton, wondering what the >next hot topic will be ... >the time is ripe for the birth of a new field, which I call simply "Brain >Building". I'm sticking my neck out here, but I feel fairly confident this >will happen. I'm predicting that the field of ALife will give birth to this >new field. I'm curious to see how other people feel about this prediction. >Low power analog vlsi chips already exist that can analyze visual and auditory >sensory inputs and cortical circuit chips are being developed by Rodney Douglas >and Misha Mahowald; the principles of sensorimotor integration as >studied by Dana Ballard and Richard Andersen are at the focus of >the theoretical breakthroughs that will be needed to achieve the >goal of autonomy in the real world by the next century. So it is time for the birth of a new neural networks/AI/AL field, must be funding is getting tight. This time, is it at all possible to avoid the hype inherent in words like "Brain Building", with objectives like "building living creatures". "Neuromorphic Engineering" is bad enough. After 10 years, I continue to fail to see any intellectually justifiable reason for such descriptions. It seems to me we have not yet completely acheived the prediction of the original neural networks DARPA report that the brain of a bee would be understood within 5 years. That was 6 years ago and it seems to me that the bee still withstands our best efforts. Now we are being launched in the direction of creating autonomous life based on sensory/motor processing in primates "by the next century". Maybe it will be easier to understand the brain of a primate than a bee, but I doubt it. Or maybe it is not necessary to understand the device being emulated before building it. Last week the third annual Computational Neuroscience Meeting was held in Monterey, California. The entire meeting was devoted to experimental and theoretical studies of real biological "neural computation". That is, presentations at the meeting concerned the detailed structure and possible computational significance of real "brains". The meeting had 250 attendees from throughout the world and represented many if not most of the leading institutions and laboratories involved in studying real neural computation. Despite this fact, and despite repeated invitations via, among others, the connectionist mailing list, almost no one from the neural networks community attended. This mirrors what appears to be the remarkably common perception of neurobiologists who attend meetings like NIPS, INNS, etc. that there is little real interest, or understanding, of neurobiology in the neural networks community. That said, I would like to issue an open invitation to those of you on the connectionist mailing list who would actually like to know more about the neural systems you propose to morph. The organizers of the CNS meetings have established a new mailing list for those specifically interested in computational neurobiology. That is, for those interested in trying to figure out how real brains compute. It will be managed in much the same fashion as the connectionist mailing list with Dave Beeman in Boulder, Colorado as the first moderator. One of the initial postings will be a synopsis of the "hot research topics" in computational neuroscience that came out of the post meeting workshops (building life was not one of them). We anticipate and encourage cross-fertilization between the two mailing groups. However, it is hard to avoid the interpretation that, at least at this point, what is interesting to computational neurobiologists is apparently not of much interest to those working in neural networks and vice versa. At the same time, it is also the case that there are many fewer neurobiologists justifying their research with reference to neural networks than there are engineers claiming to be brain builders. Thus, my periodic postings. Please send your subscription requests to: comp-neuro at smaug.bbb.caltech.edu. From french at willamette.edu Thu Jul 28 20:33:29 1994 From: french at willamette.edu (Bob French) Date: Thu, 28 Jul 1994 17:33:29 -0700 Subject: TR: reduce catastrophic interference w. context biasing Message-ID: <199407290033.RAA26115@jupiter.willamette.edu> The following paper is now available from the Ohio State neuroprose archive. It will be presented at the Cognitive Science Society Conference in Atlanta in August. It is six pages long. The work presented in this paper will be part of a larger paper on catastrophic interference to appear later this fall. Any comments will be welcome. Dynamically constraining connectionist networks to produce distributed, orthogonal representations to reduce catastrophic interference Robert M. French Department of Psychology University of Wisconsin Madison, WI 53713 email: french at head.neurology.wisc.edu or: french at willamette.edu It is now well known that when a connectionist network is trained on one set of patterns and then attempts to add new patterns to its repertoire, catastrophic interference may result. The use of sparse, orthogonal hidden-layer representations has been shown to reduce catastrophic interference. The author demonstrates that the use of sparse representations not only adversely affects a network's ability to generalize but may, in certain cases, also result in worse performance on catastrophic interference. This paper argues for the necessity of maintaining hidden-layer representations that are both as highly distributed and as highly orthogonal as possible. The author presents a fast recurrent learning algorithm, called context-biasing, that dynamically solves the problem of constraining hidden-layer representations to simultaneously produce good orthogonality and distributedness. On the data tested for this study, context-biasing is shown to reduce catastrophic interference by more than 50% compared to standard backpropagation. In particular, this technique succeeds in reducing catastrophic interference on data where sparse, orthogonal distributions failed to produce any improvement. Retrieve this paper by anonymous ftp from; archive.cis.ohio-state.edu (128.146.8.52). in the pub/neuroprose directory The name of the paper in this archive is: french.context-biasing.ps.Z For those without ftp access, write to me at: Robert M. French Dept. of Psychology University of Wisconsin Madison, Wisconsin 53706 and I'll send you hard copy. From massone at mimosa.eecs.nwu.edu Fri Jul 29 12:31:51 1994 From: massone at mimosa.eecs.nwu.edu (Lina Massone) Date: Fri, 29 Jul 94 11:31:51 CDT Subject: Two papers available on arm movements Message-ID: <9407291631.AA11679@mimosa.eecs.nwu.edu> The following two papers are available from the neuroprose archive. The papers are currently Technical Reports of the Neural Information Processing Laboratory of Northwestern University and have been submitted for publication. ftp-host:archive.cis.ohio-state.edu ftp-file: massone.arm_model.ps.Z A Neural Network Model of an Anthropomorphic Arm Lina L.E. Massone and Jennifer D. Myers Abstract This paper introduces a neural network model of a planar redundant arm whose structure and operation principles were inspired by those of the human arm. We developed the model for two purposes. One purpose was to study the relative role of control strategies and plant properties in trajectory formation, namely which features of simple arm movements can be attributed to the properties of the plant alone. We address this matter in a companion paper [the next paper]. The second purpose was a motor-learning one: to design an arm model that, because of its neural-network quality, can be eventually incorporated in a parallel distributed learning scheme for the arm controller. We modeled the arm with two joints (shoulder and elbow) and six muscle-like actuators: a pair of antagonist shoulder muscles, a pair of antagonist elbow muscles and a pair of antagonist double-joint muscles. The arm was allowed to move in the horizontal plane subject to the action of gravity. The model computes the transformation between the control signals that activate the muscle-like actuators and the coordinates of the arm endpoint. This transformation comprises four interacting stages (muscle dynamics, joint geometry, forward arm dynamics, forward arm kinematics) that we modeled with a number of feedforward and recurrent neural networks. In this paper we introduce and describe in detail the modeling methods, that are efficient, highly flexible (some of the resulting networks can be easily modified to accommodate different parametric choices and temporal scales), and quite general and hence applicable to a number of different scientific domains. ******************** ftp-host: archive.cis.ohio-state.edu ftp-file: massone.plant_properties.ps.Z A Study of the Role of Plant Properties in Arm Trajectory Formation Lina L.E. Massone and Jennifer D. Myers Abstract This paper describes the response of a neural-network model of an anthropomorphic arm to various patterns of activation of the arm muscles. The arm model was introduced and described in detail in [the previous paper]. The purpose of the simulation experiments presented here is to study the relative role of control strategies and plant properties in trajectory formation, namely which features of simple arm movements can be attributed to the properties of the plant alone -- a study that might provide some guidelines for the design of artificial arms. Our simulations demonstrate the performance of the model at steady-state, what movements the model produces in response to various activations of its muscles, and the generalization abilities of the recurrent neural network that implements the forward dynamic transformation. The results of our simulations emphasize the role of the intrinsic properties of the plant in generating movements with anthropomorphic qualities such as smoothness and unimodal velocity profiles and demonstrate that the task of an eventual controller for such an arm could be simply that of programming the amplitudes and durations of steps of neural input without considering additional motor details. Our findings are relevant to the design of artificial arms and, with some caveats, to the study of brain strategies in the arm motor system. ******************* From pfbaldi at Juliet.Caltech.Edu Fri Jul 29 10:11:04 1994 From: pfbaldi at Juliet.Caltech.Edu (Pierre F. Baldi) Date: Fri, 29 Jul 94 07:11:04 PDT Subject: New paper: How delays affect neural dynamics and learning Message-ID: <940729071104.28808357@Juliet.Caltech.Edu> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/baldi.delays1.ps.Z FTP-filename: /pub/neuroprose/baldi.delays2.ps.Z The following paper is available from the Ohio State neuroprose archive. It is scheduled to appear in: IEEE Transactions on Neural Networks, Vol. 5, 4, 626-635 (1994). How delays affect neural dynamics and learning P. Baldi JPL/Caltech A. Atiya Cairo University email: pfbaldi at juliet.caltech.edu or: amir at csvax.cs.caltech.edu We investigate the effects of delays on the dynamics and, in particular, the oscillatory properties of simple artificial neural network models. We treat in detail the case of ring networks, for which we derive simple conditions for oscillating behavior, and several formulas to predict the regions of bifurcation, the periods of the limit cycles and the phases of the various neurons. These results in turn can be applied to more complex architectures. In general, delays tend to increase the period of oscillations and broaden the spectrum of possible frequencies, in a quantifiable way. Theoretically predicted values are in excellent agreement with simulations. Adjustable delays are then proposed as one additional mechanism through which neural systems could taylor their own dynamics. Recurrent back-propagation learning equations are derived for the adjustment of delays and other parameters in networks with delayed interactions and applications are briefly discussed. Retrieve this paper by anonymous ftp from: archive.cis.ohio-state.edu (128.146.8.52) in the /pub/neuroprose directory The name of the paper in this archive is: baldi.delays1.ps.Z [24 pages] baldi.delays2.ps.Z (figures)[5 pages  pages] No hard copies available. From Connectionists-Request at cs.cmu.edu Fri Jul 1 00:05:15 1994 From: Connectionists-Request at cs.cmu.edu (Connectionists-Request@cs.cmu.edu) Date: Fri, 01 Jul 94 00:05:15 -0400 Subject: Bi-monthly Reminder Message-ID: <672.773035515@B.GP.CS.CMU.EDU> *** DO NOT FORWARD TO ANY OTHER LISTS *** This note was last updated May 5, 1994. This is an automatically posted bi-monthly reminder about how the CONNECTIONISTS list works and how to access various online resources. CONNECTIONISTS is a moderated forum for enlightened technical discussions and professional announcements. It is not a random free-for-all like comp.ai.neural-nets. Membership in CONNECTIONISTS is restricted to persons actively involved in neural net research. The following posting guidelines are designed to reduce the amount of irrelevant messages sent to the list. Before you post, please remember that this list is distributed to thousands of busy people who don't want their time wasted on trivia. Also, many subscribers pay cash for each kbyte; they shouldn't be forced to pay for junk mail. -- Dave Touretzky & David Redish --------------------------------------------------------------------- What to post to CONNECTIONISTS ------------------------------ - The list is primarily intended to support the discussion of technical issues relating to neural computation. - We encourage people to post the abstracts of their latest papers and tech reports. - Conferences and workshops may be announced on this list AT MOST twice: once to send out a call for papers, and once to remind non-authors about the registration deadline. A flood of repetitive announcements about the same conference is not welcome here. - Requests for ADDITIONAL references. This has been a particularly sensitive subject. Please try to (a) demonstrate that you have already pursued the quick, obvious routes to finding the information you desire, and (b) give people something back in return for bothering them. The easiest way to do both these things is to FIRST do the library work to find the basic references, then POST these as part of your query. Here's an example: WRONG WAY: "Can someone please mail me all references to cascade correlation?" RIGHT WAY: "I'm looking for references to work on cascade correlation. I've already read Fahlman's paper in NIPS 2, his NIPS 3 abstract, corresponded with him directly and retrieved the code in the nn-bench archive. Is anyone aware of additional work with this algorithm? I'll summarize and post results to the list." - Announcements of job openings related to neural computation. - Short reviews of new textbooks related to neural computation. To send mail to everyone on the list, address it to Connectionists at CS.CMU.EDU ------------------------------------------------------------------- What NOT to post to CONNECTIONISTS: ----------------------------------- - Requests for addition to the list, change of address and other administrative matters should be sent to: "Connectionists-Request at cs.cmu.edu" (note the exact spelling: many "connectionists", one "request"). If you mention our mailing list to someone who may apply to be added to it, please make sure they use the above and NOT "Connectionists at cs.cmu.edu". - Requests for e-mail addresses of people who are believed to subscribe to CONNECTIONISTS should be sent to postmaster at appropriate-site. If the site address is unknown, send your request to Connectionists-Request at cs.cmu.edu and we'll do our best to help. A phone call to the appropriate institution may sometimes be simpler and faster. - Note that in many mail programs a reply to a message is automatically "CC"-ed to all the addresses on the "To" and "CC" lines of the original message. If the mailer you use has this property, please make sure your personal response (request for a Tech Report etc.) is NOT broadcast over the net. ------------------------------------------------------------------------------- The CONNECTIONISTS Archive: --------------------------- All e-mail messages sent to "Connectionists at cs.cmu.edu" starting 27-Feb-88 are now available for public perusal. A separate file exists for each month. The files' names are: arch.yymm where yymm stand for the obvious thing. Thus the earliest available data are in the file: arch.8802 Files ending with .Z are compressed using the standard unix compress program. To browse through these files (as well as through other files, see below) you must FTP them to your local machine. The file "current" in the same directory contains the archives for the current month. ------------------------------------------------------------------------------- How to FTP Files from the CONNECTIONISTS Archive ------------------------------------------------ 1. Open an FTP connection to host B.GP.CS.CMU.EDU 2. Login as user anonymous with password your username. 3. 'cd' directly to the following directory: /afs/cs/project/connect/connect-archives The archive directory is the ONLY one you can access. You can't even find out whether any other directories exist. If you are using the 'cd' command you must cd DIRECTLY into this directory. Problems? - contact us at "Connectionists-Request at cs.cmu.edu". ------------------------------------------------------------------------------- Using Mosaic and the World Wide Web ----------------------------------- You can also access these files using the following url: ftp://b.gp.cs.cmu.edu/afs/cs/project/connect/connect-archives ---------------------------------------------------------------------- The NEUROPROSE Archive ---------------------- Anonymous FTP on archive.cis.ohio-state.edu (128.146.8.52) pub/neuroprose directory This directory contains technical reports as a public service to the connectionist and neural network scientific community which has an organized mailing list (for info: connectionists-request at cs.cmu.edu) Researchers may place electronic versions of their preprints in this directory, announce availability, and other interested researchers can rapidly retrieve and print the postscripts. This saves copying, postage and handling, by having the interested reader supply the paper. We strongly discourage the merger into the repository of existing bodies of work or the use of this medium as a vanity press for papers which are not of publication quality. PLACING A FILE To place a file, put it in the Inbox subdirectory, and send mail to pollack at cis.ohio-state.edu. Within a couple of days, I will move and protect it, and suggest a different name if necessary. Current naming convention is author.title.filetype.Z where title is just enough to discriminate among the files of the same author. The filetype is usually "ps" for postscript, our desired universal printing format, but may be tex, which requires more local software than a spooler. The Z indicates that the file has been compressed by the standard unix "compress" utility, which results in the .Z affix. To place or retrieve .Z files, make sure to issue the FTP command "BINARY" before transfering files. After retrieval, call the standard unix "uncompress" utility, which removes the .Z affix. An example of placing a file is in the appendix. Make sure your paper is single-spaced, so as to save paper, and include an INDEX Entry, consisting of 1) the filename, 2) the email contact for problems, 3) the number of pages and 4) a one sentence description. See the INDEX file for examples. ANNOUNCING YOUR PAPER It is the author's responsibility to invite other researchers to make copies of their paper. Before announcing, have a friend at another institution retrieve and print the file, so as to avoid easily found local postscript library errors. And let the community know how many pages to expect on their printer. Finally, information about where the paper will/might appear is appropriate inside the paper as well as in the announcement. Please add two lines to your mail header, or the top of your message, so as to facilitate the development of mailer scripts and macros which can automatically retrieve files from both NEUROPROSE and other lab-specific repositories: FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/filename.ps.Z When you announce a paper, you should consider whether (A) you want it automatically forwarded to other groups, like NEURON-DIGEST, (which gets posted to comp.ai.neural-networks) and if you want to provide (B) free or (C) prepaid hard copies for those unable to use FTP. To prevent forwarding, place a "**DO NOT FORWARD TO OTHER GROUPS**" at the top of your file. If you do offer hard copies, be prepared for a high cost. One author reported that when they allowed combination AB, the rattling around of their "free paper offer" on the worldwide data net generated over 2000 hardcopy requests! A shell script called Getps, written by Tony Plate, is in the directory, and can perform the necessary retrieval operations, given the file name. Functions for GNU Emacs RMAIL, and other mailing systems will also be posted as debugged and available. At any time, for any reason, the author may request their paper be updated or removed. For further questions contact: Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Phone: (614) 292-4890 APPENDIX: Here is an example of naming and placing a file: gvax> cp i-was-right.txt.ps rosenblatt.reborn.ps gvax> compress rosenblatt.reborn.ps gvax> ftp cheops.cis.ohio-state.edu Connected to cheops.cis.ohio-state.edu. 220 cheops.cis.ohio-state.edu FTP server ready. Name: anonymous 331 Guest login ok, send ident as password. Password:neuron 230 Guest login ok, access restrictions apply. ftp> binary 200 Type set to I. ftp> cd pub/neuroprose/Inbox 250 CWD command successful. ftp> put rosenblatt.reborn.ps.Z 200 PORT command successful. 150 Opening BINARY mode data connection for rosenblatt.reborn.ps.Z 226 Transfer complete. 100000 bytes sent in 3.14159 seconds ftp> quit 221 Goodbye. gvax> mail pollack at cis.ohio-state.edu Subject: file in Inbox. Jordan, I just placed the file rosenblatt.reborn.ps.Z in the Inbox. Here is the INDEX entry: rosenblatt.reborn.ps.Z rosenblatt at gvax.cs.cornell.edu 17 pages. Boastful statements by the deceased leader of the neurocomputing field. Let me know when it is in place so I can announce it to Connectionists at cmu. Frank ^D AFTER FRANK RECEIVES THE GO-AHEAD, AND HAS A FRIEND TEST RETRIEVE THE FILE, HE DOES THE FOLLOWING: gvax> mail connectionists Subject: TR announcement: Born Again Perceptrons FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/rosenblatt.reborn.ps.Z The file rosenblatt.reborn.ps.Z is now available for copying from the Neuroprose repository: Born Again Perceptrons (17 pages) Frank Rosenblatt Cornell University ABSTRACT: In this unpublished paper, I review the historical facts regarding my death at sea: Was it an accident or suicide? Moreover, I look over the past 23 years of work and find that I was right in my initial overblown assessments of the field of neural networks. ~r.signature ^D ------------------------------------------------------------------------ How to FTP Files from the NN-Bench Collection --------------------------------------------- 1. Create an FTP connection from wherever you are to machine "pt.cs.cmu.edu" (128.2.254.155). 2. Log in as user "anonymous" with password your username. 3. Change remote directory to "/afs/cs/project/connect/bench". Any subdirectories of this one should also be accessible. Parent directories should not be. 4. At this point FTP should be able to get a listing of files in this directory and fetch the ones you want. Problems? - contact us at "nn-bench-request at cs.cmu.edu". From pitt at cs.uiuc.edu Fri Jul 1 15:21:49 1994 From: pitt at cs.uiuc.edu (Lenny Pitt) Date: Fri, 01 Jul 94 15:21:49 EDT Subject: ML/COLT tutorial -- Computational Learning Theory Intro & Survey Message-ID: <199407012021.AA06535@pitt.cs.uiuc.edu> ========================================================= Computational Learing Theory: Introduction and Survey ========================================================= Sunday, July 10, 1994 8:45 am to 12:15 pm Rutgers University New Brunswick, New Jersey Tutorial conducted by Lenny Pitt University of Illinois Urbana, IL 61801 pitt at cs.uiuc.edu Held in conjunction with the Eleventh International Conference on Machine Learning (ML94, July 11-13, 1994) and the Seventh Annual Conference on Computational Learning Theory (COLT94, July 12-15, 1994). This tutorial will introduce the different formal learning models (eg, ``pac'' learning, mistake-bounded learning, learning with queries), present basic techniques for proving learnability and nonlearnability (eg, the VC-dimension, Occam algorithms, reductions between learning problems), and survey many of the central results in the area. The tutorial is designed to give ML attendees and those with a general interest in machine learning sufficient background to appreciate past and recent results in computational learning theory. It should also help attendees appreciate the significance and contributions of the papers that will be presented at the COLT94 conference that follows. No prior knowledge of learning theory is assumed. The tutorial is one of a set of DIMACS-sponsored tutorials that are free and open to the general public. Directions to Rutgers can be found in the ML/COLT announcement, which is available via anonymous ftp from www.cs.rutgers.edu in the directory "/pub/learning94". The specific location of the tutorial will be posted, and available with conference materials. Users of www information servers such as mosaic can find the information at "http://www.cs.rutgers.edu/pub/learning94/learning94.html". Other available information includes a campus map, and abstracts of all workshops/tutorials. Questions can be directed to ml94 at cs.rutgers.edu, or to colt94 at research.att.com. From zoon at zoon.postech.ac.kr Wed Jul 6 08:29:09 1994 From: zoon at zoon.postech.ac.kr (Prof. Cho Sungzoon) Date: Wed, 6 Jul 94 08:29:09 KDT Subject: NN-SMP-95 Call for Papers Message-ID: <9407052229.AA28058@zoon.postech.ac.kr.noname> =========================================================================== --------------------------------------------------------------------------- --- CALL FOR PAPERS --- Workshop on the Theory of Neural Networks: The Statistical Mechanics Perspective Sponsored by Center for Theoretical Physics at Seoul National University and Basic Science Research Institute at POSTECH February 2-4, 1995 Pohang University of Science and Technology, Pohang, Korea =========================================================================== During the last decade, methods of statistical mechanicswere successfully applied to the theory of neural networks.The Study of neural networks became an important part of statistical physics and the results influenced neighboring fields. In this workshop, we will review the status of the statistical physics of neural networks and discuss the future directions. We invite papers on the theory of neural networks both from statistical physics community and outside. We look forward to active interdisciplinary discussions, and encourage participation from related fields such as non-linear dynamics, computer science, mathematics, statistics, information theory and neurobiology. Invited speakers S. Amari (Tokyo Univ.) H. Sompolinksy (Hebrew Univ.) D. Haussler (UCSC) I. Kanter (Bar Ilan Univ.) M. Kearns (AT\&T) M. Opper (U. Wuerzburg) G. M. Shim(K.U. Leuven) H. S. Seung(AT\&T) K. Y. M. Wong (HKUST) and more. Abstract Submission Authors should submit six-copies of an abstract to be received by Tuesday, November 15, 1994, to Jong-Hoon Oh - NNSMP Department of Physics Pohang University of Science and Technology(POSTECH) Hyoja San 31, Pohang, Kyongbuk 790-784, Korea nnsmp at galaxy.postech.ac.kr An e-mail submission of the abstract is also possible to the above e-mail address. The abstract should include title, authors' names, affiliations, postal and e-mail addresses, telephone and fax numbers if any. The body of the abstract should be no longer than 300 words. A full paper should be submitted on venue to be included in the proceedings. Program Format We encourage informal discussions between small group of participants during workshop. Invited talks and a limited number of contributed talks will be presented in the oral session. Most of the contributed works will be presented via poster session. Tour of Kyoung-Ju(a 2000 years old city) is a part of the workshop. Registration In order to take advantage of a small workshop, we would like to maintain the number of the participants at an appropriate size. If you are interested in participation, please inform us of your intention as early as possible. Detailed registration information will be distributed via e-mail. Ask for following announcements to nnsmp-info at galaxy.postech.ac.kr. Advisory Committee S. Amari(Tokyo Univ.), S. Y. Bang(POSTECH), S. I. Choi(POSTECH), K. C. Lee(SNU), H. Sompolinsky(Hebrew Univ.). Local Organizing Committee S. Cho(POSTECH), M. Y. Choi(SNU), D. Kim(SNU), S. Kim(POSTECH), C. Kwon(Myoung-Ji U.), J.-H. Oh(POSTECH). Program Committee I. Kanter(Bat Ilan), J.-H. Oh(POSTECH), H. S. Seung(AT\&T). From mccallum at cs.rochester.edu Wed Jul 6 15:54:40 1994 From: mccallum at cs.rochester.edu (mccallum@cs.rochester.edu) Date: Wed, 06 Jul 94 15:54:40 -0400 Subject: paper available by ftp Message-ID: <199407061954.PAA04928@slate.cs.rochester.edu> ------- Blind-Carbon-Copy From mccallum at cs.rochester.edu Wed Jul 6 15:54:40 1994 From: mccallum at cs.rochester.edu (mccallum@cs.rochester.edu) Date: Wed, 06 Jul 94 15:54:40 -0400 Subject: paper available by ftp Message-ID: FTP-host: ftp.cs.rochester.edu FTP-file: pub/papers/robotics/94.mccallum-tr502.ps.Z 27 pages. "First Results with Instance-Based State Identification for Reinforcement Learning" R. Andrew McCallum Department of Computer Science University of Rochester Technical Report 502 When a reinforcement learning agent's next course of action depends on information that is hidden from the sensors because of problems such as occlusion, restricted range, bounded field of view and limited attention, we say the agent suffers from the Hidden State Problem. State identification techniques use history information to uncover hidden state. Previous approaches to encoding history include: finite state machines [Chrisman 1992; McCallum 1992], recurrent neural networks [Lin 1992] and genetic programming with indexed memory [Teller 1994]. A chief disadvantage of all these techniques is their long training time. This report presents Instance-Based State Identification, a new approach to reinforcement learning with state identification that learns with much fewer training steps. Noting that learning with history and learning in continuous spaces both share the property that they begin without knowing the granularity of the state space, the approach applies instance-based (or ``memory-based'') learning to history sequences---instead of recording instances in a continuous geometrical space, we record instances in action-perception-reward sequence space. The first implementation of this approach, called Nearest Sequence Memory, learns with an order of magnitude fewer steps than several previous approaches. The paper is also available through the http URL below: R. Andrew McCallum EBOX: mccallum at cs.rochester.edu Computer Science Dept VOX: (716) 275-2527, (716) 275-1372 (lab) University of Rochester FAX: (716) 461-2018 Rochester, NY 14627-0226 http://www.cs.rochester.edu/u/mccallum ------- End of Blind-Carbon-Copy From holm at thep.lu.se Thu Jul 7 08:18:43 1994 From: holm at thep.lu.se (Holm Schwarze) Date: Thu, 7 Jul 1994 14:18:43 +0200 (MET DST) Subject: Preprint announcement: Learning by Online Gradient Descent Message-ID: <9407071218.AA23942@dacke.thep.lu.se> A non-text attachment was scrubbed... Name: not available Type: text Size: 1888 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/4306561b/attachment-0001.ksh From dlovell at elec.uq.oz.au Fri Jul 8 15:03:14 1994 From: dlovell at elec.uq.oz.au (David Lovell) Date: Fri, 8 Jul 94 14:03:14 EST Subject: Thesis available: The Neocognitron...Limitations and Improvements Message-ID: <9407080403.AA11615@s1.elec.uq.oz.au> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/Thesis/lovell.thesis.tar Title: The Neocognitron as a System for Handwritten Character Recognition: Limitations and Improvements Size: 3080192 bytes (compressed), 234 pages (10 point, single spaced, double-sided format), 100 figures. No hardcopies available (sorry!) Hi folks, This is just to let you know that my doctoral dissertation can be retrieved from the Neuroprose archive. I know 3Mb is pretty hefty for a compressed file but there are a lot of detailed PostScript figures and bitmaps in the document. Here's how to print it out (once you have retrieved it): tar xf lovell.thesis.tar cd dlovell-thesis zcat ch0.ps.Z | lpr -P(your PostScript printer) zcat ch1.ps.Z | lpr -P(your PostScript printer) zcat ch2.ps.Z | lpr -P(your PostScript printer) etc... I have written a c-shell script called "print-thesis.csh" which will automate the uncompressing and printing process for you. The README file contains an explanation of how to use "print-thesis.csh". I hope that the thesis will be useful (or at least interesting) to anyone working in the area of off-line character recognition with hierarchical neural networks. Best regards, David ------------------------------------------------------------------------------- The Neocognitron as a System for Handwritten Character Recognition: Limitations and Improvements by David R. Lovell A thesis submitted for the degree of Doctor of Philosophy Department of Electrical and Computer Engineering, University of Queensland. ABSTRACT This thesis is about the neocognitron, a neural network that was proposed by Fukushima in 1979. Inspired by Hubel and Wiesel's serial model of processing in the visual cortex, the neocognitron was initially intended as a self-organizing model of vision, however, we are concerned with the supervised version of the network, put forward by Fukushima in 1983. Through "training with a teacher", Fukushima hoped to obtain a character recognition system that was tolerant of shifts and deformations in input images. Until now though, it has not been clear whether Fukushima's approach has resulted in a network that can rival the performance of other recognition systems. In the first three chapters of this thesis, the biological basis, operational principles and mathematical implementation of the supervised neocognitron are presented in detail. At the end of this thorough introduction, we consider a number of important issues that have not previously been addressed (at least not with any proven degree of success). How should S-cell selectivity and other parameters be chosen so as to maximize the network's performance? How sensitive is the network's classification ability to the supervisor's choice of training patterns? Can the neocognitron achieve state-of-the-art recognition rates and, if not, what is preventing it from doing so? Chapter 4 looks at the Optimal Closed-Form Training (OCFT) algorithm, a method for adjusting S-cell selectivity, suggested by Hildebrandt in 1991. Experiments reveal flaws in the assumptions behind OCFT and provide motivation for the development and testing (in Chapter 5) of three new algorithms for selectivity adjustment: SOFT, SLOG and SHOP. Of these methods, SHOP is shown to be the most effective, determining appropriate selectivity values through the use of a validation set of handwritten characters. SHOP serves as a method for probing the behaviour of the neocognitron and is used to investigate the effect of cell masks, skeletonization of input data and choice of training patterns on the network's performance. Even though SHOP is the best selectivity adjustment algorithm to be described to date, the system's peak correct recognition rate (for isolated ZIP code digits from the CEDAR database) is around 75% (with 75% reliability) after SHOP training. It is clear that the neocognitron, as originally described by Fukushima, is unable to match the performance of today's most accurate digit recognition systems which typically achieve 90% correct recognition with near 100% reliability. After observing the neocognitron's failure to exploit the distinguishing features of different kinds of digits in its classification of images, Chapter 6 proposes modifications to enhance the networks ability in this regard. Using this new architecture, a correct classification rate of 84.62% (with 96.36% reliability) was obtained on CEDAR ZIP codes, a substantial improvement but still a level of performance that is somewhat less than state-of-the-art recognition rates. Chapter 6 concludes with a critical review of the hierarchical feature extraction paradigm. The final chapter summarizes the material presented in this thesis and draws the significant findings together in a series of conclusions. In addition to the investigation of the neocognitron, this thesis also contains a derivation of statistical bounds on the errors that arise in multilayer feedforward networks as a result of weight perturbation (Appendix E). ------------------------------------------------------------------------------ David Lovell - dlovell at elec.uq.oz.au | | Dept. Electrical and Computer Engineering | "Oh bother! The pudding is ruined University of Queensland | completely now!" said Marjory, as BRISBANE 4072 | Henry the dachshund leapt up and Australia | into the lemon surprise. | tel: (07) 365 3770 | From pjh at compsci.stirling.ac.uk Fri Jul 8 12:50:59 1994 From: pjh at compsci.stirling.ac.uk (Peter J.B. Hancock) Date: 8 Jul 94 12:50:59 BST (Fri) Subject: NCPW3: Programme & registration (176 lines) Message-ID: <9407081250.AA21688@uk.ac.stir.cs.nevis> CALL FOR PARTICIPATION ********************** Third Neural Computation and Psychology Workshop. Wednesday Aug 31 - Fri Sept 2 1994. Location: University of Stirling, Scotland, UK. Provisional Programme: Tue Aug 30th 14:00-19:00 registration 19:30 reception Wed Aug 31st 09:00 Introduction. Session 1: Cognition. Chair: Prof Vicky Bruce. 09:10 David Willshaw (Centre for Cognitive Science, University of Edinburgh),: title tba 09:50 Dienes Z., Altmann G., Gao S-J, Goode A. (Experimental Psychology, University of Sussex), Mapping across domains without feedback: a neural network model of transfer of information. 10:25 Coffee 10:50 Bullinaria J. (Dept of Psychology, University of Edinburgh), Modelling reaction times 11:25 Glasspool D., Houghton G., Shallice T. (Dept of Psychology, University College, London), Interactions between knowledge sources in a dual-route connectionist model of spelling 12:00 Slack J.M. (Institute of Social and Applied Psychology, University of Kent), Distributed representations: a capacity constraint. 12:35 Lunch Session 2: Low-Level perception. Chair: Peter Hancock 14:00 Stone J.V. (Cognitive and Computing Sciences, University of Sussex), Learning spatio-temporal visual invariances using a self-organising neural network model. 14:35 Baddeley R. (University of Oxford), A Bayesian framework for understanding topographic map formation. 15:10 Tea 15:40 Fyfe C., Baddeley R. (University of Oxford), Edge sensitivity from exploratory projection pursuit. 16:15 Herrmann M., Bauer H.-U, Der R. (Nordita, Copenhagen, Inst. f. Theor Physik, Universitaet Frankfurt, and Inst f. Informatik, Universitaet Leipzig), The "perceptual magnet" effect: a model based on self-organizing maps. 16:40 Smyth D, Phillips W.A. , Kay J.W. (Dept of Psychology, University of Stirling, and SASS, Aberdeen), Discovery of high-order functions in multi-stream, multi-stage nets without external supervision. 17:15 Session ends Thur Sept 1st Session 3: Audition. Chair: Prof David Willshaw. 09:00 Meddis R. (Dept of Human Sciences, Loughborough University of Technology), The conceptual basis of modelling auditory processing in the brainstem. 09:45 Scott S. (MRCAPU, Cambridge), `Beats' in speech and music: a model of the perceptual centres of acoustic signals. 10:20 Coffee 10:45 Smith L. (Dept of Computing Science, University of Stirling), Data-driven Sound Segmentation. 11:20 Beauvois M.W. , Meddis R., (IRCAM, Paris, and Dept of Human Sciences, Loughborough University of Technology), Computer simulation of auditory stream segregation in pure-tone sequences. 11:55 Wiles J., Stevens C. (Depts of Computer Science and Psychology, University of Queensland), Music as a Componential Code: Acquisition and Representation of Temporal Relationships in a Simple Recurrent Network 12:30 Session Ends. 12:40 : Poster introductions (3-5 mins per poster). 13:10 Lunch 14:10 Poster Session. CCCN 15:45 Tea Session 4: Sequence Learning. Chair: Leslie Smith. 16:15 Lovatt P.J., Bairaktaris D. (Dept of Computing Science, University of Stirling), A computational account of phonologically mediated free recall. 16:50 Harris K.D., Sanford A.J. (Dept of Psychology, University of Glasgow), Connectionist and process modelling of long-term sequence: the integration of relative judgements, representation and learning. 17:25 Bradbury D. (Human Cognition Research Lab, Open University), A model of aspects of visual perception that uses temporal neural network techniques. 18:00 Session Ends 19:00 Conference Dinner. To be arranged. Fri Sept 2nd Session 5: Vision 2. Chair: Prof Roger Watt 09:30 Tao L-M, (IIASS, Vietri sul Mare, Italy) Compuatational color vision: the force of combination computational theory with ecological optics. 10:05 Shillcock R., Cairns P. (Centre for Cognitive Science, University of Edinburgh), Connectionist modelling of the neglect syndrome. 10:40 Coffee 11:20 Smith K.J., Humphreys G.W. (School of Psychology, University of Birmingham), Mechanisms of visual search:an implementation of guided search. 11:55 Burton M. (Dept of Psychology, University of Stirling), title tba 12:30 Lunch End of Conference *************************************** NCPW3: Neural Computing And Psychology Workshop Aug 31 - Sept 2 1994 Cottrell Building, University of Stirling. Registration Form: Please return this form (by post) to Dr. Leslie Smith, Department of Computing Science, University of Stirling Stirling FK9 4LA, Scotland. Conference fee: Before 1 August #60 After 1 August #80 ---------- This includes coffees and lunches on all three days. | Dinner (#6.72) | B&B(#16.08) +------------------------------------------------------ Tues | | +------------------------------------------------------ Wed | | +------------------------------------------------------ Thur | Conference Dinner (#20) | +------------------------------------------------------ Fri | | Dinner and B&B subtotal: ---------- Total ____________ Payment may be made by * Cheque in UK# drawn on a UK bank * Eurocheque in UK# * Bank transfer (in UK#) to Account: University of Stirling No 1 Bank of Scotland, Craigs House, 78 Upper Craigs, Stirling FK8 2DE, Scotland. Sort code: 80 91 29 Account number: 00891500 Mark the transfer: NCPW3. Charges must be paid by sender. Please enclose a copy of the proof of transfer with the registration form. (The University of Stirling does not accept credit cards.) Name Address/Affiliation Email Telephone Fax Will you be bringing a poster? ____________________ Note all prices in UK pounds. From john at dcs.rhbnc.ac.uk Fri Jul 8 10:37:02 1994 From: john at dcs.rhbnc.ac.uk (john@dcs.rhbnc.ac.uk) Date: Fri, 08 Jul 94 15:37:02 +0100 Subject: Technical Report Series in Neural and Computational Learning Message-ID: <14468.9407081437@platon.cs.rhbnc.ac.uk> The European Community ESPRIT Basic Research Action is funding a Working Group in the area of Neural and Computational Learning Theory involving 10 European sites. As part of its activities the NeuroCOLT Working Group is maintaining a Technical Report Series of the ongoing work of its members at the coordinating site of Royal Holloway, University of London. This message is to announce the instalment of the first two reports and to describe how they can be accessed. -------------------------------------- NeuroCOLT Technical Report NC-TR-94-1: -------------------------------------- Computing over the Reals with Addition and Order: Higher Complexity Classes by Felipe Cucker and Pascal Koiran Abstract: This paper deals with issues of structural complexity in a linear version of the Blum-Shub-Smale model of computation over the real numbers. Real versions of $\pspace$ and of the polynomial time hierarchy are defined, and their properties are investigated. Mainly two types of results are presented: \begin{itemize} \item Equivalence between quantification over the real numbers and over $\{0,1\}$; \item Characterizations of recognizable subsets of $\{0,1\}^*$ in terms of familiar discrete complexity classes. \end{itemize} The complexity of the decision and quantifier elimination problems in the theory of the reals with addition and order is also studied. -------------------------------------- NeuroCOLT Technical Report NC-TR-94-3: -------------------------------------- Probabilistic Analysis of Learning in Artificial Neural Networks: The PAC Model and its Variants by Martin Anthony Abstract: This report (72 pages) surveys the probably approximately correct model of machine learning, with emphasis on the sample complexity of learning. Applications to the theory of learning in artificial neural networks are discussed. The survey should be accessible to those unfamiliar with computational learning theory. It is assumed the reader has some familiarity with neural networks, but otherwise the survey is largely self-contained. The basic PAC model of concept learning is discussed and the key results involving the Vapnik-Chervonenkis dimension are derived. Implications for the theory of artificial neural networks are discussed through a survey of known results on the VC-dimension of neural nets. A brief discussion of the computational complexity of PAC learning follows. We then discuss generalisations and extensions of the PAC model: stochastic concepts, learning with respect to particular distributions, and the learnability of functions and p-concepts. (We do not discuss computational complexity in these contexts.) Contents: 1. Introduction 2. The Basic PAC Model of Learning 3. VC-Dimension and Growth Function 4. VC-Dimension and Linear Dimension 5. A Useful Probability Theorem 6. PAC Learning and the VC-Dimension 7. VC-Dimension of Binary-Output Networks introduction linearly weighted neural networks linear threshold networks other activation functions the effect of weight restrictions 8. Computational Complexity of Learning 9. Stochastic Concepts 10. Distribution-Specific Learning 11. Graph Dimension and Multiple-Output Nets the graph dimension multiple-output feedforward threshold networks 12. Pseudo-Dimension and Function Learning the pseudo-dimension learning real-valued functions 13. Capacity of a Function Space capacity and learning applications to sigmoid neural networks 14. Scale-Sensitive Dimensions learnability of p-concepts learnability of functions 15. Conclusions ----------------------- The Report NC-TR-94-1 can be accessed and printed as follows % ftp cscx.cs.rhbnc.ac.uk (134.219.200.45) Name: anonymous password: your full email address ftp> cd pub/neurocolt/tech_reports ftp> binary ftp> get nc-tr-94-1.ps.Z ftp> bye % zcat nc-tr-94-1.ps.Z | lpr -l Similarly for the Report NC-TR-94-3. Uncompressed versions of the postscript files have also been left for anyone not having an uncompress facility. Best wishes John From C.Campbell at bristol.ac.uk Mon Jul 11 05:30:33 1994 From: C.Campbell at bristol.ac.uk (I C G Campbell) Date: Mon, 11 Jul 1994 09:30:33 +0000 (GMT) Subject: Postdoc position Message-ID: <24600.9407110930@irix.bris.ac.uk> UNIVERSITY OF BRISTOL POSTDOCTORAL POSITION IN NEURAL NETWORKS Applications are invited for a postdoctoral position at the Advanced Computing Research Centre, University of Bristol. We are looking for a researcher with an interest in theoretical work, particularly the development of new learning algorithms suited to hardware (VLSI) implementations. Applicants should have a good background in mathematics and computing. Apart from theoretical work and computer simulations the project will also involve collaboration with several groups with an interest in applications involving dedicated neural hardware. The position is initially for a two year period. Applications should be supported by a Curriculum Vitae, a list of publications and a brief outline of research interests. Some sample papers would also be helpful. Further details about this position may be obtained from Dr. C. Campbell at the address below. Applications should be sent to: Dr. C. Campbell, Advanced Computing Research Centre, Queen's Building, Bristol University, Bristol BS8 1TR United Kingdom E-mail: C.Campbell at bristol.ac.uk Tel: 0272 303030 X3382 (secretary X3246) Candidates should also arrange to have 3 letters of recommendation sent to Dr. Campbell at the address above. ***Closing date: 15th September 1994*** From mbrown at aero.soton.ac.uk Mon Jul 11 16:31:25 1994 From: mbrown at aero.soton.ac.uk (Martin Brown) Date: Mon, 11 Jul 94 16:31:25 BST Subject: New Neurofuzzy Book Message-ID: <24756.9407111531@aero.soton.ac.uk> Could you please post this announcement about our following book which may be of interest to workers in the neurofuzzy field. NEUROFUZZY ADAPTIVE MODELLING AND CONTROL, Martin Brown and Chris Harris (University of Southampton, UK), Prentice Hall, Hemel Hempstead, UK, 1994. 13-134453-6 Price: 29.95 UK pounds or 49.95 US dollars (Hardback). This book provides a unified description of several adaptive neural and fuzzy networks and introduces the associative memory class of systems - which describe the similarities and differences existing between fuzzy and neural algorithms. Three networks are described in detail - the Albus CMAC, the B-spline network and a class of fuzzy systems - and then analysed, their desirable features (local learning, linearly dependent on the parameter set, fuzzy interpretation) are emphasised and the algorithms are all evaluated on a common time series problem and applied to a common ship control benchmark. Contents: 1 An Introduction to Learning Modelling and Control 1.1 Preliminaries 1.2 Intelligent Control 1.3 Learning Modelling and Control 1.4 Artificial Neural Networks 1.5 Fuzzy Control Systems 1.6 Book Description 2 Neural Networks for Modelling and Control 2.1 Introduction 2.2 Neuromodelling and Control Architectures 2.3 Neural Network Structure 2.4 Training Algorithms 2.5 Validation of a Neural Model 2.6 Discussion 3 Associative Memory Networks 3.1 Introduction 3.2 A Common Description 3.3 Five Associative Memory Networks 3.4 Summary 4 Adaptive Linear Modelling 4.1 Introduction 4.2 Linear Models 4.3 Performance of the Model 4.4 Gradient Descent 4.5 Multi-Layer Perceptrons and Back Propagation 4.6 Network Stability 4.7 Conclusion 5 Instantaneous Learning Algorithms 5.1 Introduction 5.2 Instantaneous Learning Rules 5.3 Parameter Convergence 5.4 The Effects of Instantaneous Estimates 5.5 Learning Interference in Associative Memory Networks 5.6 Higher Order Learning Rules 5.7 Discussion 6 The CMAC Algorithm 6.1 Introduction 6.2 The Basic Algorithm 6.3 Adaptation Strategies 6.4 Higher Order Basis Functions 6.5 Computational Requirements 6.6 Nonlinear Time Series Modelling 6.7 Modelling and Control Applications 6.8 Conclusions 7 The Modelling Capabilities of the Binary CMAC 7.1 Modelling and Generalisation in the Binary CMAC 7.2 Measuring the Flexibility of the Binary CMAC 7.3 Consistency Equations 7.4 Orthogonal Functions 7.5 Bounding the Modelling Error 7.6 Investigating the CMAC's Coarse Coding Map 7.7 Conclusion 8 Adaptive B-spline Networks 8.1 Introduction 8.2 Basic Algorithm 8.3 B-spline Learning Rules 8.4 B-spline Time Series Modelling 8.5 Model Adaptation Rules 8.6 ASMOD Time Series Modelling 8.7 Discussion 9 B-spline Guidance Algorithms 9.1 Introduction 9.2 Autonomous Docking 9.3 Constrained Trajectory Generation 9.4 B-spline Interpolants 9.5 Boundary and Kinematic Constraints 9.6 Example: A Quadratic Velocity Interpolant 9.7 Discussion 10 The Representation of Fuzzy Algorithms 10.1 Introduction: How Fuzzy is a Fuzzy Model? 10.2 Fuzzy Algorithms 10.3 Fuzzy Sets 10.4 Logical Operators 10.5 Compositional Rule of Inference 10.6 Defuzzification 10.7 Conclusions 11 Adaptive Fuzzy Modelling and Control 11.1 Introduction 11.2 Learning Algorithms 11.3 Plant Modelling 11.4 Indirect Fuzzy Control 11.5 Direct Fuzzy Control References Appendix A Modified Error Correction Rule Appendix B Improved CMAC Displacement Tables Appendix C Associative Memory Network Software Structure C.1 Data Structures C.2 Interface Functions C.3 Sample C Code Appendix D Fuzzy Intersection Appendix E Weight to Rule Confidence Vector Map For further information about this book (mailing/shipping costs etc.) and other neurofuzzy titles in the Prentice Hall series please contact: Liz Dickinson, Prentice Hall, Paramount Publishing International, Campus 400, Maylands Avenue, Hemel Hempstead, HP2 7EZ, United Kingdom. Tel: 0442 881900 Fax: 0442 257115 From swe at unix.brighton.ac.uk Tue Jul 12 11:17:14 1994 From: swe at unix.brighton.ac.uk (ellacott) Date: Tue, 12 Jul 94 11:17:14 BST Subject: No subject Message-ID: <1924.9407121017@unix.bton.ac.uk> ************************* MAIL FROM STEVE ELLACOTT ************************** 1st Announcement and CALL FOR PAPERS MATHEMATICS of NEURAL NETWORKS and APPLICATIONS (MANNA 1995) International Conference at Lady Margaret Hall, Oxford, July 3-7, 1995 run by the University of Huddersfield in association with the University of Brighton We are delighted to announce the first conference on the Mathematics of Neural Networks and Applications (MANNA), in which we aim to provide both top class research and a friendly motivating atmosphere. The venue, Lady Margaret Hall is an Oxford College, set in an attractive and quiet location adjacent to the University Parks and River Cherwell. Applications of neural networks (NNs) have often been carried out with a limited understanding of the underlying mathematics but it is now essential that fuller account should be taken of the many topics that contribute to NNs: approximation theory, control theory, genetic algorithms, dynamical systems, numerical analysis, optimisation, statistical decision theory, statistical mechanics, computability and information theory, etc. . We aim to consider the links between these topics and the insights they offer, and identify mathematical tools and techniques for analysing and developing NN theories, algorithms and applications. Working sessions and panel discussions are planned. Keynote speakers who have provisionally accepted invitations include: N M Allinson (York University, UK) S Grossberg (Boston, USA) S-i Amari (Tokyo) M Hirsch (Berkeley, USA) N Biggs (LSE, London) T Poggio (MIT, USA) G Cybenko (Dartmouth USA) H Ritter (Bielefeld, Germany) J G Taylor (King's College, London) P C Parks (Oxford) It is anticipated that about 40 contributed papers and posters will be presented. The proceedings will be published, probably as a volume of an international journal, and contributed papers will be considered for inclusion. The deadline for submission of abstracts is 17 February 1995. Accommodation will be available at Lady Margaret Hall (LMH) where many rooms have en- suite facilities - early bookings are recommended. The conference will start with Monday lunch and end with Friday lunch, and there will be a full-board charge (including conference dinner) of about #235 for this period as well as a modest conference fee (to be fixed later). We hope to be able to offer a reduction in fees to those who give submitted papers and to students. There will be a supporting social programme, including reception, outing(s) and conference dinner, and family accommodation may be arranged in local guest houses. Please indicate your interest by returning the form below. A booking form will be sent to you with the 2nd announcement. Thanking you in anticipation. Committee: S W Ellacott (Brighton) and J C Mason (Huddersfield) Co-organisers; I Aleksander, N M Allinson, N Biggs, C M Bishop, D Lowe, P C Parks, J G Taylor, K Warwick ______________________________________________________________________________ To: Ros Hawkins, School of Computing and Mathematics, University of Huddersfield, Queensgate, Huddersfield, West Yorkshire, HD1 3DH, England. (Email: j.c.mason at hud.ac.uk) Please send further information on MANNA, July 3 - 7, 1995 Name .......................Address .......................................... ............................................................................. ............................................................................. Telephone ............................. Fax .................................. E Mail ................................ I intend/do not intend to submit a paper Area of proposed contribution ................................................ ***************************************************************************** From hszu%ulysses at relay.nswc.navy.mil Wed Jul 13 12:11:22 1994 From: hszu%ulysses at relay.nswc.navy.mil (Harold Szu) Date: Wed, 13 Jul 94 12:11:22 EDT Subject: UCLA Short Course on Wavelets announcement (September 12-16 1994) Message-ID: <9407131611.AA05944@ulysses.nswc.navy.mil> ANNOUNCEMENT UCLA Extension Short Course The Wavelet Transform: Techniques and Applications Overview For many years, the Fourier Transform (FT) has been used in a wide variety of application areas, including multimedia compression of wideband ISDN for telecommunications; lossless transform for fingerprint storage, identification, and retrieval; an increased signal to noise ratio (SNR) for target discrimination in oil prospect seismic imaging; in-scale and rotation-invariant pattern recognition in automatic target recognition; and in heart, tumor, and biomedical research. This course describes a new connectionist technique, the Wavelet Transform (WT), that is replacing the windowed FT in a neural network to do the applications mentioned above by a WAVENET. The WT uses appropriately matched bandpass kernels, called 'mother' wavelets, thereby enabling improved representation and analysis of wideband, transient, and noisy signals. The principal advantages of the WT are 1) its localized nature, which accepts less noise and enhances the SNR, and 2) the new problem-solving paradigm it offers in the treatment of nonlinear problems. The course covers WT principles as well as adaptive techniques, describing how WT's mimic human ears and eyes by tuning up "best mothers" to spawn "daughter" wavelets that catch multi-resolution components to be fed the expansion coefficient through an artificial neural network, called a "wavenet". This, in turn, provides the useful automation required in multiple application areas, a powerful tool when the inputs are constrained by real time sparse data (for example, the "cocktail party" effect where you perceive a desired message from the cacophony of a noisy party). Another advancement discussed in the course is the theory and experiment for solving nonlinear dynamics for information processing; e.g., the environmental simulation as a non-real time virtual reality. In other words, real time virtual reality can be achieved by the wavelet compression technique, followed by an optical flow technique to acquire those wavelet transform coefficients, then applying the inverse WT to retrieve the virtual reality dynamical evolution. (For example, an ocean wave is analyzed by soliton envelope wavelets.) Finally, implementation techniques in optics and digital electronics are presented, including optical wavelet transforms and wavelet chips. Course Materials Course note and relevant software are distributed on the first day of the course. The notes are for participants only, and are not for sale. Coordinator and Lecturer Harold Szu, Ph.D. Research physicist, Washington, D.C. Dr. Szu's current research involves wavelet transforms, character recognition, and constrained optimization implementation on neural network computer. He has edited two special issues on Wavelets, Sept 1992 & July 1994 of Optical Engineering. He is the Chair of SPIE Orlando Wavelet Applications Conference every year since 1992. He is also involved with the design of a next-generation computer based on the confluence of neural networks and optical data base machines. Dr. Szu is also a technical representative to ARPA and consultant to the Office of Naval Research , and has been engaged in plasma physics, optical engineering, electronic warfare research for the past 16 years. He holds six patents, has published about 200 technical papers, plus edided several textbooks. Dr. Szu is the editor-in-chief for the INNS Press, and currently serves as the Immediate Past President of the International Neural Network Society. Lecturer and UCLA Faculty Representative John D. Villasenor, Ph.D. Assistant Professor, Department of Electrical Engineering, University of California, Los Angeles. Dr. Villasenor has been instrumental in the development of a number of efficient algorithms for a wide range of signal and image processing tasks. His contributions include application-specific optimal compression techniques for tomographic medical images, temporal change measures using synthetic aperture radar, and motion estimation and image modeling for angiogram video compression. Prior to joining UCLA, Dr. Villasenor was with the Radar Science and Engineering section of the Jet Propulsion Laboratory where he applied synthetic aperture radar to interferometric mapping, classification, and temporal change measurement. He has also studied parallelization of spectral analysis algorithms and multidimensional data visualization strategies. Dr. Villasenor's research activities at UCLA include still-frame and video medical image compression, processing and interpretation of satellite remote sensing images, development of fast algorithms for one- and two-dimensional spectral analysis, and studies of JPEG-based hybrid video coding techniques. For more information, call the Short Course Program Office at (310) 825-3344; Facsimile (213) 206-2815. Date: September 12-16 (Monday through Friday) Time: 8am - 5pm (subject to adjustment after the first class meeting). Location: Room G-33 West, UCLA Extension Building, 10995 Le Conte Avenue (adjacent to the UCLA campus), Los Angeles, California. Reg# E0153M Course No. Engineering 867.121 3.0 CEU (30 hours of instruction) Fee: $1495, includes course materials From henders at linc.cis.upenn.edu Thu Jul 14 10:41:17 1994 From: henders at linc.cis.upenn.edu (Jamie Henderson) Date: Thu, 14 Jul 1994 10:41:17 -0400 Subject: paper available on connectionist NLP/temporal synchrony Message-ID: <199407141441.KAA04931@linc.cis.upenn.edu> FTP-host: linc.cis.upenn.edu FTP-filename: pub/henderson/jpr94.ps.Z The following paper on the feasibility and implications of using temporal synchrony variable binding to do syntactic parsing is available by anonymous ftp from linc.cis.upenn.edu. It's in directory pub/henderson, and is called "jpr94.ps.Z". It's 20 pages long. This paper will appear in the Journal of Psycholinguistic Research, probably volume 23, number 6, 1994. - Jamie Henderson University of Pennsylvania -------- Connectionist Syntactic Parsing Using Temporal Variable Binding James Henderson Computer and Information Science University of Pennsylvania Recent developments in connectionist architectures for symbolic computation have made it possible to investigate parsing in a connectionist network while still taking advantage of the large body of work on parsing in symbolic frameworks. The work discussed here investigates syntactic parsing in the temporal synchrony variable binding model of symbolic computation in a connectionist network. This computational architecture solves the basic problem with previous connectionist architectures, while keeping their advantages. However, the architecture does have some limitations, which impose constraints on parsing in this architecture. Despite these constraints, the architecture is computationally adequate for syntactic parsing. In addition, the constraints make some significant linguistic predictions. These arguments are made using a specific parsing model. The extensive use of partial descriptions of phrase structure trees is crucial to the ability of this model to recover the syntactic structure of sentences within the constraints imposed by the architecture. From biehl at connect.nbi.dk Fri Jul 15 17:56:26 1994 From: biehl at connect.nbi.dk (Michael Biehl) Date: Fri, 15 Jul 94 17:56:26 METDST Subject: paper available Message-ID: FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/biehl.online-perceptron.ps.Z The following paper has been placed in the Neuroprose archive in file biehl.online-perceptron.ps.Z (8 pages). Hardcopies are not available. ------------------------------------------------------------------------- ON-LINE LEARNING WITH A PERCEPTRON Michael Biehl CONNECT, The Niels Bohr Institute Blegdamsvej 17, 2100 Copenhagen, Denmark email: biehl at physik.uni-wuerzburg.de and Peter Riegler Institut fuer theoretische Physik Julius-Maximilians-Universitaet Wuerzburg Am Hubland, D-97074 Wuerzburg, Germany submitted to Europhysics Letters ABSTRACT We study on-line learning of a linearly separable rule with a simple perceptron. Training utilizes a sequence of uncorrelated, randomly drawn N-dimensional input examples. In the thermodynamic limit the generalization error after training with P such examples can be calculated exactly. For the standard perceptron algorithm it decreases like (N/P)^(1/3) for large (P/N), in contrast to the faster (N/P)^(1/2)-behavior of the so-called Hebbian learning. Furthermore, we show that a specific parameter-free on- line scheme, the AdaTron-algorithm, gives an asymptotic (N/P)-decay of the generalization error. This coincides (up to a constant factor) with the bound for any training process based on random examples, including off-line learning. Simulations confirm our results. ----------------------------------------------------------------------- --- Michael Biehl biehl at physik.uni-wuerzburg.de From biehl at connect.nbi.dk Fri Jul 15 17:57:47 1994 From: biehl at connect.nbi.dk (Michael Biehl) Date: Fri, 15 Jul 94 17:57:47 METDST Subject: paper available Message-ID: FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/marangi.clusters.ps.Z The following paper has been placed in the Neuroprose archive in file marangi.clusters.ps.Z (8 pages). Hardcopies are not available. ------------------------------------------------------------------------- SUPERVISED LEARNING FROM CLUSTERED INPUT EXAMPLES Carmela Marangi Dipartimento di Fisica dell' Universita' di Bari and I.N.F.N., Sez. di Bari Via Orabona 4, 70126 Bari, Italy Michael Biehl ^ and Sara Solla CONNECT, The Niels Bohr Institute Blegdamsvej 17, 2100 Copenhagen, Denmark ^ email: biehl at physik.uni-wuerzburg.de submitted to Europhysics Letters ABSTRACT In this paper we analyse the effect of introducing a structure in the input distribution on the generalzation ability of a simple perceptron. The simple case of two clusters of input data and a linearly separable rule is considered. We find that the generalization ability improves with the separation between the clusters, and is bounded from below by the result for the unstructured case. The asymptotic behavior for large training sets, however, is the same for structured and unstructured input distributions. For small training sets, the dependence of the generalization error on the number of examples is observed to be nonmonotonic for certain values of the model parameters. ----------------------------------------------------------------------- --- Michael Biehl biehl at physik.uni-wuerzburg.de From tesauro at watson.ibm.com Fri Jul 15 11:03:36 1994 From: tesauro at watson.ibm.com (tesauro@watson.ibm.com) Date: Fri, 15 Jul 94 11:03:36 EDT Subject: Neural nets in commercial anti-virus software Message-ID: IBM BRINGS NEW TECHNOLOGY TO VIRUS PROTECTION IBM's investment in leading-edge research is paying off in unexpected ways. The latest release of IBM AntiVirus uses sophisticated "neural network" technology to help detect new, previously unknown viruses. "Detecting viruses that people have never seen before, while simultaneously preventing false alarms, is a difficult balancing act," said Jeffrey O. Kephart, a manager in the High Integrity Computing Laboratory, the group at the Watson Research Center that develops IBM AntiVirus. "But with several new viruses being written every day, this has become an essential requirement for any anti-virus program." "Traditionally, virus detection heuristics have been developed by trial and error. Our neural-net detector was produced completely automatically, according to sound statistical principles. The anti-virus technical community had been hoping for such a breakthrough, but was pessimistic. We invented several new techniques that overcame previous limitations." By showing a neural network a large number of infected and uninfected files, Kephart and his colleagues trained it to discriminate between viruses and uninfected programs. After the training had taken place, they found that the neural network was able to recognize a very high percentage of previously unknown viruses. "We've been quite successful in bringing leading-edge research into the IBM AntiVirus products very quickly," explained Kephart. "In this case, just a few months after our initial conception of the idea, we are delivering novel but well-tested technology to our customers around the world." IBM AntiVirus version 1.06 provides comprehensive "install-and-forget" automatic protection against computer virus attacks in DOS, Windows*, OS/2** and Novell NetWare*** computing environments. In addition to its patent-pending neural network technology, it can detect viruses inside of files compressed with PKZIP****, ZIP2EXE and LZEXE. It can even detect viruses inside of compressed files that themselves contain compressed files. Common viruses can be detected automatically when infected files are copied from a diskette or downloaded from a computer bulletin board system. New installation programs support automated installation from LAN servers. IBM AntiVirus for NetWare can check NetWare 3.1x and 4.0x servers for viruses in real time, as users add or modify files on the server. IBM AntiVirus protects against thousands of known viruses, including viruses that are said to be impossible to detect. "There's a lot of hype out there about 'killer' viruses," said Steve R. White, Senior Manager of the High Integrity Computing Laboratory. "Here are the facts. Many viruses are silly, badly written programs. A few viruses try to hide by changing their appearance when they spread - 'polymorphic' viruses - or by trying to prevent anti-virus software from seeing them at all - 'stealth' viruses." "People have said these viruses are impossible to detect. They are wrong. We have had no trouble analyzing new viruses and adding protection against them to IBM AntiVirus. The latest version of IBM AntiVirus detects lots of 'difficult' viruses, including Queeg, Pathogen and Junkie-1027. Keeping up with these new viruses does require a lot of expertise and technology, but that's what IBM Research is famous for. People who say that their anti-virus products can't keep up are using the wrong products." * Windows is a trademark of Mircosoft Corp. ** OS/2 is a trademark of IBM Corp. *** Novell and NetWare are trademarks of Novell Corp. **** PKZIP is a trademark of PKWARE, Inc. From gem at cogsci.indiana.edu Fri Jul 15 12:14:40 1994 From: gem at cogsci.indiana.edu (Gary McGraw) Date: Fri, 15 Jul 94 11:14:40 EST Subject: Letter Perception paper available Message-ID: The following paper (available by anonymous ftp) may be of interest to some on this list: Roles in Letter Perception: Human data and computer models CRCC-TR 90 Gary McGraw*, John Rehling*, and Robert Goldstone# * Center for Research on Concepts and Cognition Indiana University, Bloomington, Indiana 47405 & Istituto per la Ricerca Scientifica e Tecnologica Loc. Pante di Povo, I-38100 Trento, Italia gem at irst.it rehling at irst.it # Department of Psychology Indiana University, Bloomington, Indiana 47405 rgoldsto at ucs.indiana.edu Submitted to Cognitive Science We present the results of an experiment in letter recognition. Unlike most psychological studies of letter recognition, we include in our data set letters at the fringes of their categories and investigate the recognition of letters in diverse styles. We are interested in the relationship between the recognition of prototypical letters and the recognition of eccentric, highly-stylized letters. Our results provide empirical evidence for conceptual constituents of letter categories, called roles, which exert clear top-down influence on the segmentation of letterforms into structural components. The human data are analyzed in light of two computational models of letter perception --- one connectionist and the other symbolic. Performance of the models is compared and contrasted with human performance using theoretical tools that shed light on processing. Results point in the direction of a model using a role-based approach to letter perception. To obtain an electronic copy of this paper: Note that the paper (41 pages with many figures) comes in a rather large file of 455116 bytes (compressed). ftp ftp.cogsci.indiana.edu login: anonymous password: cd /pub/ binary get mcgraw+rehling+goldstone.roles_letter_perception.ps.Z quit Then at your system: uncompress mcgraw+rehling+goldstone.roles_letter_perception.ps.Z lpr -s mcgraw+rehling+goldstone.roles_letter_perception.ps If you cannot obtain an electronic copy, send a request for a hard copy to helga at cogsci.indiana.edu You may also retrieve the paper via the web. Open the URL http://www.cogsci.indiana.edu and follow the "papers" pointer. Gary McGraw (gem at cogsci.indiana.edu) From hornik at ci.tuwien.ac.at Mon Jul 18 09:10:00 1994 From: hornik at ci.tuwien.ac.at (Kurt Hornik) Date: Mon, 18 Jul 94 09:10 MET DST Subject: Paper available in Neuroprose Message-ID: FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/baldi.linear.ps.Z *** DO NOT FORWARD TO OTHER GROUPS *** The file baldi.linear.ps.Z is now available for copying from the Neuroprose repository: A survey of learning in linear neural networks (24 pages) Pierre Baldi (Caltech) && Kurt Hornik (TU Wien, Austria) ABSTRACT: Networks of linear units are the simplest kind of networks, where the basic questions related to learning, generalization, and self-organisation can sometimes be answered analytically. We survey most of the known results on linear networks, including: (1) back-propagation learning and the structure of the error function landscape; (2) the temporal evolution of generalization; (3) unsupervised learning algorithms and their properties. The connections to classical statistical ideas, such as principal component analysis (PCA), are emphasized as well as several simple but challenging open questions. A few new results are also spread across the paper, including an analysis of the effect of noise on back-propagation networks and a unified view of all unsupervised algorithms. -Kurt Hornik (Kurt.Hornik at ci.tuwien.ac.at) From esann at dice.ucl.ac.be Mon Jul 18 13:49:59 1994 From: esann at dice.ucl.ac.be (esann@dice.ucl.ac.be) Date: Mon, 18 Jul 1994 19:49:59 +0200 Subject: Neural Processing Letters - announcement and call for papers Message-ID: <9407181745.AA03058@ns1.dice.ucl.ac.be> Dear Colleagues, We are pleased to announce you the creation of a new publication in the field of neural networks, "Neural Processing Letters". Neural Processing Letters is intended to provide to the research community a FAST publication media, in order to rapidly publish new ideas, original developments, work in progress, in all aspects of the neural networks field. Papers will be published as letters (short papers of about 4 published pages), and the maximum delay between submission and publication will be about 3 months. You will find below some information about this new publication. If you are interested in, please don't hesitate to contact us, preferably by fax, to ask for more information. The first issue of the journal will be published in September 1994; if you are interested in submitting a paper, please ask as soon as possible for the instructions for authors. We hope that this new journal will become a standard for the publication of original ideas in the field, and that you will contribute to it by submitting your work and/or by subscribing to it! All correspondence should be addressed to the publisher: Neural Processing Letters F. Blayo and M. Verleysen editors D facto publications 45 rue Masui B-1210 Brussels, Belgium Tel: + 32 2 245 43 63 Fax: + 32 2 245 46 94 _______________________________________________ ! ! ! Neural Processing Letters ! ! ! _______________________________________________ A fast publication medium ------------------------- Neural Processing Letters is a rapid publication journal intended to disseminate the latest results in the field of neural processing. The aim of the journal is to rapidly publish new ideas, original developments and work in progress not previously published. Topics ------ Neural Processing Letters covers all aspects of the Artificial Neural Networks field including, but not restricted to, theoretical developments, biological models, new formal models, learning, applications, software and hardware developments, and prospective researches. Committee --------- Editors : Fran?ois Blayo (France) and Michel Verleysen (Belgium) The preliminary editorial board includes today : Y. Abu-Mostafa (USA) L. Almeida (Portugal) S.I. Amari (Japan) A. Babloyantz (Belgium) J. Barhen (USA) E. Baum (USA) J. Cabestany (Spain) M. Cottrell (France) D. Del Corso (Italy) A. Georgopoulos (USA) A. Guerin-Dugue (France) M. Hassoun (USA) K. Hornik (Austria) C. Jutten (France) P. Lansky (Czech Republic) J.P. Nadal (France) G. Orban (Belgium) R. Reilly (Ireland) H. Ritter (Germany) T. Roska (Hungary) J. Stonham (United Kingdom) E. Vittoz (Switzerland) Instructions to authors ----------------------- Prospective authors are invited to submit a letter, written in English language, not exceeding 3000 words including figures (each medium-sized figure being equivalent to 200 words). The content of the letter will focus on ideas, results and conclusions. All the attention must be paid to the clarity of the presentation and the synthesis of thought process. Prospective authors are strongly invited to ask for full instructions for authors. Short comments (not exceeding 300 words) on letters will be considered for publication ; comments will be published with the author's reply. Book reviews are welcome (about 500 words). Reviewing process ----------------- To ensure short publication delays, no corrections will be allowed in a submitted paper. The reviewing process will be confidential, and submitted materials will be accepted as it or rejected. Publication ----------- Neural Processing Letters will be published each two months, beginning September, 1994. The maximum delay between submission and publication will be 3 months. For further information concerning submission of papers or subscriptions, please contact the editorial office at the following address : Neural Processing Letters F. Blayo and M. Verleysen editors D facto publications 45 rue Masui B-1210 Brussels, Belgium Tel : + 32 2 245 43 63 Fax : + 32 2 245 46 94 _____________________ ! Subscription Form ! _____________________ ******************************* *** Special temporary offer *** ******************************* Please send this form by regular mail. A copy by fax may be sent to avoid delays. Information on the subscriber (please print) Name : First name : Title (M., Mrs, Dr, Prof.) : Organization : Address : Post/Zip Code : City : Country : E-mail : Tel : Fax : VAT number (mandatory for EC customers) : O I wish to subscribe to Neural Processing Letters, for a period of one year Normal price: BEF 4400. **** Special temporary offer (before September 30, 1994) **** BEF 4000 O Please send me a free sample copy of Neural Processing Letters O Please send me the detailed instructions for authors Payment details O I wish to pay by cheque/money order made payable to : D facto s.a. 45 rue Masui B-1210 Brussels - Belgium A supplementary fee of BEF 300 must be added if the payment BEF 300 is made through a bank abroad cheque or a postal money order. This supplementary fee is not required for payment by Eurocheque. O I wish to pay by bank transfer. Account : D facto publications Account number : 310-1177992-13 Bank : Banque Bruxelles-Lambert 11, av. Winston Churchill B-1180 Brussels Belgium Please indicate "Neural Processing Letters" and the name of the subscriber on the bank transfer. A supplementary fee of BEF 300 must be added if the payment BEF 300 is made from a bank abroad account. Payment by credit card is not accepted. TOTAL : BEF ... O Please send me an invoice (for EC customers: only if VAT number is indicated) First subscription issue: Unless otherwise specified, the first issue to be sent to the subscriber is the first one published after receipt of the payment. No subscription will be honoured without payment. Subscriptions are valid for six consecutive issues only. Please indicate here if you want a subscription starting at a specified issue (for example issue n.1 - September 1994). ........................................................................... .... Date : Signature : Please send this form to: Neural Processing Letters F. Blayo and M. Verleysen editors D facto publications 45 rue Masui B-1210 Brussels, Belgium Tel : + 32 2 245 43 63 Fax : + 32 2 245 46 94 _____________________________ Neural Processing Letters D facto publications 45 rue Masui 1210 Brussels Belgium tel: +32 2 245 43 63 fax: +32 2 245 46 94 _____________________________ From Dave_Touretzky at DST.BOLTZ.CS.CMU.EDU Mon Jul 18 15:23:16 1994 From: Dave_Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave_Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Mon, 18 Jul 94 15:23:16 -0400 Subject: posting paper announcements to Connectionists Message-ID: <23367.774559396@DST.BOLTZ.CS.CMU.EDU> We've had a couple of suggestions recently about the posting of paper announcements to Connectionists. Here they are; you can follow this advice or not, as you wish. 1) Don't use a generic Subject line like "paper available". Instead, include the title of your paper, or at least some helpful keywords. Example: Wrong way: "Paper available" Right way: "Paper available: Solving Towers of Hanoi with ART-4" 2) If you're going to post pointers to Postscript files, try to save a few trees by formatting your paper single spaced, instead of using the double-spaced format required for journal submissions. This also makes the paper more suitable for reading with a Postscript previewer. -- Dave Touretzky, CONNECTIONISTS moderator From inmanh at cogs.susx.ac.uk Tue Jul 19 14:31:00 1994 From: inmanh at cogs.susx.ac.uk (Inman Harvey) Date: Tue, 19 Jul 94 14:31 BST Subject: SAB94 Conference reminder Message-ID: Last minute reminder for those contemplating attending SAB94, Third Intl. Conf. on Simulation of Adaptive Behavior "From Animals to Animats", in Brighton, U.K., Aug 8--12 1994. Full program can be obtained on World Wide Web from: http://www.cogs.susx.ac.uk/lab/adapt/sab_program.html or by anonymous ftp from ftp.cogs.susx.ac.uk in directory: pub/sab94 From davec at cogs.susx.ac.uk Tue Jul 19 11:39:47 1994 From: davec at cogs.susx.ac.uk (Dave Cliff) Date: Tue, 19 Jul 1994 16:39:47 +0100 (BST) Subject: CFP: Adaptive Behavior special issue Message-ID: ------------------------------------------------------------------------------ CALL FOR PAPERS (please post) ADAPTIVE BEHAVIOR Journal Special Double Issue on COMPUTATIONAL NEUROETHOLOGY Guest editor: Dave Cliff Submission Deadline: 1 December 1994. Adaptive Behavior is an international journal published by MIT Press; Editor-in-Chief: Jean-Arcady Meyer, Ecole Normale Superieure, Paris. The aim of this special issue (to be published in 1995) is to bring together papers describing research in the field of computational neuroethology. Computational neuroethology (CNE) applies computational modelling techniques to the study of neural mechanisms underlying the generation of adaptive behaviors in embodied, situated, autonomous agents. The focus on studying agents embedded within their environments is a distinguishing feature of CNE research. CNE studies can help in the design of artificial autonomous agents, and can complement standard computational neuroscience approaches to understanding the neural control of behavior in animals. Submitted papers should emphasise the relevance of the content matter to both real and artificial systems. Submitted papers should be delivered by 1 December 1994. Manuscripts should be typed or laser-printed in English (with American spelling preferred), doublespaced, and between 10000 and 15000 words in length counting 500 words per full-page figure. Authors intending to submit should contact Dave Cliff well before the deadline, at the address below. Copies of the complete Adaptive Behavior Instructions to Contributors are available on request. Send five (5) copies of submitted papers (hardcopy only) to: Dave Cliff School of Cognitive and Computing Sciences University of Sussex Brighton BN1 9QH U.K. Phone: +44 273 678754 Fax: +44 273 671320 Email: davec at cogs.susx.ac.uk WWW: http://www.cogs.susx.ac.uk/users/davec ------------------------------------------------------------------------------ From mtx004 at cck.coventry.ac.uk Wed Jul 20 14:48:14 1994 From: mtx004 at cck.coventry.ac.uk (NSteele) Date: Wed, 20 Jul 94 14:48:14 WET DST Subject: Fuzzy Logic Symposium Message-ID: <6346.9407201348@cck.coventry.ac.uk> ICSC ISFL'95 Call for Papers First ICSC International Symposium on FUZZY LOGIC To be held at the Swiss Federal Institute of Technology (ETH) Zurich, Switzerland May 26 and 27, 1995 I. PURPOSE OF THE CONFERENCE The purpose of this conference is to assist communication of research in the field of Fuzzy Logic and its technological applications. Fuzzy Logic is a scientific revolution that has been waiting to happen for decades. Research in Fuzzy Technologies has reched a degree where industrial application is possible. This is reflected by numerous projects in the USA, Korea and Japan, where the leading corporations have invested billions in utilising fuzzy logic in technological innovations. International activities show that by the year 2000 numerous practical realizations will be influenced by Fuzzy Systems. It is thus timely to organize a symposium aimed at bringing together both existing and potential workers in the field. II. TOPICS The following topics are envisaged: * Basic concepts such as various kinds of Fuzzy Sets, Fuzzy Relations, Possibility Theory * Mathematical Aspects such as non-classical logics, Category Theory, Algebra, Topology, Chaos Theory * Methodology and applications for example in Artificial Intelligence, Expert Systems, Patten Recognition, Clustering, Fuzzy Control, Game Theory, Mathematical Programming, Neural Networks, Genetic Algorithms, etc. * Implementation, for example in Engineering, Process Control, Production, Medicine. III. INTERNATIONAL SCIENTIFIC COMMITTEE E. Badreddin, Switzerland J.D. Nicoud, Switzerland H.P. Geering, Switzerland R. Palm, Germany H. Hellendoorn, Germany B. Reusch, Germany M. Jamshidi, USA N. Steele, England (Chairman) E.P. Klement, Austria K. Warwick, England B. Kosko, USA H.J. Zimmermann, Germany R. Kruse, Germany (list incomplete) IV. ORGANISING COMMITTEE The ISFL`95 is a joint operation of the Swiss Federal Institute of Technology (ETH), Zurich and International Computer Science Conventions (ICSC). V. SUBMISSIONS OF MANUSCRIPTS Prospective authors are requested to send two copies of their abstracts of 500 words to the ICSC Secretariat for review by the International Scientific Committee. All abstracts must be written in English, starting with a succinct statement of the problem, the results achieved their significance and a comparison with previous work. If authors believe that more details are necessary to substantiate the main claims of the paper, they may include a clearly marked appendix that will be read at the discretion of the International Scientific Committee. The abstract should also include: * Title of proposed paper * Author's names, affiliations, addresses * Name of author to contact for correspondence * Fax number of contact author * Name of topic which best describes the paper (max. 5 keywords) Contributions are solicited from those working in industry and having experience in the topics of this conference as well as from academics. VI. CONFERENCE LANGUAGE The Conference language is English. Simultaneous interpretation will not be available. VII. DEADLINES AND REGISTRATION It is the intention of the organizers to have the conference proceedings avialable for the delegates. Consequently the deadlines are to be strictly respected: * Submission of Abstracts .................... August 31, 1994 * Notification of Acceptance ................. November 1, 1994 * Delivery of Full Papers .................... February 1, 1995 * Early registrations (Sfrs. 700.-) .......... February 1, 1995 * Late registration (Sfrs. 850.-) Full registration includes attendance to all sessions, conference dinner and conference proceedings. Full-time students who have a valid student ID-card, may register at a reduced rate pf Sfrs. 400.- to all technical sessions. Student registration however does not include the banquet or proceedings. Extra banquet tickets will be sold for accompanying persons and students. The proceedings can be purchased separately through the ICSC-Secretariat. VIII. ACCOMMODATION Accommodation charges are not included in the fees, but block reservations will be made by the ISCS-Secretariat for the conference period at several hotels. More information will be made available with the letter of acceptance. IX. FURTHER INFORMATION For further informations please contact International Computer Science Conventions: ICSC-Secretariat Canada or ICSC-Secretariat Switzerland P O Box 279 P O Box 657 Millet, Alberta TOC 1ZO CH-8055 Zurich Canada Switzerland Fax: ++1-403-387-4329 Fax: ++41-1-761-9627 ****************************************************************************** ========================== Nigel Steele Chairman, Division of Mathematics School of Mathematical and Information Sciences Coventry University Priory Street Coventry CV1 5FB United Kingdom. tel: (0203) 838568 +44 203 838568 email: NSTEELE at uk.ac.coventry (JANET) or NSTEELE at coventry.ac.uk (EARN BITNET etc.) fax: (0203) 838585 +44 203 838585 From plaut at cmu.edu Wed Jul 20 18:06:23 1994 From: plaut at cmu.edu (David Plaut) Date: Wed, 20 Jul 1994 18:06:23 -0400 Subject: Preprint: Modularity and double dissociations in damaged networks Message-ID: <16437.774741983@crab.psy.cmu.edu> Double Dissociation Without Modularity: Evidence from Connectionist Neuropsychology David C. Plaut Department of Psychology Carnegie Mellon University To appear in Journal of Clinical and Experimental Neuropsychology Special Issue on Modularity and the Brain Many theorists assume that the cognitive system is composed of a collection of encapsulated processing components or modules, each dedicated to performing a particular cognitive function. On this view, selective impairment of cognitive tasks following brain damage, as evidenced by double dissociations, are naturally interpreted in terms of the loss of particular processing components. By contrast, the current investigation examines in detail a double dissociation between concrete and abstract word reading after damage to a connectionist network that pronounces words via meaning and yet has no separable components (Plaut & Shallice, 1993, Cogn. Neuropsych.). The functional specialization in the network that gives rise to the double dissociation is not transparently related to the network's structure, as modular theories assume. Furthermore, a consideration of the distribution of effects across quantitatively equivalent individual lesions in the network raises specific concerns about the interpretation of single-case studies. The findings underscore the necessity of relating neuropsychological data to cognitive theories in the context of specific computational assumptions about how the cognitive system operates normally and after damage. ftp-host: hydra.psy.cmu.edu [128.2.248.152] ftp-file: pub/plaut/papers/plaut.modularity.JCEN.ps.Z [31 pages; 0.48Mb uncompressed] =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= David Plaut plaut at cmu.edu "Doubt is not a pleasant Department of Psychology 412/268-5145 condition, but certainty Carnegie Mellon University 412/268-5060 (FAX) is an absurd one." Pittsburgh, PA 15213-3890 345H Baker Hall --Voltaire =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= From plaut at cmu.edu Wed Jul 20 18:06:55 1994 From: plaut at cmu.edu (David Plaut) Date: Wed, 20 Jul 1994 18:06:55 -0400 Subject: Preprint: Rehabilitation and relearning in damaged networks Message-ID: <16478.774742015@crab.psy.cmu.edu> Relearning after Damage in Connectionist Networks: Toward a Theory of Rehabilitation David C. Plaut Department of Psychology Carnegie Mellon University To appear in Brain and Language Special Issue on Cognitive Approaches to Rehabilitation and Recovery in Aphasia Connectionist modeling offers a useful computational framework for exploring the nature of normal and impaired cognitive processes. The current work extends the relevance of connectionist modeling in neuropsychology to address issues in cognitive rehabilitation: the degree and speed of recovery through retraining, the extent to which improvement on treated items generalizes to untreated items, and how treated items are selected to maximize this generalization. A network previously used to model impairments in mapping orthography to semantics is retrained after damage. The degree of relearning and generalization varies considerably for different lesion locations, and has interesting implications for understanding the nature and variability of recovery in patients. In a second simulation, retraining on words whose semantics are atypical of their category yields more generalization than retraining on more typical words, suggesting a counterintuitive strategy for selecting items in patient therapy to maximize recovery. In a final simulation, changes in the pattern of errors produced by the network over the course of recovery is used to constrain explanations of the nature of recovery of analogous brain-damaged patients. Taken together, the findings demonstrate that the nature of relearning in damaged connectionist networks can make important contributions to a theory of rehabilitation in patients. ftp-host: hydra.psy.cmu.edu [128.2.248.152] ftp-file: pub/plaut/papers/plaut.rehab.BrLang.ps.Z [39 pages; 0.66Mb uncompressed] =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= David Plaut plaut at cmu.edu "Doubt is not a pleasant Department of Psychology 412/268-5145 condition, but certainty Carnegie Mellon University 412/268-5060 (FAX) is an absurd one." Pittsburgh, PA 15213-3890 345H Baker Hall --Voltaire =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= From degaris at hip.atr.co.jp Thu Jul 21 15:46:26 1994 From: degaris at hip.atr.co.jp (Hugo de Garis) Date: Thu, 21 Jul 94 15:46:26 JST Subject: ALife IV Conference Report, Hugo de Garis, ATR Message-ID: <9407210646.AA00982@gauss> ALife IV Conference Report, Hugo de Garis, ATR The 4th Artificial Life conference was held at MIT in Boston, Massachusetts, USA, July 6th to 8th, 1994, organised by Rod Brooks and Pattie Maes. About 500 people turned up, to hear roughly 60 talks spread over plenaries and dual split sessions. There were over 50 posters. A book containing only the oral talks will be published within a few weeks by MIT Press. The best talks were set for the morning of the 6th. The kickoff speech was by (of course) Chris Langton, father of and labeller of the field "Artificial Life". Chris spoke of the dream of ALife to build artificial biologies, so that the universal properties of all forms of life whether biological or artificial can be understood. He emphasized the role of evolution much more strongly than he did at the previous conference at Santa Fe in 1992. In an hour long talk he systematically covered the steps in ALife research towards greater autonomy in the evolutionary process of production and selection, ranging over the work of Dawkins, Hillis, Lindgren, to Ray's fully autonomous "Tierra". I was struck at this apparent "about face" of Chris's attitude towards the importance and relevance of evolutionary approaches to ALife. I remember him saying to me at the 1992 conference that he was rather bored by GAs. Chris talked about his concept of "collectionism" or micro-macro dynamics, which is both top-down and bottom-up, where the macro behavior emerges in a bottom up way from the micro local rules of simple agents, yet the macro emergent effects feed back in a top-down way on the behavior of the agents. He spoke of biological hierarchies, from prokaryotes to eukaryotes to multicells to societies. He said the future of life is in humanity's hands. It was an inspiring and fun talk, even if it did run over time, thus testing the patience of Rod who was session chair. (Every 5 minutes over time, Rod would advance a bit, to Chris's "Uh oh!"). The following two talks by Demetri Terzopoulos et al, and Karl Sims were the highlights of the conference in my book. Both effectively built (simulated) artificial organisms. Terzopoulos et al simulated artificial fish using springs and differential equantions to provide the fish with lifelike motions. The scope of their work can be seen from the section titles in their paper, e.g. physics-based fish model and locomotion, mechanics, swimming using muscles and hydrodynamics, motor controllers, pectoral fins, learning muscle based locomotion, learning strategy, low level learning, abstraction of high level controllers, sensory perception, vision sensor, behavioral modeling, habits and mental state, intention generator, behavior routines, artificial fish types, predators, pacifists. It was an extraordinary piece of work and will probably be highly influential in the next year or so. Karl Sims paper combined his genius at computer graphics with some solid research ability. He evolved 3D rectangloid shaped "creatures" AND their neural network controllers and had these creatures fight it out in pairs in a co-evolutionary competition to get as close as possible to a target cube. I had the eery feeling watching the video of these creatures that I was witnessing the birth of a new field, namely "brain building", where the focus is on constructing increasingly elaborate artificial nervous systems. I will say more about this later. The remaining talks of the first morning were by Dave Ackley (on "Altruism in the Evolution of Communication"), Hiroaki Kitano (on "Evolution of Metabolism for Morphogenesis" - which made a solid contribution to the nascient field of artificial embryology), and Craig Reynolds (of "Boid" fame) (on "Competition, Coevolution and the Game of Tag", a coevolution of an alternating cat and mouse game). In the afternoon of the 6th, in a plenary talk, my boss Shimohara, spoke of ALife work at our Evolutionary Systems Department at ATR labs, Kyoto, Japan. (By the way, the next conference, i.e. ALife V, 1996, will be organized by Chris Langton, with local assistance from Shimohara san, and will probably be held in Kyoto or Nara, Japan's favorite tourist cities), around mid May. He introduced the researchers and the work of his group, e.g. software evolution (Tom Ray's "Tierra" and its multicell extension), my "CAM-Brain" (which hopes to evolve billion neuron brains at electronic speeds inside cellular automata machines, Hemmi and Mizoguchi's "Evolvable Hardware", (which uses Koza's Genetic Programming to evolve tree structured HDLs (hardware description languages) to evolve electronic circuit descriptions), and other members of our group. He then briefly showed how extensive ALife research has become in Japan. Shimohara stunned his audience by stating that the long term aim of the group, i.e. by the year 2001, is to build an artificial brain. A string of people came up to me after his talk with the comment "Is he serious?" "Yep", I said. After that, the conference split into dual sessions, so I missed half the talks. To get an overview of the best talks in the dual sessions I asked some of the organizers and "senior attendants" whom they felt gave the best or the most interesting or promising talks. As usual, in these ALife reports of mine, there is a strong dose of subjective judgement and bias. Some highlights were :- Jeffrey Kephart's "A Biologically Inspired Immune System for Computers", introduced the notion of "computer immune systems" to counter computer viruses. He is from IBM, so he was woolly on details, but he said that the millions of dollars spent on viral protection made a computer immune system essential. He also stated that a running system would be ready at IBM within a year. Such a system could be the first multimillion dollar ALife based application. Hosokawa et al's talk "Dynamics of Self Assembling Systems - Analogy with Chemical Kinetics", I did not see at the conference, but had seen already at a seminar they presented at ATR. They shake cardboard triangles with internal magnets so that they self assemble into multicelled systems. They then analyse the probabilities of forming various self assembling shapes. Beckers et al's talk "From Local Actions to Global Tasks : Stigmergy and Collective Robotics" I did not see either. It took a foraging behavioral principle of termites (stigmergy) and applied it to minirobots. Nolfi et al's "How to Evolve Autonomous Robots : Different Approaches in Evolutionary Robotics" discussed the rival approaches to evolving neural controllers for robots, i.e. simulation or real world fitness measurements. (i.e. fast and simple, vs. slow and complex). A good overview paper of a complex and important issue. etc etc The other plenary talks were :- Jill Tarter and Paul Horowitz on "Search for Extra-Terrestrial Intelligence". This promised to be a fun talk, but Tarter is too nuts-and-bolts a personality and was too preoccupied by a recent funding cut to relate well to her audience. Horowitz was more fun, with a definite sense of humor matching his competence. However, what was lacking was a link between SETI and ALife. These two speakers were simply parachuted in from outside, without instructions to connect SETI to ALife. An opportunity for synergy between SETI and ALife was missed. Questions such as "what types of life should SETI expect to find, would their biochemistry necessarily be similar to ours, etc", were not even addressed. Pity. Jack Szostak spoke on "Towards the In Vitro Evolution of an RNA Replicase". This talk I found rivetting. I believe that the blossoming field of molecular evolution is the hottest and most significant branch of ALife around today. It will revolutionize the fields of genetic engineering, the drug industry, and may even play a role in the long term construction of artificial cells. This field is about GAs applied to real molecules, evolving them in a cycle of test, select, amplify. Nobels will flow from this field. Already recognition of Gerald Joyce's pioneering work in this field has come in the form of prizes. Stay tuned. Tom Ray paced up and down the stage introducing his concept of "A Proposal to Create a Network-Wide Biodiversity Reserve for Digitial Organisms", i.e. putting Tierra on thousands of computers on the Internet. Tom wowed his audience with statements like ".. the digital organisms will migrate around the globe on a daily basis, staying on the dark side of the planet, because they will have discovered that there is more CPU time available at night, while users sleep". Ray dreams of "digital farming", i.e. tapping spontaneously evolved digital organisms and using them for useful purposes. He prefers spontaneous evolution to directed evolution ("autonomism" vs. "directivism"). Stefan Helmreich, an anthropologist, reported on his studies of ALifers and their work. Chris Langton introduced him saying that he (i.e. Chris) felt like a bug being examined by Helmreich. I had a rather antsy feeling listening to him, because he sounded rather like a psycho-analyst or a theologian, in the sense of not feeling compelled to put his conjectures to the test. It was most edifying to learn that most ALifers are upper middle class, straight, atheist WASPs, etc. The talk had a definite ideological axe-to-grind edge to it. He also read his speech, a real no-no in computer land, and spoke at machine gun pace, totally losing his non native English speaker audience. While the bullets were flying, I couldnt help thinking that surveys had shown that on average the theoretical physicists and mathematicians are the smartest groups at universities, and the anthropologists are the dumbest. Helmreich was certainly not dumb, but some of his assertions sure were antsy. The afternoon of the second day was taken up with posters and tours of MIT's Media Lab and the AI Lab. I went to the AI Lab and snapped lots of photos of the team members of "COG", Brook's latest attempt at AI. It was production line research, with a PERT chart over 3 years with more than 30 arrows, each arrow being a PhD or masters thesis. I met over a dozen young researchers working on COG, an upper torso robot with vision, hearing, hand and finger control and hopefully COGnitive abilities. This is a very ambitious project. Brooks will need all the luck he can get. At a recent Tokyo workshop, Brooks said that he launched COG, because he felt he had only one 10 year project left in him, and he wanted to have a shot at making an AI human rather than some artificial cockroach or something equally unsexy. Good luck Rod, and a long life! The morning of the third day, Luc Steels gave a plenary talk on "Emergent functionality of robot behavior through on-line evolution". Unfortunately, I skipped the third day, to meet another engagement, so I cant give an opinion. General Comments To those researchers in the field of evolutionary computation, I think you can congratulate yourselves. EC played a significant, if not dominant role at ALife IV. Chris Langton stated in his editorial of the first issue of the new MIT Press journal "Artificial Life" that he did not want to see any more "YANNs" (i.e. yet another (evolved) neural net). This shows how powerful a tool EC has become. A journalist writing on evolvable hardware in the magazine "The Economist" in 1993, described evolution as the computational theme of the 90s. It looks that way more and more. I asked over a dozen people what they thought of the conference in general. An assortment of comments were :- The field of ALife has matured. The mathematicians are starting to move in, time to move out. There was little new, just more of the same. A good solid conference, solid work, respectable. Boring, all the fringey stuff was weeded out. I must say, that the last comment hit home for me. ALife IV felt like "just another conference", to me, whereas ALife III had real zing. Apart from a few papers on evolvable hardware, a paper on computer immunity, and a few others, there was little I could describe as being qualitatively new. It looks as though the field has matured, as evidenced by the fact that there is now an MIT Press ALife journal, and that 500 or so people turned up to ALife IV. Chris Langton's three ALife conferences were characterised by a mix of creative fun and solid competence. I felt the ALife IV conference lacked the fun element. This can be dangerous because the "creative-crazies" who pioneer a field are a fickle lot, and can very easily move on to the next hot topic. I remember a conversation with Chris Langton, wondering what the next hot topic will be. We didnt know. Well, now I think I know what it will be. I had premonitions of it listening to Terzopolous's and Sims's talks. My feeling is that enough people are now playing around with building artificial nervous systems, (e.g. the "3 musketeers" at Sussex, UK; Beer and Arbib in the US; our group at ATR, Japan; Nolfi et al in Italy; etc) that the time is ripe for the birth of a new field, which I call simply "Brain Building". I'm sticking my neck out here, but I feel fairly confident this will happen. I'm predicting that the field of ALife will give birth to this new field. I'm curious to see how other people feel about this prediction. ALife V in Japan (probably Kyoto or Nara), 1996. Finally, if you have been promising yourself a trip to Japan before you get too old, here is your chance. ALife V, will be held in 1996 in Japan, probably in May, in Kyoto or Nara, Japan's favorite tourist cities, with "a temple on every corner". Maybe you can combine the conference with a week or two of touristing. I live here and I still havent exhausted what there is to see. If I'm not too busy talking with my million neuron brain in 1996, see you there (i.e. here). MIT Press will publish the oral papers in a book due out within a matter of weeks I'm told. Cheers, Hugo de Garis Dr. Hugo de Garis, Brain Builder Group, Evolutionary Systems Department, ATR Human Information Processing Research Laboratories, 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto-fu, Kansai Science City, 619-02, Japan. tel. + 81 7749 5 1079, fax. + 81 7749 5 1008, email. degaris at hip.atr.co.jp From Mark_Kantrowitz at GLINDA.OZ.CS.CMU.EDU Thu Jul 21 15:41:56 1994 From: Mark_Kantrowitz at GLINDA.OZ.CS.CMU.EDU (Mark Kantrowitz) Date: Thu, 21 Jul 94 15:41:56 -0400 Subject: CMU Artificial Intelligence Repository Message-ID: <11485.774819716@GLINDA.OZ.CS.CMU.EDU> ** ANNOUNCING ** ++++++++++++++++++++++++++++++++++++++++++ + CMU Artificial Intelligence Repository + + and + + Prime Time Freeware for AI + ++++++++++++++++++++++++++++++++++++++++++ July 1994 The CMU Artificial Intelligence Repository was established by Carnegie Mellon University to contain public domain and freely distributable software, publications, and other materials of interest to AI researchers, educators, students, and practitioners. The AI Repository currently contains more than a gigabyte of material and is growing steadily. The AI Repository is accessible for free by anonymous FTP, AFS, and WWW. A selection of materials from the AI Repository is also being published on CD-ROM by Prime Time Freeware and should be available for purchase at AAAI-94 or direct by mail or fax from Prime Time Freeware (see below). ----------------------------- Accessing the AI Repository: ----------------------------- To access the AI Repository by anonymous FTP, ftp to: ftp.cs.cmu.edu [128.2.206.173] and cd to the directory: /user/ai/ Use username "anonymous" (without the quotes) and type your email address (in the form "user at host") as the password. To access the AI Repository by AFS (Andrew File System), use the directory: /afs/cs.cmu.edu/project/ai-repository/ai/ To access the AI Repository by WWW, use the URL: http://www.cs.cmu.edu:8001/Web/Groups/AI/html/repository.html Be sure to read the files 0.doc and readme.txt in this directory. ------------------------------- Contents of the AI Repository: ------------------------------- The AI Programming Languages and the AI Software Packages sections of the repository are "complete". These can be accessed in the lang/ and areas/ subdirectories of the AI Repository. Compression and archiving utilities may be found in the util/ subdirectory. Other directories, which are in varying states of completion, are events/ (Calendar of Events, Conference Calls) and pubs/ (Publications, including technical reports, books, mail/news archives). The AI Programming Languages section includes directories for Common Lisp, Prolog, Scheme, Smalltalk, and other AI-related programming languages. The AI Software Packages section includes subdirectories for: agents/ Intelligent Agent Architectures alife/ Artificial Life and Complex Adaptive Systems anneal/ Simulated Annealing blackbrd/ Blackboard Architectures bookcode/ Code From AI Textbooks ca/ Cellular Automata classics/ Classical AI Programs constrnt/ Constraint Processing dai/ Distributed AI discover/ Discovery and Data-Mining doc/ Documentation edu/ Educational Tools expert/ Expert Systems/Production Systems faq/ Frequently Asked Questions fuzzy/ Fuzzy Logic games/ Game Playing genetic/ Genetic Algorithms, Genetic Programming, Evolutionary Programming icot/ ICOT Free Software kr/ Knowledge Representation, Semantic Nets, Frames, ... learning/ Machine Learning misc/ Miscellaneous AI music/ Music neural/ Neural Networks, Connectionist Systems, Neural Systems nlp/ Natural Language Processing (Natural Language Understanding, Natural Language Generation, Parsing, Morphology, Machine Translation) planning/ Planning, Plan Recognition reasonng/ Reasoning (Analogical Reasoning, Case Based Reasoning, Defeasible Reasoning, Legal Reasoning, Medical Reasoning, Probabilistic Reasoning, Qualitative Reasoning, Temporal Reasoning, Theorem Proving/Automated Reasoning, Truth Maintenance) robotics/ Robotics search/ Search speech/ Speech Recognition and Synthesis testbeds/ Planning/Agent Testbeds vision/ Computer Vision The repository has standardized on using 'tar' for producing archives of files and 'gzip' for compression. ------------------------------------- Keyword Searching of the Repository: ------------------------------------- To search the keyword index by mail, send a message to: ai+query at cs.cmu.edu with one or more lines containing calls to the keys command, such as: keys lisp iteration in the message body. You'll get a response by return mail. Do not include anything else in the Subject line of the message or in the message body. For help on the query mail server, include: help instead. A Mosaic interface to the keyword searching program is in the works. We also plan to make the source code (including indexes) to this program available, as soon as it is stable. ------------------------------------------ Contributing Materials to the Repository: ------------------------------------------ Contributions of software and other materials are always welcome, but must be accompanied by an unambiguous copyright statement that grants permission for free use, copying, and distribution, such as: - a declaration that the materials are in the public domain, or - a copyright notice that states that the materials are subject to the GNU General Public License (cite version), or - some other copyright notice (we will tell you if the copying permissions are too restrictive for us to include the materials in the repository) Inclusion of materials in the repository does not modify their copyright status in any way. Materials may be placed in: ftp.cs.cmu.edu:/user/ai/new/ When you put anything in this directory, please send mail to ai+contrib at cs.cmu.edu giving us permission to distribute the files, and state whether this permission is just for the AI Repository, or also includes publication on the CD-ROM version (Prime Time Freeware for AI). We would appreciate if you would include a 0.doc file for your package; see /user/ai/new/package.doc for a template. (If you don't have the time to write your own, we can write it for you based on the information in your package.) ------------------------------------- Prime Time Freeware for AI (CD-ROM): ------------------------------------- A portion of the contents of the repository is published annually by Prime Time Freeware. The first issue consists of two ISO-9660 CD-ROMs bound into a 224-page book. Each CD-ROM contains approximately 600 megabytes of gzipped archives (more than 2 gigabytes uncompressed and unpacked). Prime Time Freeware for AI is particularly useful for folks who do not have FTP access, but may also be useful as a way of saving disk space and avoiding annoying FTP searches and retrievals. Prime Time Freeware helped establish the CMU AI Repository, and sales of Prime Time Freeware for AI will continue to help support the maintenance and expansion of the repository. It sells (list) for US$60 plus applicable sales tax and shipping and handling charges. Payable through Visa, MasterCard, postal money orders in US funds, and checks in US funds drawn on a US bank. For further information on Prime Time Freeware for AI and other Prime Time Freeware products, please contact: Prime Time Freeware 370 Altair Way, Suite 150 Sunnyvale, CA 94086 USA Tel: +1 408-433-9662 Fax: +1 408-433-0727 E-mail: ptf at cfcl.com ------------------------ Repository Maintainer: ------------------------ The AI Repository was established by Mark Kantrowitz in 1993 as an outgrowth of the Lisp Utilities Repository (established 1990) and his work on the FAQ (Frequently Asked Questions) postings for the AI, Lisp, Scheme, and Prolog newsgroups. The Lisp Utilities Repository has been merged into the AI Repository. Bug reports, comments, questions and suggestions concerning the repository should be sent to Mark Kantrowitz . Bug reports, comments, questions and suggestions concerning a particular software package should be sent to the address indicated by the author. From rreilly at nova.ucd.ie Fri Jul 22 06:49:41 1994 From: rreilly at nova.ucd.ie (Ronan Reilly) Date: Fri, 22 Jul 1994 11:49:41 +0100 Subject: Research Position - Dublin, Ireland Message-ID: RESEARCH POSITION IN CONNECTIONIST AI Applications are invited for a two-year research position in the Computer Science Department of University College Dublin in the area of connectionist and symbolic knowledge representation with particular application to the construction of expert systems. The position is funded under ESPRIT Project 8162: Quality Assessment of Living with Information Technology (QUALIT). The ideal candidate will have a good honours degree in computer science or related discipline and will have research experience in the area of hybrid connectionist/symbolic expert systems. Salary will be in the range IR15,000-18,000 depending on experience. Applications should be sent to: Dr Ronan Reilly Department of Computer Science University College Dublin Belfield Dublin 4 IRELAND or by e-mail to: rreilly at nova.ucd.ie Closing date for applications is 19 August, 1994. ------------------------------------------------------------------------------ Ronan Reilly, PhD e-mail: rreilly at nova.ucd.ie Dept. of Computer Science Tel.: +353-1-706 2475 University College Dublin Fax: +353-1-269 7262 Belfield, Dublin 4 IRELAND ------------------------------------------------------------------------------ From terry at salk.edu Fri Jul 22 20:26:05 1994 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 22 Jul 94 17:26:05 PDT Subject: ALife IV Conference Report, Hugo de Garis, ATR Message-ID: <9407230026.AA18267@salk.edu> >I remember a conversation with Chris Langton, wondering what the >next hot topic will be. We didnt know. Well, now I think I know what it >will be. I had premonitions of it listening to Terzopolous's and Sims's talks. >My feeling is that enough people are now playing around with building >artificial nervous systems, (e.g. the "3 musketeers" at Sussex, UK; Beer and >Arbib in the US; our group at ATR, Japan; Nolfi et al in Italy; etc) that the >time is ripe for the birth of a new field, which I call simply "Brain >Building". I'm sticking my neck out here, but I feel fairly confident this >will happen. I'm predicting that the field of ALife will give birth to this >new field. I'm curious to see how other people feel about this prediction. The First Annual Telluride Workshop on Neuromorphic Engineering sponsored by the NSF recently tackled this very issue. The goal is to develop a new low power autonomous technology suitable for guiding robots like the ones that Mark Tilden has been evolving. (It may be significant that Mark attended this workshop rather than the AL Meeting). Christof Koch and I helped to organize this workshop which brought together engineers and neuroscientists from academia and industry who were interested in building living creatures. Low power analog vlsi chips already exist that can analyze visual and auditory sensory inputs and cortical circuit chips are being developed by Rodney Douglas and Misha Mahowald; the principles of sensorimotor integration as studied by Dana Ballard and Richard Andersen are at the focus of the theoretical breakthroughs that will be needed to achieve the goal of autonomy in the real world by the next century. One important milestone was announced at the workshop: Reliable analog on-chip learning has been developed in Carver Mead's laboratory that will make possible adaptive mechanisms and learning at all levels of processing, as occurs in biological systems. A report on the outcome of the workshop will be made available via ftp -- an announcement will follow in August. Terry ----- From lars at eiffel.ei.dth.dk Mon Jul 25 10:59:10 1994 From: lars at eiffel.ei.dth.dk (Lars Kai Hansen) Date: Mon, 25 Jul 94 15:59:10 +0100 Subject: Report: The Error-Reject Tradeoff Message-ID: <9407251459.AA16729@eiffel.ei.dth.dk> FTP-host: eivind.ei.dth.dk [129.142.65.123] FTP-file: /dist/hansen_reject.ps.Z [30 pages; 400kb uncompressed] The following technical report is available by anonymous ftp from the Electronics Institute ftp-server. Hardcopies are not available. ------------------------------------------------------------------------------- "THE ERROR-REJECT TRADEOFF" Lars Kai Hansen Christian Liisberg Peter Salamon CONNECT, Applied Bio Cybernetics Dept. of Math. Sciences Electronics Inst. B349 DK-3390 Hundested, San Diego State University Tech. Univ. Denmark Denmark San Diego CA 92182 USA, DK-2800 Lyngby, Denmark Abstract: We investigate the error versus reject tradeoff for classifiers. Our analysis is motivated by the remarkable similarity in error-reject tradeoff curves for widely differing algorithms classifying handwritten characters. We present the data in a new scaled version that makes this universal character particularly evident. Based on Chow's theory of the error-reject tradeoff and its underlying Bayesian analysis we argue that such universality is in fact to be expected for general classification problems. Furthermore, we extend Chow's theory to classifiers working from finite samples on a broad, albeit limited class of problems. The problems we consider are effectively binary, i.e., classification problems for which almost all inputs involve a choice between the right classification and at most one predominant alternative. We show that for such problems at most half of the initially rejected inputs would have been erroneously classified. We show further that such problems arise naturally as small perturbations of the PAC model for large training sets. The perturbed model leads us to conclude that the dominant source of error comes from pairwise overlapping categories. For infinite training sets, the overlap is due to noise and/or poor preprocessing. For finite training sets there is an additional contribution from the inevitable displacement of the decision boundaries due to finiteness of the sample. In either case, a rejection mechanism which rejects inputs in a shell surrounding the decision boundaries leads to a universal form for the error-reject tradeoff. Finally we analyze a specific reject mechanism based on the extent of consensus among an ensemble of classifiers. For the ensemble reject mechanism we find an analytic expression for the error-reject tradeoff based on a maximum entropy estimate of the problem difficulty distribution. Keywords: error-reject tradeoff, handwritten digits, ensembles, neural networks. ------------------------------------------------------------------------------------ - Lars Kai Hansen lkhansen at ei.dtu.dk From ifsa95 at dep.fem.unicamp.br Mon Jul 25 15:03:20 1994 From: ifsa95 at dep.fem.unicamp.br (IFSA95) Date: Mon, 25 Jul 94 14:03:20 EST Subject: IFSA'95 Call for papers Message-ID: <9407251703.AA01463@dep.fem.unicamp.br> IFSA '95 Sixth International Fuzzy Systems Association World Congress Sao Paulo-Brazil July 22-28, 1995 ___________ First Call for Papers ___________ The International Fuzzy Systems Association is pleased to announce the sixth IFSA World Congress (IFSA '95) to be held in Sao Paulo, Brazil, from July 22nd to 28th, 1995. The goals of the congress are twofold; the first is to encourage communication between researchers throughout the world whose research either draws support from, or complements, the theory and applications of fuzzy sets related models. The second goal is to explore industrial applications of fuzzy systems technology to make systems more convenient. The theme of IFSA '95 is "New Frontiers", aiming at enlarging the horizons of research in fuzzy sets beyond its traditional areas. The congress is divided into 7 main topical areas, whose themes include, but are not limited to, the theory or applications in the topics listed below with the respective area chairs. Authors are invited to submit extended abstracts for consideration by the program committee. Four copies of 4-page extended abstracts, one paper submission form, and one cadastration form filled for each author, should be sent to the secretariat (see address below) before November 1st, 1994. Fax submissions will not be accepted. The proceedings will contain the final versions of refereed papers, prepared on camera-ready sheets. For any further inquires, please send a message to ifsa95 at dep.fem.unicamp.br. DATES _____ Tutorials July 22 - 23, 1995 Conference July 24 - 28, 1995 Demonstrations July 22 - 28, 1995 DEADLINES _________ Reception of 4 copies of 4-page extended abstracts; 1 cadastration form for each author; and 1 paper submission form - November 1st, 1994 Notification of acceptance - February 1st, 1995 Reception of final camera ready copy for proceedings - April 1st, 1995 SECRETARIAT ___________ Address : INPE/Setor de Eventos/ IFSA'95 Av. dos Astronautas, 1758 - Caixa Postal 515 12201-970 Sao Jose' dos Campos - SP - Brazil Phone: +55-123-418977 Fax: +55-123-218743 IFSA'95 MAILING LIST SUBSCRIPTION AND SPECIFIC INFORMATION __________________________________________________________ To subscribe to IFSA'95 mailing list please send a message to : listserv at cesar.unicamp.br In the body of the message please write "subscribe IFSA" along with your name, in a line by itself. For further information please contact : ifsa95 at dep.fem.unicamp.br ORGANIZERS __________ General Chairman : Armando Rocha (Brazil) Vice Chairman : George Klir (USA) Honorary Chairman : Lofti Zadeh (USA) Honorary Vice-Chairmen: J. Bezdeck (USA) D. Dubois (France) E. Sanchez (France) T. Terano (Japan) H.-J. Zimmerman (Germany) Steering Commitee Chairwoman : Sandra Sandri (Brazil) Scientific Board Chairman : Fernando Gomide (Brazil) Local Arrangements Chairman : Marcio Rillo (Brazil) AREA CHAIRS ___________ Artificial Intelligence - Ronald Yager (USA) _______________________________________________________________________ Approximate Reasoning, Knowledge Acquisition, Knowledge Representation, Expert Systems Design, Natural Language Issues, Decision Making, Computer Vision & Pattern Recognition, Distributed AI, Genetic Algorithms, Artificial Life, Evolutionary Systems, and other related topics. Engineering - Kaoru Hirota (Japan) _______________________________________________________________________ Fuzzy Control, Hybrid Control, Industrial Robots, Intelligent Robotics, Industrial Systems, Fuzzy Petri Nets, Manufacturing, and other related topics. Mathematical Foundations - Peter Klement _______________________________________________________________________ Non-Classical Logics, Category Theory, Analysis Algebra and Topology, Functional Equations, Fuzzy Measures, Approximation Theory, Evidence Theory & Probability & Statistics, Relational Equations, and other related topics. Information Sciences - Henri Prade (France) _______________________________________________________________________ Automata, Fuzzy Grammars, Formal Languages, Information Retrieval, Fuzzy Databases, Distributed Data Bases, Information Theory, Distributed Soft-Computing, and other related topics. Health Sciences, Biology, Psychology - Donna Hudson (USA) _______________________________________________________________________ Medical Diagnosis, Intelligent Patient Monitoring, Laboratory Intelligent Systems, Applications in Molecular Biology, Physiology, Pharmacology, Perception, and other related topics. Neural Nets and Hardware - Takeshi Yamakawa (Japan) _______________________________________________________________________ Neural Nets, System Identifications by Neural Networks, Neural Chips, Hardware Implementations of Neural Networks, Fuzzy Hardware and Fuzzy Logic Computers Implementation, Novel Devices for Neural & Chaotic Systems, and other related topics. Fuzzy Systems - J.L. Verdegay (Spain) _______________________________________________________________________ Fuzzy Linear and Non-linear Programming, Multiple Objective Decision Making, Fuzzy Mathematic Programming, Multiple Criteria Decision, Multi-person Decision Making, Operational Research, Management, Economic Systems, Fuzzy Databases, Distributed Data Bases, and other related topics. Also, papers not classified in the above areas. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - CADASTRATION FORM Last Name ____________________________________________________________ First name ___________________________________________________________ Organization/Affiliation _____________________________________________ ______________________________________________________________________ Address ______________________________________________________________ ______________________________________________________________________ Zip/Postal Code _____________________ City ___________________________ Country ______________________________________________________________ Telephone ___________________________________________________________ Fax __________________________________________________________________ E-mail address ______________________________________________________ 1a) Do you intend to submit a paper at the conference ? Yes __ No __ 1b) As main author ? Yes __ No __ 2) Do you intend to attend the conference ? Yes __ No __ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - PAPER SUBMISSION FORM Title ________________________________________________________________ Authors ______________________________________________________________ Reader at the conference _____________________________________________ Area of the paper (please mark only one option) [ ] Artificial Intelligence [ ] Engineering [ ] Mathematical Foundations [ ] Information Sciences [ ] Health Sciences, Biology, Psychology [ ] Neural Nets and Hardware [ ] Fuzzy Systems From jan at riks.nl Wed Jul 27 09:07:09 1994 From: jan at riks.nl (Jan Paredis) Date: Wed, 27 Jul 94 15:07:09 +0200 Subject: TR: Co-evolutionary Training of NNs Message-ID: <9407271307.AA07180@london> *** DO NOT FORWARD TO OTHER GROUPS *** The following paper is now available: TITLE: Steps towards Co-evolutionary Classification Neural Networks AUTHOR: Jan Paredis 10 pages To appear in: Proc. Artificial Life IV, R. Brooks, P. Maes (eds), MIT Press / Bradford Books. ABSTRACT This paper proposes two improvements to the genetic evolution of neural networks (NNs): life-time fitness evaluation and co-evolution. A classi- fication task is used to demonstrate the potential of these methods and to compare them with state-of-the-art evolutionary NN approaches. Furthermore, both methods are complementary: co-evolution can be used in combination with life-time fitness evaluation. Moreover, the continuous feedback associated with life-time evalua- tion paves the way for the incorporation of life-time learning. This may lead to hybrid approaches which involve genetic as well as, for example, back- propagation learning. In addition to this, life-time fitness evaluation allows an apt response to noise and changes in the problem to be solved. ------------------------------- To obtain a hardcopy send an e-mail to: jan at riks.nl Subject: NN paper request Body: Your full Snail Mail adress Then a hardcopy will be sent to you Jan Paredis RIKS Postbus 463 NL-6200 AL Maastricht The Netherlands email: jan at riks.nl tel: +31 43 253433 fax: +31 43 253155 From tishby at CS.HUJI.AC.IL Wed Jul 27 09:15:55 1994 From: tishby at CS.HUJI.AC.IL (Tali Tishby) Date: Wed, 27 Jul 1994 16:15:55 +0300 Subject: Updates on 12-ICPR, Jerusalem Message-ID: <199407271315.AA04812@humus.cs.huji.ac.il> =============================================================================== ***** Updates ***** 12th ICPR : INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION 9-13 October 1994 Renaissance Hotel, Jerusalem, Israel ***** Advance Registration Deadline: 9 August 1994 ***** ***** Authors: Camera ready due August 8 at the IEEE Computer Society ***** =============================================================================== 1. Get full updated information by sending E-Mail to icpr-info at cs.huji.ac.il. 2. A network of 15 Silicon Graphics computers and 10 NCD X-terminals, with a high-speed Internet link, will be available. Bring your Demonstrations!! You could also telnet to your own computer, of course, and read E-Mail. 3. On-Line information about Jerusalem can be obtained by telnet into "www.huji.ac.il", login as www, and then select "[1] Line Mode Interface" followed by "[3] Databases in Israel" and "[13] The Jerusalem Mosaic". Dont worry if you get some funny symbols. If you have Mosaic you can select: http://shum.cc.huji.ac.il/jeru/jerusalem.html 4. The Banquet will be a Bedouine feast, combined with a special sight-and-sound show, at the foot of Massada. An unfogettable experience! During the banquet, the following announcements will be made: * IAPR Announcement: New IAPR Executive Committee, Venue for 14-ICPR * Nomination of IAPR Fellows * Best Industry-Related Paper Award * Best-Paper-Award by the journal "Pattern Recognition" 5. The opening session of the conference will be on Monday, Oct 10, 08:30 AM: 8:30 Welcome Address: J. Aggarwal, President of IAPR 8:40 Presentation of the K.S. Fu Award 8:45 Address by the winner of the K.S. Fu Award 9:15 Welcome Address: 12-ICPR Conference Chairmen 9:30 Plenary Talk: Avnir, D. - Hebrew University - THE PATTERNED NATURE 10:00 Coffee Break 10:30 Start of 4 Parallel Sessions 6. Master Card is now also accepted for registration payments. =============================================================================== ------- End of Forwarded Message From ro2m at crab.psy.cmu.edu Wed Jul 27 14:57:31 1994 From: ro2m at crab.psy.cmu.edu (Randall C. O'Reilly) Date: Wed, 27 Jul 94 14:57:31 EDT Subject: TR: Hippocampal Conjunctive Encoding, Storage, and Recall Message-ID: <9407271857.AA27494@crab.psy.cmu.edu.psy.cmu.edu> The following Technical Report is available both electronically from our own FTP server or in hard copy form. Instructions for obtaining copies may be found at the end of this post. ======================================================================== HIPPOCAMPAL CONJUNCTIVE ENCODING, STORAGE, AND RECALL: AVOIDING A TRADEOFF Randall C. O'Reilly James L. McClelland Carnegie Mellon University Technical Report PDP.CNS.94.4 June 1994 The hippocampus and related structures are thought to be capable of: 1) representing cortical activity in a way that minimizes overlap of the representations assigned to different cortical patterns (pattern separation); and 2) modifying synaptic connections so that these representations can later be reinstated from partial or noisy versions of the cortical activity pattern that was present at the time of storage (pattern completion). We point out that there is a tradeoff between pattern separation and completion, and propose that the unique anatomical and physiological properties of the hippocampus might serve to minimize this tradeoff. We use analytical methods to determine quantitative estimates of both separation and completion for specified parameterized models of the hippocampus. These estimates are then used to evaluate the role of various properties and of the hippocampus, such as the activity levels seen in different hippocampal regions, synaptic potentiation and depression, the multi-layer connectivity of the system, and the relatively focused and strong mossy fiber projections. This analysis is focused on the feedforward pathways from the Entorhinal Cortex (EC) to the Dentate Gyrus (DG) and region CA3. Among our results are the following: 1) Hebbian synaptic modification (LTP) facilitates completion but reduces separation, unless the strengths of synapses from inactive presynaptic units to active postsynaptic units are reduced (LTD). 2) Multiple layers, as in EC to DG to CA3, allow the compounding of pattern separation, but not pattern completion. 3) The variance of the input signal carried by the mossy fibers is important for separation, not the raw strength, which may explain why the mossy fiber inputs are few and relatively strong, rather than many and relatively weak like the other hippocampal pathways. 4) The EC projects to CA3 both directly and indirectly via the DG, which suggests that the two-stage pathway may dominate during pattern separation and the one-stage pathway may dominate during completion; methods the hippocampus may use to enhance this effect are discussed. ======================================================================= Retrieval information for pdp.cns TRs: unix> ftp 128.2.248.152 # hydra.psy.cmu.edu Name: anonymous Password: ftp> cd pub/pdp.cns ftp> binary ftp> get pdp.cns.94.4.ps.Z ftp> quit unix> zcat pdp.cns.94.4.ps.Z | lpr # or however you print postscript NOTE: The compressed file is 249,429 bytes long. Uncompressed, the file is 754,521 byes long. The printed version is 41 total pages long. For those who do not have FTP access, physical copies can be requested from Barbara Dorney . From jbower at smaug.bbb.caltech.edu Thu Jul 28 14:42:12 1994 From: jbower at smaug.bbb.caltech.edu (jbower@smaug.bbb.caltech.edu) Date: Thu, 28 Jul 94 11:42:12 PDT Subject: No subject Message-ID: <9407281842.AA17673@smaug.bbb.caltech.edu> >I remember a conversation with Chris Langton, wondering what the >next hot topic will be ... >the time is ripe for the birth of a new field, which I call simply "Brain >Building". I'm sticking my neck out here, but I feel fairly confident this >will happen. I'm predicting that the field of ALife will give birth to this >new field. I'm curious to see how other people feel about this prediction. >Low power analog vlsi chips already exist that can analyze visual and auditory >sensory inputs and cortical circuit chips are being developed by Rodney Douglas >and Misha Mahowald; the principles of sensorimotor integration as >studied by Dana Ballard and Richard Andersen are at the focus of >the theoretical breakthroughs that will be needed to achieve the >goal of autonomy in the real world by the next century. So it is time for the birth of a new neural networks/AI/AL field, must be funding is getting tight. This time, is it at all possible to avoid the hype inherent in words like "Brain Building", with objectives like "building living creatures". "Neuromorphic Engineering" is bad enough. After 10 years, I continue to fail to see any intellectually justifiable reason for such descriptions. It seems to me we have not yet completely acheived the prediction of the original neural networks DARPA report that the brain of a bee would be understood within 5 years. That was 6 years ago and it seems to me that the bee still withstands our best efforts. Now we are being launched in the direction of creating autonomous life based on sensory/motor processing in primates "by the next century". Maybe it will be easier to understand the brain of a primate than a bee, but I doubt it. Or maybe it is not necessary to understand the device being emulated before building it. Last week the third annual Computational Neuroscience Meeting was held in Monterey, California. The entire meeting was devoted to experimental and theoretical studies of real biological "neural computation". That is, presentations at the meeting concerned the detailed structure and possible computational significance of real "brains". The meeting had 250 attendees from throughout the world and represented many if not most of the leading institutions and laboratories involved in studying real neural computation. Despite this fact, and despite repeated invitations via, among others, the connectionist mailing list, almost no one from the neural networks community attended. This mirrors what appears to be the remarkably common perception of neurobiologists who attend meetings like NIPS, INNS, etc. that there is little real interest, or understanding, of neurobiology in the neural networks community. That said, I would like to issue an open invitation to those of you on the connectionist mailing list who would actually like to know more about the neural systems you propose to morph. The organizers of the CNS meetings have established a new mailing list for those specifically interested in computational neurobiology. That is, for those interested in trying to figure out how real brains compute. It will be managed in much the same fashion as the connectionist mailing list with Dave Beeman in Boulder, Colorado as the first moderator. One of the initial postings will be a synopsis of the "hot research topics" in computational neuroscience that came out of the post meeting workshops (building life was not one of them). We anticipate and encourage cross-fertilization between the two mailing groups. However, it is hard to avoid the interpretation that, at least at this point, what is interesting to computational neurobiologists is apparently not of much interest to those working in neural networks and vice versa. At the same time, it is also the case that there are many fewer neurobiologists justifying their research with reference to neural networks than there are engineers claiming to be brain builders. Thus, my periodic postings. Please send your subscription requests to: comp-neuro at smaug.bbb.caltech.edu. From french at willamette.edu Thu Jul 28 20:33:29 1994 From: french at willamette.edu (Bob French) Date: Thu, 28 Jul 1994 17:33:29 -0700 Subject: TR: reduce catastrophic interference w. context biasing Message-ID: <199407290033.RAA26115@jupiter.willamette.edu> The following paper is now available from the Ohio State neuroprose archive. It will be presented at the Cognitive Science Society Conference in Atlanta in August. It is six pages long. The work presented in this paper will be part of a larger paper on catastrophic interference to appear later this fall. Any comments will be welcome. Dynamically constraining connectionist networks to produce distributed, orthogonal representations to reduce catastrophic interference Robert M. French Department of Psychology University of Wisconsin Madison, WI 53713 email: french at head.neurology.wisc.edu or: french at willamette.edu It is now well known that when a connectionist network is trained on one set of patterns and then attempts to add new patterns to its repertoire, catastrophic interference may result. The use of sparse, orthogonal hidden-layer representations has been shown to reduce catastrophic interference. The author demonstrates that the use of sparse representations not only adversely affects a network's ability to generalize but may, in certain cases, also result in worse performance on catastrophic interference. This paper argues for the necessity of maintaining hidden-layer representations that are both as highly distributed and as highly orthogonal as possible. The author presents a fast recurrent learning algorithm, called context-biasing, that dynamically solves the problem of constraining hidden-layer representations to simultaneously produce good orthogonality and distributedness. On the data tested for this study, context-biasing is shown to reduce catastrophic interference by more than 50% compared to standard backpropagation. In particular, this technique succeeds in reducing catastrophic interference on data where sparse, orthogonal distributions failed to produce any improvement. Retrieve this paper by anonymous ftp from; archive.cis.ohio-state.edu (128.146.8.52). in the pub/neuroprose directory The name of the paper in this archive is: french.context-biasing.ps.Z For those without ftp access, write to me at: Robert M. French Dept. of Psychology University of Wisconsin Madison, Wisconsin 53706 and I'll send you hard copy. From massone at mimosa.eecs.nwu.edu Fri Jul 29 12:31:51 1994 From: massone at mimosa.eecs.nwu.edu (Lina Massone) Date: Fri, 29 Jul 94 11:31:51 CDT Subject: Two papers available on arm movements Message-ID: <9407291631.AA11679@mimosa.eecs.nwu.edu> The following two papers are available from the neuroprose archive. The papers are currently Technical Reports of the Neural Information Processing Laboratory of Northwestern University and have been submitted for publication. ftp-host:archive.cis.ohio-state.edu ftp-file: massone.arm_model.ps.Z A Neural Network Model of an Anthropomorphic Arm Lina L.E. Massone and Jennifer D. Myers Abstract This paper introduces a neural network model of a planar redundant arm whose structure and operation principles were inspired by those of the human arm. We developed the model for two purposes. One purpose was to study the relative role of control strategies and plant properties in trajectory formation, namely which features of simple arm movements can be attributed to the properties of the plant alone. We address this matter in a companion paper [the next paper]. The second purpose was a motor-learning one: to design an arm model that, because of its neural-network quality, can be eventually incorporated in a parallel distributed learning scheme for the arm controller. We modeled the arm with two joints (shoulder and elbow) and six muscle-like actuators: a pair of antagonist shoulder muscles, a pair of antagonist elbow muscles and a pair of antagonist double-joint muscles. The arm was allowed to move in the horizontal plane subject to the action of gravity. The model computes the transformation between the control signals that activate the muscle-like actuators and the coordinates of the arm endpoint. This transformation comprises four interacting stages (muscle dynamics, joint geometry, forward arm dynamics, forward arm kinematics) that we modeled with a number of feedforward and recurrent neural networks. In this paper we introduce and describe in detail the modeling methods, that are efficient, highly flexible (some of the resulting networks can be easily modified to accommodate different parametric choices and temporal scales), and quite general and hence applicable to a number of different scientific domains. ******************** ftp-host: archive.cis.ohio-state.edu ftp-file: massone.plant_properties.ps.Z A Study of the Role of Plant Properties in Arm Trajectory Formation Lina L.E. Massone and Jennifer D. Myers Abstract This paper describes the response of a neural-network model of an anthropomorphic arm to various patterns of activation of the arm muscles. The arm model was introduced and described in detail in [the previous paper]. The purpose of the simulation experiments presented here is to study the relative role of control strategies and plant properties in trajectory formation, namely which features of simple arm movements can be attributed to the properties of the plant alone -- a study that might provide some guidelines for the design of artificial arms. Our simulations demonstrate the performance of the model at steady-state, what movements the model produces in response to various activations of its muscles, and the generalization abilities of the recurrent neural network that implements the forward dynamic transformation. The results of our simulations emphasize the role of the intrinsic properties of the plant in generating movements with anthropomorphic qualities such as smoothness and unimodal velocity profiles and demonstrate that the task of an eventual controller for such an arm could be simply that of programming the amplitudes and durations of steps of neural input without considering additional motor details. Our findings are relevant to the design of artificial arms and, with some caveats, to the study of brain strategies in the arm motor system. ******************* From pfbaldi at Juliet.Caltech.Edu Fri Jul 29 10:11:04 1994 From: pfbaldi at Juliet.Caltech.Edu (Pierre F. Baldi) Date: Fri, 29 Jul 94 07:11:04 PDT Subject: New paper: How delays affect neural dynamics and learning Message-ID: <940729071104.28808357@Juliet.Caltech.Edu> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/baldi.delays1.ps.Z FTP-filename: /pub/neuroprose/baldi.delays2.ps.Z The following paper is available from the Ohio State neuroprose archive. It is scheduled to appear in: IEEE Transactions on Neural Networks, Vol. 5, 4, 626-635 (1994). How delays affect neural dynamics and learning P. Baldi JPL/Caltech A. Atiya Cairo University email: pfbaldi at juliet.caltech.edu or: amir at csvax.cs.caltech.edu We investigate the effects of delays on the dynamics and, in particular, the oscillatory properties of simple artificial neural network models. We treat in detail the case of ring networks, for which we derive simple conditions for oscillating behavior, and several formulas to predict the regions of bifurcation, the periods of the limit cycles and the phases of the various neurons. These results in turn can be applied to more complex architectures. In general, delays tend to increase the period of oscillations and broaden the spectrum of possible frequencies, in a quantifiable way. Theoretically predicted values are in excellent agreement with simulations. Adjustable delays are then proposed as one additional mechanism through which neural systems could taylor their own dynamics. Recurrent back-propagation learning equations are derived for the adjustment of delays and other parameters in networks with delayed interactions and applications are briefly discussed. Retrieve this paper by anonymous ftp from: archive.cis.ohio-state.edu (128.146.8.52) in the /pub/neuroprose directory The name of the paper in this archive is: baldi.delays1.ps.Z [24 pages] baldi.delays2.ps.Z (figures)[5 pages  pages] No hard copies available.