From Connectionists-Request at cs.cmu.edu Sat May 1 00:05:17 1993 From: Connectionists-Request at cs.cmu.edu (Connectionists-Request@cs.cmu.edu) Date: Sat, 01 May 93 00:05:17 -0400 Subject: Bi-monthly Reminder Message-ID: <23733.736229117@B.GP.CS.CMU.EDU> *** DO NOT FORWARD TO ANY OTHER LISTS *** This note was last updated January 4, 1993. This is an automatically posted bi-monthly reminder about how the CONNECTIONISTS list works and how to access various online resources. CONNECTIONISTS is not an edited forum like the Neuron Digest, or a free-for-all newsgroup like comp.ai.neural-nets. It's somewhere in between, relying on the self-restraint of its subscribers. Membership in CONNECTIONISTS is restricted to persons actively involved in neural net research. The following posting guidelines are designed to reduce the amount of irrelevant messages sent to the list. Before you post, please remember that this list is distributed to over a thousand busy people who don't want their time wasted on trivia. Also, many subscribers pay cash for each kbyte; they shouldn't be forced to pay for junk mail. Happy hacking. -- Dave Touretzky & David Redish --------------------------------------------------------------------- What to post to CONNECTIONISTS ------------------------------ - The list is primarily intended to support the discussion of technical issues relating to neural computation. - We encourage people to post the abstracts of their latest papers and tech reports. - Conferences and workshops may be announced on this list AT MOST twice: once to send out a call for papers, and once to remind non-authors about the registration deadline. A flood of repetitive announcements about the same conference is not welcome here. - Requests for ADDITIONAL references. This has been a particularly sensitive subject lately. Please try to (a) demonstrate that you have already pursued the quick, obvious routes to finding the information you desire, and (b) give people something back in return for bothering them. The easiest way to do both these things is to FIRST do the library work to find the basic references, then POST these as part of your query. Here's an example: WRONG WAY: "Can someone please mail me all references to cascade correlation?" RIGHT WAY: "I'm looking for references to work on cascade correlation. I've already read Fahlman's paper in NIPS 2, his NIPS 3 abstract, and found the code in the nn-bench archive. Is anyone aware of additional work with this algorithm? I'll summarize and post results to the list." - Announcements of job openings related to neural computation. - Short reviews of new text books related to neural computation. To send mail to everyone on the list, address it to Connectionists at CS.CMU.EDU ------------------------------------------------------------------- What NOT to post to CONNECTIONISTS: ----------------------------------- - Requests for addition to the list, change of address and other administrative matters should be sent to: "Connectionists-Request at cs.cmu.edu" (note the exact spelling: many "connectionists", one "request"). If you mention our mailing list to someone who may apply to be added to it, please make sure they use the above and NOT "Connectionists at cs.cmu.edu". - Requests for e-mail addresses of people who are believed to subscribe to CONNECTIONISTS should be sent to postmaster at appropriate-site. If the site address is unknown, send your request to Connectionists-Request at cs.cmu.edu and we'll do our best to help. A phone call to the appropriate institution may sometimes be simpler and faster. - Note that in many mail programs a reply to a message is automatically "CC"-ed to all the addresses on the "To" and "CC" lines of the original message. If the mailer you use has this property, please make sure your personal response (request for a Tech Report etc.) is NOT broadcast over the net. - Do NOT tell a friend about Connectionists at cs.cmu.edu. Tell him or her only about Connectionists-Request at cs.cmu.edu. This will save your friend from public embarrassment if she/he tries to subscribe. ------------------------------------------------------------------------------- The CONNECTIONISTS Archive: --------------------------- All e-mail messages sent to "Connectionists at cs.cmu.edu" starting 27-Feb-88 are now available for public perusal. A separate file exists for each month. The files' names are: arch.yymm where yymm stand for the obvious thing. Thus the earliest available data are in the file: arch.8802 Files ending with .Z are compressed using the standard unix compress program. To browse through these files (as well as through other files, see below) you must FTP them to your local machine. ------------------------------------------------------------------------------- How to FTP Files from the CONNECTIONISTS Archive ------------------------------------------------ 1. Open an FTP connection to host B.GP.CS.CMU.EDU (Internet address 128.2.242.8). 2. Login as user anonymous with password your username. 3. 'cd' directly to one of the following directories: /usr/connect/connectionists/archives /usr/connect/connectionists/bibliographies 4. The archives and bibliographies directories are the ONLY ones you can access. You can't even find out whether any other directories exist. If you are using the 'cd' command you must cd DIRECTLY into one of these two directories. Access will be denied to any others, including their parent directory. 5. The archives subdirectory contains back issues of the mailing list. Some bibliographies are in the bibliographies subdirectory. Problems? - contact us at "Connectionists-Request at cs.cmu.edu". ------------------------------------------------------------------------------- How to FTP Files from the Neuroprose Archive -------------------------------------------- Anonymous FTP on archive.cis.ohio-state.edu (128.146.8.52) pub/neuroprose directory This directory contains technical reports as a public service to the connectionist and neural network scientific community which has an organized mailing list (for info: connectionists-request at cs.cmu.edu) Researchers may place electronic versions of their preprints or articles in this directory, announce availability, and other interested researchers can rapidly retrieve and print the postscripts. This saves copying, postage and handling, by having the interested reader supply the paper. (Along this line, single spaced versions, if possible, will help!) To place a file, put it in the Inbox subdirectory, and send mail to pollack at cis.ohio-state.edu. Within a couple of days, I will move and protect it, and suggest a different name if necessary. When you announce a paper, you should consider whether (A) you want it automatically forwarded to other groups, like NEURON-DIGEST, (which gets posted to comp.ai.neural-networks) and if you want to provide (B) free or (C) prepaid hard copies for those unable to use FTP. If you do offer hard copies, be prepared for an onslaught. One author reported that when they allowed combination AB, the rattling around of their "free paper offer" on the worldwide data net generated over 2000 hardcopy requests! Experience dictates the preferred paradigm is to announce an FTP only version with a prominent "**DO NOT FORWARD TO OTHER GROUPS**" at the top of your announcement to the connectionist mailing list. Current naming convention is author.title.filetype[.Z] where title is enough to discriminate among the files of the same author. The filetype is usually "ps" for postscript, our desired universal printing format, but may be tex, which requires more local software than a spooler. Very large files (e.g. over 200k) must be squashed (with either a sigmoid function :) or the standard unix "compress" utility, which results in the .Z affix. To place or retrieve .Z files, make sure to issue the FTP command "BINARY" before transfering files. After retrieval, call the standard unix "uncompress" utility, which removes the .Z affix. An example of placing a file is attached as an appendix, and a shell script called Getps in the directory can perform the necessary retrival operations. For further questions contact: Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Phone: (614) 292-4890 Here is an example of naming and placing a file: gvax> cp i-was-right.txt.ps rosenblatt.reborn.ps gvax> compress rosenblatt.reborn.ps gvax> ftp archive.cis.ohio-state.edu Connected to archive.cis.ohio-state.edu. 220 archive.cis.ohio-state.edu FTP server ready. Name: anonymous 331 Guest login ok, send ident as password. Password:neuron 230 Guest login ok, access restrictions apply. ftp> binary 200 Type set to I. ftp> cd pub/neuroprose/Inbox 250 CWD command successful. ftp> put rosenblatt.reborn.ps.Z 200 PORT command successful. 150 Opening BINARY mode data connection for rosenblatt.reborn.ps.Z 226 Transfer complete. 100000 bytes sent in 3.14159 seconds ftp> quit 221 Goodbye. gvax> mail pollack at cis.ohio-state.edu Subject: file in Inbox. Jordan, I just placed the file rosenblatt.reborn.ps.Z in the Inbox. The INDEX sentence is "Boastful statements by the deceased leader of the neurocomputing field." Please let me know when it is ready to announce to Connectionists at cmu. BTW, I enjoyed reading your review of the new edition of Perceptrons! Frank ------------------------------------------------------------------------ How to FTP Files from the NN-Bench Collection --------------------------------------------- 1. Create an FTP connection from wherever you are to machine "pt.cs.cmu.edu" (128.2.254.155). 2. Log in as user "anonymous" with password your username. 3. Change remote directory to "/afs/cs/project/connect/bench". Any subdirectories of this one should also be accessible. Parent directories should not be. 4. At this point FTP should be able to get a listing of files in this directory and fetch the ones you want. Problems? - contact us at "nn-bench-request at cs.cmu.edu". From mozer at dendrite.cs.colorado.edu Sat May 1 15:42:26 1993 From: mozer at dendrite.cs.colorado.edu (Michael C. Mozer) Date: Sat, 1 May 1993 13:42:26 -0600 Subject: Final call for NIPS workshop proposals Message-ID: <199305011942.AA13154@neuron.cs.colorado.edu> CALL FOR PROPOSALS NIPS*93 Post-Conference Workshops December 3 and 4, 1993 Vail, Colorado Following the regular program of the Neural Information Processing Systems 1993 conference, workshops on current topics in neural information processing will be held on December 3 and 4, 1993, in Vail, Colorado. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Past topics have included: active learning and control; architectural issues; attention; bayesian analysis; benchmarking neural network applications; computational complexity issues; computational neuroscience; fast training techniques; genetic algorithms; music; neural network dynamics; optimization; recurrent nets; rules and connectionist models; self- organization; sensory biophysics; speech; time series prediction; vision; and VLSI and optical implementations. The goal of the workshops is to provide an informal forum for researchers to discuss important issues of current interest. Sessions will meet in the morning and in the afternoon of both days, with free time in between for ongoing individual exchange or outdoor activities. Concrete open and/or controversial issues are encouraged and preferred as workshop topics. Individuals proposing to chair a workshop will have responsibilities including: arranging short informal presentations by experts working on the topic, moderating or leading the discussion and reporting its high points, findings, and conclusions to the group during evening plenary sessions (the "gong show"), and writing a brief (2 page) summary. Submission Procedure: Interested parties should submit a short proposal for a workshop of interest postmarked by May 22, 1993. (Express mail is *not* necessary. Submissions by electronic mail will also be accepted.) Proposals should include a title, a description of what the workshop is to address and accomplish, and the proposed length of the workshop (one day or two days). It should motivate why the topic is of interest or controversial, why it should be discussed and what the targeted group of participants is. In addition, please send a brief resume of the prospective workshop chair, a list of publications and evidence of scholarship in the field of interest. Mail submissions to: Mike Mozer NIPS*93 Workshops Chair Department of Computer Science University of Colorado Boulder, CO 80309-0430 USA (e-mail: mozer at cs.colorado.edu) Name, mailing address, phone number, fax number, and e-mail net address should be on all submissions. PROPOSALS MUST BE POSTMARKED BY MAY 22, 1993 Please Post From kanal at cs.UMD.EDU Sat May 1 23:40:06 1993 From: kanal at cs.UMD.EDU (Laveen N. Kanal) Date: Sat, 1 May 93 23:40:06 -0400 Subject: some recent papers which may be of interest to you. Message-ID: <9305020340.AA12865@nanik.cs.UMD.EDU> Laveen N. Kanal, " On pattern, categories, and alternate realities", published in Pattern Recognition Letters, vol 14, pp. 241-255, March 1993, Elsevier/North-Holland. Tbis is the text of the talk given by the author at The Hague, The Netherlands, when he received the King-Sun Fu award of the International Association for Pattern Recognition. Contents: Preamble Pattern Some sketches from the current pattern recognition scene Artificial neural networks Hybrid systems "Where's the AI?" Categorization Alternate realities Prospects Concluding remarks "Time goes, you say? Ah, no! Alas Time stays, we go;" Pierre de Ronsard The Paradox of Time (Austin Dobson, tr) Other Recent Papers: R. Bhatnagar & L.N. Kanal, "Structural and Probabilistic Knowledge for Abductive Reasoning",IEEE Trans. on Pattern Analysis and Machine Intelligence, special issue on Probabilistic Reasoning, March 1993. L. Kanal & S. Raghavan," Hybrid Systems- A Key to Intelligent Pattern Recognition", IJCNN-92, Proc. Int. Joint. Conf on Neural Networks, June 1992. B.J. Hellstrom & L.N. Kanal, "Asymmetric Mean-Field Neural Networks for Multiprocessor Scheduling", Neural Networks, Vol. 5, pp 671-686, May 1992. L.N. Kanal & G.R. Dattatreya, "Pattern Recognition", in S. Shapiro (ed), Encyclopedia of Artificial Intelligence, 2nd edition John Wiley 1992. R. Bhatnagar & L.N. Kanal, " Reasoning in Uncertain Domains-A Survey and Commentary", in A. Kent & J.G. Williams (eds), Encyclopedia of Computer Science and Technology,p. 297-316,(also in Encyclopedia of Microcomputers, Marcel Dekker, Inc, 1992. Laveen N. Kanal Prof. of Computer Science A.V. Williams Bldg. Univ. of Maryland College Park, MD 20742 USA kanal at cs.umd.edu From hwang at pierce.ee.washington.edu Mon May 3 15:17:00 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Mon, 3 May 93 12:17:00 PDT Subject: back-propagation and projection pursuit learning Message-ID: <9305031917.AA22668@pierce.ee.washington.edu.> Technical Report available from neuroprose: REGRESSION MODELING IN BACK-PROPAGATION AND PROJECTION PURSUIT LEARNING Jenq-Neng Hwang, Shyh-Rong Lay Information Processing Laboratory Department of Electrical Engineering, FT-10 University of Washington, Seattle, WA 98195 and Martin Maechler, Doug Martin, Jim Schimert Department of Statistics, GN-22 University of Washington, Seattle, WA 98195 ABSTRACT We studied and compared two types of connectionist learning methods for model-free regression problems in this paper. One is the popular "back-propagation" learning (BPL) well known in the artificial neural networks literature; the other is the "projection pursuit" learning (PPL) emerged in recent years in the statistical estimation literature. Both the BPL and the PPL are based on projections of the data in directions determined from interconnection weights. However, unlike the use of fixed nonlinear activations (usually sigmoidal) for the hidden neurons in BPL, the PPL systematically approximates the unknown nonlinear activations. Moreover, the BPL estimates all the weights simultaneously at each iteration, while the PPL estimates the weights cyclically (neuron-by-neuron and layer-by-layer) at each iteration. Although the BPL and the PPL have comparable training speed when based on a Gauss-Newton optimization algorithm, the PPL proves more parsimonious in that the PPL requires a fewer hidden neurons to approximate the true function. To further improve the statistical performance of the PPL, an orthogonal polynomial approximation is used in place of the supersmoother method originally proposed for nonlinear activation approximation in the PPL. ================ To obtain copies of the postscript file, please use Jordan Pollack's service (no hardcopies will be provided): Example: unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52) Name (archive.cis.ohio-state.edu): anonymous Password (archive.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get hwang.bplppl.ps.Z ftp> quit unix> uncompress hwang.bplppl.ps Now print "hwang.bplppl.ps" as you would any other (postscript) file. From bovik at cs.utexas.edu Sat May 1 11:57:40 1993 From: bovik at cs.utexas.edu (Alan C. Bovik) Date: Sat, 1 May 1993 10:57:40 -0500 Subject: No subject Message-ID: <9305011557.AA16002@im4u.cs.utexas.edu> FIRST IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING November 13-16, 1994 Austin Convention Center, Austin, Texas, USA PRELIMINARY CALL FOR PAPERS Sponsored by the Institute of Electrical and Electronics En- gineers (IEEE) Signal Processing Society, ICIP-94 is the inaugur- al international conference on theoretical, experimental and ap- plied image processing. It will provide a centralized, high- quality forum for presentation of technological advances and research results by scientists and engineers working in Image Processing and associated disciplines such as multimedia and video technology. Also encouraged are image processing applica- tions in areas such as the biomedical sciences and geosciences. SCOPE: 1. IMAGE PROCESSING: Coding, Filtering, Enhancement, Restoration, Segmentation, Multiresolution Processing, Multispectral Process- ing, Image Representation, Image Analysis, Interpolation and Spa- tial Transformations, Motion Detection and Estimation, Image Se- quence Processing, Video Signal Processing, Neural Networks for image processing and model-based compression, Noise Modeling, Architectures and Software. 2. COMPUTED IMAGING: Acoustic Imaging, Radar Imaging, Tomography, Magnetic Resonance Imaging, Geophysical and Seismic Imaging, Ra- dio Astronomy, Speckle Imaging, Computer Holography, Confocal Mi- croscopy, Electron Microscopy, X-ray Crystallography, Coded- Aperture Imaging, Real-Aperture Arrays. 3. IMAGE SCANNING DISPLAY AND PRINTING: Scanning and Sampling, Quantization and Halftoning, Color Reproduction, Image Represen- tation and Rendering, Graphics and Fonts, Architectures and Software for Display and Printing Systems, Image Quality, Visual- ization. 4. VIDEO: Digital video, Multimedia, HD video and packet video, video signal processor chips. 5. APPLICATIONS: Application of image processing technology to any field. PROGRAM COMMITTEE: GENERAL CHAIR: Alan C. Bovik, U. Texas, Austin TECHNICAL CHAIRS: Tom Huang, U. Illinois, Champaign and John W. Woods, Rensselaer, Troy SPECIAL SESSIONS CHAIR: Mike Orchard, U. Illinois, Champaign EAST EUROPEAN LIASON: Henri Maitre, TELECOM, Paris FAR EAST LIASON: Bede Liu, Princeton University SUBMISSION PROCEDURES Prospective authors are invited to propose papers for lecture or poster presentation in any of the technical areas listed above. To submit a proposal, prepare a 2-3 page summary of the paper in- cluding figures and references. Send five copies of the paper summaries to: John W. Woods Center for Image Processing Research Rensselaer Polytechnic Institute Troy, NY 12180-3590, USA. Each selected paper (five-page limit) will be published in the Proceedings of ICIP-94, using high-quality paper for good image reproduction. Style files in LaTeX will be provided for the con- venience of the authors. SCHEDULE Paper summaries/abstracts due: 15 February 1994 Notification of Acceptance: 1 May 1994 Camera-Ready papers: 15 July 1994 Conference: 13-16 November 1994 CONFERENCE ENVIRONMENT ICIP-94 will be held in the recently completed state-of-the-art Convention Center in downtown Austin. The Convention Center is situated two blocks from the Town Lake, and is only 12 minutes from Robert Meuller Airport. It is surrounded by many modern hotels that provide comfortable accommodation for $75-$125 per night. Austin, the state capital, is renowned for its natural hill- country beauty and an active cultural scene. Within walking dis- tance of the Convention Center are several hiking and jogging trails, as well as opportunities for a variety of aquatic sports. Live bands perform in various clubs around the city and at night spots along Sixth Street, offering a range of jazz, blues, country/Western, reggae, swing and rock music. Day temperatures are typically in the upper sixties in mid-November. An exciting range of EXHIBITS, VENDOR PRESENTATIONS, and SOCIAL EVENTS is being planned. Innovative proposals for TUTORIALS, and SPECIAL SESSIONS are invited. For further details about ICIP-94, please contact: Conference Management Services 3024 Thousand Oaks Drive Austin, Texas 78746 Tel: 512/327/4012; Fax:512/327/8132 email: icip at pine.ece.utexas.edu PRELIMINARY CALL FOR PAPERS FIRST IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING November 13-16, 1994 Austin Convention Center, Austin, Texas, USA From hwang at pierce.ee.washington.edu Mon May 3 15:18:07 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Mon, 3 May 93 12:18:07 PDT Subject: Markov random field modeling via neural networks Message-ID: <9305031918.AA22673@pierce.ee.washington.edu.> Technical Report available from neuroprose: TEXTURED IMAGE SYNTHESIS AND SEGMENTATION VIA NEURAL NETWORK PROBABILISTIC MODELING Jenq-Neng Hwang, Eric Tsung-Yen Chen Information Processing Laboratory Department of Electrical Engineering, FT-10 University of Washington, Seattle, WA 98195 ABSTRACT It has been shown that a trained back-propagation neural network (BPNN) classifier with Kullback-Leibler criterion produces outputs which can be interpreted as estimates of Bayesian "a posteriori" probabilities. Based on this interpretation, we propose a back-propagation neural network (BPNN) approach for the estimation of the local conditional distributions of textured images, which are commonly represented by a Markov random field (MRF) formulation. The proposed BPNN approach overcomes many of the difficulties encountered in using MRF formulation. In particular our approach does not require the trial-and-error selection of clique functions or the subsequent laborious and unreliable estimation of clique parameters. Simulations show that the images synthesized using BPNN modeling produced desired artificial/real textures more consistently than MRF based methods. Application of the proposed BPNN approach to segmentation of artificial and real-world textures is also presented. ================ To obtain copies of the postscript file, please use Jordan Pollack's service (no hardcopies will be provided): Example: unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52) Name (archive.cis.ohio-state.edu): anonymous Password (archive.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get hwang.nnmrf.ps.Z ftp> quit unix> uncompress hwang.nnmrf.ps Now print "hwang.nnmrf.ps" as you would any other (postscript) file. From mel at cns.caltech.edu Mon May 3 15:30:55 1993 From: mel at cns.caltech.edu (Bartlett Mel) Date: Mon, 3 May 93 12:30:55 PDT Subject: NIPS*93: Deadline May 22 Message-ID: <9305031930.AA20897@plato.cns.caltech.edu> ******************** FINAL REMINDER, NOTE DEADLINE OF MAY 22 ***************** CALL FOR PAPERS Neural Information Processing Systems -Natural and Synthetic- Monday, November 29 - Thursday, December 2, 1993 Denver, Colorado This is the seventh meeting of an inter-disciplinary conference which brings together neuroscientists, engineers, computer scien- tists, cognitive scientists, physicists, and mathematicians in- terested in all aspects of neural processing and computation. There will be an afternoon of tutorial presentations (Nov 29) preceding the regular session and two days of focused workshops will follow at a nearby ski area (Dec 3-4). Major categories and examples of subcategories for paper submis- sions are the following: Neuroscience: Studies and Analyses of Neurobiological Systems, Inhibition in cortical circuits, Signals and noise in neural computation, Computational and Theoretical Neurobiology, Neu- rophysics. Theory: Computational Learning Theory, Complexity Theory, Dynamical Systems, Statistical Mechanics, Probability and Statistics, Approximation Theory. Implementation and Simulation: VLSI, Optical, Software Simula- tors, Implementation Languages, Parallel Processor Design and Benchmarks. Algorithms and Architectures: Learning Algorithms, Construc- tive and Pruning Algorithms, Localized Basis Functions, Tree Structured Networks, Performance Comparisons, Recurrent Net- works, Combinatorial Optimization, Genetic Algorithms. Cognitive Science & AI: Natural Language, Human Learning and Memory, Perception and Psychophysics, Symbolic Reasoning. Visual Processing: Stereopsis, Visual Motion, Recognition, Im- age Coding and Classification. Speech and Signal Processing: Speech Recognition, Coding, and Synthesis, Text-to-Speech, Adaptive Equalization, Nonlinear Noise Removal. Control, Navigation, and Planning: Navigation and Planning, Learning Internal Models of the World, Trajectory Planning, Robotic Motor Control, Process Control. Applications: Medical Diagnosis or Data Analysis, Financial and Economic Analysis, Timeseries Prediction, Protein Struc- ture Prediction, Music Processing, Expert Systems. Technical Program: Plenary, contributed and poster sessions will be held. There will be no parallel sessions. The full text of presented papers will be published. Submission Procedures: Original research contributions are soli- cited, and will be carefully refereed. Authors must submit six copies of both a 1000-word (or less) summary and six copies of a separate single-page 50-100 word abstract clearly stating their results postmarked by May 22, 1993 (express mail is not neces- sary). Accepted abstracts will be published in the conference program. Summaries are for program committee use only. At the bottom of each abstract page and on the first summary page indi- cate preference for oral or poster presentation and specify one of the above nine broad categories and, if appropriate, sub- categories (For example: Poster, Applications-Expert Systems; Oral, Implementation-Analog VLSI). Include addresses of all au- thors at the front of the summary and the abstract and indicate to which author correspondence should be addressed. Submissions will not be considered that lack category information, separate abstract sheets, the required six copies, author addresses, or are late. Mail Submissions To: Gerry Tesauro NIPS*93 Program Chair The Salk Institute, CNL 10010 North Torrey Pines Rd. La Jolla, CA 92037 Mail For Registration Material To: NIPS*93 Registration NIPS Foundation PO Box 60035 Pasadena, CA 91116-6035 All submitting authors will be sent registration material au- tomatically. Program committee decisions will be sent to the correspondence author only. NIPS*93 Organizing Committee: General Chair, Jack Cowan, Univer- sity of Chicago; Publications Chair, Joshua Alspector, Bellcore; Publicity Chair, Bartlett Mel, CalTech; Program Chair, Gerry Tesauro, IBM/Salk Institute; Treasurer, Rodney Goodman, CalTech; Local Arrangements, Chuck Anderson, Colorado State Universi- ty; Tutorials Chair, Dave Touretzky, Carnegie-Mellon, Workshop Chair, Mike Mozer, University of Colorado; Program Co-Chairs: Larry Abbott, Brandeis Univ, Chris Atkeson, MIT; A. B. Bonds, Vanderbilt Univ; Gary Cottrell, UCSD; Scott Fahlman, CMU; Rod Goodman, Caltech; John Hertz, NORDITA/NIH; John Lazzaro, UC Berkeley; Todd Leen, OGI; Jay McClelland, CMU; Nelson Morgan,ICSI; Steve Nowlan, Salk Inst./Synaptics; Misha Pavel, NASA/OGI; Sandy Pentland, MIT; Tom Petsche, Siemens. Domestic Liasons: IEEE Liaison, Terrence Fine, Cornell; Government & Cor- porate Liaison, Lee Giles, NEC Research Institute Inc.; Overseas Liasons: Mitsuo Kawato, ATR; Marwan Jabri, University of Sydney; Gerard Dreyfus, Ecole Superieure, Paris; Alan Murray, University of Edinburgh; Andreas Meier, Simon Bolivar U. DEADLINE FOR SUMMARIES & ABSTRACTS IS MAY 22, 1993 (POSTMARKED) please post 9 From elman at crl.ucsd.edu Mon May 3 23:23:25 1993 From: elman at crl.ucsd.edu (Jeff Elman) Date: Mon, 3 May 93 20:23:25 PDT Subject: new books in MIT Neural Network/Connectionsm series Message-ID: <9305040323.AA15510@crl.ucsd.edu> The following books have now appeared as part of the Neural Network Modeling and Connection Series, and may be of interest to readers of the connectionists mailing group. Detailed descriptions of each book, along with table of contents, follow. Jeff Elman ============================================================ Neural Network Modeling and Connectionism Series Jeffrey Elman, editor. MIT Press/Bradford Books. * Miikkulainen, R. "Subsymbolic Natural Language Processing An Integrated Model of Scripts, Lexicon, and Memory" * Mitchell, M. "Analogy-Making as Perception A Computer Model" * Cleeremans, A. "Mechanisms of Implicit Learning Connectionist Models of Sequence Processing" * Sereno, M.E. "Neural Computation of Pattern Motion Modeling Stages of Motion Analysis in the Primate Visual Cortex" * Miller, W.T., Sutton, R.S., & Werbos, P.J. (Eds.), "Neural Networks for Control" * Hanson, S.J., & Olson, C.R. (Eds.) "Connectionist Modeling and Brain Function The Developing Interface" * Judd, S.J. "Neural Network Design and the Complexity of Learning" * Mozer, M.C. "The Perception of Multiple Objects A Connectionist Approach" ------------------------------------------------------------ New Subsymbolic Natural Language Processing An Integrated Model of Scripts, Lexicon, and Memory Risto Miikkulainen Aiming to bridge the gap between low-level connectionist models and high-level symbolic artificial intelligence, Miikkulainen describes DISCERN, a complete natural language processing system implemented entirely at the subsymbolic level. In DISCERN, distributed neural network models of parsing, generating, reasoning, lexical processing, and episodic memory are integrated into a single system that learns to read, paraphrase, and answer questions about stereotypical narratives. Using the DISCERN system as an example, Miikkulainen introduces a general approach to building high-level cognitive models from distributed neural networks, and shows how the special properties of such networks are useful in modeling human performance. In this approach connectionist networks are not only plausible models of isolated cognitive phenomena, but also sufficient constituents for complete artificial intelligence systems. Risto Miikkulainen is an Assistant Professor in the Department of Computer Sciences at the University of Texas, Austin. Contents: I.Overview. Introduction. Background. Overview of DISCERN. II. Processing Mechanisms. Backpropagation Networks. Developing Representations in FGREP Modules Building from FGREP Modules. III. Memory Mechanisms. Self-Organizing Feature Maps. Episodic Memory Organization: Hierarchical Feature Maps. Episodic Memory Storage and Retrieval: Trace Feature Maps. Lexicon. IV. Evaluation. Behavior of the Complete Model. Discussion. Comparison to Related Work. Extensions and Future Work. Conclusions. Appendixes: A Story Data. Implementation Details. Instructions for Obtaining the DISCERN Software. A Bradford Book May 1993 - 408 pp. - 129 illus. - $45.00 0-262-13290-7 MIISH ------------------------------------------------------------ New Analogy-Making as Perception A Computer Model Melanie Mitchell Analogy-Making as Perception is based on the premise that analogy-making is fundamentally a high-level perceptual process in which the interaction of perception and concepts gives rise to "conceptual slippages" which allow analogies to be made. It describes Copycat, developed by the author with Douglas Hofstadter, that models the complex, subconscious interaction between perception and concepts that underlies the creation of analogies. In Copycat, both concepts and high-level perception are emergent phenomena, arising from large numbers of low-level, parallel, non-deterministic activities. In the spectrum of cognitive modeling approaches, Copycat occupies a unique intermediate position between symbolic systems and connectionist systems - a position that is at present the most useful one for understanding the fluidity of concepts and high-level perception. On one level the work described here is about analogy-making, but on another level it is about cognition in general. It explores such issues as the nature of concepts and perception and the emergence of highly flexible concepts from a lower-level "subcognitive" substrate. Melanie Mitchell, Assistant Professor in the Department of Electrical Engineering and Computer Science at the University of Michigan, is a Fellow of the Michigan Society of Fellows. She is also Director of the Adaptive Computation Program at the Santa Fe Institute. Contents: Introduction. High-Level Perception, Conceptual Slippage, and Analogy-Making in a Microworld. The Architecture of Copycat. Copycat's Performance on the Five Target Problems. Copycat's Performance on Variants of the Five Target Problems. Summary of the Comparisons between Copycat and Human Subjects. Some Shortcomings of the Model. Results of Selected "Lesions" of Copycat. Comparisons with Related Work. Contributions of This Research. Afterword by Douglas R. Hofstadter. Appendixes. A Sampler of Letter-String Analogy Problems Beyond Copycat's Current Capabilities. Parameters and Formulas. More Detailed Descriptions of Codelet Types. A Bradford Book May 1993 - 382 pp. - 168 illus. - $45.00 0-262-13289-3 MITAH ------------------------------------------------------------ New Mechanisms of Implicit Learning Connectionist Models of Sequence Processing Axel Cleeremans What do people learn when they do not know that they are learning? Until recently all of the work in the area of implicit learning focused on empirical questions and methods. In this book, Axel Cleeremans explores unintentional learning from an information-processing perspective. He introduces a theoretical framework that unifies existing data and models on implicit learning, along with a detailed computational model of human performance in sequence-learning situations. The model, based on a simple recurrent network (SRN), is able to predict the successive elements of sequences generated from finite-state grammars. Human subjects are shown to exhibit a similar sensitivity to the temporal structure in a series of choice reaction time experiments of increasing complexity; yet their explicit knowledge of the sequence remains limited. Simulation experiments indicate that the SRN model is able to account for these data in great detail. Other architectures that process sequential material are considered. These are contrasted with the SRN model, which they sometimes outperform. Considered together, the models show how complex knowledge may emerge through the operation of elementary mechanisms - a key aspect of implicit learning performance. Axel Cleeremans is a Senior Research Assistant at the National Fund for Scientific Research, Belgium. Contents: Implicit Learning: Explorations in Basic Cognition. The SRN Model: Computational Aspects of Sequence Processing. Sequence Learning as a Paradigm for Studying Implicit Learning. Sequence Learning: Further Explorations. Encoding Remote Control. Explicit Sequence Learning. General Discussion. A Bradford Book April 1993 - 227 pp. - 60 illus. - $30.00 0-262-03205-8 CLEMH ------------------------------------------------------------ New Neural Computation of Pattern Motion Modeling Stages of Motion Analysis in the Primate Visual Cortex Margaret Euphrasia Sereno How does the visual system compute the global motion of an object from local views of its contours? Although this important problem in computational vision (also called the aperture problem) is key to understanding how biological systems work, there has been surprisingly little neurobiologically plausible work done on it. This book describes a neurally based model, implemented as a connectionist network, of how the aperture problem is solved. It provides a structural account of the model's performance on a number of tasks and demonstrates that the details of implementation influence the nature of the computation as well as predict perceptual effects that are unique to the model. The basic approach described can be extended to a number of different sensory computations. "This is an important book, discussing a significant and very general problem in sensory processing. The model presented is simple, and it is elegant in that we can see, intuitively, exactly why and how it works. Simplicity, clarity and elegance are virtues in any field, but not often found in work in neural networks and sensory processing. The model described in Sereno's book is an exception. This book will have a sizeable impact on the field." - James Anderson, Professor, Department of Cognitive and Linguistic Sciences, Brown University Contents: Introduction. Computational, Psychophysical, and Neurobiological Approaches to Motion Measurement. The Model. Simulation Results. Psychophysical Demonstrations. Summary and Conclusions. Appendix: Aperture Problem Linearity. A Bradford Book March 1993 - 181 pp.- 41 illus. - $24.95 0-262-19329-9 SERNH ------------------------------------------------------------ Neural Networks for Control edited by W. Thomas Miller, III, Richard S. Sutton, and Paul J. Werbos This book brings together examples of all of the most important paradigms in artificial neural networks (ANNs) for control, including evaluations of possible applications. An appendix provides complete descriptions of seven benchmark control problems for those who wish to explore new ideas for building automatic controllers. Contents: I.General Principles. Connectionist Learning for Control: An Overview, Andrew G. Barto. Overview of Designs and Capabilities, Paul J. Werbos. A Menu of Designs for Reinforcement Learning Over Time, Paul J. Werbos. Adaptive State Representation and Estimation Using Recurrent Connectionist Networks, Ronald J. Williams. Adaptive Control using Neural Networks, Kumpati S. Narendra. A Summary Comparison of CMAC Neural Network and Traditional Adaptive Control Systems, L. Gordon Kraft, III, and David P. Campagna. Recent Advances in Numerical Techniques for Large Scale Optimization, David F. Shanno. First Results with Dyna, An Integrated Architecture for Learning, Planning and Reacting, Richard S. Sutton. II. Motion Control. Computational Schemes and Neural Network Models for Formation and Control of Multijoint Arm Trajectory, Mitsuo Kawato. Vision-Based Robot Motion Planning, Bartlett W. Mel. Using Associative Content-Addressable Memories to Control Robots, Christopher G. Atkeson and David J. Reinkensmeyer. The Truck Backer-Upper: An Example of Self-Learning in Neural Networks, Derrick Nguyen and Bernard Widrow. An Adaptive Sensorimotor Network Inspired by the Anatomy and Physiology of the Cerebellum, James C. Houk, Satinder P. Singh, Charles Fisher, and Andrew G. Barto. Some New Directions for Adaptive Control Theory in Robotics, Judy A. Franklin and Oliver G. Selfridge. III. Application Domains. Applications of Neural Networks in Robotics and Automation for Manufacturing, Arthur C. Sanderson. A Bioreactor Benchmark for Adapive Network-based Process Control, Lyle H. Ungar. A Neural Network Baseline Problem for Control of Aircraft Flare and Touchdown, Charles C. Jorgensen and C. Schley. Intelligent Conrol for Multiple Autonomous Undersea Vehicles, Martin Herman, James S. Albus, and Tsai-Hong Hong. A Challenging Set of Control Problems, Charles W. Anderson and W. Thomas Miller. A Bradford Book 1990 - 524 pp. - $52.50 0-262-13261-3 MILNH ------------------------------------------------------------ Connectionist Modeling and Brain Function The Developing Interface edited by Stephen Jose Hanson and Carl R. Olson This tutorial on current research activity in connectionist-inspired biology-based modeling describes specific experimental approaches and also confronts general issues related to learning, associative memory, and sensorimotor development. "This volume makes a convincing case that data-rich brain scientists and model-rich cognitive psychologists can and should talk to one another. The topics they discuss together here - memory and perception - are of vital interest to both, and their collaboration promises continued excitement along this new scientific frontier." - George Miller, Princeton University Contents: Part I: Overview. Introduction: Connectionism and Neuroscience, S. J. Hanson and C. R. Olson. Computational Neuroscience, T. J. Sejnowski, C. Koch, and P. S. Churchland. Part II: Associative Memory and Conditioning. The Behavioral Analysis of Associative Learning in the Terrestrial Mollusc Limax Maximus: The Importance of Inter-event Relationships, C. L. Sahley. Neural Models of Classical Conditioning: A Theoretical Viewpoint, G. Tesauro. Unsupervised Perceptual Learning: A Paleocortical Model, R. Granger, J. Ambros-Ingerson, P. Anton, and G. Lynch. Part III. The Somatosensory System. Biological Constraints on a Dynamic Network: The Somatosensory Nervous System, T. Allard. A Model of Receptive Field Plasticity and Topographic Reorganization in the Somatosensory Cortex, L. H. Finkel. Spatial Representation of the Body, C. R. Olson and S. J. Hanson. Part IV: The Visual System. The Development of Ocular Dominance Columns: Mechanisms and Models. K. D. Miller and M. P. Stryker. Self- Organization in a Perceptual System: How Network Models and Information Theory May Shed Light on Neural Organization, R. Linsker. Solving the Brightness-From-Luminance Problem: A Neural Architecture for Invariant Brightness Perception, S. Grossberg and D. Todorovic. A Bradford Book 1990 - 423 pp. - $44.00 0-262-08193-8 HANCH ------------------------------------------------------------ Neural Network Design and the Complexity of Learning J. Stephen Judd Using the tools of complexity theory, Stephen Judd develops a formal description of associative learning in connectionist networks. He rigorously exposes the computational difficulties in training neural networks and explores how certain design principles will or will not make the problems easier. "Judd . . . formalized the loading problem and proved it to be NP-complete. This formal work is clearly explained in his book in such a way that it will be accessible both to the expert and nonexpert." - Eric B. Baum, IEEE Transactions on Neural Networks "Although this book is the true successor to Minsky and Papert's maligned masterpiece of 1969 (Perceptrons), Judd is not trying to demolish the field of neurocomputing. His purpose is to clarify the limitations of a wide class of network models and thereby suggest guidelines for practical applications." - Richard Forsyth, Artificial Intelligence & Behavioral Simulation Contents: Neural Netowrks: Hopes, Problems, and Goals. The Loading Problem. Other Studies of Learning. The Intractability of Loading. Subcases. Shallow Architectures. Memorization and Generalization. Conclusions. Appendices A Bradford Book 1990 - 150 pp. - $27.50 0-262-10045-2 JUDNH ------------------------------------------------------------ The Perception of Multiple Objects A Connectionist Approach Michael C. Mozer Building on the vision studies of David Marr and the connectionist modeling of the PDP group it describes a neurally inspired computational model of two-dimensional object recognition and spatial attention that can explain many characteristics of human visual perception. The model, called MORSEL, can actually recognize several two-dimensional objects at once (previous models have tended to blur multiple objects into one or to overload). Mozer's is a fully mechanistic account, not just a functional-level theory. "Mozer's work makes a major contribution to the study of visual information processing. He has developed a very creative and sophisticated new approach to the problem of visual object recognition. The combination of computational rigor with thorough and knowledgeable examination of psychological results is impressive and unique." - Harold Pashler, University of California at San Diego Contents: Introduction. Multiple Word Recognition. The Pull-Out Network. The Attentional Mechanism. The Visual Short-Term Memory. Psychological Phenomena Explained by MORSEL. Evaluation of MORSEL. Appendixes: A Comparison of Hardware Requirements. Letter Cluster Frequency and Discriminability Within BLIRNET's Training Set. A Bradford Book 1991 - 217 pp - $27.50 0-262-13270-2 MOZPH ------------------------------------------------------------- ORDER FORM Please send me the following book(s): Qty Author Bookcode Price ___ Cleeremans CLEMH 30.00 ___ Hanson HANCH 44.00 ___ Judd JUDNH 27.50 ___ Mikkulainen MIISH 45.00 ___ Miller MILNH 52.50 ___ Mitchell MITAH 45.00 ___ Mozer MOZPH 27.50 ___ Sereno SERNH 24.95 ___ Payment Enclosed ___ Purchase Order Attached Charge to my ___ Master Card ___ Visa Card# _______________________________ Exp.Date _______________ Signature _________________________________________________ _____ Total for book $2.75 Postage _____ Please add 50c postage for each additional book _____ Canadian customers Add 7% GST _____ TOTAL due MIT Press Send To: Name ______________________________________________________ Address ___________________________________________________ City ________________________ State ________ Zip __________ Daytime Phone ________________ Fax ________________________ Make checks payable and send order to: The MIT Press * 55 Hayward Street * Cambridge, MA 02142 For fastest service call (617) 625-8569 or toll-free 1-800-356-0343 The MIT Guarantee: If for any reason you are not completely satisfied, return your book(s) within ten days of receipt for a full refund or credit. 3ENET From rreilly at nova.ucd.ie Tue May 4 03:50:49 1993 From: rreilly at nova.ucd.ie (Ronan Reilly) Date: Tue, 4 May 1993 08:50:49 +0100 Subject: CforP: Workshop on NLP Message-ID: Call for Participation in the 2ND WORKSHOP ON THE COGNITIVE SCIENCE OF NATURAL LANGUAGE PROCESSING 26-27 July, 1993 Dublin City University Guest Speakers: Walter Daelemans University of Tilburg Ronan Reilly University College Dublin Attendance at the CSNLP workshop will be by invitation on the basis of a submitted paper. Those wishing to be considered should send a paper of not more than eight A4 pages to Sean O'Nuallain or Andy Way, School of Computer Applications, Dublin City University, Dublin 9, Ireland, by not later than 14 June, 1993. Notification of acceptance along with registration and accommodation details will be sent out by 25 June, 1993. Submitting authors should also send their fax number and/or e-mail address to help speed up the selection process. The particular focus of the workshop will be on the computational modelling of human natural language processing (NLP), and preference will be given to papers that present empirically supported computational models of any aspect of human NLP. An additional goal in selecting papers will be to provide coverage of a range of NLP areas. From hwang at pierce.ee.washington.edu Tue May 4 13:06:05 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Tue, 4 May 93 10:06:05 PDT Subject: apology Message-ID: <9305041706.AA24985@pierce.ee.washington.edu.> We apologized for our ignorance of the incompatibility of our postscript files recently placed in Neuroprose with your printers. We will fix these problems and reload these three reports ASAP. These three files are: hwang.bplppl.ps.Z (back-propagation and projection pursuit learning) hwang.nnmrf.ps.Z (probabilistic textured image modeling by neural networks) hwang.srnn.ps.Z (mental image transformation via surface reconstruction nn) Jenq-Neng Hwang, Assistant Professor Information Processing Laboratory Dept. of Electrical Engr., FT-10 University of Washington Seattle, WA 98915 (206) 685-1603 (O), (206) 543-3842 (FAX) hwang at ee.washington.edu From hwang at pierce.ee.washington.edu Tue May 4 12:15:53 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Tue, 4 May 93 09:15:53 PDT Subject: mental image transformation and surface reconstruction NN Message-ID: <9305041615.AA24719@pierce.ee.washington.edu.> Technical Report available from neuroprose: MENTAL IMAGE TRANSFORMATION AND MATCHING USING SURFACE RECONSTRUCTION NEURAL NETWORKS Jenq-Neng Hwang, Yen-Hao Tseng Information Processing Laboratory Department of Electrical Engineering, FT-10 University of Washington, Seattle, WA 98195 ABSTRACT Invariant 2-D/3-D object recognition and motion estimation under detection/occlusion noise and/or partial object viewing are difficult pattern recognition tasks. On the other hand, the biological neural networks of human are extremely adept in these tasks. It has been suggested by the studies of experimental psychology that the task of matching rotated and scaled shapes by human is done by mentally rotating and scaling gradually one of the shapes into the orientation and size of the other and then testing for a match. Motivated by these studies, we present a novel and robust neural network solution for these tasks based on detected surface boundary data or range data. The method operates in two stages: The object is first parametrically represented by a surface reconstruction neural network (SRNN) trained by the boundary points sampled from the exemplar object. When later presented with boundary points sampled from the distorted object without point correspondence, this parametric representation allows the mismatch information back-propagate through the SRNN to gradually determine (align) the best similarity transform of the distorted object. The distance measure can then be computed in the reconstructed representation domain between the surface reconstructed exemplar object and the aligned distorted object. Applications to invariant 2-D target classification and 3-D object motion estimation using sparse range data collected from a single aspect view are presented. ================ To obtain copies of the postscript file, please use Jordan Pollack's service (no hardcopies will be provided): Example: unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52) Name (archive.cis.ohio-state.edu): anonymous Password (archive.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get hwang.srnn.ps.Z ftp> quit unix> uncompress hwang.srnn.ps Now print "hwang.srnn.ps" as you would any other (postscript) file. From skalsky at aaai.org Tue May 4 14:31:56 1993 From: skalsky at aaai.org (Rick Skalsky) Date: Tue, 4 May 93 11:31:56 PDT Subject: AAAI Spring Symposium Series 1994 Call for Proposals Message-ID: <9305041831.AA01242@aaai.org> 1994 Spring Symposium Series Call for Proposals AAAI invites proposals for the 1994 Spring Symposium Series, to be held at Stanford University, March 21-23, 1994. The Spring Symposium Series is a yearly set of symposia, designed to bring colleagues together in small, intimate forums. There will be about eight symposia on various topics in the 1994 Spring Symposium Series. All symposia will be limited in size. The symposia will run in parallel for two and one-half days. The symposia will allow for presentation of speculative work and work in progress, as well as completed work. Ample discussion time will be scheduled in each symposium. Working notes will be prepared, and distributed to the participants. Chairs can determine whether the working notes of their symposia will be available as AAAI Technical Reports following the meeting. Most participants of the symposia will be selected on the basis of statements of interest or abstracts submitted to the symposia chairs; some open registration will be allowed. Participants will be expected to attend a single symposium. Proposals for symposia should be between two and five pages in length, and should contain: - A title for the symposium - A description of the symposium, identifying specific areas of interest - Evidence that the symposium is of interest at this time--such as a completed, successful one-day workshop on a related topic - The names and addresses of the organizing committee, preferably three or four people at different sites, all of whom have agreed to serve on the committee - A list of several potential participants. Ideally, the entire organizing committee should collaborate in producing the proposal. If possible, a draft proposal should be sent out to a few of the potential participants and their comments solicited. All proposals will be reviewed by the AAAI Symposium Committee (cochairs: Lynn Andrea Stein, MIT; and Jim Hendler, University of Maryland). The criteria for acceptance of proposals include: - An appropriate level of perceived interest in the topic of the symposium among AAAI members. (Symposia proposals that appear to be too popular to fit in the size constraints should be turned into regular AAAI workshops.) - No long-term ongoing series of activities in the particular topic. (The Spring Symposium Series serves more to nurture interest in particular topics than to maintain it over a number of years.) The existence of activities in related and more-general topics will help to indicate the level of interest in the particular topic. - An appropriate organizing committee. - Accepted proposals will be distributed as widely as possible over the subfields of AI, and balanced between theoretical and applied topics. Symposia bridging theory and practice are particularly solicited. Symposium proposals should be submitted as soon as possible, but no later than June 7, 1993. Proposals that are submitted significantly before this deadline can be in draft form. Comments on how to improve and complete the proposal will be returned to the submitter in time for revisions to be made before the deadline. Notifications of acceptance or rejection will be sent to submitters around June 21, 1993. The submitters of accepted proposals will become the chair of the symposium, unless alternative arrangements are made. The symposium organizing committees will be responsible for: - Producing, in conjunction with the general chair, a Call for Participation for the symposium, which will be published in the AI Magazine - Reviewing requests to participate in the symposium and determining symposium participants - Preparing working notes for the symposium - Scheduling the activities of the symposium - Preparing a short review of the symposium, to be printed in the AI Magazine. AAAI will provide logistical support, will take care of all local arrangements, and will arrange for reproducing and distributing the working notes. Please submit (preferably by electronic mail) your symposium proposals, and inquiries concerning symposia, to both of the chairs: Jim Hendler (hendler at cs.umd.edu) Department of Computer Science University of Maryland AV Williams Building College Park, MD 20742 USA Lynn Andrea Stein (las at ai.mit.edu) AI Laboratory Massachusetts Institute of Technology 545 Technology Square #811 Cambridge, MA 02139 USA From mwitten at hermes.chpc.utexas.edu Tue May 4 15:30:32 1993 From: mwitten at hermes.chpc.utexas.edu (mwitten@hermes.chpc.utexas.edu) Date: Tue, 4 May 93 14:30:32 CDT Subject: WORLD CONGRESS ON COMPUTATIONAL MEDICINE<-CFPP Message-ID: <9305041930.AA01085@morpheus.chpc.utexas.edu> [] ***** CALL FOR PAPERS AND PARTICIPATION ***** [] FIRST WORLD CONGRESS ON COMPUTATIONAL MEDICINE AND PUBLIC HEALTH 24-28 April 1994 Hyatt Regency Hotel Austin, Texas compmed94 at chpc.utexas.edu (this notice may be reposted/cross posted/circulated) ------------------------------------------------------------------------ *Conference Chair: Matthew Witten, UT System Center For High Performance Computing, Austin, Texas - m.witten at chpc.utexas.edu *Conference Directorate: Regina Monaco, Mt. Sinai Medical Center * Dan Davison, University of Houston * Chris Johnson, University of Utah * Lisa Fauci, Tulane University * Daniel Zelterman, University of Minnesota Minneapolis * James Hyman, Los Alamos National Laboratory * Richard Hart, Tulane University * Dennis Duke, SCRI-Florida State University * Sharon Meintz, University of Nevada Los Vegas * Dean Sittig, Vanderbilt University * Dick Tsur, World Bank and UT System CHPC * Dan Deerfield, Pittsburgh Supercomputing Center * Istvan Gyori, Szeged University School of Medicine Computing Center *Conference Theme: The appearance of high-performance computing environments has greatly enhanced the capabilities of the biomedical modeler. With increasing frequency, computational sciences are being exploited as a means with which to investigate biomedical processes at all levels of complexity, from molecular to systemic to demographic. The emergence of an increasing number of players in this field has lead to the subsequent emergence of a new transdisciplinary field which we call Computational Medicine and Public Health. The purpose of this congress is to bring together a transdisciplinary group of researchers in medicine, public health, computer science, mathematics, nursing, veterinary medicine, ecology, allied health, as well as numerous other disciplines, for the purposes of examining the grand challenge problems of the next decades. Young scientists are encouraged to attend and to present their work in this increasingly interesting discipline. Funding is being solicited from NSF, NIH, DOE, Darpa, EPA, and private foundations, as well as other sources to assist in travel support and in the offsetting of expenses for those unable to attend otherwise. Papers, poster presentations, tutorials, focussed topic workshops, birds of a feather groups, demonstrations, and other suggestions are solicited in, but are not limited to the following areas: *Visualization/Sonification --- medical imaging --- molecular visualization as a clinical research tool --- simulation visualization --- microscopy --- visualization as applied to problems arising in computational molecular biology and genetics or other non-traditional disciplines *Computational Molecular Biology and Genetics --- computational ramifications of clinical needs in the Human Genome, Plant Genome, and Animal Genome Projects --- computational and grand challenge problems in molecular biology and genetics --- algorithms and methodologies --- issues of multiple datatype databases *Computational Pharmacology, Pharmacodynamics, Drug Design *Computational Chemistry as Applied to Clinical Issues *Computational Cell Biology, Physiology, and Metabolism --- Single cell metabolic models (red blood cell) --- Cancer models --- Transport models --- Single cell interaction with external factors models (laser, ultrasound, electrical stimulus) *Computational Physiology and Metabolism --- Renal System --- Cardiovascular dynamics --- Liver function --- Pulmonary dynamics --- Auditory function, coclear dynamics, hearing --- Reproductive modeling: ovarian dynamics, reproductive ecotoxicology, modeling the hormonal cycle --- Metabolic Databases and metabolic models *Computational Demography, Epidemiology, and Statistics/Biostatistics --- Classical demographic, epidemiologic, and biostatistical modeling --- Modeling of the role of culture, poverty, and other sociological issues as they impact healthcare *Computational Disease Modeling --- AIDS --- TB --- Influenza --- Other *Computational Biofluids --- Blood flow --- Sperm dynamics --- Modeling of arteriosclerosis *Computational Dentistry, Orthodontics, and Prosthetics *Computational Veterinary Medicine --- Computational issues in modeling non-human dynamics such as equine, feline, canine dynamics (physiological/biomechanical) *Computational Allied Health Sciences --- Physical Therapy --- Neuromusic Therapy --- Resiratory Therapy *Computational Radiology --- Dose modeling --- Treatment planning *Computational Surgery --- Simulation of surgical procedures in VR worlds --- Surgical simulation as a precursor to surgical intervention *Computational Cardiology *Computational Neurobiology and Neurophysiology --- Brain modeling --- Single neuron models --- Neural nets and clinical applications --- Neurophysiological dynamics --- Neurotransmitter modeling --- Neurological disorder modeling (Alzheimers Disease, for example) *Computational Biomechanics --- Bone Modeling --- Joint Modeling *The role of alternate reality methodologies and high performance environments in the medical and public health disciplines *Issues in the use of high performance computing environments in the teaching of health science curricula *The role of high performance environments for the handling of large medical datasets (high performance storage environments, high performance networking, high performance medical records manipulation and management, metadata structures and definitions) *Federal and private support for transdisciplinary research in computational medicine and public health *Contact: To contact the congress organizers for any reason use any of the following Electronic Mail - compmed94 at chpc.utexas.edu Fax (USA) - (512) 471-2445 Phone (USA) - (512) 471-2472 Compmed 1994 University of Texas System CHPC Balcones Research Center, 1.154CMS 10100 Burnet Road Austin, Texas 78758-4497 *Submission Procedures: Authors must submit 5 copies of a single-page 50-100 word abstract clearly discussing the topic of their presentation. In addition, authors must clearly state their choice of poster, contributed paper, tutorial, exhibit, focussed workshop or birds of a feather group along with a discussion of their presentation. Abstracts will be published as part of the preliminary conference material. To notify the congress organizing committee that you would like to participate and to be put on the congress mailing list, please fill out and return the form that follows this announcement. You may use any of the contact methods above. *Conference Deadlines: The following deadlines should be noted: 1 October 1993 - Notification of interest in participation 1 November 1993 - Abstracts for talks/posters/workshops/birds of a feather sessions/demonstrations 15 January 1994 - Notification of acceptance of abstract 15 February 1994 - Application for financial aid ============================= INTENT TO PARTICIPATE ========================== First Name: Middle Initial (if available): Family Name: Your Professional Title: [ ]Dr. [ ]Professor [ ]Mr. [ ]Mrs. [ ]Ms. [ ]Other:__________________ Office Phone (desk): Office Phone (message): Home/Evening Phone (for emergency contact): Fax: Electronic Mail (Bitnet): Electronic Mail (Internet): Postal Address: Institution or Center: Building Code: Mail Stop: Street Address1: Street Address2: City: State: Country: Zip or Country Code: Please list your three major interest areas: Interest1: Interest2: Interest3: =================================================================== From jramire at conicit.ve Tue May 4 23:47:38 1993 From: jramire at conicit.ve (Jose Ramirez G. (AVINTA) Date: Tue, 4 May 93 23:47:38 AST Subject: Workshp on ANN Message-ID: <9305050347.AA27626@dino.conicit.ve> ************************************************************** Call For Panelists and Call For Particiation Panel on "Research directions and applications of Artificial Neural Networks" The second World Congress on Expert Systems will be help in Lisbon, Portugal, 10-14 January 1.994. During the congress a panel focused on "Research directions and applications of Artificial Neural Networks" will be conducted. Panelist proposal are requested, according to the following: 1. 5 or 6 panelists will be selected. The panel will have presentations of 10 min. per panelist, plus a questions and answers period of 30 min. 2. Proposals must include a brief vitae (10 lines) of the panelist and a description of the topic to be addressed during the panel (5 lines). 3. Proposals must be sent by e-mail of fax to: Jose Ramirez email: jramire at conicit.ve fax : +58-2-2832689 4. The proposals must be received by May 28, 1.993. 5. The selected panelist must fill a registration form for the congress(at a reduced fee) and confirm the participation in the event. ************************************************************* From hwang at pierce.ee.washington.edu Wed May 5 12:50:13 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Wed, 5 May 93 09:50:13 PDT Subject: three technical reports available Message-ID: <9305051650.AA28307@pierce.ee.washington.edu.> We have fixed the postscript printing problems and reload the three reports in neuroprose. These three files are now available: hwang.bplppl.ps.Z (back-propagation and projection pursuit learning) hwang.nnmrf.ps.Z (probabilistic textured image modeling by neural networks) hwang.objrec.ps.Z (single spaced) or hwang.srnn.ps.Z (double spaced) (mental image transformation via surface reconstruction neural nets) Jenq-Neng Hwang, Assistant Professor Information Processing Laboratory Dept. of Electrical Engr., FT-10 University of Washington Seattle, WA 98915 (206) 685-1603 (O), (206) 543-3842 (FAX) hwang at ee.washington.edu From robtag at udsab.dia.unisa.it Wed May 5 12:01:02 1993 From: robtag at udsab.dia.unisa.it (Tagliaferri Roberto) Date: Wed, 5 May 1993 18:01:02 +0200 Subject: WIRN 93 Programme Message-ID: <199305051601.AA21642@udsab.dia.unisa.it> Istituto Internazionale Alti Studi Scientifici (IIASS) Dipartimento di Fisica Teorica, Universita` di Salerno Dipartimento di Informatica ed Applicazioni, Universita` di Salerno Dipartimento di Scienze dell'Informazione, Universita` di Milano Istituto per la Ricerca dei Sistemi Informatici Paralleli, C.N.R., Napoli Societa` Italiana Reti Neuroniche (SIREN) 6th ITALIAN WORKSHOP ON NEURAL NETWORKS WIRN VIETRI-93 IIASS Research Center Ph: +39 89 761167 FAX:+39 89 761189 Vietri Sul Mare, Salerno, May 12-14, 1993 PRELIMINARY PROGRAM Wednesday 12 9:00 Opening of the Workshop 9:30 S. Gielen (Invited Lecture) 11:00 Coffee break 11:30 Formal Models and Pattern Recognition G. Basti, V. Bidoli et al. "Particle recognition on experimental data in a silicon calorimeter by back propagation with stochastic pre-processing" A. Borghese "Learning optimal control using neural networks" S. Brofferio, V. Rampa "A supervised-ART neural network for pattern recognition" P. Pedrazzi "On self-organizing neural character recognizers" V. Sanguineti, P. Morasso "Models of cortical maps" L. Stringa "Experiments in memory-based learning" 13:00 Lunch break 15:00 Prof. Tredici (Review Lecture on Progresses in Neuroanatomy) 16:00 Applications (1st part) E. Coccorese, R. Martone, C. Morabito "Classification of plasma equilibria in a tokamak using a three-level propagation network" E.D. Di Claudio, G. Trivelloni, G. Orlandi "Model identification of non linear dynamical systems by recurrent neural networks" P. Morasso, A. Pareto, S. Pagliano, V. Sanguineti "A self-organizing approach for diagnostic problems" 17:00 Coffee Break 17:30 Hybrid and Robotic Systems A. Chella, U. Maniscalco, R. Pirrone, F. Sorbello, P. Storniolo "A shape from shading hybrid approach to estimate superquadric parameters" Z.M. Kovacs-V., R. Guerrieri, G. Baccarini "A hybrid system for handprinted character recognition" A. Sperduti, A. Starita "Modular neural codes implementing neural trees" Thursday 13 9:00 L. Zadeh (Invited Lecture) 11:00 Coffee Break 11:30 Fuzzy neural systems E. Binaghi, A. Mazzetti, R. Orlando, A. Rampini "Integration of fuzzy reasoning techniques in the error back propagation learning algorithm" M. Costa, E. Pasero "FNC: a fuzzy neural classifier with bayesian engine" Zhiling Wiang, G. Sylos Labini "A self-organizing network of alterable competitive layer for pattern cluster" 13:00 Lunch Break 15:00 V. Cimagalli (Review Lecture on Cellular Networks) 16:00 - 17:30 Poster and Industrial Sessions 17:30 SIREN Annual Meeting Friday 14 9:00 Y. Bengio, P. Frasconi and M. Gori (Review Lecture on Recurrent Networks for Adaptive Temporal Reasoning) 10:00 Applications (2nd part) S. Cavalieri, A. Fichera "Exploiting neural network features to model and analyze noise pollution" A.M. Colla, N. Longo, G. Morgavi, S. Ridella "SBP: A hybrid neural model for pattern recognition" F. Piglione, G. Cirrincione "Neural-net based load-flow models for electric power systems" 11:00 Coffee Break 11:30 Hardware and Software Design A. d'Acierno, R. Vaccaro "The back-propagation learning algorithm on parallel computers: a mapping scheme" M. Gioiello, G. Vassallo, F. Sorbello "A new fully digital feed-forward network for hand-written digits recognition" F. Lauria, M. Sette "CONNET: a neural network configuration language" P. Wilke "Simulation of neural networks in a distributed computing environment using Neuro Graph" 13:00 Lunch Break 15:00 Architectures and Algorithms M. Alberti, P. Marelli, R. Posenato " A neural algorithm for the maximum satisfiability problem" E. Alpaydin "Multiple networks for function learning" D. Micci Barreca, G.C. Buttazzo "A neural architecture for failure-based learning" M. Schmitt "On the size of weights for McCulloch-Pitts neurons" Registration fee 275.000 Italian Liras (including proceedings and social dinner). No fees to be payed for students. From joe at cogsci.edinburgh.ac.uk Fri May 7 06:15:11 1993 From: joe at cogsci.edinburgh.ac.uk (Joe Levy) Date: Fri, 07 May 93 11:15:11 +0100 Subject: 2nd Neural Computation and Psychology Workshop: Language and Memory Message-ID: <8959.9305071015@galloway.cogsci.ed.ac.uk> 2nd Neural Computation and Psychology Workshop: Language and Memory. University of Edinburgh 10th-13th September Preliminary Call for Participation Following on from last year's very successful workshop on "Neurodynamics and Psychology" at Bangor University, it has been suggested that a workshop on some aspect of connectionist modelling in psychology should be held in the UK every year. This year the Connectionism and Cognition Research Group at the University of Edinburgh will host a workshop under the general theme of "language and memory" in Edinburgh between Friday 10th and Monday 13th September. We are currently preparing a program and will post details as soon as possible. The main sessions are likely to include memory, speech processing and reading. The workshop will be single track with a small number of invited speakers. Attendance will be limited to 50 people to allow ample time for discussion. For further details contact: Joe Levy Phone: +44 31 650 4450 | University of Edinburgh Fax: +44 31 650 4587 | Human Communication Research ARPA: joe%cogsci.ed.ac.uk at nsfnet-relay.ac.uk | Centre, 2 Buccleuch Place JANET: joe at uk.ac.ed.cogsci | Edinburgh EH8 9LW Scotland From fritzke at ICSI.Berkeley.EDU Fri May 7 17:43:31 1993 From: fritzke at ICSI.Berkeley.EDU (Bernd Fritzke) Date: Fri, 7 May 93 14:43:31 PDT Subject: three new papers in neuroprose Message-ID: <9305072143.AA22277@icsib14.ICSI.Berkeley.EDU> *** DO NOT FORWARD TO ANY OTHER LISTS *** The following technical reports have been placed in the neuroprose directory (ftp instructions follow the abstracts). For two of the TR's also hardcopies are available. Instructions are at the end of the posting. Comments and questions are welcome. Thanks to Jordan Pollack for maintaining the neuroprose archive. -Bernd International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704-1105 USA ------------------------------------------------------------ Growing Cell Structures - A Self-organizing Network for Unsupervised and Supervised Learning *) Bernd Fritzke ICSI, Berkeley TR-93-026 (34 pages) *) submitted for publication We present a new self-organizing neural network model having two variants. The first variant performs unsu- pervised learning and can be used for data visualization, clustering, and vector quantization. The main advantage over existing approaches, e.g., the Kohonen feature map, is the ability of the model to automatically find a suit- able network structure and size. This is achieved through a controlled growth process which also includes occasional removal of units. The second variant of the model is a supervised learning method which results from the combination of the abovementioned self-organizing network with the radial basis function (RBF) approach. In this model it is possible - in contrast to earlier approaches - to perform the positioning of the RBF units and the supervised training of the weights in parallel. Therefore, the current classification error can be used to determine where to insert new RBF units. This leads to small networks which generalize very well. Results on the two-spirals benchmark and a vowel classification problem are presented which are better than any results previously published. ------------------------------------------------------------ Vector Quantization with a Growing and Splitting Elastic Net *) Bernd Fritzke ICSI, Berkeley (6 pages) *) to be presented at ICANN-93, Amsterdam A new vector quantization method is proposed which gen- erates codebooks incrementally. New vectors are inserted in areas of the input vector space where the quantization error is especially high until the desired number of codebook vec- tors is reached. A one-dimensional topological neighborhood makes it possible to interpolate new vectors from existing ones. Vectors not contributing to error minimization are removed. After the desired number of vectors is reached, a stochastic approximation phase fine tunes the codebook. The final quality of the codebooks is exceptional. A comparison with two well-known methods for vector quantization was per- formed by solving an image compression problem. The results indicate that the new method is significantly better than both other approaches. ------------------------------------------------------------ Kohonen Feature Maps and Growing Cell Structures -- a Performance Comparison *) Bernd Fritzke ICSI, Berkeley (8 pages) *) to appear in Advances in Neural Information Processing Systems 5 C.L. Giles, S.J. Hanson, and J.D. Cowan (eds.), Morgan Kaufmann, San Mateo, CA, 1993 A performance comparison of two self-organizing net- works, the Kohonen Feature Map and the recently proposed Growing Cell Structures is made. For this purpose several performance criteria for self-organizing networks are pro- posed and motivated. The models are tested with three exam- ple problems of increasing difficulty. The Kohonen Feature Map demonstrates slightly superior results only for the sim- plest problem. For the other more difficult and also more realistic problems the Growing Cell Structures exhibit sig- nificantly better performance by every criterion. Addi- tional advantages of the new model are that all parameters are constant over time and that size as well as structure of the network are determined automatically. ************************* ftp instructions ********************** If you have the Getps script unix> Getps fritzke.tr93-26.ps.Z unix> Getps fritzke.icann93.ps.Z unix> Getps fritzke.nips92.ps.Z (Getps ftp's the named file, decompresses it, and asks wether to print it) otherwise do first the following (to get Getps) unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52) Connected to archive.cis.ohio-state.edu. 220 archive.cis.ohio-state.edu FTP server ready. Name: anonymous 331 Guest login ok, send ident as password. Password: 230 Guest login ok, access restrictions apply. ftp> cd pub/neuroprose 250 CWD command successful. ftp> get Getps 200 PORT command successful. 150 Opening BINARY mode data connection for Getps (2190 bytes). 226 Transfer complete. ftp> quit 221 Goodbye. ************************* hardcopies **************************** The NIPS92 paper and the 34-page paper have appeared as ICSI technical reports TR-93-025 and TR-93-026, respectively. Hardcopies are available for a small charge for postage and handling. For details please contact Vivian Balis (balis at icsi.berkeley.edu) at ICSI. From pollack at cis.ohio-state.edu Fri May 7 11:44:36 1993 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Fri, 7 May 93 11:44:36 -0400 Subject: Tweaking NEUROPROSE Message-ID: <9305071544.AA07540@dendrite.cis.ohio-state.edu> *****do not forward to other groups***** Good People, There are problems of scale with NEUROPROSE, and no resources to fix them properly. Therefore, after great thought about the laws of unintended consequences, and with no insult intended to recent articles, I am hereby tweaking the practices of NEUROPROSE, and I trust you will all go along with me eventually: 1. No more multiple daily submissions, NEUROPROSE is supposed to be for relevant preprints, not a vanity press or a medium for the distribution of life works or annual reports. 2. Make sure your paper is single-spaced, even as a draft, so as to save paper. 3. Please announce the NUMBER OF PAGES with with the announcement, so people are not surprised by empty laser printer trays. In your request to me, it would help to have a formatted INDEX entry with the page count as well (see appendix). 4. Before announcing, have a friend at another institution retrieve and print the file, so as to avoid easily found local postscript library errors. Lots of resource are wasted when the files do not print. 5. Add the following two lines to your mail header, or the top of your message, so as to facilitate the development of mailer scripts and macros which can automatically retrieve files from both NEUROPROSE and other lab-specific repositories (Thanks to Dave Plaut's sense of humor): FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/filename.ps.Z 6. Finally, unless you are posting a file with non-standard ftp arrangements, like a tar.Z file, leave the instructions off, as everyone knows at this point how to get and uncompress and print a postscript file! I have amended the README file to this effect. Please send comments to me for discussion, rather than the whole mailing list. Thanks. Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Phone: (614)292-4890 (then * to fax) From ptodd at spo.rowland.org Sun May 9 16:31:21 1993 From: ptodd at spo.rowland.org (Peter M. Todd) Date: Sun, 9 May 93 16:31:21 EDT Subject: CFP: music/creativity issue of Connection Science Message-ID: <9305092031.AA01937@spo.rowland.org> **** PLEASE DISTRIBUTE **** MUSIC AND CREATIVITY Call for Papers for a Special Issue of Connection Science Over the last few years there has been a vertiginous growth in the connectionist exploration of many domains, including music. Music has traditionally been one of the least studied areas of cognition, in part because of the complexity of musical phenomena and their language-like connections between many levels and modalities of thought. But the application of network-based computational techniques to aspects of musicality and creativity has resulted in a variety of illuminating models. The time now seems right for an overview of the agenda being followed by connectionists in this area, the articulation of the central issues in the field, and a forum for the discussion of future directions. To this end, we are inviting papers covering the whole field of connectionist modelling of music, arts, and creativity for a special issue of the journal Connection Science. Papers may be either empirical or theoretical, but must communicate predominantly unpublished ideas. We are particularly interested in receiving work in the following areas (although we emphasize music here, other areas of creativity and artistic endeavour may be substituted): 1. The limits and possibilities for connectionism in modelling creativity. 2. Modelling cognitive aspects of music: meter, rhythm, tonality, harmony and melody. 3. The use of neural networks in creating pieces of music, choreography, visual art, etc. 4. Modelling the integration of lower- and higher-level musical knowledge, including hierarchical representations. 5. The representation of intermodal relationships between musical dimensions, e.g. tonality and rhythm. 6. Developmental models of musical cognition. 7. Psychoacoustic models underlying categorical pitch and other musical phenomena. 8. Models of auditory streaming, attention, phrasing, and grouping. 9. Connectionist models of timbre. 10. Models of cross-cultural differences or universals. 11. Comparative models of music and language. 12. The use of sequential, recurrent, predictive, and chaotic network models for creative phenomena. 13. Cognitive neuroscience models of musical phenomena. We are particularly interested in stimulating discussion with this special issue of the present and future of this field, and papers should explore the importance of issues raised by the research as broadly as possible. An awareness of the cognitive plausibility and implications of the ideas presented is also essential. Requirements for Submission All papers will be rigorously refereed. Guidelines for submission of papers to Connection Science can be found in issues of the Journal and are also available from lyn at dcs.exeter.ac.uk (or by mail from Lyn Shackleton, University of Exeter, address as below). Authors are encouraged to contact the editors with any questions about proposed papers or the relevance of their work for this special issue. Authors must submit five (5) printed copies of their papers to either of the addresses listed below by OCTOBER 15 1993. Each copy of the paper should be fronted by a separate title page listing its title, authors, their addresses, surface-mail and E-mail, and an abstract of under 200 words. Notification of receipt will be electronically mailed to the first (or designated) author. Notification of acceptance or rejection will be mailed by DECEMBER 31 1993. Final versions of accepted papers will be due MARCH 1 1994. Special Issue Editors: Niall Griffith Department of Computer Science, University of Exeter, Prince of Wales Road, Exeter, EX4 4PT, England. E-mail: ngr at dcs.exeter.ac.uk Peter M. Todd The Rowland Institute for Science 100 Edwin H. Land Boulevard Cambridge, MA 02142 USA E-mail: ptodd at spo.rowland.org From jain at arris.com Mon May 10 13:50:41 1993 From: jain at arris.com (Ajay Jain) Date: Mon, 10 May 93 10:50:41 PDT Subject: Position available Message-ID: <9305101750.AA17773@oyster.arris.com> ***** Please do not forward to other groups ***** RESEARCH SCIENTIST Statistics and Machine Learning Arris Pharmaceutical Corporation Arris Pharmaceutical is drug discovery company employing a synergistic approach that combines advances and expertise in molecular biology, synthetic chemistry, and applied mathematics. The company's mission is to develop synthetic therapeutics to address to address major medical needs through appying proprietary structure based drug design methods. We are seeking a person with expertise in both statistics and computer science, and with a PhD in statistics, mathematics, or computer science to join our team. The candidate must have experience designing, implementing, and using nonlinear statistical techniques (e.g., MARS, PI, CART, neural networks). Also highly desirable are experience in the application of statistical methods to experiment design, experience in database design, and strong interest and/or formal training in chemistry, biology, or medicine. The candidate should be eager to learn the relevant parts of computational chemistry and to interact with medicinal chemists and molecular biologists. The Arris drug design strategy begins by identifying a pharmaceutical target (e.g., an enzyme or a cell-surface receptor), developing assays to measure chemical binding with this target, and screening large libraries of peptides (short amino acid sequences) with these assays. The resulting data, which indicates for each compound how well it binds to the target, is analyzed by statistical algorithms to develop hypotheses that explain why some compounds bind well to the target while others do not. Information from X-ray crystallography or NMR spectroscopy may also be available to the statistical algorithm. Hypotheses will then be refined by synthesizing and testing additional peptides. Finally, medicinal chemists will synthesize small organic molecules that satisfy the hypothesis, and these will become candidate drugs to be tested for medical safety and effectiveness. The person hired will work as a member of the computational drug design group, conducting research on the application of advanced statistical and computational techniques to drug design, and developing chemical modeling tools incorporating these techniques. In particular, he or she will develop software to discover patterns in the biological activity of massive libraries of biopolymer compounds, and to predict new compounds with enhanced activity. In addition, he or she will contribute statistical expertise to experiment design and to other machine learning projects in the company. The computational drug design team currently includes Barr Bauer, David Chapman, Roger Critchlow, Tom Dietterich, Ajay Jain, Kimberle Koile, Rick Lathrop, Tomas Lozano Perez, and John Park. For more information, send your resume with the names and addresses of three references to: Arris Pharmaceutical Corporation Personnel Manager 385 Oyster Point Blvd. South San Francisco CA 94080 You may also send email reponses to jain at arris.com. From gerda at mail2.ai.univie.ac.at Mon May 10 15:16:30 1993 From: gerda at mail2.ai.univie.ac.at (Gerda Helscher) Date: Mon, 10 May 1993 21:16:30 +0200 Subject: CFP: EMCSR'94 European Meeting on Cybernetics and Systems Research Message-ID: <199305101916.AA12898@wachau.ai.univie.ac.at> * * * * * TWELFTH EUROPEAN MEETING * * ON * * CYBERNETICS AND SYSTEMS RESEARCH * * (EMCSR 1994) * April 5 - 8, 1994 UNIVERSITY OF VIENNA organized by the Austrian Society for Cybernetic Studies in cooperation with Dept.of Medical Cybernetics and Artificial Intelligence, Univ.of Vienna and International Federation for Systems Research Cybernetics - "the study of communication and control in the animal and the machine" (N.Wiener) - has recently returned to the forefront, not only in cyberpunk and cyberspace, but, even more important, contributing to the consolidation of various scientific theories. Additionally, an ever increasing number of research areas, including social and economic theories, theoretical biology, ecology, computer science, and robotics draw on ideas from second order cybernetics. Artificial intelligence, evolved directly from cybernetics, has not only technological and economic, but also important social impacts. With a marked trend towards interdisciplinary cooperation and global perspectives, this important role of cybernetics is expected to be further strengthened over the next years. Since 1972, the biennial European Meetings on Cybernetics and Systems Research (EMCSR) have served as a forum for discussion of converging ideas and new aspects of different scientific disciplines. As on previous occasions, a number of sessions providing wide coverage of the rapid developments will be arranged, complemented with daily plenary meetings, where eminent speakers will present latest research results. SESSIONS + Chairpersons: A General Systems Methodology G.J.Klir, USA B Advances in Mathematical Systems Theory M.Peschel, Germany, and F.Pichler, Austria C Fuzzy Systems, Approximate Reasoning and Knowledge-Based Systems C.Carlsson, Finland, K.-P.Adlassnig, Austria, and E.P.Klement, Austria D Designing and Systems, and Their Education B.Banathy, USA, W.Gasparski, Poland, and G.Goldschmidt, Israel E Humanity, Architecture and Conceptualization G.Pask, UK, and G.de Zeeuw, Netherlands F Biocybernetics and Mathematical Biology L.M.Ricciardi, Italy G Systems and Ecology F.J.Radermacher, Germany, and K.Fedra, Austria H Cybernetics and Informatics in Medicine G.Gell, Austria, and G.Porenta, Austria I Cybernetics of Socio-Economic Systems K.Balkus, USA, and O.Ladanyi, Austria J Systems, Management and Organization G.Broekstra, Netherlands, and R.Hough, USA K Cybernetics of National Development P.Ballonoff, USA, T.Koizumi, USA, and S.A.Umpleby, USA L Communication and Computers A M.Tjoa, Austria M Intelligent Autonomous Systems J.W.Rozenblit, USA, and H.Praehofer, Austria N Cybernetic Principles of Knowledge Development F.Heylighen, Belgium, and S.A.Umpleby, USA O Cybernetics, Systems, and Psychotherapy M.Okuyama, Japan, and H.Koizumi, USA P Artificial Neural Networks and Adaptive Systems S.Grossberg, USA, and G.Dorffner, Austria Q Artificial Intelligence and Cognitive Science V.Marik, Czechia, and R.Born, Austria R Artificial Intelligence and Systems Science for Peace Research S.Unseld, Switzerland, and R.Trappl, Austria SUBMISSION OF PAPERS: Acceptance of contributions will be determined on the basis of Draft Final Papers. These Papers must not exceed 7 single-spaced A4 pages (maximum 50 lines, final size will be 8.5 x 6 inch), in English. They have to contain the final text to be submitted, including graphs and pictures. However, these need not be of reproducible quality. The Draft Final Paper must carry the title, author(s) name(s), and affiliation in this order. Please specify the session in which you would like to present your paper. Each scientist shall submit only one paper. Please send t h r e e copies of the Draft Final Paper to the Conference Secretariat (NOT to session chairpersons!) DEADLINE FOR SUBMISSION: October 8, 1993. In order to enable careful refereeing, Draft Final Papers received after the deadline cannot be considered. FINAL PAPERS: Authors will be notified about acceptance no later than November 13, 1993. They will be provided by the conference secretariat at the same time with the detailed instructions for the preparation of the final paper. PRESENTATION: It is understood that the paper is presented personally at the Meeting by the contributor. CONFERENCE FEE: Contributors: AS 2500 if paid before January 31, 1994 AS 3200 if paid later Participants: AS 3500 if paid before January 31, 1994 AS 4200 if paid later The Conference Fee includes participation in the Twelfth European Meeting, attendance at official receptions, and the volume of the proceedings available at the Meeting. Please send cheque, or transfer the amount free of charges for beneficiary to our account no. 0026-34400/00 at Creditanstalt-Bankverein Vienna. Please state your name clearly. HOTEL ACCOMMODATIONS will be handled by OESTERREICHISCHES VERKEHRSBUERO, Kongressabteilung, Opernring 5, A-1010 Vienna, phone +43-1-58800-113, fax +43-1-5867127, telex 111 222. Reservation cards will be sent to all those returning the attached registration form. SCHOLARSHIPS: The Austrian Federal Ministry for Science and Research has kindly agreed to provide a limited number of scholarships covering the registration fee for the conference and part of the accommodation costs for colleagues from eastern and south-eastern European countries. Applications should be sent to the Conference Secretariat before October 8, 1993. * * * * * The Proceedings of the 1st to 11th European Meetings on Cybernetics and Systems Research were published as Pichler F. and Trappl R.(eds.): ADVANCES IN CYBERNETICS AND SYSTEMS RESEARCH, 2 vols, Transcripta Books, London, 1973. Trappl R. and Pichler F.R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.I, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1975. Trappl R. and Hanika F.de P.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.II, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1975. Trappl R., Klir G.J. and Ricciardi L.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.III, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1978. Trappl R. and Pask G.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.IV, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1978. Trappl R., Hanika F.de P. and Pichler F.R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.V, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1979. Pichler F.R. and Trappl R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.VI, Hemisphere, Washington,DC / McGraw-Hill, 1982. Pichler F.R. and Hanika F.de P.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.VII, Hemisphere, Washington,DC, 1980. Trappl R., Klir G.J. and Pichler F.R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.VIII, Hemisphere, Washington,DC / McGraw-Hill, 1982. Trappl R., Ricciardi L. and Pask G.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.IX, Hemisphere, Washington,DC / McGraw-Hill, 1982. Trappl R., Hanika F.de P. and Tomlinson R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.X, Hemisphere, Washington,DC / McGraw-Hill, 1982. Trappl R., Findler N.V. and Horn W.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol. XI, Hemisphere, Washington,DC / McGraw-Hill, 1982. Trappl R.(ed.): CYBERNETICS AND SYSTEMS RESEARCH, North-Holland, Amsterdam, 1982. Trappl R.(ed.): CYBERNETICS AND SYSTEMS RESEARCH 2, Elsevier, Amsterdam, 1984. Trappl R.(ed.): CYBERNETICS AND SYSTEMS '86, Reidel, Dordrecht, 1986. Trappl R.(ed.): CYBERNETICS AND SYSTEMS '88, 2 vols., Kluwer, Dordrecht, 1988. Trappl R.(ed.): CYBERNETICS AND SYSTEMS '90, World Scientific, Singapore, 1990. Trappl R.(ed.): CYBERNETICS AND SYSTEMS '92, 2 vols., World Scientific, Singapore, 1992. Please contact the conference secretariat for more details. ------------------------------------------------------------------------ CHAIRMAN of the Meeting: Robert Trappl, President Austrian Society for Cybernetic Studies SECRETARIAT: I. Ghobrial-Willmann and G. Helscher Austrian Society for Cybernetic Studies A-1010 Vienna 1, Schottengasse 3 (Austria) Phone: +43-1-53532810 Fax: +43-1-5320652 E-mail: sec at ai.univie.ac.at PROGRAMME COMMITTEE: K.-P. Adlassnig (Austria) G. J. Klir (USA) K. Balkus (USA) T. Koizumi (USA) P. Ballonoff (USA) O. Ladanyi (Austria) B. Banathy (USA) V. Marik (Czechia) R. Born (Austria) G. Pask (UK) G. Broekstra (Netherlands) M. Peschel (Germany) E. Buchberger (Austria) F. Pichler (Austria) C. Carlsson (Finland) G. Porenta (Austria) G. Chroust (Austria) H. Praehofer (Austria) G. Dorffner (Austria) F. J. Radermacher (Germany) K. Fedra (Austria) J. Retti (Austria) W. Gasparski (Poland) L. M. Ricciardi (Italy) G. Gell (Austria) J. W. Rozenblit (USA) G. Goldschmidt (Israel) N. Rozsenich (Austria) S. Grossberg (USA) A M. Tjoa (Austria) F. Heylighen (Belgium) R. Trappl (Austria) W. Horn (Austria) H. Trost (Austria) R. Hough (USA) S. A. Umpleby (USA) N. C. Hu (China) S. Unseld (Switzerland) E. P. Klement (Austria) G. de Zeeuw (Netherlands) ORGANIZING COMMITTEE: E. Buchberger P. Petta G. Chroust F. Pichler I. Ghobrial-Willmann R. Trappl G. Helscher H. Trost W. Horn M. Veitl J. Matiasek ****************************************** PAPER SUBMISSION DEADLINE: October 8, 1993 ****************************************** ------------------------------------------------------------------------ EMCSR-94 TWELFTH EUROPEAN MEETING ON CYBERNETICS AND SYSTEMS RESEARCH Please return to: Austrian Society for Cybernetic Studies Schottengasse 3, A-1010 VIENNA, AUSTRIA (EUROPE) E-mail: sec at ai.univie.ac.at o I plan to attend the Meeting. o I intend to submit a paper to Session ..... o I enclose the Draft Final Paper. o My Draft Final Paper will arrive prior to October 8, 1993. o My cheque for AS ....... covering the Conference Fee is enclosed. o I have transferred AS ........ to your account 0026-34400/00 at Creditanstalt Vienna. o I shall not be at the Meeting but am interested to receive particulars of the Proceedings. From pja at cis.ohio-state.edu Tue May 11 10:27:55 1993 From: pja at cis.ohio-state.edu (Peter J Angeline) Date: Tue, 11 May 93 10:27:55 -0400 Subject: EP94 Call For Papers Message-ID: <9305111427.AA15156@neuron.cis.ohio-state.edu> ---------------------------------------------------------------------- The Third Annual Conference on Evolutionary Programming CALL FOR PAPERS February 24-25, 1994 San Diego, California Evolutionary programming is a stochastic optimization technique that can be used to address various optimization problems. Papers regarding the theory and application of evolutionary programming to complex problem solving and solicited. Topics include, but are not limited to: automatic control neural network training and design system identification adaptive representation forecasting robotics combinatorial optimization pattern recognition and the relationship between evolutionary programming and other optimization methods. On or before June 30, 1993, prospective authors should submit a 100-250 word abstract and three page extended summary of the proposed paper to the Technical Program Chairman: Lawrence J. Fogel ORINCON Corporation 9363 Towne Centre Dr. San Diego, CA 92121 Authors will be notified of the program committee's decision on or before September 30, 1993. Completed papers will be due January 15, 1994. Paper format, page requirements and registration information will be detailed upon acceptance. General Chairman: Anthony V. Sebald, UC San Diego Technical Chairman: Lawrence J. Fogel, ORINCON Corporation Program Committee: Peter Angeline, The Ohio State Univ. Roman Galar, Tech. Univ. Wroclaw Wirt Atmar, AICS Research Inc. Douglas Hoskins, The Boeing Company Thomas Back, Univ. Dortmund Gerald Joyce, Scripps Clin./Res. Found. George Burgin, Titam Systems/Linkabit John McDonnell, NCCOSC Michael Conrad, Wayne State Univ. Stuart Rubin, NCCOSC David B. Fogel, ORINCON Corporation Hans-Paul Schwefel, Univ. Dortmund Gary B. Fogel, UC Los Angeles William M. Spears, Naval Research Lab Finance Chairman: Bill Porto, ORINCON Corporation Publicity Co-Chairs: Ward Page, NCCOSC Patrick Simpson, ORINCON Corporation Sponsored by the Evolutionary Programming Society In Cooporation with the IEEE Neural Networks Council From fogel at ece.UCSD.EDU Tue May 11 12:57:21 1993 From: fogel at ece.UCSD.EDU (Fogel) Date: Tue, 11 May 93 09:57:21 PDT Subject: Email Digest for Evolutionary Programming Message-ID: <9305111657.AA29071@sunshine.ucsd.edu> ANNOUNCING EVOLUTIONARY PROGRAMMING EMAIL DIGEST We are pleased to announce that as of May 10, 1993, an email digest covering transactions on evolutionary programming will be available. The digest is intended to promote discussions on a wide range of technical issues in evolutionary optimization, as well as provide information on upcoming conferences, events, journals, special issues, and other items of interest to the EP community. Discussions on all areas of evolutionary computation are welcomed, including artificial life, evolution strategies, and genetic algorithms. The digest is meant to encourage interdisciplinary communications. Your suggestions and comments regarding the digest are always welcome. To subscribe to the digest, send mail to ep-list-request at magenta.me.fau.edu and include the line "subscribe ep-list" in the body of the text. Further instructions will follow your subscription. The digest will be moderated by N. Saravanan of Florida Atlantic University. Sincerely, David Fogel fogel at sunshine.ucsd.edu N. Saravanan saravan at amber.me.fau.edu From mikewj at signal.dra.hmg.gb Tue May 11 07:00:55 1993 From: mikewj at signal.dra.hmg.gb (Mike Wynne-Jones) Date: Tue, 11 May 93 12:00:55 +0100 Subject: Neural nets applications meeting in UK Message-ID: AA08130@milne.dra.hmg.gb *********************************** NEURAL COMPUTING APPLICATIONS FORUM *********************************** 23 - 24 June 1993 Fitzwilliam College, Cambridge University, UK ***************************************** PRACTICAL APPLICATIONS OF NEURAL NETWORKS ***************************************** Neural Computing Applications Forum is the primary meeting place for people developing Neural Network applications in industry and academia. It has 150 members from the UK and Europe, from universities, small companies and big ones, and holds four main meeting each year. It has been running for 3 years, and is cheap to join. This meeting spans two days with informal workshops on 23 June and the main meeting comprising talks about neural network techniques and applications on 24 June. ********* WORKSHOPS - these talks are planned; additional short talks are sought. ********* ********************************************************** Constructing structured networks of Radial Basis Functions 23 June, 13.00 to 15.00 ********************************************************** Including : Robert Debenham (Logica Cambridge): "Online construction of RBFs during training" Richard Bostock (Aston University): "Bump-tree construction by genetic algorithms" ********************************************************* Self Organising Networks 23 June, 15.30 to 17.30 ********************************************************* Including: Nigel Allinson (York University): "Self Organising Networks: fast training, case studies and digital implementations" ************************************************************ Evening: Punting on the Cam followed by liquid refreshments! ************************************************************ ***************************** MAIN MEETING - 24 June 1993 ***************************** 8.30 Registration 9.05 Welcome 9.15 Douglas Kell (university of Wales): "Detection of impurities in olive oil" 9.55 Mahesan Niranjan (University of Cambridge): "On-line learning algorithms for prediction and control applications" 10.30 Coffee 11.00 Tony Robinson (University of Cambridge): "Application of recurrent nets to phone probability estimation in speech recognition" 11.40 Prof. Cabrol-Bass (LARTIC, France): "Indices for the Evaluation of Neural Network Performance as classifiers: Application to Structural Elucidation in Infra Red Spectroscopy" 12.15 Lunch 2.00 Stephen Roberts (Oxford University): "Probabilistic Growth of RBFs for detection of novelty" 2.40 Dave Cressy (Logica Cambridge Research): "Neural Control of an Experimental Batch Distillation Column" 3.15 Tea 3.40 Tom Harris (Brunel University): "Kohonen nets in machine health monitoring" 4.10 Discussions 4.30 Close ACCOMODATION is available in Fitzwilliam college at 30 pounds (single) and 47 pounds (twin), and **MUST** be booked and paid for in advance. There are also lots of hotels in Cambridge. ***************** Application ***************** Members of NCAF get free entry to all meetings for a year. (This is very good value - main meetings, tutorials, special interest meetings). It also includes subscription to Springer Verlag's new(ish) journal "Neural Computing and Applications". Full membership: 250 pounds. - anybody in your cmall company / research group in big company. Individual membership: 140 pounds - named individual only. Student membership (with journal): 55 pounds - copy of student ID required. Student membership (no journal, very cheap!): 25 pounds - copy of student ID required. Entry to this meeting without membership costs 35 pounds for the workshops, and 80 pounds for the main day. Payment in advance if possible; 5 pounds charge for issue of invoice if credit is required; need an official order number. Email enquiries to Mike Wynne-Jones, mikewj at signal.dra.hmg.gb. Postal to Mike Wynne-Jones, NCAF, PO Box 62, Malvern, WR14 4NU, UK. Fax to Mike Wynne-Jones, (+44/0) 684 894384 From yz%TRBOUN.BITNET at FRMOP11.CNUSC.FR Wed May 12 16:58:29 1993 From: yz%TRBOUN.BITNET at FRMOP11.CNUSC.FR (yz%TRBOUN.BITNET@FRMOP11.CNUSC.FR) Date: Wed, 12 May 1993 16:58:29 EDT Subject: Second Turkish AI and ANN Symposium Message-ID: <0096C66F.211B38A0.12766@trboun.bitnet> SECOND TURKISH ARTIFICIAL INTELLIGENCE AND ARTIFICIAL NEURAL NETWORKS SYMPOSIUM Bogazici University, Istanbul, Turkey June 24-25, 1993 SPONSORS: Bogazici University IEEE Control Systems Society Turkey Chapter TUBITAK IEEE Computer Society Turkey Chapter Bilkent University Middle East Technical University SYMPOSIUM CHAIR: Selahattin Kuru, Bogazici University PROGRAM COMMITTEE: Levent Akin, Bogazici University Varol Akman, Bilkent University Ethem Alpaydin, Bogazici University (Chair) Isil Bozma, Bogazici University M. Kemal Ciliz, Bogazici University Fikret Gurgen, Bogazici University H. Altay Guvenir, Bilkent University Ugur Halici, M.E.T.U. Yorgo Istefanopulos, Bogazici University Sakir Kocabas, TUBITAK Gebze Arastirma Merkezi Selahattin Kuru, Bogazici University Kemal Oflazer, Bilkent University A. C. Cem Say, Bogazici University Nese Yalabik, M.E.T.U. ORGANIZATION COMMITTEE: Levent Akin, Bogazici University (Chair) Ethem Alpaydin, Bogazici University Ruhan Alpaydin, Bogazici University Hakan Aygun, Bogazici University Sema Oktug, Bogazici University A. C. Cem Say, Bogazici University Mehmet Yagci, Bogazici University INVITED SPEAKER: Herbert E. Rauch , Lockheed Palo Alto Research Lab., President-elect, IEEE Control Systems Society IEEE Distinguished Lecturer AIM: Recent advances in the theory and engineering of computational sciences extended the application of automatic computers to novel domains, introducing a new generation of software and hardware systems. These artificially intelligent systems generally are equipped with sensors and actuators through which they interact with the environment, leading to abilities like vision, speech recognition and synthesis, and mobility. They are able to learn from experience and represent learned knowledge internally in an abstract, task-oriented form and operate based on this knowledge which leads to cognitive abilities like natural language understanding, problem solving and planning. They may also need to be autonomous thus not requiring intense operator supervision and intervention. Researching on these ideas is generally coupled with a reverse engineering effort of the natural systems which are the living examples. One major line here is building new computing systems inspired from the neural network organization of animal brains. Realization of such systems is a large inter-disciplinary effort requiring the collective work of researchers from as distant domains as cognitive psychology, neuroscience, linguistics, and physics from natural sciences, signal processing, electronics, and mechanics from engineering sciences, additional to computer science. The objective of the Symposium is to bring together Turkish and foreign researchers in the field, and to publicise their studies. There will be a panel in the closing session. The subject of this panel will be determined during the Symposium. PRELIMINARY PROGRAM JUNE 24, 1993 THURSDAY 08:30 - 12:00 REGISTRATION 09:00 ROOM A: OPENING CEREMONY 09:20 ROOM A: INVITED SPEAKER Neural Networks for Control, Identification and Diagnosis Herbert E. Rauch, Palo Alto Research Lab., USA. IEEE Control Systems Society President Elect 10:20 TEA BREAK 10:40 ROOM A: SESSION A.1.1 (NEURAL NETWORKS THEORY) 1. Learning in Hybrid Neural Models A. M. Colla, Elsag Bailey spa, N. Longo, F. Masulli, S. Ridella, University of Genoa, G. Morgavi, I. C. E. C. N. R., Italy. 2. A Weighted Least-Squares Algorithm for Neural Network Learning in Recognition of Low Probability Events D. J. Munro, O. K. Ersoy, M. R. Bell, J. S. Sadowsky, Purdue University, USA. 3. Memory Based Function Approximation Using Neural Networks S. Aratma, E. Alpaydin, Bogazici University, Turkey. 4. Statistical Physics, Neural Networks and Combinatorial Optimization Problems H. El Ghaziri, Ecole Polytechnique Fdrale de Lausanne, Switzerland. 5. Genetic Synthesis of Unsupervised Learning Algorithms A. Dasdan, K. Oflazer, Bilkent University, Turkey. 10:40 ROOM B: SESSION B.1.1 (NATURAL LANGUAGE PROCESSING I) 1. Two-level Description of Turkish Morphology K. Oflazer, Bilkent University, Turkey. 2. ATN Representation of Turkish Morphology T. Gungor, S. Kuru, Bogazici University, Turkey. 3. A Text Tagger for Turkish K. Oflazer, Bilkent University, Turkey. 4. A Spelling Checker and Corrector for Turkish H. L. Akin, S. Kuru, T. Gungor, I. Hamzaoglu, D. Arbatli, Bogazici University, Turkey. 5. Utilizing Connectionist Paradigm for Lexicon Access in Natural Language Processing Systems M. U. Sencan, K.Oflazer, Bilkent University, Turkey. 12:20 LUNCH 14:00 ROOM A: SESSION A.1.2 (IMAGE PROCESSING) 1. A Classification Algorithm Based on Feature Partitioning I. Sirin, H. A. Guvenir, Bilkent University, Turkey. 2. Dense Stereo Correspondance Using Elastic Nets U. M. Leloglu, TUBITAK AEAGE, Turkey. 3. Design and Development of an Image Processing and Computer Vision Software for Generating Data Files of 2D Physical Objects for Drafting Software Packages E. F. Arslan, A. Erden, Middle East Technical University, Turkey. 4. B-Spline Fonksiyonlari ile Goruntu Aradegerleme Isleminde Spline Katsayilarinin Hesaplanmasinda Iyilestirme. S. Albayrak, M. Y. Karsligil, Yildiz Technical University, D. Demir, TUBITAK MAE, Turkey. 5. Laser ve Kamera Araciligiyla Alinmis Kirinim Temelli Resimlerde Cisim Kenari Belirleme A. Kuzucu, Istanbul Technical University, M. Yilmaz, Erciyes University, Turkey. 14:00 ROOM B: SESSION B.1.2 (LOGIC AND REASONING) 1. A Resolution Principle for Quantificational S5 with an Application to Artificial Intelligence M. Baaz, C. G. Fermuller, Technische Universitt Wien, Austria. 2. Object-Oriented Logic Programming on Finite Domains L. V. Ciortuz, "Al. I. Cuza" University of Iasi, Romania. 3. Ontology for Buying and Selling: A Preliminary Study M. Ersan, E. Ersan, V. Akman, Bilkent University, Turkey. 4. Kuzgun Paradoksu: Akil ile Zeka Arasinda bir Ikilem M. M. Dagli, TUBITAK., Turkey. 5. Representing Emotions in Terms of Object Directedness H. G. Unsal, V. Akman, Bilkent University, Turkey. 15:40 TEA BREAK 16:00 ROOM A: SESSION A.1.3 (NEURAL NETWORKS APPLICATIONS I) 1. Arms Race Modeling Using Neural Networks: A Case Study A. N. Refenes, A. Zapranis, University College London, UK, C. Kollias, University of Crete, Greece. 2. A Comparative Study of Various Objective Functions for Feedforward Neural Networks A. U. Unluakin, F. Gurgen, Bogazici University, Turkey. 3. A Neural Network Approach to the Single Machine Total Tardiness Scheduling Problem B. Gurgun, I. Sabuncuoglu, Bilkent University, Turkey. 4. Evaluation of Doppler Blood Velocity Waveform Indices for Prenatal Surveillance Using Back Propagation Training Algorithm N. Baykal, A. Erkmen, N.Yalabik, Middle East Technical University, S. Beksac, Hacettepe University, I. Altintas, Middle East Technical University, Turkey. 5. Prediction of Postoperative Hemorrhage in Open Heart Surgery Patients Using Thromboelastographic Data and Neural Networks H. L. Akin, Bogazici University , S.Celikel, Istanbul University, Turkey. 16:00 ROOM B: SESSION B.1.3 (ARTIFICIAL INTELLIGENCE APPLICATIONS) 1. Combining AI Means and Traditional Programming to Solve Some Problems of FMS Simulation, Scheduling and Control G. Kovacs, Hungarian Academy of Sciences, Hungary. 2. Design and Implementation of an Expert System in the Prognosis of Hepatology I. Bonfa, Computer Systems Sector, C. Maioli, University of Bologna, F. Sarti, Belleria Hospital , G.Mazzoni, University of Bologna,G. L. Milandri, P. R. Dal Monte, Belleria Hospital, Italy. 3. Fuzzy Controller Against PD Controller for Servo Motor Process A. Nasar, Cairo University, M. S. El-Sherif, M. S. Abd El-Samee, Electronics Research Institute, Egypt. 4. Bilgisayar Kontrollu Robot Manipulatoru S. Celik, F. Daldaban, F. Canbulut, Erciyes University, Turkey. 5. "Bir Kelime-Bir Islem" Oynayan Program C. Say, S. Sen, R. Barengi, Bogazici University, Turkey. 17:40 BREAK 18:00-20:30 COCKTAIL JUNE 25, 1993 FRIDAY 09:00 ROOM A: SESSION A.2.1 (OPTICAL CHARACTER RECOGNITION) 1. Combining Unsupervised Techniques With Supervised Neural Methods for Optical Character Recognition M. Yagci, E. Alpaydin, Bogazici University, Turkey. 2. Pipelined Associative Memories for Handwritten Character Recognition Z. M. Kovacs, V. R. Guerrieri, University of Bologna, Italy. 3. On-line Cursive Handwriting Recognition Using Cooperating Neural Networks N. S. Flann, Utah State University,USA. 4. An Algorithm for Automatic Recognition of Arabic Cursive Handwritten A. I. El-Desoky, M. M. Salem, El-Mansoura University, N. H. Hegazi, M. M. Farag, National Research Center, Egypt. 09:00 ROOM B: SESSION B.2.1 (KNOWLEDGE REPRESENTATION) 1. Representation of Descriptive and Prescriptive Knowledge in Intelligent Systems S. Kocabas, Istanbul Technical University, TUBITAK MAE, Turkey. 2. A Proposed Knowledge Representation Scheme for Hybrid Learning Systems S. Abdel-Hamid, A. Abdel-Wahab, A. El-Dessouki, Electronics Research Institute, Egypt. 3. Frame Matching in an Extended Relational Language A. Kulenovic, A. Lagumdzija-Kulenovic, Marmara University, Turkey. 4. Mechanical Construction of Carnap-style Knowledge Bases D. Kain, MacLean Hunter Publishers, Austria. 10:20 TEA BREAK 10:40 ROOM A: SESSION A.2.2 (NEURAL NETWORKS APPLICATIONS II) 1. Implementation of a Neural Network Model on a Massively Parallel Computer Architecture M. K. Ciliz, A. Paksoy, Bogazici University, Turkey. 2. Dynamic Hill Climbing: Overcoming the Limitations of Optimization Algortihms. D. Yuret, M. de la Maza, M.I.T, USA. 3. Yapay Noron Aglarindan Cift-Yonlu Iliskili Bellek Icin Kullanilan Uc Kodlama Yontemi S.Oge, Yildiz Technical University, F. Gurgen, Bogazici University, Turkey. 4. Yapay Sinir Aglarinin bir Tanimina Dogru C. Guzelis, Istanbul Technical University, Turkey. 5. Yapay Noron Agi Adaptif Resonans Teori Yontemi ile Siniflandirma Sistemi ve Bir Uygulama M. E. Karsligil, M. Y. Karsligil, Yildiz Technical University, Turkey. 10:40 ROOM B: SESSION B.2.2 (NATURAL LANGUAGE PROCESSING II) 1. A Lexical-Functional Grammar for a Subset of Turkish Z. Gungordu, K. Oflazer, Bilkent University, Turkey. 2. An ATN Grammar for a Subset of Turkish C. Demir, K. Oflazer, Bilkent University, Turkey. 3. Resolution of Pronominal Anaphora in Turkish E. Tin, V. Akman, Bilkent University, Turkey. 4. Automatic Natural Language Identification F. Kocan, M.U. Karakas, Hacettepe University, Turkey. 5. Language Recognition Using 3-Gram Statistical Analysis E. Gokcay, Bilkent Universitesi, D. Gokcay, Middle East Technical University, Turkey. 12:20 LUNCH 14:00 ROOM A: SESSION A.2.3 (MACHINE TRANSLATION) 1. Machine Translation from Turkish to its Dialects I. Hamzaoglu, S. Kuru, Bogazici University, Turkey. 2. Connectionism in Machine Translation of Macedonian and English Prepositions K. Cundeva, Institut za Informatika, Macedonia. 3. A Turkish / English Translation Application Using Neural Networks in Morphological Analysis D. Gokcay, U. Halici, Middle East Technical University, Turkey. 14:00 ROOM B: SESSION B.2.3 (PHILOSOPHICAL ISSUES) 1. Elements of Scientific Creativity S. Kocabas, Istanbul Technical University, TUBITAK MAE, Turkey. 2. Similarities Between Humans and Machines A. E. Gunhan, University of Bergen, Norway. 15:00 TEA BREAK 15:30 - 17:00 ROOM A: PANEL AND CLOSING CEREMONY ACCOMMODATION Participants should contact the hotel facilities directly. Reservations of 3-4 weeks in advance are advised, considering the city's heavy tourism load during the period of the Symposium. Listed below are a few suggestions of ours. All prices include Value Added Tax except where indicated. Ciragan Palace Kempinski (*****) overlooks the Bosphorus (Near Besiktas) Single ("park" side) $195 Single (sea side) $230 Double ("park" side) $235 Double (sea side) $270 Tel: +90 (1) 2583377 Dedeman Oteli (****) in Gayrettepe Single $110 Double $150 Tel: +90 (1) 2748800 Fax: +90 (1) 2751100 Kalyon Otel(****) in the Old City Single $96 Double $120 Tel: +90 (1) 5174400 Fax: +90 (1) 6381111 (Excl. VAT. Concession of 20% for the Symposium participants) Movenpick Hotel(****) at Maslak Single (single bed in double Room) $165 Double $165 Breakfast $15 Tel: +90 (1) 2850900 Fax: +90 (1) 2850951 Bebek Hotel (***) relatively close to the Symposium venue Single (single bed in double Room, sea side, incl. breakfast) $90 Single ("city" side, incl. breakfast) $50 Double (sea side, incl. breakfast) $90 Double ("city" side, incl. breakfast) $60 Tel: +90 (1) 263 30 00 (2 lines) Telex: 27201 HOBE TR. Ist. Anadolu Otelcilik ve Turizm Meslek Lisesi Uygulama Oteli relatively close to the Symposium venue Single (single bed in double Room, incl. breakfast) 224.000 TL Double (incl. breakfast) 224.000 TL Tel: +90 (1) 278 19 97 (2 lines) Fax: +90 (1) 278 19 99 ARRIVAL TO BOGAZICI UNIVERSITY (BU) WITHIN ISTANBUL BU is located in between Bebek and Etiler. You can reach BU by max 3 transit vehicles using combinations of public bus, minibus, "dolmus" and ferry. Below table shows distance between some centres to BU. Taxi fee is 5,000 TL as initial and 4,000 TL per km. Centre Arrival Distance Taxi * Bakirkoy P 30 100,000 TL Yesilkoy P 37 125,000 TL Eminonu P 12 55,000 TL Taksim P 8 40,000 TL Besiktas P 7 33,000 TL Kadikoy P SP DP 18 85,000 TL Bostanci P SP DP 24 110,000 TL Levent P 4 25,000 TL Maslak MP P 10 45,000 TL Bebek Walk 2 13,000 TL P- Public Bus, M- Minibus, D- Dolmus (special kind of minibus), S- Ship *May increase about 10 % by June. Arrival column shows the transit vehicle combinations to come BU (eg, SP means first ship then public bus trip). REGISTRATION FEE: Until June 1, 1993, normal registration fee is $ 200, and student fee is $ 100. After June 1, the normal and student fees will be applied as $ 250 and $ 125, respectively. Fees include the Proceedings, the cocktail and tea services. APPLICATION ADDRESS: Assist. Prof. Dr. Levent Akin (Second Turkish AI and ANN Symposium) Bogazici University, Department of Computer Engineering, 80815 Bebek, ISTANBUL, TURKEY. TEL : +90 (1) 263 15 00/ 1769 FAX : +90 (1) 265 84 88 E-MAIL: YZ at TRBOUN.BITNET PRE-REGISTRATION FORM I will attend the II. Turkish Artificial Intelligence and Artificial Neural Networks Symposium to be held on June 24 and 25, 1993. Name : ................................... Job : ................................... Institution/Company : .................................. Addess ................................. ........................................................... Telephone : .................................. E-Mail : .................................. SYMPOSIUM FEE : $ ....................... I have deposited the above indicated amount at Garanti Bankasi Bogazici University Branch account no 6610126/2 (YAPAY ZEKA SEMPOZYUM). Bank receipt is enclosed. From jbower at smaug.cns.caltech.edu Wed May 12 17:10:17 1993 From: jbower at smaug.cns.caltech.edu (Jim Bower) Date: Wed, 12 May 93 14:10:17 PDT Subject: CNS*93 Message-ID: <9305122110.AA11014@smaug.cns.caltech.edu> Registration information for the Second Annual Computation and Neural Systems Meeting CNS*93 July 31 through August 6,1993 Washington DC This posting announces registration for this year's Computation and Neural Systems meeting (CNS*93). This is the second in a series of annual inter-disciplinary conferences intended to address the broad range of research approaches and issues involved in the general field of computational neuroscience. As last year, this meeting will bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in understanding how biological neural systems compute. The meeting will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. Meeting Structure The meeting will be composed of three parts: a day of tutorials, three and a half days of research presentations, and two and a half days of follow up workshops. The first day of the meeting (July 30) will be devoted to tutorial presentations and workshops focused on particular technical issues confronting computational neurobiology as well as general issues related to computational neurobiology. Introductory tutorials will also be given. The next three and a half days will include the main technical program consisting of plenary, contributed and poster sessions. Oral presentations will be made in one continuous session. Posters will be presented each evening. Following the main meeting, there will be two and a half days of focused workshops at a resort in West Virginia. Workshop topics will be established via email communication prior to the meeting as well as during the main meeting based on papers presented and issues raised. Location The tutorial day and the main meeting itself will be held at the Hyatt Regency Bethesda. This modern hotel is located at One Bethesda Metro Center in downtown Bethesda with easy access to the DC metro system. Following the main meeting, two days of postmeeting workshops will be held at the Coolfont resort which is set within 1350 mountainous acres in the Eastern Panhandle of West Virginia. Accommodations Main Meeting. We have reserved a block of rooms at special rates in the conference hotel. Regular registrants (i.e. non students) $89 single, $125 double, and full time students $79 single or double. Registering for the meeting, WILL NOT result in an automatic room reservation. Instead you must make your own reservations by contacting: The Hyatt Regency Bethesda One Bethesda Metro Center Bethesda, MD 20814 (301) 657-1234 or 1-800-233-1234 NOTE: IN ORDER TO GET THE REDUCED RATES, YOU MUST INDICATE THAT YOU ARE REGISTERING FOR THE CNS*93 MEETING. STUDENTS WILL BE ASKED TO VERIFY STATUS. Workshops. Accommodations for the workshops will be provided onsite at Coolfont resort and are included in the price of registration. All meals are also included in the registration fee. Acknowledgment of registration for the workshops and payment of fees WILL constitute a guarantee of accommodations at Coolfont. However, the total accommodations available for the workshops are limited, so please register early. Registering for the meeting We would recommend registering for the meeting as soon as possible as space for some meeting events is limited. Participants can register for the meeting in several different ways. 1) electronically, 2) via email, 3) via regular surface mail. Each different method is described below. Please only register using one method. You will receive a confirmation of registration within two weeks. 1) Interactive electronic registration: For those of you with internet connectivity who would like to register electronically for CNS*93 we have provided an internet account through which you may submit your registration information. To use this service you need only "telnet" to "mordor.cns.caltech.edu" and login as "cns93". No password is required. For example: yourhost% telnet mordor.cns.caltech.edu Trying 131.215.137.69 ... Connected to mordor.cns.caltech.edu. Escape character is '^]'. SunOS UNIX (mordor) login: cns93 Now answer all questions Note that all registration through this electronic service is subject to verification of payment. 2) For those with easy access to electronic mail, simply fill in the attached registration form and email it to: cp at smaug.cns.caltech.edu 3) Finally, for those who elect neither of the above options, please print out the attached registration form and send with payment via surface mail to the address indicated. In each case, registration will not be accepted as final until all fees are paid. Those registering by 1 or 2 above, but paying with check or money order should send payment to the following address as with your name and institution clearly marked. CNS*93 Registrations Division of Biology 216-76 Caltech Pasadena, CA 91125 ================================================== ************************ REGISTRATION FORM CNS*93 WASHINGTON D.C. July 31 - August 8 1993 ************************ Name : Title : Organization : Address : City : State : Zip : Country : Telephone : email address : Registration Fees : Tutorial (July 31) _____ $ 25 (includes lunch) Technical Program (August 1-4) _____ $ 300 Regular _____ $ 125 Full-time Student (Include verification of status) _____ $ 50 Banquet (for each additional banquet ticket) (main registration includes one banquet ticket and book of abstracts) Post-meeting Workshop (August 4-7) _____ $ 300 (includes round-trip transportation, meals and lodging) $ ______ Total Payment Please indicate method of payment : ____ Check or Money Order (Payable in US. dollars to CNS*92 - Caltech) will be sent to CNS*92 Registrations Division of Biology 216-76 Caltech Pasadena, CA 91125 ___ Visa ___ Mastercard ___ American Express Charge my card number ________________________________________ Expiration date ____________ Name of cardholder ___________________ Signature as appears on card : _________________________ Date ____________ Please make sure to indicate CNS*93 and YOUR name on all money transfers Did you submit an abstract & summary ? ( ) yes ( ) no title : Do you have special dietary preferences or restrictions (e.g., diabetic, low sodium, kosher, vegetarian)? If so, please note: Some grants to cover partial travel expenses may become available. Do you wish further information ? ( ) yes ( ) no ================================================== On-line access to additional meeting information Additional information about the meeting is available via FTP over the internet (address: 131.215.135.69 ). To obtain information about the agenda, currently registered attendees, or paper abstracts, the initial sequence is the same (Things you type are in ""): > yourhost% "ftp 131.215.137.69" > 220 mordor FTP server (SunOS 4.1) ready. Name (131.215.139.69:): "ftp" > 331 Guest login OK, send ident as password. Password: "yourname at yourhost.yourside.yourdomain" > 230 Guest login OK, access restrictions apply. ftp> "cd cns93" > 250 CWD command successful. ftp> At this point you can do one of several things: 1) To examine what is available type: "ls" 2) To download the meeting registration form type: "get registration" 3) To download the meeting agenda type: "get agenda" 4) To download a list of attendees type: "get attendees" 5) To download meeting abstracts first type: "cd cns93/abstracts" a) to view the list of abstracts type: "ls" b) to download specific abstracts type: "get " c) to download all abstracts type: "mget *" Once you have obtained what you want type: "quit" From petsche at learning.siemens.com Thu May 13 18:26:36 1993 From: petsche at learning.siemens.com (Thomas Petsche) Date: Thu, 13 May 93 18:26:36 EDT Subject: Position available immediately Message-ID: <9305132226.AA07416@learning.siemens.com> A Siemens subsidiary in Atlanta Georgia has an immediate opening for an engineer with a background that includes neural networks and electric machines (motors and generators). The position requires a master's degree or equivalent experience. Your responsibilities would include developing, implementing and testing neural network, statistical, and/or machine learning based algorithms for electric machine diagnosis. Siemens AG is a worldwide supplier of electrical and electronic devices with sales in excess of 4Billion$ in the US and 40Billion$ worldwide. If you are interested, send a cover letter and resume to me and I'll forward it to the relevant people. Thomas Petsche Siemens Corporate Research 755 College Road East Princeton, NJ 08540 Fax: 609-734-3392 From isk at lautaro.fb10.tu-berlin.de Thu May 13 03:18:28 1993 From: isk at lautaro.fb10.tu-berlin.de (I.SANTIBANEZ-KOREF) Date: Thu, 13 May 93 09:18:28 +0200 Subject: Evolutionary Structuring of Artificial Neural Networks Message-ID: <9305130718.AA06992@lautaro.fb10.tu-berlin.de > *** DO NOT FORWARD TO ANY OTHER LISTS *** A postscript copy of the following Technical Report can be obtained by anonymous ftp at ftp-bionik.fb10.tu-berlin.de (ftp-instructions at the end of the message) : Evolutionary Structuring of Artificial Neural Networks H.--M. Voigt, J. Born, I. Santibanez--Koref Technical University Berlin Bionics and Evolution Techniques Laboratory Bio-- and Neuroinformatics Research Group The report summarizes research on the structuring of Artificial Neural Networks by a stochastic graph generation grammar.The main feature of the approach is to carry out the graph generation in view of an individual development process which is embedded in an evolutionary framework.We explain this approach by examples, and evaluate its practicability. Comments and questions are welcome. ========== ftp-instructions: unix %ftp ftp-bionik.fb10.tu-berlin.de Connected to lautaro.fb10.TU-Berlin.DE. Name (ftp-bionik.fb10.tu-berlin.de:pqp):anonymous 331 Guest login ok, send your complete e-mail address as password. Password: 230 Guest login ok, access restrictions apply. ftp> cd pub/papers/Bionik 250 CWD command successful. ftp> bin 200 Type set to I. ftp> get tr-02-93.ps.Z 200 PORT command successful. 150 Opening BINARY mode data connection for tr-02-93.ps.Z (157041 bytes). 226 Transfer complete. local: tr-02-93.ps.Z remote: tr-02-93.ps.Z 157041 bytes received in 0.44 seconds (3.5e+02 Kbytes/s) ftp> quit 221 Goodbye. unix % zcat tr-02-93.ps.Z | lpr -P =============== Ivan Santibanez-Koref FG: Bionik und Evoluionstechnik FoG Bio- und Neuroinformatik Sekr. ACK1 Ackerstrasse 71-76 1000 Berlin 6 GERMANY Tel.: +49 - 30 - 314 72 677 Fax.: +49 - 30 - 541 98 72 E-mail: isk at fb10.tu-berlin.dbp.de From ferna1 at sis.ucm.es Thu May 13 09:18:30 1993 From: ferna1 at sis.ucm.es (Fernando M. Pescador) Date: Thu, 13 May 1993 9:18:30 UTC+0200 Subject: Request for Connectionists Information Message-ID: I send to you the content of our 1st FORUM in Neural Networks. Sorry because the content is in Spanish language. Sincerely, Fernando Pescador ferna1 at sis.ucm.es * * * * * FORUM DE REDES NEURONALES * * * * * Jueves 27 de Mayo de 1993 Universidad Complutense Salon de Actos del Centro de Proceso de Datos de la U.C.M. Hora: 9:30 de la Ma~nana. Organizacion: Servicio de Informatica Universidad Complutense de Madrid Fernando Pescador ( ferna1 at sis.ucm.es ) ------ Ponencias -------- 1 *.- " MOLECULAR COMPUTING : O como dise~nar redes neuronales con proteinas y su aplicacion en el dise~no de Nano-computadores." Autor: Rafael Lahoz Beltra. Dpto. de Biomatematica (Matematica Aplicada) U.C.M. Resumen: El objeto de la conferencia es introducir el concepto de Molecular Computing y mostrar como redes neuronales pueden implementarse fisicamente con redes de polimeros, concretamente proteinas, y comentar los resultados por nosotros obtenidos y publicados en las revistas COMPUTER, ByoSystem, etc. 2 *.- " REPRESENTACION Y CODIFICACION DE ESTRUCTURAS DE LA QUIMICA ORGANICA MEDIANTE REDES NEURONALES " Autor: Manfred Stud. Instituto de Quimica del C.S.I.C. Resumen: " Se presenta un modulo grafico que transforma una estructura molecular de la Quimica Organica en una red neuronal cuyo procesamiento conduce a un codigo que es utilizado como representacion de dicha estructura en un proceso de asociacion mediante bp con una propiedad de la misma (su actividad biologica)." 3 *.- " IDENTIFICACION DE TIPOS DE DIAS MEDIANTE UN MAPA AUTO-ORGANIZATIVO " Autor: Alvaro Garcia Tejedor Dpto. Ingenieria del Conocimiento de ERITEL Resumen: Se esta intentando predecir la demanda nacional de consumo electrico en escala horaria (curva de demanda horaria) para cualquier dia del a~no. Sin embargo, el comportamiento de la demanda depende fuertemente del dia que se intenta predecir. El numero y tipo de dias presentes en un a~o es desconocido, ya que gran parte de los factores que influyen en la demanda no son cuantificables. Se ha propuesto un mapa de Kohonen como metodo para clasificar los dias a partir de los perfiles de demanda horaria. El resultado se ha comparado con otras tecnicas de clasificacion (Analisis de Componentes Principales y Analisis Discriminante usando Distancias de Mahalanobis). 4 *.- " DEFINICION DE LA CAPA DE ENTRADA " Autor: Susana Lopez Ornat. Dpto. Psicologia Basica: Procesos Cognitivos. U.C.M. Resumen: " Problema de definicion de la capa de entrada para modelos de adquisicion del lenguaje. Los resultados experimentales obtenidos muestran que el input es co-definido por el sistema-procesos en los patrones de entrada a la capa del input . " 5 *.- " APRENDIZAJE MEDIANTE VALOR ADAPTATIVO ". Autor: Antonio Murciano Cespedosa y Javier Zamora Romero. Dpto. Biomatematica ( Matematica Aplicada ) U.C.M. Resumen: Se presentara un modelo para el centrado de estimulos visuales, como ejemplo del aprendizaje mediante valor adaptativo. Dicho modelo esta basado en estructuras inspiradas biologicamente y evolutivamente seleccionadas. Este tipo de aprendizaje permite comportamientos dependientes del ambiente ( de entrenamiento y de funcionamiento ). 6 *.- "REDES METABOLICAS: PROTEINAS Y MAQUINAS DE TURING". Autor: Mario Reviriego Eiros. Dpto. Biomatematica. ( Matematica Aplicada ) U.C.M. Resumen: " El tema trata de la simulacion de redes metabolicas por medio de enzimas (proteinas) que se "comportan" como maquinas de Turing y metabolitos que son modelizados por medio de cadenas binarias." 7 *.- "CALCULO DE ESTRUCTURA SECUNDARIA DE PROTEINAS MEDIANTE UNA RED NEURONAL CON APRENDIZAJE NO SUPERVISADO". Autor: Pablo Chacon Montes Dpto. Biologia Molecular I ( Fac. Quimicas ) U.C.M. Resumen:" Se ha desarrollado una red de Kohonen para la clasificacion topologica de proteinas a partir de espectros de dicroismo circular. En los mapas resultantes se observa un ordenamiento en dependencia de los distintos tipos de estructura secundaria. Esta clasificacion es aprovechada posteriormente para el calculo de porcentaje de estructura de nuevas proteinas. 8 *.- "AUTOORGANIZACION DE CAMPOS RECEPTIVOS ON-OFF EN EL CORTEX VISUAL" Autor: Miguel A. Andrade Navarro Dpto. Biologia Molecular I ( Fac. Quimicas ) U.C.M. Resumen: " Se modeliza la autoorganizacion de las conexiones sinapticas en el cortex visual de mamiferos que sucede durante las etapas tempranas del desarrollo. Se emplea para ello una red neuronal de dos capas en la que la evolucion de las conexiones depende de la correlacion de actividad neuronal mediante reglas Hebbianas. Se observa la aparicion de neuronas sensibles a la posicion, orientacion y tama~no de un estimulo. 9 *.- " NT5000: SISTEMA PROCESADOR DE REDES NEURONALES ". Autor: Jose C. Chacon Gomez Laboratorio de Vision. Fac. Psicologia. Resumen: Presentacion de este sistema ( hardware especializado y software corriendo en PC ) de dise~no y calculo de redes neuronales. 10 *.- " ASPIRIN/MIGRAINES HERRAMIENTA DE DISE~NO Y ANALISIS DE REDES NEURONALES ". Autor: Fernando Pescador. Servicio de Informatica de la U.C.M. Resumen: Se presenta la herramienta de libre distribucion AM6.0, que puede correr en estaciones de trabajo y Superordenadores y sirve para el dise~no y analisis de Redes Neuronales. =============================================================================== From MASETTI at BOLOGNA.INFN.IT Mon May 17 05:57:00 1993 From: MASETTI at BOLOGNA.INFN.IT (MASETTI@BOLOGNA.INFN.IT) Date: Mon, 17 MAY 93 09:57 GMT Subject: call for papers: SAC '94 Message-ID: =========================================================== | | | | | | | CALL FOR PAPERS | | =============== | | | | 1994 ACM Symposium on Applied Computing (SAC'94) | | | | | | TRACK ON FUZZY LOGIC IN APPLICATIONS | | ------------------------------------ | | | | Phoenix Civic Plaza, Phoenix, Arizona, USA | | | | March 6-8, 1994 | | | =========================================================== SAC'94 is the annual conference of the ACM Special Interest Group on Applied Computing (SIGAPP), APL (SIGAPL), Biomedical Computing (SIGBIO), Business Information Technology (SIGBIT), Computer Uses in Education (SIGCUE), Forth (SIGFORTH), and Small and Personal Computer (SIGSMALL/PC). For the past nine years, SAC has become a primary forum for applied computing practitioners and researchers. Once again SAC'94 will be held in conjunction with the 1994 ACM Computer Science Conference (CSC'94). Fuzzy Logic in Applications is one of the major tracks in SAC. The purpose of this track is to provide a forum for the interchange of ideas, research, development activities, and applications among academic and practitioners in the areas related to Fuzzy Logic in Applications. State-of-the-art and state-of-the-practice original papers relevant to the track themes as well as panel proposals are solicited. RELEVANT TOPICS: Applications of Fuzzy Systems to: - System Control - Signal Processing - Intelligent Information Systems - Image Understanding - Case-Based Reasoning - Pattern Recognition - Decision Making and Analysis - Robotics and Automation - Modelling - Medical Diagnostic and MRI - Databases and Information Retrieval - Evolutionary Computation - Neural Systems IMPORTANT DATES: Submission of draft papers: 17.09.1993 Notification of acceptance: 01.11.1993 Camera-ready copy due: 20.11.1993 TRACK CHAIR: Madjid Fathi FB Informatik, LS1 P.O.BOX 500 500 University of Dortmund D-4600 Dortmund 50 Germany Tel: +49231-7556372 FAX: +49231-7556555 Email: fathi at ls1.informatik.uni-dortmund.de HONORARY ADVISOR : Lotfi A. Zadeh, University of California, Berkeley TRACK ADVISORY: Y. Attikiouzel, Univ. of Western Australia H. Berenji, NASA Ames Division, AI Research, CA, USA M. Jamshidi, Univ. of New Mexico, NM, USA A. Kandel, Univ. of South Florida, USA R. Kruse, Univ. of Braunschweig, Germany E.H. Mamdani, Univ. of London, GB M. Masetti, Univ. of Bologna, Italy H. Prade, Univ. of Paul Sabatier, France B. Reusch, Univ. of Dortmund, Germany E.H. Ruspini, SRI International, USA H. Tanaka, Univ. of Osaka, Japan L. Valverde, Univ. of de les Illes Baleares, Spain R.R. Yager, Iona College, Editor in-chief, USA H.J. Zimmermann, Univ. of Aachen, Germany GUIDELINES FOR SUBMISSION Several Categories of papers will be considered for presentation and publication including: (i) original and unpublished research articles, (ii) Reports of applications in - business, - government, - industrie, - arts, - science and - engineering. Accepted papers will be published in the ACM/SAC'94 Conference Proceedings to be printed by the ACM Press. In order to facilitate the blind external review process, submission guidelines must be strictly adhered to: - Submit 5 copies of your manuscript to the track chair. - Authors names and addresses MUST NOT appear in the body of the paper, self-reference must be in the third person, attribution to the author(s) must be in the form of "author", and bibliographical entries by the author(s) must also be in the form of "author". - The body of the paper should not exceed 5.000 words (approximately 20 doubled-spaced pages). - A seperate cover sheet shoeld be attached to each copy, containing - the title of the paper, - the author(s) and affiliation(s), - and the address (including e-mail address and fax number, if available) to which correspondence should be sent. - Panel proposals must include abstract of the topics and a copy of resume/vita of the moderator. ------- End of Forwarded Message From Prahlad.Gupta at K.GP.CS.CMU.EDU Mon May 17 18:57:24 1993 From: Prahlad.Gupta at K.GP.CS.CMU.EDU (Prahlad.Gupta@K.GP.CS.CMU.EDU) Date: Mon, 17 May 93 18:57:24 EDT Subject: Cognitive Science Preprint Message-ID: FTP-host: reports.adm.cs.cmu.edu (128.2.218.42) FTP-filename: /1993/CMU-CS-93-146.ps The following article will appear in the Cognitive Science journal. A preprint of the paper is available as CMU Computer Science Technical Report No. CMU-CS-93-146, in electronic as well as hard-copy form. Information follows about electronic retrieval (free), as well as ordering hard copies (for a small charge). Comments on the paper are invited. Note: A preliminary and substantially different version of this paper was announced in the neuroprose electronic archive in December-January 1991-92 as the file gupta.stress.ps.Z (which is no longer available). -- Prahlad ------------------------------------------------------------------------------ Connectionist Models and Linguistic Theory: Investigations of Stress Systems in Language Prahlad Gupta and David S. Touretzky Carnegie Mellon University (To appear in Cognitive Science) Abstract We question the widespread assumption that linguistic theory should guide the formulation of mechanistic accounts of human language processing. We develop a pseudo-linguistic theory for the domain of linguistic stress, based on observation of the learning behavior of a perceptron exposed to a variety of stress patterns. There are significant similarities between our analysis of perceptron stress learning and metrical phonology, the linguistic theory of human stress. Both approaches attempt to identify salient characteristics of the stress systems under examination without reference to the workings of the underlying processor. Our theory and computer simulations exhibit some strikingly suggestive correspondences with metrical theory. We show, however, that our high-level pseudo-linguistic account bears no causal relation to processing in the perceptron, and provides little insight into the nature of this processing. Because of the persuasive similarities between the nature of our theory and linguistic theorizing, we suggest that linguistic theory may be in much the same position. Contrary to the usual assumption, it may not provide useful guidance in attempts to identify processing mechanisms underlying human language. ------------------------------------------------------------------------------ INSTRUCTIONS FOR ELECTRONIC RETRIEVAL VIA ANONYMOUS FTP unix> ftp reports.adm.cs.cmu.edu # or ftp 128.2.218.42 Connected to reports.adm.cs.cmu.edu. 220 REPORTS.ADM.CS.CMU.EDU FTP server (Version 4.105 of 10-Jul-90 12:08) ready. Name (reports.adm.cs.cmu.edu:prahlad): anonymous 331 Guest login ok, send username at node as password. Password: # you must include the "@" 230-Filenames can not begin with "/.." . Other than that, everything is ok. 230 User anon logged in. ftp> cd 1993 250 Directory path set to 1993. ftp> get CMU-CS-93-146.ps 200 PORT command successful. 150 Opening data connection for CMU-CS-93-146.ps (128.2.248.83,1073) (591324 byt es). 226 Transfer complete. local: CMU-CS-93-146.ps remote: CMU-CS-93-146.ps 600021 bytes received in 10 seconds (57 Kbytes/s) ftp> quit unix> lpr -P CMU-CS-93-146.ps # or however you print PostScript files ------------------------------------------------------------------------------ ORDERING A HARD COPY (The TR No. is CMU-CS-93-146) Contact: Computer Science Documentation School of Computer Science Carnegie Mellon University Pittsburgh, Pennsylvania 15213, USA Phone: (412) 268-2596 Internet: reports at cs.cmu.edu ------------------------------------------------------------------------------ From sjodin at sics.se Tue May 18 09:11:23 1993 From: sjodin at sics.se (Gunnar Sj|din) Date: Tue, 18 May 1993 15:11:23 +0200 Subject: Two research positions at the Swedish Institute of Computer Science available. Message-ID: <9305181311.AA06261@sics.se> SICS is the joint effort of Swedish industry and government in computer science research. We are now entering the area of neural networks and would like to permanently employ one researcher, and invite a guest researcher for one year. They should be willing to help build the group and its research program. We are interested in candidates with a strong background both in the theory of the field and its applications. On the application side, we are particularly interested in methods for telecommunications and robotics but other areas may come in as well. Duties to begin as soon as possible after September 1, 1993. Apply, by June 15, in writing, email or fax to Gunnar Sjodin SICS Box 1263 S-164 28 Kista Sweden phone: +46-8-752 15 48, fax +46-8-751 72 30 email:sjodin at sics.se Please enclose a curriculum vitae, a list of publications, and the names, addresses, and phone numbers of two referees. From stiber at cs.ust.hk Wed May 19 14:14:10 1993 From: stiber at cs.ust.hk (Dr. Michael D. Stiber) Date: Wed, 19 May 93 14:14:10 HKT Subject: Paper in neuroprose: Learning In Neural Models With Complex Dynamics Message-ID: <9305190614.AA21563@cs.ust.hk> The following preprint has been placed in the Neuroprose archives at Ohio State (filename: stiber.dynlearn.ps.Z). If you cannot use FTP, I can email the file to you. "Learning In Neural Models With Complex Dynamics" (4 pages) Michael Stiber Department of Computer Science The Hong Kong University of Science and Technology Clear Water Bay, Kowloon, Hong Kong stiber at cs.ust.hk Jose P. Segundo Department of Anatomy and Cell Biology and Brain Research Institute University of California Los Angeles, California 90024, USA iaqfjps at mvs.oac.ucla.edu Abstract Interest in the ANN field has recently focused on dynamical neural {\em networks} for performing temporal operations, as more realistic models of biological information processing, and to extend ANN learning techniques. While this represents a step towards realism, it is important to note that {\em individual} neurons are complex dynamical systems, interacting through nonlinear, nonmonotonic connections. The result is that the ANN concept of {\em learning}, even when applied to a single synaptic connection, is a nontrivial subject. Based on recent results from living and simulated neurons, a first pass is made at clarifying this problem. We summarize how synaptic changes in a 2-neuron, single synapse neural network can change system behavior and how this constrains the type of modification scheme that one might want to use for realistic neuron-like processors. Dr. Michael Stiber stiber at cs.ust.hk Department of Computer Science tel: (852) 358 6981 The Hong Kong University of Science & Technology fax: (852) 358 1477 Clear Water Bay, Kowloon, Hong Kong From pastor at max.ee.lsu.edu Wed May 19 16:41:28 1993 From: pastor at max.ee.lsu.edu (John Pastor) Date: Wed, 19 May 93 15:41:28 CDT Subject: No subject Message-ID: <9305192041.AA28051@max.ee.lsu.edu> The following technical report is now available. If you would like to have a copy, please let me know. ------------------------------------------------------------------ Technical Report ECE/LSU 93-04 Another Alternative to Backpropagation: A One Pass Classification Scheme for Use with the Kak algorithm John F. Pastor Department of Electrical and Computer Engineering Louisiana State University Baton Rouge, La. 70803 April 26,1993 email: pastor at max.ee.lsu.edu ABSTRACT Kak[1] provides a new technique for designing, and training, a feedforward neural network. Training with the Kak algorithm is much faster, and is implemented much more easily, than with the backpropagation algorithm[2]. The Kak algorithm calls for the construction of a network with one hidden layer. Each hidden neuron classifies an input vector in the training set that maps to a nonzero output vector. Kak[1] also presents two classification algorithms. The first, CC1, provides generalization comparable to backpropagation[2] but may require numerous passes through the training set to classify one input vector. The second, CC2, only requires inspection of the vector we wish to classify but does not provide generalization. An extension of CC2 is suggested as a new classification scheme that will classify an input vector with only one pass through the training set yet will provide generalization. Simulation results are presented that demonstrate that using the new classification scheme not only signifigantly reduces training time, but provides better generalization capabilities, than classifying with CC1. Thus, the Kak algorithm, using this new classification scheme, is an even better alternative to backpropagation. From Connectionists-Request at cs.cmu.edu Sat May 1 00:05:17 1993 From: Connectionists-Request at cs.cmu.edu (Connectionists-Request@cs.cmu.edu) Date: Sat, 01 May 93 00:05:17 -0400 Subject: Bi-monthly Reminder Message-ID: <23733.736229117@B.GP.CS.CMU.EDU> *** DO NOT FORWARD TO ANY OTHER LISTS *** This note was last updated January 4, 1993. This is an automatically posted bi-monthly reminder about how the CONNECTIONISTS list works and how to access various online resources. CONNECTIONISTS is not an edited forum like the Neuron Digest, or a free-for-all newsgroup like comp.ai.neural-nets. It's somewhere in between, relying on the self-restraint of its subscribers. Membership in CONNECTIONISTS is restricted to persons actively involved in neural net research. The following posting guidelines are designed to reduce the amount of irrelevant messages sent to the list. Before you post, please remember that this list is distributed to over a thousand busy people who don't want their time wasted on trivia. Also, many subscribers pay cash for each kbyte; they shouldn't be forced to pay for junk mail. Happy hacking. -- Dave Touretzky & David Redish --------------------------------------------------------------------- What to post to CONNECTIONISTS ------------------------------ - The list is primarily intended to support the discussion of technical issues relating to neural computation. - We encourage people to post the abstracts of their latest papers and tech reports. - Conferences and workshops may be announced on this list AT MOST twice: once to send out a call for papers, and once to remind non-authors about the registration deadline. A flood of repetitive announcements about the same conference is not welcome here. - Requests for ADDITIONAL references. This has been a particularly sensitive subject lately. Please try to (a) demonstrate that you have already pursued the quick, obvious routes to finding the information you desire, and (b) give people something back in return for bothering them. The easiest way to do both these things is to FIRST do the library work to find the basic references, then POST these as part of your query. Here's an example: WRONG WAY: "Can someone please mail me all references to cascade correlation?" RIGHT WAY: "I'm looking for references to work on cascade correlation. I've already read Fahlman's paper in NIPS 2, his NIPS 3 abstract, and found the code in the nn-bench archive. Is anyone aware of additional work with this algorithm? I'll summarize and post results to the list." - Announcements of job openings related to neural computation. - Short reviews of new text books related to neural computation. To send mail to everyone on the list, address it to Connectionists at CS.CMU.EDU ------------------------------------------------------------------- What NOT to post to CONNECTIONISTS: ----------------------------------- - Requests for addition to the list, change of address and other administrative matters should be sent to: "Connectionists-Request at cs.cmu.edu" (note the exact spelling: many "connectionists", one "request"). If you mention our mailing list to someone who may apply to be added to it, please make sure they use the above and NOT "Connectionists at cs.cmu.edu". - Requests for e-mail addresses of people who are believed to subscribe to CONNECTIONISTS should be sent to postmaster at appropriate-site. If the site address is unknown, send your request to Connectionists-Request at cs.cmu.edu and we'll do our best to help. A phone call to the appropriate institution may sometimes be simpler and faster. - Note that in many mail programs a reply to a message is automatically "CC"-ed to all the addresses on the "To" and "CC" lines of the original message. If the mailer you use has this property, please make sure your personal response (request for a Tech Report etc.) is NOT broadcast over the net. - Do NOT tell a friend about Connectionists at cs.cmu.edu. Tell him or her only about Connectionists-Request at cs.cmu.edu. This will save your friend from public embarrassment if she/he tries to subscribe. ------------------------------------------------------------------------------- The CONNECTIONISTS Archive: --------------------------- All e-mail messages sent to "Connectionists at cs.cmu.edu" starting 27-Feb-88 are now available for public perusal. A separate file exists for each month. The files' names are: arch.yymm where yymm stand for the obvious thing. Thus the earliest available data are in the file: arch.8802 Files ending with .Z are compressed using the standard unix compress program. To browse through these files (as well as through other files, see below) you must FTP them to your local machine. ------------------------------------------------------------------------------- How to FTP Files from the CONNECTIONISTS Archive ------------------------------------------------ 1. Open an FTP connection to host B.GP.CS.CMU.EDU (Internet address 128.2.242.8). 2. Login as user anonymous with password your username. 3. 'cd' directly to one of the following directories: /usr/connect/connectionists/archives /usr/connect/connectionists/bibliographies 4. The archives and bibliographies directories are the ONLY ones you can access. You can't even find out whether any other directories exist. If you are using the 'cd' command you must cd DIRECTLY into one of these two directories. Access will be denied to any others, including their parent directory. 5. The archives subdirectory contains back issues of the mailing list. Some bibliographies are in the bibliographies subdirectory. Problems? - contact us at "Connectionists-Request at cs.cmu.edu". ------------------------------------------------------------------------------- How to FTP Files from the Neuroprose Archive -------------------------------------------- Anonymous FTP on archive.cis.ohio-state.edu (128.146.8.52) pub/neuroprose directory This directory contains technical reports as a public service to the connectionist and neural network scientific community which has an organized mailing list (for info: connectionists-request at cs.cmu.edu) Researchers may place electronic versions of their preprints or articles in this directory, announce availability, and other interested researchers can rapidly retrieve and print the postscripts. This saves copying, postage and handling, by having the interested reader supply the paper. (Along this line, single spaced versions, if possible, will help!) To place a file, put it in the Inbox subdirectory, and send mail to pollack at cis.ohio-state.edu. Within a couple of days, I will move and protect it, and suggest a different name if necessary. When you announce a paper, you should consider whether (A) you want it automatically forwarded to other groups, like NEURON-DIGEST, (which gets posted to comp.ai.neural-networks) and if you want to provide (B) free or (C) prepaid hard copies for those unable to use FTP. If you do offer hard copies, be prepared for an onslaught. One author reported that when they allowed combination AB, the rattling around of their "free paper offer" on the worldwide data net generated over 2000 hardcopy requests! Experience dictates the preferred paradigm is to announce an FTP only version with a prominent "**DO NOT FORWARD TO OTHER GROUPS**" at the top of your announcement to the connectionist mailing list. Current naming convention is author.title.filetype[.Z] where title is enough to discriminate among the files of the same author. The filetype is usually "ps" for postscript, our desired universal printing format, but may be tex, which requires more local software than a spooler. Very large files (e.g. over 200k) must be squashed (with either a sigmoid function :) or the standard unix "compress" utility, which results in the .Z affix. To place or retrieve .Z files, make sure to issue the FTP command "BINARY" before transfering files. After retrieval, call the standard unix "uncompress" utility, which removes the .Z affix. An example of placing a file is attached as an appendix, and a shell script called Getps in the directory can perform the necessary retrival operations. For further questions contact: Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Phone: (614) 292-4890 Here is an example of naming and placing a file: gvax> cp i-was-right.txt.ps rosenblatt.reborn.ps gvax> compress rosenblatt.reborn.ps gvax> ftp archive.cis.ohio-state.edu Connected to archive.cis.ohio-state.edu. 220 archive.cis.ohio-state.edu FTP server ready. Name: anonymous 331 Guest login ok, send ident as password. Password:neuron 230 Guest login ok, access restrictions apply. ftp> binary 200 Type set to I. ftp> cd pub/neuroprose/Inbox 250 CWD command successful. ftp> put rosenblatt.reborn.ps.Z 200 PORT command successful. 150 Opening BINARY mode data connection for rosenblatt.reborn.ps.Z 226 Transfer complete. 100000 bytes sent in 3.14159 seconds ftp> quit 221 Goodbye. gvax> mail pollack at cis.ohio-state.edu Subject: file in Inbox. Jordan, I just placed the file rosenblatt.reborn.ps.Z in the Inbox. The INDEX sentence is "Boastful statements by the deceased leader of the neurocomputing field." Please let me know when it is ready to announce to Connectionists at cmu. BTW, I enjoyed reading your review of the new edition of Perceptrons! Frank ------------------------------------------------------------------------ How to FTP Files from the NN-Bench Collection --------------------------------------------- 1. Create an FTP connection from wherever you are to machine "pt.cs.cmu.edu" (128.2.254.155). 2. Log in as user "anonymous" with password your username. 3. Change remote directory to "/afs/cs/project/connect/bench". Any subdirectories of this one should also be accessible. Parent directories should not be. 4. At this point FTP should be able to get a listing of files in this directory and fetch the ones you want. Problems? - contact us at "nn-bench-request at cs.cmu.edu". From mozer at dendrite.cs.colorado.edu Sat May 1 15:42:26 1993 From: mozer at dendrite.cs.colorado.edu (Michael C. Mozer) Date: Sat, 1 May 1993 13:42:26 -0600 Subject: Final call for NIPS workshop proposals Message-ID: <199305011942.AA13154@neuron.cs.colorado.edu> CALL FOR PROPOSALS NIPS*93 Post-Conference Workshops December 3 and 4, 1993 Vail, Colorado Following the regular program of the Neural Information Processing Systems 1993 conference, workshops on current topics in neural information processing will be held on December 3 and 4, 1993, in Vail, Colorado. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Past topics have included: active learning and control; architectural issues; attention; bayesian analysis; benchmarking neural network applications; computational complexity issues; computational neuroscience; fast training techniques; genetic algorithms; music; neural network dynamics; optimization; recurrent nets; rules and connectionist models; self- organization; sensory biophysics; speech; time series prediction; vision; and VLSI and optical implementations. The goal of the workshops is to provide an informal forum for researchers to discuss important issues of current interest. Sessions will meet in the morning and in the afternoon of both days, with free time in between for ongoing individual exchange or outdoor activities. Concrete open and/or controversial issues are encouraged and preferred as workshop topics. Individuals proposing to chair a workshop will have responsibilities including: arranging short informal presentations by experts working on the topic, moderating or leading the discussion and reporting its high points, findings, and conclusions to the group during evening plenary sessions (the "gong show"), and writing a brief (2 page) summary. Submission Procedure: Interested parties should submit a short proposal for a workshop of interest postmarked by May 22, 1993. (Express mail is *not* necessary. Submissions by electronic mail will also be accepted.) Proposals should include a title, a description of what the workshop is to address and accomplish, and the proposed length of the workshop (one day or two days). It should motivate why the topic is of interest or controversial, why it should be discussed and what the targeted group of participants is. In addition, please send a brief resume of the prospective workshop chair, a list of publications and evidence of scholarship in the field of interest. Mail submissions to: Mike Mozer NIPS*93 Workshops Chair Department of Computer Science University of Colorado Boulder, CO 80309-0430 USA (e-mail: mozer at cs.colorado.edu) Name, mailing address, phone number, fax number, and e-mail net address should be on all submissions. PROPOSALS MUST BE POSTMARKED BY MAY 22, 1993 Please Post From kanal at cs.UMD.EDU Sat May 1 23:40:06 1993 From: kanal at cs.UMD.EDU (Laveen N. Kanal) Date: Sat, 1 May 93 23:40:06 -0400 Subject: some recent papers which may be of interest to you. Message-ID: <9305020340.AA12865@nanik.cs.UMD.EDU> Laveen N. Kanal, " On pattern, categories, and alternate realities", published in Pattern Recognition Letters, vol 14, pp. 241-255, March 1993, Elsevier/North-Holland. Tbis is the text of the talk given by the author at The Hague, The Netherlands, when he received the King-Sun Fu award of the International Association for Pattern Recognition. Contents: Preamble Pattern Some sketches from the current pattern recognition scene Artificial neural networks Hybrid systems "Where's the AI?" Categorization Alternate realities Prospects Concluding remarks "Time goes, you say? Ah, no! Alas Time stays, we go;" Pierre de Ronsard The Paradox of Time (Austin Dobson, tr) Other Recent Papers: R. Bhatnagar & L.N. Kanal, "Structural and Probabilistic Knowledge for Abductive Reasoning",IEEE Trans. on Pattern Analysis and Machine Intelligence, special issue on Probabilistic Reasoning, March 1993. L. Kanal & S. Raghavan," Hybrid Systems- A Key to Intelligent Pattern Recognition", IJCNN-92, Proc. Int. Joint. Conf on Neural Networks, June 1992. B.J. Hellstrom & L.N. Kanal, "Asymmetric Mean-Field Neural Networks for Multiprocessor Scheduling", Neural Networks, Vol. 5, pp 671-686, May 1992. L.N. Kanal & G.R. Dattatreya, "Pattern Recognition", in S. Shapiro (ed), Encyclopedia of Artificial Intelligence, 2nd edition John Wiley 1992. R. Bhatnagar & L.N. Kanal, " Reasoning in Uncertain Domains-A Survey and Commentary", in A. Kent & J.G. Williams (eds), Encyclopedia of Computer Science and Technology,p. 297-316,(also in Encyclopedia of Microcomputers, Marcel Dekker, Inc, 1992. Laveen N. Kanal Prof. of Computer Science A.V. Williams Bldg. Univ. of Maryland College Park, MD 20742 USA kanal at cs.umd.edu From hwang at pierce.ee.washington.edu Mon May 3 15:17:00 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Mon, 3 May 93 12:17:00 PDT Subject: back-propagation and projection pursuit learning Message-ID: <9305031917.AA22668@pierce.ee.washington.edu.> Technical Report available from neuroprose: REGRESSION MODELING IN BACK-PROPAGATION AND PROJECTION PURSUIT LEARNING Jenq-Neng Hwang, Shyh-Rong Lay Information Processing Laboratory Department of Electrical Engineering, FT-10 University of Washington, Seattle, WA 98195 and Martin Maechler, Doug Martin, Jim Schimert Department of Statistics, GN-22 University of Washington, Seattle, WA 98195 ABSTRACT We studied and compared two types of connectionist learning methods for model-free regression problems in this paper. One is the popular "back-propagation" learning (BPL) well known in the artificial neural networks literature; the other is the "projection pursuit" learning (PPL) emerged in recent years in the statistical estimation literature. Both the BPL and the PPL are based on projections of the data in directions determined from interconnection weights. However, unlike the use of fixed nonlinear activations (usually sigmoidal) for the hidden neurons in BPL, the PPL systematically approximates the unknown nonlinear activations. Moreover, the BPL estimates all the weights simultaneously at each iteration, while the PPL estimates the weights cyclically (neuron-by-neuron and layer-by-layer) at each iteration. Although the BPL and the PPL have comparable training speed when based on a Gauss-Newton optimization algorithm, the PPL proves more parsimonious in that the PPL requires a fewer hidden neurons to approximate the true function. To further improve the statistical performance of the PPL, an orthogonal polynomial approximation is used in place of the supersmoother method originally proposed for nonlinear activation approximation in the PPL. ================ To obtain copies of the postscript file, please use Jordan Pollack's service (no hardcopies will be provided): Example: unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52) Name (archive.cis.ohio-state.edu): anonymous Password (archive.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get hwang.bplppl.ps.Z ftp> quit unix> uncompress hwang.bplppl.ps Now print "hwang.bplppl.ps" as you would any other (postscript) file. From bovik at cs.utexas.edu Sat May 1 11:57:40 1993 From: bovik at cs.utexas.edu (Alan C. Bovik) Date: Sat, 1 May 1993 10:57:40 -0500 Subject: No subject Message-ID: <9305011557.AA16002@im4u.cs.utexas.edu> FIRST IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING November 13-16, 1994 Austin Convention Center, Austin, Texas, USA PRELIMINARY CALL FOR PAPERS Sponsored by the Institute of Electrical and Electronics En- gineers (IEEE) Signal Processing Society, ICIP-94 is the inaugur- al international conference on theoretical, experimental and ap- plied image processing. It will provide a centralized, high- quality forum for presentation of technological advances and research results by scientists and engineers working in Image Processing and associated disciplines such as multimedia and video technology. Also encouraged are image processing applica- tions in areas such as the biomedical sciences and geosciences. SCOPE: 1. IMAGE PROCESSING: Coding, Filtering, Enhancement, Restoration, Segmentation, Multiresolution Processing, Multispectral Process- ing, Image Representation, Image Analysis, Interpolation and Spa- tial Transformations, Motion Detection and Estimation, Image Se- quence Processing, Video Signal Processing, Neural Networks for image processing and model-based compression, Noise Modeling, Architectures and Software. 2. COMPUTED IMAGING: Acoustic Imaging, Radar Imaging, Tomography, Magnetic Resonance Imaging, Geophysical and Seismic Imaging, Ra- dio Astronomy, Speckle Imaging, Computer Holography, Confocal Mi- croscopy, Electron Microscopy, X-ray Crystallography, Coded- Aperture Imaging, Real-Aperture Arrays. 3. IMAGE SCANNING DISPLAY AND PRINTING: Scanning and Sampling, Quantization and Halftoning, Color Reproduction, Image Represen- tation and Rendering, Graphics and Fonts, Architectures and Software for Display and Printing Systems, Image Quality, Visual- ization. 4. VIDEO: Digital video, Multimedia, HD video and packet video, video signal processor chips. 5. APPLICATIONS: Application of image processing technology to any field. PROGRAM COMMITTEE: GENERAL CHAIR: Alan C. Bovik, U. Texas, Austin TECHNICAL CHAIRS: Tom Huang, U. Illinois, Champaign and John W. Woods, Rensselaer, Troy SPECIAL SESSIONS CHAIR: Mike Orchard, U. Illinois, Champaign EAST EUROPEAN LIASON: Henri Maitre, TELECOM, Paris FAR EAST LIASON: Bede Liu, Princeton University SUBMISSION PROCEDURES Prospective authors are invited to propose papers for lecture or poster presentation in any of the technical areas listed above. To submit a proposal, prepare a 2-3 page summary of the paper in- cluding figures and references. Send five copies of the paper summaries to: John W. Woods Center for Image Processing Research Rensselaer Polytechnic Institute Troy, NY 12180-3590, USA. Each selected paper (five-page limit) will be published in the Proceedings of ICIP-94, using high-quality paper for good image reproduction. Style files in LaTeX will be provided for the con- venience of the authors. SCHEDULE Paper summaries/abstracts due: 15 February 1994 Notification of Acceptance: 1 May 1994 Camera-Ready papers: 15 July 1994 Conference: 13-16 November 1994 CONFERENCE ENVIRONMENT ICIP-94 will be held in the recently completed state-of-the-art Convention Center in downtown Austin. The Convention Center is situated two blocks from the Town Lake, and is only 12 minutes from Robert Meuller Airport. It is surrounded by many modern hotels that provide comfortable accommodation for $75-$125 per night. Austin, the state capital, is renowned for its natural hill- country beauty and an active cultural scene. Within walking dis- tance of the Convention Center are several hiking and jogging trails, as well as opportunities for a variety of aquatic sports. Live bands perform in various clubs around the city and at night spots along Sixth Street, offering a range of jazz, blues, country/Western, reggae, swing and rock music. Day temperatures are typically in the upper sixties in mid-November. An exciting range of EXHIBITS, VENDOR PRESENTATIONS, and SOCIAL EVENTS is being planned. Innovative proposals for TUTORIALS, and SPECIAL SESSIONS are invited. For further details about ICIP-94, please contact: Conference Management Services 3024 Thousand Oaks Drive Austin, Texas 78746 Tel: 512/327/4012; Fax:512/327/8132 email: icip at pine.ece.utexas.edu PRELIMINARY CALL FOR PAPERS FIRST IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING November 13-16, 1994 Austin Convention Center, Austin, Texas, USA From hwang at pierce.ee.washington.edu Mon May 3 15:18:07 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Mon, 3 May 93 12:18:07 PDT Subject: Markov random field modeling via neural networks Message-ID: <9305031918.AA22673@pierce.ee.washington.edu.> Technical Report available from neuroprose: TEXTURED IMAGE SYNTHESIS AND SEGMENTATION VIA NEURAL NETWORK PROBABILISTIC MODELING Jenq-Neng Hwang, Eric Tsung-Yen Chen Information Processing Laboratory Department of Electrical Engineering, FT-10 University of Washington, Seattle, WA 98195 ABSTRACT It has been shown that a trained back-propagation neural network (BPNN) classifier with Kullback-Leibler criterion produces outputs which can be interpreted as estimates of Bayesian "a posteriori" probabilities. Based on this interpretation, we propose a back-propagation neural network (BPNN) approach for the estimation of the local conditional distributions of textured images, which are commonly represented by a Markov random field (MRF) formulation. The proposed BPNN approach overcomes many of the difficulties encountered in using MRF formulation. In particular our approach does not require the trial-and-error selection of clique functions or the subsequent laborious and unreliable estimation of clique parameters. Simulations show that the images synthesized using BPNN modeling produced desired artificial/real textures more consistently than MRF based methods. Application of the proposed BPNN approach to segmentation of artificial and real-world textures is also presented. ================ To obtain copies of the postscript file, please use Jordan Pollack's service (no hardcopies will be provided): Example: unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52) Name (archive.cis.ohio-state.edu): anonymous Password (archive.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get hwang.nnmrf.ps.Z ftp> quit unix> uncompress hwang.nnmrf.ps Now print "hwang.nnmrf.ps" as you would any other (postscript) file. From mel at cns.caltech.edu Mon May 3 15:30:55 1993 From: mel at cns.caltech.edu (Bartlett Mel) Date: Mon, 3 May 93 12:30:55 PDT Subject: NIPS*93: Deadline May 22 Message-ID: <9305031930.AA20897@plato.cns.caltech.edu> ******************** FINAL REMINDER, NOTE DEADLINE OF MAY 22 ***************** CALL FOR PAPERS Neural Information Processing Systems -Natural and Synthetic- Monday, November 29 - Thursday, December 2, 1993 Denver, Colorado This is the seventh meeting of an inter-disciplinary conference which brings together neuroscientists, engineers, computer scien- tists, cognitive scientists, physicists, and mathematicians in- terested in all aspects of neural processing and computation. There will be an afternoon of tutorial presentations (Nov 29) preceding the regular session and two days of focused workshops will follow at a nearby ski area (Dec 3-4). Major categories and examples of subcategories for paper submis- sions are the following: Neuroscience: Studies and Analyses of Neurobiological Systems, Inhibition in cortical circuits, Signals and noise in neural computation, Computational and Theoretical Neurobiology, Neu- rophysics. Theory: Computational Learning Theory, Complexity Theory, Dynamical Systems, Statistical Mechanics, Probability and Statistics, Approximation Theory. Implementation and Simulation: VLSI, Optical, Software Simula- tors, Implementation Languages, Parallel Processor Design and Benchmarks. Algorithms and Architectures: Learning Algorithms, Construc- tive and Pruning Algorithms, Localized Basis Functions, Tree Structured Networks, Performance Comparisons, Recurrent Net- works, Combinatorial Optimization, Genetic Algorithms. Cognitive Science & AI: Natural Language, Human Learning and Memory, Perception and Psychophysics, Symbolic Reasoning. Visual Processing: Stereopsis, Visual Motion, Recognition, Im- age Coding and Classification. Speech and Signal Processing: Speech Recognition, Coding, and Synthesis, Text-to-Speech, Adaptive Equalization, Nonlinear Noise Removal. Control, Navigation, and Planning: Navigation and Planning, Learning Internal Models of the World, Trajectory Planning, Robotic Motor Control, Process Control. Applications: Medical Diagnosis or Data Analysis, Financial and Economic Analysis, Timeseries Prediction, Protein Struc- ture Prediction, Music Processing, Expert Systems. Technical Program: Plenary, contributed and poster sessions will be held. There will be no parallel sessions. The full text of presented papers will be published. Submission Procedures: Original research contributions are soli- cited, and will be carefully refereed. Authors must submit six copies of both a 1000-word (or less) summary and six copies of a separate single-page 50-100 word abstract clearly stating their results postmarked by May 22, 1993 (express mail is not neces- sary). Accepted abstracts will be published in the conference program. Summaries are for program committee use only. At the bottom of each abstract page and on the first summary page indi- cate preference for oral or poster presentation and specify one of the above nine broad categories and, if appropriate, sub- categories (For example: Poster, Applications-Expert Systems; Oral, Implementation-Analog VLSI). Include addresses of all au- thors at the front of the summary and the abstract and indicate to which author correspondence should be addressed. Submissions will not be considered that lack category information, separate abstract sheets, the required six copies, author addresses, or are late. Mail Submissions To: Gerry Tesauro NIPS*93 Program Chair The Salk Institute, CNL 10010 North Torrey Pines Rd. La Jolla, CA 92037 Mail For Registration Material To: NIPS*93 Registration NIPS Foundation PO Box 60035 Pasadena, CA 91116-6035 All submitting authors will be sent registration material au- tomatically. Program committee decisions will be sent to the correspondence author only. NIPS*93 Organizing Committee: General Chair, Jack Cowan, Univer- sity of Chicago; Publications Chair, Joshua Alspector, Bellcore; Publicity Chair, Bartlett Mel, CalTech; Program Chair, Gerry Tesauro, IBM/Salk Institute; Treasurer, Rodney Goodman, CalTech; Local Arrangements, Chuck Anderson, Colorado State Universi- ty; Tutorials Chair, Dave Touretzky, Carnegie-Mellon, Workshop Chair, Mike Mozer, University of Colorado; Program Co-Chairs: Larry Abbott, Brandeis Univ, Chris Atkeson, MIT; A. B. Bonds, Vanderbilt Univ; Gary Cottrell, UCSD; Scott Fahlman, CMU; Rod Goodman, Caltech; John Hertz, NORDITA/NIH; John Lazzaro, UC Berkeley; Todd Leen, OGI; Jay McClelland, CMU; Nelson Morgan,ICSI; Steve Nowlan, Salk Inst./Synaptics; Misha Pavel, NASA/OGI; Sandy Pentland, MIT; Tom Petsche, Siemens. Domestic Liasons: IEEE Liaison, Terrence Fine, Cornell; Government & Cor- porate Liaison, Lee Giles, NEC Research Institute Inc.; Overseas Liasons: Mitsuo Kawato, ATR; Marwan Jabri, University of Sydney; Gerard Dreyfus, Ecole Superieure, Paris; Alan Murray, University of Edinburgh; Andreas Meier, Simon Bolivar U. DEADLINE FOR SUMMARIES & ABSTRACTS IS MAY 22, 1993 (POSTMARKED) please post 9 From elman at crl.ucsd.edu Mon May 3 23:23:25 1993 From: elman at crl.ucsd.edu (Jeff Elman) Date: Mon, 3 May 93 20:23:25 PDT Subject: new books in MIT Neural Network/Connectionsm series Message-ID: <9305040323.AA15510@crl.ucsd.edu> The following books have now appeared as part of the Neural Network Modeling and Connection Series, and may be of interest to readers of the connectionists mailing group. Detailed descriptions of each book, along with table of contents, follow. Jeff Elman ============================================================ Neural Network Modeling and Connectionism Series Jeffrey Elman, editor. MIT Press/Bradford Books. * Miikkulainen, R. "Subsymbolic Natural Language Processing An Integrated Model of Scripts, Lexicon, and Memory" * Mitchell, M. "Analogy-Making as Perception A Computer Model" * Cleeremans, A. "Mechanisms of Implicit Learning Connectionist Models of Sequence Processing" * Sereno, M.E. "Neural Computation of Pattern Motion Modeling Stages of Motion Analysis in the Primate Visual Cortex" * Miller, W.T., Sutton, R.S., & Werbos, P.J. (Eds.), "Neural Networks for Control" * Hanson, S.J., & Olson, C.R. (Eds.) "Connectionist Modeling and Brain Function The Developing Interface" * Judd, S.J. "Neural Network Design and the Complexity of Learning" * Mozer, M.C. "The Perception of Multiple Objects A Connectionist Approach" ------------------------------------------------------------ New Subsymbolic Natural Language Processing An Integrated Model of Scripts, Lexicon, and Memory Risto Miikkulainen Aiming to bridge the gap between low-level connectionist models and high-level symbolic artificial intelligence, Miikkulainen describes DISCERN, a complete natural language processing system implemented entirely at the subsymbolic level. In DISCERN, distributed neural network models of parsing, generating, reasoning, lexical processing, and episodic memory are integrated into a single system that learns to read, paraphrase, and answer questions about stereotypical narratives. Using the DISCERN system as an example, Miikkulainen introduces a general approach to building high-level cognitive models from distributed neural networks, and shows how the special properties of such networks are useful in modeling human performance. In this approach connectionist networks are not only plausible models of isolated cognitive phenomena, but also sufficient constituents for complete artificial intelligence systems. Risto Miikkulainen is an Assistant Professor in the Department of Computer Sciences at the University of Texas, Austin. Contents: I.Overview. Introduction. Background. Overview of DISCERN. II. Processing Mechanisms. Backpropagation Networks. Developing Representations in FGREP Modules Building from FGREP Modules. III. Memory Mechanisms. Self-Organizing Feature Maps. Episodic Memory Organization: Hierarchical Feature Maps. Episodic Memory Storage and Retrieval: Trace Feature Maps. Lexicon. IV. Evaluation. Behavior of the Complete Model. Discussion. Comparison to Related Work. Extensions and Future Work. Conclusions. Appendixes: A Story Data. Implementation Details. Instructions for Obtaining the DISCERN Software. A Bradford Book May 1993 - 408 pp. - 129 illus. - $45.00 0-262-13290-7 MIISH ------------------------------------------------------------ New Analogy-Making as Perception A Computer Model Melanie Mitchell Analogy-Making as Perception is based on the premise that analogy-making is fundamentally a high-level perceptual process in which the interaction of perception and concepts gives rise to "conceptual slippages" which allow analogies to be made. It describes Copycat, developed by the author with Douglas Hofstadter, that models the complex, subconscious interaction between perception and concepts that underlies the creation of analogies. In Copycat, both concepts and high-level perception are emergent phenomena, arising from large numbers of low-level, parallel, non-deterministic activities. In the spectrum of cognitive modeling approaches, Copycat occupies a unique intermediate position between symbolic systems and connectionist systems - a position that is at present the most useful one for understanding the fluidity of concepts and high-level perception. On one level the work described here is about analogy-making, but on another level it is about cognition in general. It explores such issues as the nature of concepts and perception and the emergence of highly flexible concepts from a lower-level "subcognitive" substrate. Melanie Mitchell, Assistant Professor in the Department of Electrical Engineering and Computer Science at the University of Michigan, is a Fellow of the Michigan Society of Fellows. She is also Director of the Adaptive Computation Program at the Santa Fe Institute. Contents: Introduction. High-Level Perception, Conceptual Slippage, and Analogy-Making in a Microworld. The Architecture of Copycat. Copycat's Performance on the Five Target Problems. Copycat's Performance on Variants of the Five Target Problems. Summary of the Comparisons between Copycat and Human Subjects. Some Shortcomings of the Model. Results of Selected "Lesions" of Copycat. Comparisons with Related Work. Contributions of This Research. Afterword by Douglas R. Hofstadter. Appendixes. A Sampler of Letter-String Analogy Problems Beyond Copycat's Current Capabilities. Parameters and Formulas. More Detailed Descriptions of Codelet Types. A Bradford Book May 1993 - 382 pp. - 168 illus. - $45.00 0-262-13289-3 MITAH ------------------------------------------------------------ New Mechanisms of Implicit Learning Connectionist Models of Sequence Processing Axel Cleeremans What do people learn when they do not know that they are learning? Until recently all of the work in the area of implicit learning focused on empirical questions and methods. In this book, Axel Cleeremans explores unintentional learning from an information-processing perspective. He introduces a theoretical framework that unifies existing data and models on implicit learning, along with a detailed computational model of human performance in sequence-learning situations. The model, based on a simple recurrent network (SRN), is able to predict the successive elements of sequences generated from finite-state grammars. Human subjects are shown to exhibit a similar sensitivity to the temporal structure in a series of choice reaction time experiments of increasing complexity; yet their explicit knowledge of the sequence remains limited. Simulation experiments indicate that the SRN model is able to account for these data in great detail. Other architectures that process sequential material are considered. These are contrasted with the SRN model, which they sometimes outperform. Considered together, the models show how complex knowledge may emerge through the operation of elementary mechanisms - a key aspect of implicit learning performance. Axel Cleeremans is a Senior Research Assistant at the National Fund for Scientific Research, Belgium. Contents: Implicit Learning: Explorations in Basic Cognition. The SRN Model: Computational Aspects of Sequence Processing. Sequence Learning as a Paradigm for Studying Implicit Learning. Sequence Learning: Further Explorations. Encoding Remote Control. Explicit Sequence Learning. General Discussion. A Bradford Book April 1993 - 227 pp. - 60 illus. - $30.00 0-262-03205-8 CLEMH ------------------------------------------------------------ New Neural Computation of Pattern Motion Modeling Stages of Motion Analysis in the Primate Visual Cortex Margaret Euphrasia Sereno How does the visual system compute the global motion of an object from local views of its contours? Although this important problem in computational vision (also called the aperture problem) is key to understanding how biological systems work, there has been surprisingly little neurobiologically plausible work done on it. This book describes a neurally based model, implemented as a connectionist network, of how the aperture problem is solved. It provides a structural account of the model's performance on a number of tasks and demonstrates that the details of implementation influence the nature of the computation as well as predict perceptual effects that are unique to the model. The basic approach described can be extended to a number of different sensory computations. "This is an important book, discussing a significant and very general problem in sensory processing. The model presented is simple, and it is elegant in that we can see, intuitively, exactly why and how it works. Simplicity, clarity and elegance are virtues in any field, but not often found in work in neural networks and sensory processing. The model described in Sereno's book is an exception. This book will have a sizeable impact on the field." - James Anderson, Professor, Department of Cognitive and Linguistic Sciences, Brown University Contents: Introduction. Computational, Psychophysical, and Neurobiological Approaches to Motion Measurement. The Model. Simulation Results. Psychophysical Demonstrations. Summary and Conclusions. Appendix: Aperture Problem Linearity. A Bradford Book March 1993 - 181 pp.- 41 illus. - $24.95 0-262-19329-9 SERNH ------------------------------------------------------------ Neural Networks for Control edited by W. Thomas Miller, III, Richard S. Sutton, and Paul J. Werbos This book brings together examples of all of the most important paradigms in artificial neural networks (ANNs) for control, including evaluations of possible applications. An appendix provides complete descriptions of seven benchmark control problems for those who wish to explore new ideas for building automatic controllers. Contents: I.General Principles. Connectionist Learning for Control: An Overview, Andrew G. Barto. Overview of Designs and Capabilities, Paul J. Werbos. A Menu of Designs for Reinforcement Learning Over Time, Paul J. Werbos. Adaptive State Representation and Estimation Using Recurrent Connectionist Networks, Ronald J. Williams. Adaptive Control using Neural Networks, Kumpati S. Narendra. A Summary Comparison of CMAC Neural Network and Traditional Adaptive Control Systems, L. Gordon Kraft, III, and David P. Campagna. Recent Advances in Numerical Techniques for Large Scale Optimization, David F. Shanno. First Results with Dyna, An Integrated Architecture for Learning, Planning and Reacting, Richard S. Sutton. II. Motion Control. Computational Schemes and Neural Network Models for Formation and Control of Multijoint Arm Trajectory, Mitsuo Kawato. Vision-Based Robot Motion Planning, Bartlett W. Mel. Using Associative Content-Addressable Memories to Control Robots, Christopher G. Atkeson and David J. Reinkensmeyer. The Truck Backer-Upper: An Example of Self-Learning in Neural Networks, Derrick Nguyen and Bernard Widrow. An Adaptive Sensorimotor Network Inspired by the Anatomy and Physiology of the Cerebellum, James C. Houk, Satinder P. Singh, Charles Fisher, and Andrew G. Barto. Some New Directions for Adaptive Control Theory in Robotics, Judy A. Franklin and Oliver G. Selfridge. III. Application Domains. Applications of Neural Networks in Robotics and Automation for Manufacturing, Arthur C. Sanderson. A Bioreactor Benchmark for Adapive Network-based Process Control, Lyle H. Ungar. A Neural Network Baseline Problem for Control of Aircraft Flare and Touchdown, Charles C. Jorgensen and C. Schley. Intelligent Conrol for Multiple Autonomous Undersea Vehicles, Martin Herman, James S. Albus, and Tsai-Hong Hong. A Challenging Set of Control Problems, Charles W. Anderson and W. Thomas Miller. A Bradford Book 1990 - 524 pp. - $52.50 0-262-13261-3 MILNH ------------------------------------------------------------ Connectionist Modeling and Brain Function The Developing Interface edited by Stephen Jose Hanson and Carl R. Olson This tutorial on current research activity in connectionist-inspired biology-based modeling describes specific experimental approaches and also confronts general issues related to learning, associative memory, and sensorimotor development. "This volume makes a convincing case that data-rich brain scientists and model-rich cognitive psychologists can and should talk to one another. The topics they discuss together here - memory and perception - are of vital interest to both, and their collaboration promises continued excitement along this new scientific frontier." - George Miller, Princeton University Contents: Part I: Overview. Introduction: Connectionism and Neuroscience, S. J. Hanson and C. R. Olson. Computational Neuroscience, T. J. Sejnowski, C. Koch, and P. S. Churchland. Part II: Associative Memory and Conditioning. The Behavioral Analysis of Associative Learning in the Terrestrial Mollusc Limax Maximus: The Importance of Inter-event Relationships, C. L. Sahley. Neural Models of Classical Conditioning: A Theoretical Viewpoint, G. Tesauro. Unsupervised Perceptual Learning: A Paleocortical Model, R. Granger, J. Ambros-Ingerson, P. Anton, and G. Lynch. Part III. The Somatosensory System. Biological Constraints on a Dynamic Network: The Somatosensory Nervous System, T. Allard. A Model of Receptive Field Plasticity and Topographic Reorganization in the Somatosensory Cortex, L. H. Finkel. Spatial Representation of the Body, C. R. Olson and S. J. Hanson. Part IV: The Visual System. The Development of Ocular Dominance Columns: Mechanisms and Models. K. D. Miller and M. P. Stryker. Self- Organization in a Perceptual System: How Network Models and Information Theory May Shed Light on Neural Organization, R. Linsker. Solving the Brightness-From-Luminance Problem: A Neural Architecture for Invariant Brightness Perception, S. Grossberg and D. Todorovic. A Bradford Book 1990 - 423 pp. - $44.00 0-262-08193-8 HANCH ------------------------------------------------------------ Neural Network Design and the Complexity of Learning J. Stephen Judd Using the tools of complexity theory, Stephen Judd develops a formal description of associative learning in connectionist networks. He rigorously exposes the computational difficulties in training neural networks and explores how certain design principles will or will not make the problems easier. "Judd . . . formalized the loading problem and proved it to be NP-complete. This formal work is clearly explained in his book in such a way that it will be accessible both to the expert and nonexpert." - Eric B. Baum, IEEE Transactions on Neural Networks "Although this book is the true successor to Minsky and Papert's maligned masterpiece of 1969 (Perceptrons), Judd is not trying to demolish the field of neurocomputing. His purpose is to clarify the limitations of a wide class of network models and thereby suggest guidelines for practical applications." - Richard Forsyth, Artificial Intelligence & Behavioral Simulation Contents: Neural Netowrks: Hopes, Problems, and Goals. The Loading Problem. Other Studies of Learning. The Intractability of Loading. Subcases. Shallow Architectures. Memorization and Generalization. Conclusions. Appendices A Bradford Book 1990 - 150 pp. - $27.50 0-262-10045-2 JUDNH ------------------------------------------------------------ The Perception of Multiple Objects A Connectionist Approach Michael C. Mozer Building on the vision studies of David Marr and the connectionist modeling of the PDP group it describes a neurally inspired computational model of two-dimensional object recognition and spatial attention that can explain many characteristics of human visual perception. The model, called MORSEL, can actually recognize several two-dimensional objects at once (previous models have tended to blur multiple objects into one or to overload). Mozer's is a fully mechanistic account, not just a functional-level theory. "Mozer's work makes a major contribution to the study of visual information processing. He has developed a very creative and sophisticated new approach to the problem of visual object recognition. The combination of computational rigor with thorough and knowledgeable examination of psychological results is impressive and unique." - Harold Pashler, University of California at San Diego Contents: Introduction. Multiple Word Recognition. The Pull-Out Network. The Attentional Mechanism. The Visual Short-Term Memory. Psychological Phenomena Explained by MORSEL. Evaluation of MORSEL. Appendixes: A Comparison of Hardware Requirements. Letter Cluster Frequency and Discriminability Within BLIRNET's Training Set. A Bradford Book 1991 - 217 pp - $27.50 0-262-13270-2 MOZPH ------------------------------------------------------------- ORDER FORM Please send me the following book(s): Qty Author Bookcode Price ___ Cleeremans CLEMH 30.00 ___ Hanson HANCH 44.00 ___ Judd JUDNH 27.50 ___ Mikkulainen MIISH 45.00 ___ Miller MILNH 52.50 ___ Mitchell MITAH 45.00 ___ Mozer MOZPH 27.50 ___ Sereno SERNH 24.95 ___ Payment Enclosed ___ Purchase Order Attached Charge to my ___ Master Card ___ Visa Card# _______________________________ Exp.Date _______________ Signature _________________________________________________ _____ Total for book $2.75 Postage _____ Please add 50c postage for each additional book _____ Canadian customers Add 7% GST _____ TOTAL due MIT Press Send To: Name ______________________________________________________ Address ___________________________________________________ City ________________________ State ________ Zip __________ Daytime Phone ________________ Fax ________________________ Make checks payable and send order to: The MIT Press * 55 Hayward Street * Cambridge, MA 02142 For fastest service call (617) 625-8569 or toll-free 1-800-356-0343 The MIT Guarantee: If for any reason you are not completely satisfied, return your book(s) within ten days of receipt for a full refund or credit. 3ENET From rreilly at nova.ucd.ie Tue May 4 03:50:49 1993 From: rreilly at nova.ucd.ie (Ronan Reilly) Date: Tue, 4 May 1993 08:50:49 +0100 Subject: CforP: Workshop on NLP Message-ID: Call for Participation in the 2ND WORKSHOP ON THE COGNITIVE SCIENCE OF NATURAL LANGUAGE PROCESSING 26-27 July, 1993 Dublin City University Guest Speakers: Walter Daelemans University of Tilburg Ronan Reilly University College Dublin Attendance at the CSNLP workshop will be by invitation on the basis of a submitted paper. Those wishing to be considered should send a paper of not more than eight A4 pages to Sean O'Nuallain or Andy Way, School of Computer Applications, Dublin City University, Dublin 9, Ireland, by not later than 14 June, 1993. Notification of acceptance along with registration and accommodation details will be sent out by 25 June, 1993. Submitting authors should also send their fax number and/or e-mail address to help speed up the selection process. The particular focus of the workshop will be on the computational modelling of human natural language processing (NLP), and preference will be given to papers that present empirically supported computational models of any aspect of human NLP. An additional goal in selecting papers will be to provide coverage of a range of NLP areas. From hwang at pierce.ee.washington.edu Tue May 4 13:06:05 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Tue, 4 May 93 10:06:05 PDT Subject: apology Message-ID: <9305041706.AA24985@pierce.ee.washington.edu.> We apologized for our ignorance of the incompatibility of our postscript files recently placed in Neuroprose with your printers. We will fix these problems and reload these three reports ASAP. These three files are: hwang.bplppl.ps.Z (back-propagation and projection pursuit learning) hwang.nnmrf.ps.Z (probabilistic textured image modeling by neural networks) hwang.srnn.ps.Z (mental image transformation via surface reconstruction nn) Jenq-Neng Hwang, Assistant Professor Information Processing Laboratory Dept. of Electrical Engr., FT-10 University of Washington Seattle, WA 98915 (206) 685-1603 (O), (206) 543-3842 (FAX) hwang at ee.washington.edu From hwang at pierce.ee.washington.edu Tue May 4 12:15:53 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Tue, 4 May 93 09:15:53 PDT Subject: mental image transformation and surface reconstruction NN Message-ID: <9305041615.AA24719@pierce.ee.washington.edu.> Technical Report available from neuroprose: MENTAL IMAGE TRANSFORMATION AND MATCHING USING SURFACE RECONSTRUCTION NEURAL NETWORKS Jenq-Neng Hwang, Yen-Hao Tseng Information Processing Laboratory Department of Electrical Engineering, FT-10 University of Washington, Seattle, WA 98195 ABSTRACT Invariant 2-D/3-D object recognition and motion estimation under detection/occlusion noise and/or partial object viewing are difficult pattern recognition tasks. On the other hand, the biological neural networks of human are extremely adept in these tasks. It has been suggested by the studies of experimental psychology that the task of matching rotated and scaled shapes by human is done by mentally rotating and scaling gradually one of the shapes into the orientation and size of the other and then testing for a match. Motivated by these studies, we present a novel and robust neural network solution for these tasks based on detected surface boundary data or range data. The method operates in two stages: The object is first parametrically represented by a surface reconstruction neural network (SRNN) trained by the boundary points sampled from the exemplar object. When later presented with boundary points sampled from the distorted object without point correspondence, this parametric representation allows the mismatch information back-propagate through the SRNN to gradually determine (align) the best similarity transform of the distorted object. The distance measure can then be computed in the reconstructed representation domain between the surface reconstructed exemplar object and the aligned distorted object. Applications to invariant 2-D target classification and 3-D object motion estimation using sparse range data collected from a single aspect view are presented. ================ To obtain copies of the postscript file, please use Jordan Pollack's service (no hardcopies will be provided): Example: unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52) Name (archive.cis.ohio-state.edu): anonymous Password (archive.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get hwang.srnn.ps.Z ftp> quit unix> uncompress hwang.srnn.ps Now print "hwang.srnn.ps" as you would any other (postscript) file. From skalsky at aaai.org Tue May 4 14:31:56 1993 From: skalsky at aaai.org (Rick Skalsky) Date: Tue, 4 May 93 11:31:56 PDT Subject: AAAI Spring Symposium Series 1994 Call for Proposals Message-ID: <9305041831.AA01242@aaai.org> 1994 Spring Symposium Series Call for Proposals AAAI invites proposals for the 1994 Spring Symposium Series, to be held at Stanford University, March 21-23, 1994. The Spring Symposium Series is a yearly set of symposia, designed to bring colleagues together in small, intimate forums. There will be about eight symposia on various topics in the 1994 Spring Symposium Series. All symposia will be limited in size. The symposia will run in parallel for two and one-half days. The symposia will allow for presentation of speculative work and work in progress, as well as completed work. Ample discussion time will be scheduled in each symposium. Working notes will be prepared, and distributed to the participants. Chairs can determine whether the working notes of their symposia will be available as AAAI Technical Reports following the meeting. Most participants of the symposia will be selected on the basis of statements of interest or abstracts submitted to the symposia chairs; some open registration will be allowed. Participants will be expected to attend a single symposium. Proposals for symposia should be between two and five pages in length, and should contain: - A title for the symposium - A description of the symposium, identifying specific areas of interest - Evidence that the symposium is of interest at this time--such as a completed, successful one-day workshop on a related topic - The names and addresses of the organizing committee, preferably three or four people at different sites, all of whom have agreed to serve on the committee - A list of several potential participants. Ideally, the entire organizing committee should collaborate in producing the proposal. If possible, a draft proposal should be sent out to a few of the potential participants and their comments solicited. All proposals will be reviewed by the AAAI Symposium Committee (cochairs: Lynn Andrea Stein, MIT; and Jim Hendler, University of Maryland). The criteria for acceptance of proposals include: - An appropriate level of perceived interest in the topic of the symposium among AAAI members. (Symposia proposals that appear to be too popular to fit in the size constraints should be turned into regular AAAI workshops.) - No long-term ongoing series of activities in the particular topic. (The Spring Symposium Series serves more to nurture interest in particular topics than to maintain it over a number of years.) The existence of activities in related and more-general topics will help to indicate the level of interest in the particular topic. - An appropriate organizing committee. - Accepted proposals will be distributed as widely as possible over the subfields of AI, and balanced between theoretical and applied topics. Symposia bridging theory and practice are particularly solicited. Symposium proposals should be submitted as soon as possible, but no later than June 7, 1993. Proposals that are submitted significantly before this deadline can be in draft form. Comments on how to improve and complete the proposal will be returned to the submitter in time for revisions to be made before the deadline. Notifications of acceptance or rejection will be sent to submitters around June 21, 1993. The submitters of accepted proposals will become the chair of the symposium, unless alternative arrangements are made. The symposium organizing committees will be responsible for: - Producing, in conjunction with the general chair, a Call for Participation for the symposium, which will be published in the AI Magazine - Reviewing requests to participate in the symposium and determining symposium participants - Preparing working notes for the symposium - Scheduling the activities of the symposium - Preparing a short review of the symposium, to be printed in the AI Magazine. AAAI will provide logistical support, will take care of all local arrangements, and will arrange for reproducing and distributing the working notes. Please submit (preferably by electronic mail) your symposium proposals, and inquiries concerning symposia, to both of the chairs: Jim Hendler (hendler at cs.umd.edu) Department of Computer Science University of Maryland AV Williams Building College Park, MD 20742 USA Lynn Andrea Stein (las at ai.mit.edu) AI Laboratory Massachusetts Institute of Technology 545 Technology Square #811 Cambridge, MA 02139 USA From mwitten at hermes.chpc.utexas.edu Tue May 4 15:30:32 1993 From: mwitten at hermes.chpc.utexas.edu (mwitten@hermes.chpc.utexas.edu) Date: Tue, 4 May 93 14:30:32 CDT Subject: WORLD CONGRESS ON COMPUTATIONAL MEDICINE<-CFPP Message-ID: <9305041930.AA01085@morpheus.chpc.utexas.edu> [] ***** CALL FOR PAPERS AND PARTICIPATION ***** [] FIRST WORLD CONGRESS ON COMPUTATIONAL MEDICINE AND PUBLIC HEALTH 24-28 April 1994 Hyatt Regency Hotel Austin, Texas compmed94 at chpc.utexas.edu (this notice may be reposted/cross posted/circulated) ------------------------------------------------------------------------ *Conference Chair: Matthew Witten, UT System Center For High Performance Computing, Austin, Texas - m.witten at chpc.utexas.edu *Conference Directorate: Regina Monaco, Mt. Sinai Medical Center * Dan Davison, University of Houston * Chris Johnson, University of Utah * Lisa Fauci, Tulane University * Daniel Zelterman, University of Minnesota Minneapolis * James Hyman, Los Alamos National Laboratory * Richard Hart, Tulane University * Dennis Duke, SCRI-Florida State University * Sharon Meintz, University of Nevada Los Vegas * Dean Sittig, Vanderbilt University * Dick Tsur, World Bank and UT System CHPC * Dan Deerfield, Pittsburgh Supercomputing Center * Istvan Gyori, Szeged University School of Medicine Computing Center *Conference Theme: The appearance of high-performance computing environments has greatly enhanced the capabilities of the biomedical modeler. With increasing frequency, computational sciences are being exploited as a means with which to investigate biomedical processes at all levels of complexity, from molecular to systemic to demographic. The emergence of an increasing number of players in this field has lead to the subsequent emergence of a new transdisciplinary field which we call Computational Medicine and Public Health. The purpose of this congress is to bring together a transdisciplinary group of researchers in medicine, public health, computer science, mathematics, nursing, veterinary medicine, ecology, allied health, as well as numerous other disciplines, for the purposes of examining the grand challenge problems of the next decades. Young scientists are encouraged to attend and to present their work in this increasingly interesting discipline. Funding is being solicited from NSF, NIH, DOE, Darpa, EPA, and private foundations, as well as other sources to assist in travel support and in the offsetting of expenses for those unable to attend otherwise. Papers, poster presentations, tutorials, focussed topic workshops, birds of a feather groups, demonstrations, and other suggestions are solicited in, but are not limited to the following areas: *Visualization/Sonification --- medical imaging --- molecular visualization as a clinical research tool --- simulation visualization --- microscopy --- visualization as applied to problems arising in computational molecular biology and genetics or other non-traditional disciplines *Computational Molecular Biology and Genetics --- computational ramifications of clinical needs in the Human Genome, Plant Genome, and Animal Genome Projects --- computational and grand challenge problems in molecular biology and genetics --- algorithms and methodologies --- issues of multiple datatype databases *Computational Pharmacology, Pharmacodynamics, Drug Design *Computational Chemistry as Applied to Clinical Issues *Computational Cell Biology, Physiology, and Metabolism --- Single cell metabolic models (red blood cell) --- Cancer models --- Transport models --- Single cell interaction with external factors models (laser, ultrasound, electrical stimulus) *Computational Physiology and Metabolism --- Renal System --- Cardiovascular dynamics --- Liver function --- Pulmonary dynamics --- Auditory function, coclear dynamics, hearing --- Reproductive modeling: ovarian dynamics, reproductive ecotoxicology, modeling the hormonal cycle --- Metabolic Databases and metabolic models *Computational Demography, Epidemiology, and Statistics/Biostatistics --- Classical demographic, epidemiologic, and biostatistical modeling --- Modeling of the role of culture, poverty, and other sociological issues as they impact healthcare *Computational Disease Modeling --- AIDS --- TB --- Influenza --- Other *Computational Biofluids --- Blood flow --- Sperm dynamics --- Modeling of arteriosclerosis *Computational Dentistry, Orthodontics, and Prosthetics *Computational Veterinary Medicine --- Computational issues in modeling non-human dynamics such as equine, feline, canine dynamics (physiological/biomechanical) *Computational Allied Health Sciences --- Physical Therapy --- Neuromusic Therapy --- Resiratory Therapy *Computational Radiology --- Dose modeling --- Treatment planning *Computational Surgery --- Simulation of surgical procedures in VR worlds --- Surgical simulation as a precursor to surgical intervention *Computational Cardiology *Computational Neurobiology and Neurophysiology --- Brain modeling --- Single neuron models --- Neural nets and clinical applications --- Neurophysiological dynamics --- Neurotransmitter modeling --- Neurological disorder modeling (Alzheimers Disease, for example) *Computational Biomechanics --- Bone Modeling --- Joint Modeling *The role of alternate reality methodologies and high performance environments in the medical and public health disciplines *Issues in the use of high performance computing environments in the teaching of health science curricula *The role of high performance environments for the handling of large medical datasets (high performance storage environments, high performance networking, high performance medical records manipulation and management, metadata structures and definitions) *Federal and private support for transdisciplinary research in computational medicine and public health *Contact: To contact the congress organizers for any reason use any of the following Electronic Mail - compmed94 at chpc.utexas.edu Fax (USA) - (512) 471-2445 Phone (USA) - (512) 471-2472 Compmed 1994 University of Texas System CHPC Balcones Research Center, 1.154CMS 10100 Burnet Road Austin, Texas 78758-4497 *Submission Procedures: Authors must submit 5 copies of a single-page 50-100 word abstract clearly discussing the topic of their presentation. In addition, authors must clearly state their choice of poster, contributed paper, tutorial, exhibit, focussed workshop or birds of a feather group along with a discussion of their presentation. Abstracts will be published as part of the preliminary conference material. To notify the congress organizing committee that you would like to participate and to be put on the congress mailing list, please fill out and return the form that follows this announcement. You may use any of the contact methods above. *Conference Deadlines: The following deadlines should be noted: 1 October 1993 - Notification of interest in participation 1 November 1993 - Abstracts for talks/posters/workshops/birds of a feather sessions/demonstrations 15 January 1994 - Notification of acceptance of abstract 15 February 1994 - Application for financial aid ============================= INTENT TO PARTICIPATE ========================== First Name: Middle Initial (if available): Family Name: Your Professional Title: [ ]Dr. [ ]Professor [ ]Mr. [ ]Mrs. [ ]Ms. [ ]Other:__________________ Office Phone (desk): Office Phone (message): Home/Evening Phone (for emergency contact): Fax: Electronic Mail (Bitnet): Electronic Mail (Internet): Postal Address: Institution or Center: Building Code: Mail Stop: Street Address1: Street Address2: City: State: Country: Zip or Country Code: Please list your three major interest areas: Interest1: Interest2: Interest3: =================================================================== From jramire at conicit.ve Tue May 4 23:47:38 1993 From: jramire at conicit.ve (Jose Ramirez G. (AVINTA) Date: Tue, 4 May 93 23:47:38 AST Subject: Workshp on ANN Message-ID: <9305050347.AA27626@dino.conicit.ve> ************************************************************** Call For Panelists and Call For Particiation Panel on "Research directions and applications of Artificial Neural Networks" The second World Congress on Expert Systems will be help in Lisbon, Portugal, 10-14 January 1.994. During the congress a panel focused on "Research directions and applications of Artificial Neural Networks" will be conducted. Panelist proposal are requested, according to the following: 1. 5 or 6 panelists will be selected. The panel will have presentations of 10 min. per panelist, plus a questions and answers period of 30 min. 2. Proposals must include a brief vitae (10 lines) of the panelist and a description of the topic to be addressed during the panel (5 lines). 3. Proposals must be sent by e-mail of fax to: Jose Ramirez email: jramire at conicit.ve fax : +58-2-2832689 4. The proposals must be received by May 28, 1.993. 5. The selected panelist must fill a registration form for the congress(at a reduced fee) and confirm the participation in the event. ************************************************************* From hwang at pierce.ee.washington.edu Wed May 5 12:50:13 1993 From: hwang at pierce.ee.washington.edu (Jenq-Neng Hwang) Date: Wed, 5 May 93 09:50:13 PDT Subject: three technical reports available Message-ID: <9305051650.AA28307@pierce.ee.washington.edu.> We have fixed the postscript printing problems and reload the three reports in neuroprose. These three files are now available: hwang.bplppl.ps.Z (back-propagation and projection pursuit learning) hwang.nnmrf.ps.Z (probabilistic textured image modeling by neural networks) hwang.objrec.ps.Z (single spaced) or hwang.srnn.ps.Z (double spaced) (mental image transformation via surface reconstruction neural nets) Jenq-Neng Hwang, Assistant Professor Information Processing Laboratory Dept. of Electrical Engr., FT-10 University of Washington Seattle, WA 98915 (206) 685-1603 (O), (206) 543-3842 (FAX) hwang at ee.washington.edu From robtag at udsab.dia.unisa.it Wed May 5 12:01:02 1993 From: robtag at udsab.dia.unisa.it (Tagliaferri Roberto) Date: Wed, 5 May 1993 18:01:02 +0200 Subject: WIRN 93 Programme Message-ID: <199305051601.AA21642@udsab.dia.unisa.it> Istituto Internazionale Alti Studi Scientifici (IIASS) Dipartimento di Fisica Teorica, Universita` di Salerno Dipartimento di Informatica ed Applicazioni, Universita` di Salerno Dipartimento di Scienze dell'Informazione, Universita` di Milano Istituto per la Ricerca dei Sistemi Informatici Paralleli, C.N.R., Napoli Societa` Italiana Reti Neuroniche (SIREN) 6th ITALIAN WORKSHOP ON NEURAL NETWORKS WIRN VIETRI-93 IIASS Research Center Ph: +39 89 761167 FAX:+39 89 761189 Vietri Sul Mare, Salerno, May 12-14, 1993 PRELIMINARY PROGRAM Wednesday 12 9:00 Opening of the Workshop 9:30 S. Gielen (Invited Lecture) 11:00 Coffee break 11:30 Formal Models and Pattern Recognition G. Basti, V. Bidoli et al. "Particle recognition on experimental data in a silicon calorimeter by back propagation with stochastic pre-processing" A. Borghese "Learning optimal control using neural networks" S. Brofferio, V. Rampa "A supervised-ART neural network for pattern recognition" P. Pedrazzi "On self-organizing neural character recognizers" V. Sanguineti, P. Morasso "Models of cortical maps" L. Stringa "Experiments in memory-based learning" 13:00 Lunch break 15:00 Prof. Tredici (Review Lecture on Progresses in Neuroanatomy) 16:00 Applications (1st part) E. Coccorese, R. Martone, C. Morabito "Classification of plasma equilibria in a tokamak using a three-level propagation network" E.D. Di Claudio, G. Trivelloni, G. Orlandi "Model identification of non linear dynamical systems by recurrent neural networks" P. Morasso, A. Pareto, S. Pagliano, V. Sanguineti "A self-organizing approach for diagnostic problems" 17:00 Coffee Break 17:30 Hybrid and Robotic Systems A. Chella, U. Maniscalco, R. Pirrone, F. Sorbello, P. Storniolo "A shape from shading hybrid approach to estimate superquadric parameters" Z.M. Kovacs-V., R. Guerrieri, G. Baccarini "A hybrid system for handprinted character recognition" A. Sperduti, A. Starita "Modular neural codes implementing neural trees" Thursday 13 9:00 L. Zadeh (Invited Lecture) 11:00 Coffee Break 11:30 Fuzzy neural systems E. Binaghi, A. Mazzetti, R. Orlando, A. Rampini "Integration of fuzzy reasoning techniques in the error back propagation learning algorithm" M. Costa, E. Pasero "FNC: a fuzzy neural classifier with bayesian engine" Zhiling Wiang, G. Sylos Labini "A self-organizing network of alterable competitive layer for pattern cluster" 13:00 Lunch Break 15:00 V. Cimagalli (Review Lecture on Cellular Networks) 16:00 - 17:30 Poster and Industrial Sessions 17:30 SIREN Annual Meeting Friday 14 9:00 Y. Bengio, P. Frasconi and M. Gori (Review Lecture on Recurrent Networks for Adaptive Temporal Reasoning) 10:00 Applications (2nd part) S. Cavalieri, A. Fichera "Exploiting neural network features to model and analyze noise pollution" A.M. Colla, N. Longo, G. Morgavi, S. Ridella "SBP: A hybrid neural model for pattern recognition" F. Piglione, G. Cirrincione "Neural-net based load-flow models for electric power systems" 11:00 Coffee Break 11:30 Hardware and Software Design A. d'Acierno, R. Vaccaro "The back-propagation learning algorithm on parallel computers: a mapping scheme" M. Gioiello, G. Vassallo, F. Sorbello "A new fully digital feed-forward network for hand-written digits recognition" F. Lauria, M. Sette "CONNET: a neural network configuration language" P. Wilke "Simulation of neural networks in a distributed computing environment using Neuro Graph" 13:00 Lunch Break 15:00 Architectures and Algorithms M. Alberti, P. Marelli, R. Posenato " A neural algorithm for the maximum satisfiability problem" E. Alpaydin "Multiple networks for function learning" D. Micci Barreca, G.C. Buttazzo "A neural architecture for failure-based learning" M. Schmitt "On the size of weights for McCulloch-Pitts neurons" Registration fee 275.000 Italian Liras (including proceedings and social dinner). No fees to be payed for students. From joe at cogsci.edinburgh.ac.uk Fri May 7 06:15:11 1993 From: joe at cogsci.edinburgh.ac.uk (Joe Levy) Date: Fri, 07 May 93 11:15:11 +0100 Subject: 2nd Neural Computation and Psychology Workshop: Language and Memory Message-ID: <8959.9305071015@galloway.cogsci.ed.ac.uk> 2nd Neural Computation and Psychology Workshop: Language and Memory. University of Edinburgh 10th-13th September Preliminary Call for Participation Following on from last year's very successful workshop on "Neurodynamics and Psychology" at Bangor University, it has been suggested that a workshop on some aspect of connectionist modelling in psychology should be held in the UK every year. This year the Connectionism and Cognition Research Group at the University of Edinburgh will host a workshop under the general theme of "language and memory" in Edinburgh between Friday 10th and Monday 13th September. We are currently preparing a program and will post details as soon as possible. The main sessions are likely to include memory, speech processing and reading. The workshop will be single track with a small number of invited speakers. Attendance will be limited to 50 people to allow ample time for discussion. For further details contact: Joe Levy Phone: +44 31 650 4450 | University of Edinburgh Fax: +44 31 650 4587 | Human Communication Research ARPA: joe%cogsci.ed.ac.uk at nsfnet-relay.ac.uk | Centre, 2 Buccleuch Place JANET: joe at uk.ac.ed.cogsci | Edinburgh EH8 9LW Scotland From fritzke at ICSI.Berkeley.EDU Fri May 7 17:43:31 1993 From: fritzke at ICSI.Berkeley.EDU (Bernd Fritzke) Date: Fri, 7 May 93 14:43:31 PDT Subject: three new papers in neuroprose Message-ID: <9305072143.AA22277@icsib14.ICSI.Berkeley.EDU> *** DO NOT FORWARD TO ANY OTHER LISTS *** The following technical reports have been placed in the neuroprose directory (ftp instructions follow the abstracts). For two of the TR's also hardcopies are available. Instructions are at the end of the posting. Comments and questions are welcome. Thanks to Jordan Pollack for maintaining the neuroprose archive. -Bernd International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704-1105 USA ------------------------------------------------------------ Growing Cell Structures - A Self-organizing Network for Unsupervised and Supervised Learning *) Bernd Fritzke ICSI, Berkeley TR-93-026 (34 pages) *) submitted for publication We present a new self-organizing neural network model having two variants. The first variant performs unsu- pervised learning and can be used for data visualization, clustering, and vector quantization. The main advantage over existing approaches, e.g., the Kohonen feature map, is the ability of the model to automatically find a suit- able network structure and size. This is achieved through a controlled growth process which also includes occasional removal of units. The second variant of the model is a supervised learning method which results from the combination of the abovementioned self-organizing network with the radial basis function (RBF) approach. In this model it is possible - in contrast to earlier approaches - to perform the positioning of the RBF units and the supervised training of the weights in parallel. Therefore, the current classification error can be used to determine where to insert new RBF units. This leads to small networks which generalize very well. Results on the two-spirals benchmark and a vowel classification problem are presented which are better than any results previously published. ------------------------------------------------------------ Vector Quantization with a Growing and Splitting Elastic Net *) Bernd Fritzke ICSI, Berkeley (6 pages) *) to be presented at ICANN-93, Amsterdam A new vector quantization method is proposed which gen- erates codebooks incrementally. New vectors are inserted in areas of the input vector space where the quantization error is especially high until the desired number of codebook vec- tors is reached. A one-dimensional topological neighborhood makes it possible to interpolate new vectors from existing ones. Vectors not contributing to error minimization are removed. After the desired number of vectors is reached, a stochastic approximation phase fine tunes the codebook. The final quality of the codebooks is exceptional. A comparison with two well-known methods for vector quantization was per- formed by solving an image compression problem. The results indicate that the new method is significantly better than both other approaches. ------------------------------------------------------------ Kohonen Feature Maps and Growing Cell Structures -- a Performance Comparison *) Bernd Fritzke ICSI, Berkeley (8 pages) *) to appear in Advances in Neural Information Processing Systems 5 C.L. Giles, S.J. Hanson, and J.D. Cowan (eds.), Morgan Kaufmann, San Mateo, CA, 1993 A performance comparison of two self-organizing net- works, the Kohonen Feature Map and the recently proposed Growing Cell Structures is made. For this purpose several performance criteria for self-organizing networks are pro- posed and motivated. The models are tested with three exam- ple problems of increasing difficulty. The Kohonen Feature Map demonstrates slightly superior results only for the sim- plest problem. For the other more difficult and also more realistic problems the Growing Cell Structures exhibit sig- nificantly better performance by every criterion. Addi- tional advantages of the new model are that all parameters are constant over time and that size as well as structure of the network are determined automatically. ************************* ftp instructions ********************** If you have the Getps script unix> Getps fritzke.tr93-26.ps.Z unix> Getps fritzke.icann93.ps.Z unix> Getps fritzke.nips92.ps.Z (Getps ftp's the named file, decompresses it, and asks wether to print it) otherwise do first the following (to get Getps) unix> ftp archive.cis.ohio-state.edu (or ftp 128.146.8.52) Connected to archive.cis.ohio-state.edu. 220 archive.cis.ohio-state.edu FTP server ready. Name: anonymous 331 Guest login ok, send ident as password. Password: 230 Guest login ok, access restrictions apply. ftp> cd pub/neuroprose 250 CWD command successful. ftp> get Getps 200 PORT command successful. 150 Opening BINARY mode data connection for Getps (2190 bytes). 226 Transfer complete. ftp> quit 221 Goodbye. ************************* hardcopies **************************** The NIPS92 paper and the 34-page paper have appeared as ICSI technical reports TR-93-025 and TR-93-026, respectively. Hardcopies are available for a small charge for postage and handling. For details please contact Vivian Balis (balis at icsi.berkeley.edu) at ICSI. From pollack at cis.ohio-state.edu Fri May 7 11:44:36 1993 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Fri, 7 May 93 11:44:36 -0400 Subject: Tweaking NEUROPROSE Message-ID: <9305071544.AA07540@dendrite.cis.ohio-state.edu> *****do not forward to other groups***** Good People, There are problems of scale with NEUROPROSE, and no resources to fix them properly. Therefore, after great thought about the laws of unintended consequences, and with no insult intended to recent articles, I am hereby tweaking the practices of NEUROPROSE, and I trust you will all go along with me eventually: 1. No more multiple daily submissions, NEUROPROSE is supposed to be for relevant preprints, not a vanity press or a medium for the distribution of life works or annual reports. 2. Make sure your paper is single-spaced, even as a draft, so as to save paper. 3. Please announce the NUMBER OF PAGES with with the announcement, so people are not surprised by empty laser printer trays. In your request to me, it would help to have a formatted INDEX entry with the page count as well (see appendix). 4. Before announcing, have a friend at another institution retrieve and print the file, so as to avoid easily found local postscript library errors. Lots of resource are wasted when the files do not print. 5. Add the following two lines to your mail header, or the top of your message, so as to facilitate the development of mailer scripts and macros which can automatically retrieve files from both NEUROPROSE and other lab-specific repositories (Thanks to Dave Plaut's sense of humor): FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/filename.ps.Z 6. Finally, unless you are posting a file with non-standard ftp arrangements, like a tar.Z file, leave the instructions off, as everyone knows at this point how to get and uncompress and print a postscript file! I have amended the README file to this effect. Please send comments to me for discussion, rather than the whole mailing list. Thanks. Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Phone: (614)292-4890 (then * to fax) From ptodd at spo.rowland.org Sun May 9 16:31:21 1993 From: ptodd at spo.rowland.org (Peter M. Todd) Date: Sun, 9 May 93 16:31:21 EDT Subject: CFP: music/creativity issue of Connection Science Message-ID: <9305092031.AA01937@spo.rowland.org> **** PLEASE DISTRIBUTE **** MUSIC AND CREATIVITY Call for Papers for a Special Issue of Connection Science Over the last few years there has been a vertiginous growth in the connectionist exploration of many domains, including music. Music has traditionally been one of the least studied areas of cognition, in part because of the complexity of musical phenomena and their language-like connections between many levels and modalities of thought. But the application of network-based computational techniques to aspects of musicality and creativity has resulted in a variety of illuminating models. The time now seems right for an overview of the agenda being followed by connectionists in this area, the articulation of the central issues in the field, and a forum for the discussion of future directions. To this end, we are inviting papers covering the whole field of connectionist modelling of music, arts, and creativity for a special issue of the journal Connection Science. Papers may be either empirical or theoretical, but must communicate predominantly unpublished ideas. We are particularly interested in receiving work in the following areas (although we emphasize music here, other areas of creativity and artistic endeavour may be substituted): 1. The limits and possibilities for connectionism in modelling creativity. 2. Modelling cognitive aspects of music: meter, rhythm, tonality, harmony and melody. 3. The use of neural networks in creating pieces of music, choreography, visual art, etc. 4. Modelling the integration of lower- and higher-level musical knowledge, including hierarchical representations. 5. The representation of intermodal relationships between musical dimensions, e.g. tonality and rhythm. 6. Developmental models of musical cognition. 7. Psychoacoustic models underlying categorical pitch and other musical phenomena. 8. Models of auditory streaming, attention, phrasing, and grouping. 9. Connectionist models of timbre. 10. Models of cross-cultural differences or universals. 11. Comparative models of music and language. 12. The use of sequential, recurrent, predictive, and chaotic network models for creative phenomena. 13. Cognitive neuroscience models of musical phenomena. We are particularly interested in stimulating discussion with this special issue of the present and future of this field, and papers should explore the importance of issues raised by the research as broadly as possible. An awareness of the cognitive plausibility and implications of the ideas presented is also essential. Requirements for Submission All papers will be rigorously refereed. Guidelines for submission of papers to Connection Science can be found in issues of the Journal and are also available from lyn at dcs.exeter.ac.uk (or by mail from Lyn Shackleton, University of Exeter, address as below). Authors are encouraged to contact the editors with any questions about proposed papers or the relevance of their work for this special issue. Authors must submit five (5) printed copies of their papers to either of the addresses listed below by OCTOBER 15 1993. Each copy of the paper should be fronted by a separate title page listing its title, authors, their addresses, surface-mail and E-mail, and an abstract of under 200 words. Notification of receipt will be electronically mailed to the first (or designated) author. Notification of acceptance or rejection will be mailed by DECEMBER 31 1993. Final versions of accepted papers will be due MARCH 1 1994. Special Issue Editors: Niall Griffith Department of Computer Science, University of Exeter, Prince of Wales Road, Exeter, EX4 4PT, England. E-mail: ngr at dcs.exeter.ac.uk Peter M. Todd The Rowland Institute for Science 100 Edwin H. Land Boulevard Cambridge, MA 02142 USA E-mail: ptodd at spo.rowland.org From jain at arris.com Mon May 10 13:50:41 1993 From: jain at arris.com (Ajay Jain) Date: Mon, 10 May 93 10:50:41 PDT Subject: Position available Message-ID: <9305101750.AA17773@oyster.arris.com> ***** Please do not forward to other groups ***** RESEARCH SCIENTIST Statistics and Machine Learning Arris Pharmaceutical Corporation Arris Pharmaceutical is drug discovery company employing a synergistic approach that combines advances and expertise in molecular biology, synthetic chemistry, and applied mathematics. The company's mission is to develop synthetic therapeutics to address to address major medical needs through appying proprietary structure based drug design methods. We are seeking a person with expertise in both statistics and computer science, and with a PhD in statistics, mathematics, or computer science to join our team. The candidate must have experience designing, implementing, and using nonlinear statistical techniques (e.g., MARS, PI, CART, neural networks). Also highly desirable are experience in the application of statistical methods to experiment design, experience in database design, and strong interest and/or formal training in chemistry, biology, or medicine. The candidate should be eager to learn the relevant parts of computational chemistry and to interact with medicinal chemists and molecular biologists. The Arris drug design strategy begins by identifying a pharmaceutical target (e.g., an enzyme or a cell-surface receptor), developing assays to measure chemical binding with this target, and screening large libraries of peptides (short amino acid sequences) with these assays. The resulting data, which indicates for each compound how well it binds to the target, is analyzed by statistical algorithms to develop hypotheses that explain why some compounds bind well to the target while others do not. Information from X-ray crystallography or NMR spectroscopy may also be available to the statistical algorithm. Hypotheses will then be refined by synthesizing and testing additional peptides. Finally, medicinal chemists will synthesize small organic molecules that satisfy the hypothesis, and these will become candidate drugs to be tested for medical safety and effectiveness. The person hired will work as a member of the computational drug design group, conducting research on the application of advanced statistical and computational techniques to drug design, and developing chemical modeling tools incorporating these techniques. In particular, he or she will develop software to discover patterns in the biological activity of massive libraries of biopolymer compounds, and to predict new compounds with enhanced activity. In addition, he or she will contribute statistical expertise to experiment design and to other machine learning projects in the company. The computational drug design team currently includes Barr Bauer, David Chapman, Roger Critchlow, Tom Dietterich, Ajay Jain, Kimberle Koile, Rick Lathrop, Tomas Lozano Perez, and John Park. For more information, send your resume with the names and addresses of three references to: Arris Pharmaceutical Corporation Personnel Manager 385 Oyster Point Blvd. South San Francisco CA 94080 You may also send email reponses to jain at arris.com. From gerda at mail2.ai.univie.ac.at Mon May 10 15:16:30 1993 From: gerda at mail2.ai.univie.ac.at (Gerda Helscher) Date: Mon, 10 May 1993 21:16:30 +0200 Subject: CFP: EMCSR'94 European Meeting on Cybernetics and Systems Research Message-ID: <199305101916.AA12898@wachau.ai.univie.ac.at> * * * * * TWELFTH EUROPEAN MEETING * * ON * * CYBERNETICS AND SYSTEMS RESEARCH * * (EMCSR 1994) * April 5 - 8, 1994 UNIVERSITY OF VIENNA organized by the Austrian Society for Cybernetic Studies in cooperation with Dept.of Medical Cybernetics and Artificial Intelligence, Univ.of Vienna and International Federation for Systems Research Cybernetics - "the study of communication and control in the animal and the machine" (N.Wiener) - has recently returned to the forefront, not only in cyberpunk and cyberspace, but, even more important, contributing to the consolidation of various scientific theories. Additionally, an ever increasing number of research areas, including social and economic theories, theoretical biology, ecology, computer science, and robotics draw on ideas from second order cybernetics. Artificial intelligence, evolved directly from cybernetics, has not only technological and economic, but also important social impacts. With a marked trend towards interdisciplinary cooperation and global perspectives, this important role of cybernetics is expected to be further strengthened over the next years. Since 1972, the biennial European Meetings on Cybernetics and Systems Research (EMCSR) have served as a forum for discussion of converging ideas and new aspects of different scientific disciplines. As on previous occasions, a number of sessions providing wide coverage of the rapid developments will be arranged, complemented with daily plenary meetings, where eminent speakers will present latest research results. SESSIONS + Chairpersons: A General Systems Methodology G.J.Klir, USA B Advances in Mathematical Systems Theory M.Peschel, Germany, and F.Pichler, Austria C Fuzzy Systems, Approximate Reasoning and Knowledge-Based Systems C.Carlsson, Finland, K.-P.Adlassnig, Austria, and E.P.Klement, Austria D Designing and Systems, and Their Education B.Banathy, USA, W.Gasparski, Poland, and G.Goldschmidt, Israel E Humanity, Architecture and Conceptualization G.Pask, UK, and G.de Zeeuw, Netherlands F Biocybernetics and Mathematical Biology L.M.Ricciardi, Italy G Systems and Ecology F.J.Radermacher, Germany, and K.Fedra, Austria H Cybernetics and Informatics in Medicine G.Gell, Austria, and G.Porenta, Austria I Cybernetics of Socio-Economic Systems K.Balkus, USA, and O.Ladanyi, Austria J Systems, Management and Organization G.Broekstra, Netherlands, and R.Hough, USA K Cybernetics of National Development P.Ballonoff, USA, T.Koizumi, USA, and S.A.Umpleby, USA L Communication and Computers A M.Tjoa, Austria M Intelligent Autonomous Systems J.W.Rozenblit, USA, and H.Praehofer, Austria N Cybernetic Principles of Knowledge Development F.Heylighen, Belgium, and S.A.Umpleby, USA O Cybernetics, Systems, and Psychotherapy M.Okuyama, Japan, and H.Koizumi, USA P Artificial Neural Networks and Adaptive Systems S.Grossberg, USA, and G.Dorffner, Austria Q Artificial Intelligence and Cognitive Science V.Marik, Czechia, and R.Born, Austria R Artificial Intelligence and Systems Science for Peace Research S.Unseld, Switzerland, and R.Trappl, Austria SUBMISSION OF PAPERS: Acceptance of contributions will be determined on the basis of Draft Final Papers. These Papers must not exceed 7 single-spaced A4 pages (maximum 50 lines, final size will be 8.5 x 6 inch), in English. They have to contain the final text to be submitted, including graphs and pictures. However, these need not be of reproducible quality. The Draft Final Paper must carry the title, author(s) name(s), and affiliation in this order. Please specify the session in which you would like to present your paper. Each scientist shall submit only one paper. Please send t h r e e copies of the Draft Final Paper to the Conference Secretariat (NOT to session chairpersons!) DEADLINE FOR SUBMISSION: October 8, 1993. In order to enable careful refereeing, Draft Final Papers received after the deadline cannot be considered. FINAL PAPERS: Authors will be notified about acceptance no later than November 13, 1993. They will be provided by the conference secretariat at the same time with the detailed instructions for the preparation of the final paper. PRESENTATION: It is understood that the paper is presented personally at the Meeting by the contributor. CONFERENCE FEE: Contributors: AS 2500 if paid before January 31, 1994 AS 3200 if paid later Participants: AS 3500 if paid before January 31, 1994 AS 4200 if paid later The Conference Fee includes participation in the Twelfth European Meeting, attendance at official receptions, and the volume of the proceedings available at the Meeting. Please send cheque, or transfer the amount free of charges for beneficiary to our account no. 0026-34400/00 at Creditanstalt-Bankverein Vienna. Please state your name clearly. HOTEL ACCOMMODATIONS will be handled by OESTERREICHISCHES VERKEHRSBUERO, Kongressabteilung, Opernring 5, A-1010 Vienna, phone +43-1-58800-113, fax +43-1-5867127, telex 111 222. Reservation cards will be sent to all those returning the attached registration form. SCHOLARSHIPS: The Austrian Federal Ministry for Science and Research has kindly agreed to provide a limited number of scholarships covering the registration fee for the conference and part of the accommodation costs for colleagues from eastern and south-eastern European countries. Applications should be sent to the Conference Secretariat before October 8, 1993. * * * * * The Proceedings of the 1st to 11th European Meetings on Cybernetics and Systems Research were published as Pichler F. and Trappl R.(eds.): ADVANCES IN CYBERNETICS AND SYSTEMS RESEARCH, 2 vols, Transcripta Books, London, 1973. Trappl R. and Pichler F.R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.I, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1975. Trappl R. and Hanika F.de P.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.II, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1975. Trappl R., Klir G.J. and Ricciardi L.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.III, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1978. Trappl R. and Pask G.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.IV, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1978. Trappl R., Hanika F.de P. and Pichler F.R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.V, Hemisphere, Washington,DC / Halsted-Wiley, New York, 1979. Pichler F.R. and Trappl R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.VI, Hemisphere, Washington,DC / McGraw-Hill, 1982. Pichler F.R. and Hanika F.de P.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.VII, Hemisphere, Washington,DC, 1980. Trappl R., Klir G.J. and Pichler F.R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.VIII, Hemisphere, Washington,DC / McGraw-Hill, 1982. Trappl R., Ricciardi L. and Pask G.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.IX, Hemisphere, Washington,DC / McGraw-Hill, 1982. Trappl R., Hanika F.de P. and Tomlinson R.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol.X, Hemisphere, Washington,DC / McGraw-Hill, 1982. Trappl R., Findler N.V. and Horn W.(eds.): PROGRESS IN CYBERNETICS AND SYSTEMS RESEARCH, Vol. XI, Hemisphere, Washington,DC / McGraw-Hill, 1982. Trappl R.(ed.): CYBERNETICS AND SYSTEMS RESEARCH, North-Holland, Amsterdam, 1982. Trappl R.(ed.): CYBERNETICS AND SYSTEMS RESEARCH 2, Elsevier, Amsterdam, 1984. Trappl R.(ed.): CYBERNETICS AND SYSTEMS '86, Reidel, Dordrecht, 1986. Trappl R.(ed.): CYBERNETICS AND SYSTEMS '88, 2 vols., Kluwer, Dordrecht, 1988. Trappl R.(ed.): CYBERNETICS AND SYSTEMS '90, World Scientific, Singapore, 1990. Trappl R.(ed.): CYBERNETICS AND SYSTEMS '92, 2 vols., World Scientific, Singapore, 1992. Please contact the conference secretariat for more details. ------------------------------------------------------------------------ CHAIRMAN of the Meeting: Robert Trappl, President Austrian Society for Cybernetic Studies SECRETARIAT: I. Ghobrial-Willmann and G. Helscher Austrian Society for Cybernetic Studies A-1010 Vienna 1, Schottengasse 3 (Austria) Phone: +43-1-53532810 Fax: +43-1-5320652 E-mail: sec at ai.univie.ac.at PROGRAMME COMMITTEE: K.-P. Adlassnig (Austria) G. J. Klir (USA) K. Balkus (USA) T. Koizumi (USA) P. Ballonoff (USA) O. Ladanyi (Austria) B. Banathy (USA) V. Marik (Czechia) R. Born (Austria) G. Pask (UK) G. Broekstra (Netherlands) M. Peschel (Germany) E. Buchberger (Austria) F. Pichler (Austria) C. Carlsson (Finland) G. Porenta (Austria) G. Chroust (Austria) H. Praehofer (Austria) G. Dorffner (Austria) F. J. Radermacher (Germany) K. Fedra (Austria) J. Retti (Austria) W. Gasparski (Poland) L. M. Ricciardi (Italy) G. Gell (Austria) J. W. Rozenblit (USA) G. Goldschmidt (Israel) N. Rozsenich (Austria) S. Grossberg (USA) A M. Tjoa (Austria) F. Heylighen (Belgium) R. Trappl (Austria) W. Horn (Austria) H. Trost (Austria) R. Hough (USA) S. A. Umpleby (USA) N. C. Hu (China) S. Unseld (Switzerland) E. P. Klement (Austria) G. de Zeeuw (Netherlands) ORGANIZING COMMITTEE: E. Buchberger P. Petta G. Chroust F. Pichler I. Ghobrial-Willmann R. Trappl G. Helscher H. Trost W. Horn M. Veitl J. Matiasek ****************************************** PAPER SUBMISSION DEADLINE: October 8, 1993 ****************************************** ------------------------------------------------------------------------ EMCSR-94 TWELFTH EUROPEAN MEETING ON CYBERNETICS AND SYSTEMS RESEARCH Please return to: Austrian Society for Cybernetic Studies Schottengasse 3, A-1010 VIENNA, AUSTRIA (EUROPE) E-mail: sec at ai.univie.ac.at o I plan to attend the Meeting. o I intend to submit a paper to Session ..... o I enclose the Draft Final Paper. o My Draft Final Paper will arrive prior to October 8, 1993. o My cheque for AS ....... covering the Conference Fee is enclosed. o I have transferred AS ........ to your account 0026-34400/00 at Creditanstalt Vienna. o I shall not be at the Meeting but am interested to receive particulars of the Proceedings. From pja at cis.ohio-state.edu Tue May 11 10:27:55 1993 From: pja at cis.ohio-state.edu (Peter J Angeline) Date: Tue, 11 May 93 10:27:55 -0400 Subject: EP94 Call For Papers Message-ID: <9305111427.AA15156@neuron.cis.ohio-state.edu> ---------------------------------------------------------------------- The Third Annual Conference on Evolutionary Programming CALL FOR PAPERS February 24-25, 1994 San Diego, California Evolutionary programming is a stochastic optimization technique that can be used to address various optimization problems. Papers regarding the theory and application of evolutionary programming to complex problem solving and solicited. Topics include, but are not limited to: automatic control neural network training and design system identification adaptive representation forecasting robotics combinatorial optimization pattern recognition and the relationship between evolutionary programming and other optimization methods. On or before June 30, 1993, prospective authors should submit a 100-250 word abstract and three page extended summary of the proposed paper to the Technical Program Chairman: Lawrence J. Fogel ORINCON Corporation 9363 Towne Centre Dr. San Diego, CA 92121 Authors will be notified of the program committee's decision on or before September 30, 1993. Completed papers will be due January 15, 1994. Paper format, page requirements and registration information will be detailed upon acceptance. General Chairman: Anthony V. Sebald, UC San Diego Technical Chairman: Lawrence J. Fogel, ORINCON Corporation Program Committee: Peter Angeline, The Ohio State Univ. Roman Galar, Tech. Univ. Wroclaw Wirt Atmar, AICS Research Inc. Douglas Hoskins, The Boeing Company Thomas Back, Univ. Dortmund Gerald Joyce, Scripps Clin./Res. Found. George Burgin, Titam Systems/Linkabit John McDonnell, NCCOSC Michael Conrad, Wayne State Univ. Stuart Rubin, NCCOSC David B. Fogel, ORINCON Corporation Hans-Paul Schwefel, Univ. Dortmund Gary B. Fogel, UC Los Angeles William M. Spears, Naval Research Lab Finance Chairman: Bill Porto, ORINCON Corporation Publicity Co-Chairs: Ward Page, NCCOSC Patrick Simpson, ORINCON Corporation Sponsored by the Evolutionary Programming Society In Cooporation with the IEEE Neural Networks Council From fogel at ece.UCSD.EDU Tue May 11 12:57:21 1993 From: fogel at ece.UCSD.EDU (Fogel) Date: Tue, 11 May 93 09:57:21 PDT Subject: Email Digest for Evolutionary Programming Message-ID: <9305111657.AA29071@sunshine.ucsd.edu> ANNOUNCING EVOLUTIONARY PROGRAMMING EMAIL DIGEST We are pleased to announce that as of May 10, 1993, an email digest covering transactions on evolutionary programming will be available. The digest is intended to promote discussions on a wide range of technical issues in evolutionary optimization, as well as provide information on upcoming conferences, events, journals, special issues, and other items of interest to the EP community. Discussions on all areas of evolutionary computation are welcomed, including artificial life, evolution strategies, and genetic algorithms. The digest is meant to encourage interdisciplinary communications. Your suggestions and comments regarding the digest are always welcome. To subscribe to the digest, send mail to ep-list-request at magenta.me.fau.edu and include the line "subscribe ep-list" in the body of the text. Further instructions will follow your subscription. The digest will be moderated by N. Saravanan of Florida Atlantic University. Sincerely, David Fogel fogel at sunshine.ucsd.edu N. Saravanan saravan at amber.me.fau.edu From mikewj at signal.dra.hmg.gb Tue May 11 07:00:55 1993 From: mikewj at signal.dra.hmg.gb (Mike Wynne-Jones) Date: Tue, 11 May 93 12:00:55 +0100 Subject: Neural nets applications meeting in UK Message-ID: AA08130@milne.dra.hmg.gb *********************************** NEURAL COMPUTING APPLICATIONS FORUM *********************************** 23 - 24 June 1993 Fitzwilliam College, Cambridge University, UK ***************************************** PRACTICAL APPLICATIONS OF NEURAL NETWORKS ***************************************** Neural Computing Applications Forum is the primary meeting place for people developing Neural Network applications in industry and academia. It has 150 members from the UK and Europe, from universities, small companies and big ones, and holds four main meeting each year. It has been running for 3 years, and is cheap to join. This meeting spans two days with informal workshops on 23 June and the main meeting comprising talks about neural network techniques and applications on 24 June. ********* WORKSHOPS - these talks are planned; additional short talks are sought. ********* ********************************************************** Constructing structured networks of Radial Basis Functions 23 June, 13.00 to 15.00 ********************************************************** Including : Robert Debenham (Logica Cambridge): "Online construction of RBFs during training" Richard Bostock (Aston University): "Bump-tree construction by genetic algorithms" ********************************************************* Self Organising Networks 23 June, 15.30 to 17.30 ********************************************************* Including: Nigel Allinson (York University): "Self Organising Networks: fast training, case studies and digital implementations" ************************************************************ Evening: Punting on the Cam followed by liquid refreshments! ************************************************************ ***************************** MAIN MEETING - 24 June 1993 ***************************** 8.30 Registration 9.05 Welcome 9.15 Douglas Kell (university of Wales): "Detection of impurities in olive oil" 9.55 Mahesan Niranjan (University of Cambridge): "On-line learning algorithms for prediction and control applications" 10.30 Coffee 11.00 Tony Robinson (University of Cambridge): "Application of recurrent nets to phone probability estimation in speech recognition" 11.40 Prof. Cabrol-Bass (LARTIC, France): "Indices for the Evaluation of Neural Network Performance as classifiers: Application to Structural Elucidation in Infra Red Spectroscopy" 12.15 Lunch 2.00 Stephen Roberts (Oxford University): "Probabilistic Growth of RBFs for detection of novelty" 2.40 Dave Cressy (Logica Cambridge Research): "Neural Control of an Experimental Batch Distillation Column" 3.15 Tea 3.40 Tom Harris (Brunel University): "Kohonen nets in machine health monitoring" 4.10 Discussions 4.30 Close ACCOMODATION is available in Fitzwilliam college at 30 pounds (single) and 47 pounds (twin), and **MUST** be booked and paid for in advance. There are also lots of hotels in Cambridge. ***************** Application ***************** Members of NCAF get free entry to all meetings for a year. (This is very good value - main meetings, tutorials, special interest meetings). It also includes subscription to Springer Verlag's new(ish) journal "Neural Computing and Applications". Full membership: 250 pounds. - anybody in your cmall company / research group in big company. Individual membership: 140 pounds - named individual only. Student membership (with journal): 55 pounds - copy of student ID required. Student membership (no journal, very cheap!): 25 pounds - copy of student ID required. Entry to this meeting without membership costs 35 pounds for the workshops, and 80 pounds for the main day. Payment in advance if possible; 5 pounds charge for issue of invoice if credit is required; need an official order number. Email enquiries to Mike Wynne-Jones, mikewj at signal.dra.hmg.gb. Postal to Mike Wynne-Jones, NCAF, PO Box 62, Malvern, WR14 4NU, UK. Fax to Mike Wynne-Jones, (+44/0) 684 894384 From yz%TRBOUN.BITNET at FRMOP11.CNUSC.FR Wed May 12 16:58:29 1993 From: yz%TRBOUN.BITNET at FRMOP11.CNUSC.FR (yz%TRBOUN.BITNET@FRMOP11.CNUSC.FR) Date: Wed, 12 May 1993 16:58:29 EDT Subject: Second Turkish AI and ANN Symposium Message-ID: <0096C66F.211B38A0.12766@trboun.bitnet> SECOND TURKISH ARTIFICIAL INTELLIGENCE AND ARTIFICIAL NEURAL NETWORKS SYMPOSIUM Bogazici University, Istanbul, Turkey June 24-25, 1993 SPONSORS: Bogazici University IEEE Control Systems Society Turkey Chapter TUBITAK IEEE Computer Society Turkey Chapter Bilkent University Middle East Technical University SYMPOSIUM CHAIR: Selahattin Kuru, Bogazici University PROGRAM COMMITTEE: Levent Akin, Bogazici University Varol Akman, Bilkent University Ethem Alpaydin, Bogazici University (Chair) Isil Bozma, Bogazici University M. Kemal Ciliz, Bogazici University Fikret Gurgen, Bogazici University H. Altay Guvenir, Bilkent University Ugur Halici, M.E.T.U. Yorgo Istefanopulos, Bogazici University Sakir Kocabas, TUBITAK Gebze Arastirma Merkezi Selahattin Kuru, Bogazici University Kemal Oflazer, Bilkent University A. C. Cem Say, Bogazici University Nese Yalabik, M.E.T.U. ORGANIZATION COMMITTEE: Levent Akin, Bogazici University (Chair) Ethem Alpaydin, Bogazici University Ruhan Alpaydin, Bogazici University Hakan Aygun, Bogazici University Sema Oktug, Bogazici University A. C. Cem Say, Bogazici University Mehmet Yagci, Bogazici University INVITED SPEAKER: Herbert E. Rauch , Lockheed Palo Alto Research Lab., President-elect, IEEE Control Systems Society IEEE Distinguished Lecturer AIM: Recent advances in the theory and engineering of computational sciences extended the application of automatic computers to novel domains, introducing a new generation of software and hardware systems. These artificially intelligent systems generally are equipped with sensors and actuators through which they interact with the environment, leading to abilities like vision, speech recognition and synthesis, and mobility. They are able to learn from experience and represent learned knowledge internally in an abstract, task-oriented form and operate based on this knowledge which leads to cognitive abilities like natural language understanding, problem solving and planning. They may also need to be autonomous thus not requiring intense operator supervision and intervention. Researching on these ideas is generally coupled with a reverse engineering effort of the natural systems which are the living examples. One major line here is building new computing systems inspired from the neural network organization of animal brains. Realization of such systems is a large inter-disciplinary effort requiring the collective work of researchers from as distant domains as cognitive psychology, neuroscience, linguistics, and physics from natural sciences, signal processing, electronics, and mechanics from engineering sciences, additional to computer science. The objective of the Symposium is to bring together Turkish and foreign researchers in the field, and to publicise their studies. There will be a panel in the closing session. The subject of this panel will be determined during the Symposium. PRELIMINARY PROGRAM JUNE 24, 1993 THURSDAY 08:30 - 12:00 REGISTRATION 09:00 ROOM A: OPENING CEREMONY 09:20 ROOM A: INVITED SPEAKER Neural Networks for Control, Identification and Diagnosis Herbert E. Rauch, Palo Alto Research Lab., USA. IEEE Control Systems Society President Elect 10:20 TEA BREAK 10:40 ROOM A: SESSION A.1.1 (NEURAL NETWORKS THEORY) 1. Learning in Hybrid Neural Models A. M. Colla, Elsag Bailey spa, N. Longo, F. Masulli, S. Ridella, University of Genoa, G. Morgavi, I. C. E. C. N. R., Italy. 2. A Weighted Least-Squares Algorithm for Neural Network Learning in Recognition of Low Probability Events D. J. Munro, O. K. Ersoy, M. R. Bell, J. S. Sadowsky, Purdue University, USA. 3. Memory Based Function Approximation Using Neural Networks S. Aratma, E. Alpaydin, Bogazici University, Turkey. 4. Statistical Physics, Neural Networks and Combinatorial Optimization Problems H. El Ghaziri, Ecole Polytechnique Fdrale de Lausanne, Switzerland. 5. Genetic Synthesis of Unsupervised Learning Algorithms A. Dasdan, K. Oflazer, Bilkent University, Turkey. 10:40 ROOM B: SESSION B.1.1 (NATURAL LANGUAGE PROCESSING I) 1. Two-level Description of Turkish Morphology K. Oflazer, Bilkent University, Turkey. 2. ATN Representation of Turkish Morphology T. Gungor, S. Kuru, Bogazici University, Turkey. 3. A Text Tagger for Turkish K. Oflazer, Bilkent University, Turkey. 4. A Spelling Checker and Corrector for Turkish H. L. Akin, S. Kuru, T. Gungor, I. Hamzaoglu, D. Arbatli, Bogazici University, Turkey. 5. Utilizing Connectionist Paradigm for Lexicon Access in Natural Language Processing Systems M. U. Sencan, K.Oflazer, Bilkent University, Turkey. 12:20 LUNCH 14:00 ROOM A: SESSION A.1.2 (IMAGE PROCESSING) 1. A Classification Algorithm Based on Feature Partitioning I. Sirin, H. A. Guvenir, Bilkent University, Turkey. 2. Dense Stereo Correspondance Using Elastic Nets U. M. Leloglu, TUBITAK AEAGE, Turkey. 3. Design and Development of an Image Processing and Computer Vision Software for Generating Data Files of 2D Physical Objects for Drafting Software Packages E. F. Arslan, A. Erden, Middle East Technical University, Turkey. 4. B-Spline Fonksiyonlari ile Goruntu Aradegerleme Isleminde Spline Katsayilarinin Hesaplanmasinda Iyilestirme. S. Albayrak, M. Y. Karsligil, Yildiz Technical University, D. Demir, TUBITAK MAE, Turkey. 5. Laser ve Kamera Araciligiyla Alinmis Kirinim Temelli Resimlerde Cisim Kenari Belirleme A. Kuzucu, Istanbul Technical University, M. Yilmaz, Erciyes University, Turkey. 14:00 ROOM B: SESSION B.1.2 (LOGIC AND REASONING) 1. A Resolution Principle for Quantificational S5 with an Application to Artificial Intelligence M. Baaz, C. G. Fermuller, Technische Universitt Wien, Austria. 2. Object-Oriented Logic Programming on Finite Domains L. V. Ciortuz, "Al. I. Cuza" University of Iasi, Romania. 3. Ontology for Buying and Selling: A Preliminary Study M. Ersan, E. Ersan, V. Akman, Bilkent University, Turkey. 4. Kuzgun Paradoksu: Akil ile Zeka Arasinda bir Ikilem M. M. Dagli, TUBITAK., Turkey. 5. Representing Emotions in Terms of Object Directedness H. G. Unsal, V. Akman, Bilkent University, Turkey. 15:40 TEA BREAK 16:00 ROOM A: SESSION A.1.3 (NEURAL NETWORKS APPLICATIONS I) 1. Arms Race Modeling Using Neural Networks: A Case Study A. N. Refenes, A. Zapranis, University College London, UK, C. Kollias, University of Crete, Greece. 2. A Comparative Study of Various Objective Functions for Feedforward Neural Networks A. U. Unluakin, F. Gurgen, Bogazici University, Turkey. 3. A Neural Network Approach to the Single Machine Total Tardiness Scheduling Problem B. Gurgun, I. Sabuncuoglu, Bilkent University, Turkey. 4. Evaluation of Doppler Blood Velocity Waveform Indices for Prenatal Surveillance Using Back Propagation Training Algorithm N. Baykal, A. Erkmen, N.Yalabik, Middle East Technical University, S. Beksac, Hacettepe University, I. Altintas, Middle East Technical University, Turkey. 5. Prediction of Postoperative Hemorrhage in Open Heart Surgery Patients Using Thromboelastographic Data and Neural Networks H. L. Akin, Bogazici University , S.Celikel, Istanbul University, Turkey. 16:00 ROOM B: SESSION B.1.3 (ARTIFICIAL INTELLIGENCE APPLICATIONS) 1. Combining AI Means and Traditional Programming to Solve Some Problems of FMS Simulation, Scheduling and Control G. Kovacs, Hungarian Academy of Sciences, Hungary. 2. Design and Implementation of an Expert System in the Prognosis of Hepatology I. Bonfa, Computer Systems Sector, C. Maioli, University of Bologna, F. Sarti, Belleria Hospital , G.Mazzoni, University of Bologna,G. L. Milandri, P. R. Dal Monte, Belleria Hospital, Italy. 3. Fuzzy Controller Against PD Controller for Servo Motor Process A. Nasar, Cairo University, M. S. El-Sherif, M. S. Abd El-Samee, Electronics Research Institute, Egypt. 4. Bilgisayar Kontrollu Robot Manipulatoru S. Celik, F. Daldaban, F. Canbulut, Erciyes University, Turkey. 5. "Bir Kelime-Bir Islem" Oynayan Program C. Say, S. Sen, R. Barengi, Bogazici University, Turkey. 17:40 BREAK 18:00-20:30 COCKTAIL JUNE 25, 1993 FRIDAY 09:00 ROOM A: SESSION A.2.1 (OPTICAL CHARACTER RECOGNITION) 1. Combining Unsupervised Techniques With Supervised Neural Methods for Optical Character Recognition M. Yagci, E. Alpaydin, Bogazici University, Turkey. 2. Pipelined Associative Memories for Handwritten Character Recognition Z. M. Kovacs, V. R. Guerrieri, University of Bologna, Italy. 3. On-line Cursive Handwriting Recognition Using Cooperating Neural Networks N. S. Flann, Utah State University,USA. 4. An Algorithm for Automatic Recognition of Arabic Cursive Handwritten A. I. El-Desoky, M. M. Salem, El-Mansoura University, N. H. Hegazi, M. M. Farag, National Research Center, Egypt. 09:00 ROOM B: SESSION B.2.1 (KNOWLEDGE REPRESENTATION) 1. Representation of Descriptive and Prescriptive Knowledge in Intelligent Systems S. Kocabas, Istanbul Technical University, TUBITAK MAE, Turkey. 2. A Proposed Knowledge Representation Scheme for Hybrid Learning Systems S. Abdel-Hamid, A. Abdel-Wahab, A. El-Dessouki, Electronics Research Institute, Egypt. 3. Frame Matching in an Extended Relational Language A. Kulenovic, A. Lagumdzija-Kulenovic, Marmara University, Turkey. 4. Mechanical Construction of Carnap-style Knowledge Bases D. Kain, MacLean Hunter Publishers, Austria. 10:20 TEA BREAK 10:40 ROOM A: SESSION A.2.2 (NEURAL NETWORKS APPLICATIONS II) 1. Implementation of a Neural Network Model on a Massively Parallel Computer Architecture M. K. Ciliz, A. Paksoy, Bogazici University, Turkey. 2. Dynamic Hill Climbing: Overcoming the Limitations of Optimization Algortihms. D. Yuret, M. de la Maza, M.I.T, USA. 3. Yapay Noron Aglarindan Cift-Yonlu Iliskili Bellek Icin Kullanilan Uc Kodlama Yontemi S.Oge, Yildiz Technical University, F. Gurgen, Bogazici University, Turkey. 4. Yapay Sinir Aglarinin bir Tanimina Dogru C. Guzelis, Istanbul Technical University, Turkey. 5. Yapay Noron Agi Adaptif Resonans Teori Yontemi ile Siniflandirma Sistemi ve Bir Uygulama M. E. Karsligil, M. Y. Karsligil, Yildiz Technical University, Turkey. 10:40 ROOM B: SESSION B.2.2 (NATURAL LANGUAGE PROCESSING II) 1. A Lexical-Functional Grammar for a Subset of Turkish Z. Gungordu, K. Oflazer, Bilkent University, Turkey. 2. An ATN Grammar for a Subset of Turkish C. Demir, K. Oflazer, Bilkent University, Turkey. 3. Resolution of Pronominal Anaphora in Turkish E. Tin, V. Akman, Bilkent University, Turkey. 4. Automatic Natural Language Identification F. Kocan, M.U. Karakas, Hacettepe University, Turkey. 5. Language Recognition Using 3-Gram Statistical Analysis E. Gokcay, Bilkent Universitesi, D. Gokcay, Middle East Technical University, Turkey. 12:20 LUNCH 14:00 ROOM A: SESSION A.2.3 (MACHINE TRANSLATION) 1. Machine Translation from Turkish to its Dialects I. Hamzaoglu, S. Kuru, Bogazici University, Turkey. 2. Connectionism in Machine Translation of Macedonian and English Prepositions K. Cundeva, Institut za Informatika, Macedonia. 3. A Turkish / English Translation Application Using Neural Networks in Morphological Analysis D. Gokcay, U. Halici, Middle East Technical University, Turkey. 14:00 ROOM B: SESSION B.2.3 (PHILOSOPHICAL ISSUES) 1. Elements of Scientific Creativity S. Kocabas, Istanbul Technical University, TUBITAK MAE, Turkey. 2. Similarities Between Humans and Machines A. E. Gunhan, University of Bergen, Norway. 15:00 TEA BREAK 15:30 - 17:00 ROOM A: PANEL AND CLOSING CEREMONY ACCOMMODATION Participants should contact the hotel facilities directly. Reservations of 3-4 weeks in advance are advised, considering the city's heavy tourism load during the period of the Symposium. Listed below are a few suggestions of ours. All prices include Value Added Tax except where indicated. Ciragan Palace Kempinski (*****) overlooks the Bosphorus (Near Besiktas) Single ("park" side) $195 Single (sea side) $230 Double ("park" side) $235 Double (sea side) $270 Tel: +90 (1) 2583377 Dedeman Oteli (****) in Gayrettepe Single $110 Double $150 Tel: +90 (1) 2748800 Fax: +90 (1) 2751100 Kalyon Otel(****) in the Old City Single $96 Double $120 Tel: +90 (1) 5174400 Fax: +90 (1) 6381111 (Excl. VAT. Concession of 20% for the Symposium participants) Movenpick Hotel(****) at Maslak Single (single bed in double Room) $165 Double $165 Breakfast $15 Tel: +90 (1) 2850900 Fax: +90 (1) 2850951 Bebek Hotel (***) relatively close to the Symposium venue Single (single bed in double Room, sea side, incl. breakfast) $90 Single ("city" side, incl. breakfast) $50 Double (sea side, incl. breakfast) $90 Double ("city" side, incl. breakfast) $60 Tel: +90 (1) 263 30 00 (2 lines) Telex: 27201 HOBE TR. Ist. Anadolu Otelcilik ve Turizm Meslek Lisesi Uygulama Oteli relatively close to the Symposium venue Single (single bed in double Room, incl. breakfast) 224.000 TL Double (incl. breakfast) 224.000 TL Tel: +90 (1) 278 19 97 (2 lines) Fax: +90 (1) 278 19 99 ARRIVAL TO BOGAZICI UNIVERSITY (BU) WITHIN ISTANBUL BU is located in between Bebek and Etiler. You can reach BU by max 3 transit vehicles using combinations of public bus, minibus, "dolmus" and ferry. Below table shows distance between some centres to BU. Taxi fee is 5,000 TL as initial and 4,000 TL per km. Centre Arrival Distance Taxi * Bakirkoy P 30 100,000 TL Yesilkoy P 37 125,000 TL Eminonu P 12 55,000 TL Taksim P 8 40,000 TL Besiktas P 7 33,000 TL Kadikoy P SP DP 18 85,000 TL Bostanci P SP DP 24 110,000 TL Levent P 4 25,000 TL Maslak MP P 10 45,000 TL Bebek Walk 2 13,000 TL P- Public Bus, M- Minibus, D- Dolmus (special kind of minibus), S- Ship *May increase about 10 % by June. Arrival column shows the transit vehicle combinations to come BU (eg, SP means first ship then public bus trip). REGISTRATION FEE: Until June 1, 1993, normal registration fee is $ 200, and student fee is $ 100. After June 1, the normal and student fees will be applied as $ 250 and $ 125, respectively. Fees include the Proceedings, the cocktail and tea services. APPLICATION ADDRESS: Assist. Prof. Dr. Levent Akin (Second Turkish AI and ANN Symposium) Bogazici University, Department of Computer Engineering, 80815 Bebek, ISTANBUL, TURKEY. TEL : +90 (1) 263 15 00/ 1769 FAX : +90 (1) 265 84 88 E-MAIL: YZ at TRBOUN.BITNET PRE-REGISTRATION FORM I will attend the II. Turkish Artificial Intelligence and Artificial Neural Networks Symposium to be held on June 24 and 25, 1993. Name : ................................... Job : ................................... Institution/Company : .................................. Addess ................................. ........................................................... Telephone : .................................. E-Mail : .................................. SYMPOSIUM FEE : $ ....................... I have deposited the above indicated amount at Garanti Bankasi Bogazici University Branch account no 6610126/2 (YAPAY ZEKA SEMPOZYUM). Bank receipt is enclosed. From jbower at smaug.cns.caltech.edu Wed May 12 17:10:17 1993 From: jbower at smaug.cns.caltech.edu (Jim Bower) Date: Wed, 12 May 93 14:10:17 PDT Subject: CNS*93 Message-ID: <9305122110.AA11014@smaug.cns.caltech.edu> Registration information for the Second Annual Computation and Neural Systems Meeting CNS*93 July 31 through August 6,1993 Washington DC This posting announces registration for this year's Computation and Neural Systems meeting (CNS*93). This is the second in a series of annual inter-disciplinary conferences intended to address the broad range of research approaches and issues involved in the general field of computational neuroscience. As last year, this meeting will bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in understanding how biological neural systems compute. The meeting will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. Meeting Structure The meeting will be composed of three parts: a day of tutorials, three and a half days of research presentations, and two and a half days of follow up workshops. The first day of the meeting (July 30) will be devoted to tutorial presentations and workshops focused on particular technical issues confronting computational neurobiology as well as general issues related to computational neurobiology. Introductory tutorials will also be given. The next three and a half days will include the main technical program consisting of plenary, contributed and poster sessions. Oral presentations will be made in one continuous session. Posters will be presented each evening. Following the main meeting, there will be two and a half days of focused workshops at a resort in West Virginia. Workshop topics will be established via email communication prior to the meeting as well as during the main meeting based on papers presented and issues raised. Location The tutorial day and the main meeting itself will be held at the Hyatt Regency Bethesda. This modern hotel is located at One Bethesda Metro Center in downtown Bethesda with easy access to the DC metro system. Following the main meeting, two days of postmeeting workshops will be held at the Coolfont resort which is set within 1350 mountainous acres in the Eastern Panhandle of West Virginia. Accommodations Main Meeting. We have reserved a block of rooms at special rates in the conference hotel. Regular registrants (i.e. non students) $89 single, $125 double, and full time students $79 single or double. Registering for the meeting, WILL NOT result in an automatic room reservation. Instead you must make your own reservations by contacting: The Hyatt Regency Bethesda One Bethesda Metro Center Bethesda, MD 20814 (301) 657-1234 or 1-800-233-1234 NOTE: IN ORDER TO GET THE REDUCED RATES, YOU MUST INDICATE THAT YOU ARE REGISTERING FOR THE CNS*93 MEETING. STUDENTS WILL BE ASKED TO VERIFY STATUS. Workshops. Accommodations for the workshops will be provided onsite at Coolfont resort and are included in the price of registration. All meals are also included in the registration fee. Acknowledgment of registration for the workshops and payment of fees WILL constitute a guarantee of accommodations at Coolfont. However, the total accommodations available for the workshops are limited, so please register early. Registering for the meeting We would recommend registering for the meeting as soon as possible as space for some meeting events is limited. Participants can register for the meeting in several different ways. 1) electronically, 2) via email, 3) via regular surface mail. Each different method is described below. Please only register using one method. You will receive a confirmation of registration within two weeks. 1) Interactive electronic registration: For those of you with internet connectivity who would like to register electronically for CNS*93 we have provided an internet account through which you may submit your registration information. To use this service you need only "telnet" to "mordor.cns.caltech.edu" and login as "cns93". No password is required. For example: yourhost% telnet mordor.cns.caltech.edu Trying 131.215.137.69 ... Connected to mordor.cns.caltech.edu. Escape character is '^]'. SunOS UNIX (mordor) login: cns93 Now answer all questions Note that all registration through this electronic service is subject to verification of payment. 2) For those with easy access to electronic mail, simply fill in the attached registration form and email it to: cp at smaug.cns.caltech.edu 3) Finally, for those who elect neither of the above options, please print out the attached registration form and send with payment via surface mail to the address indicated. In each case, registration will not be accepted as final until all fees are paid. Those registering by 1 or 2 above, but paying with check or money order should send payment to the following address as with your name and institution clearly marked. CNS*93 Registrations Division of Biology 216-76 Caltech Pasadena, CA 91125 ================================================== ************************ REGISTRATION FORM CNS*93 WASHINGTON D.C. July 31 - August 8 1993 ************************ Name : Title : Organization : Address : City : State : Zip : Country : Telephone : email address : Registration Fees : Tutorial (July 31) _____ $ 25 (includes lunch) Technical Program (August 1-4) _____ $ 300 Regular _____ $ 125 Full-time Student (Include verification of status) _____ $ 50 Banquet (for each additional banquet ticket) (main registration includes one banquet ticket and book of abstracts) Post-meeting Workshop (August 4-7) _____ $ 300 (includes round-trip transportation, meals and lodging) $ ______ Total Payment Please indicate method of payment : ____ Check or Money Order (Payable in US. dollars to CNS*92 - Caltech) will be sent to CNS*92 Registrations Division of Biology 216-76 Caltech Pasadena, CA 91125 ___ Visa ___ Mastercard ___ American Express Charge my card number ________________________________________ Expiration date ____________ Name of cardholder ___________________ Signature as appears on card : _________________________ Date ____________ Please make sure to indicate CNS*93 and YOUR name on all money transfers Did you submit an abstract & summary ? ( ) yes ( ) no title : Do you have special dietary preferences or restrictions (e.g., diabetic, low sodium, kosher, vegetarian)? If so, please note: Some grants to cover partial travel expenses may become available. Do you wish further information ? ( ) yes ( ) no ================================================== On-line access to additional meeting information Additional information about the meeting is available via FTP over the internet (address: 131.215.135.69 ). To obtain information about the agenda, currently registered attendees, or paper abstracts, the initial sequence is the same (Things you type are in ""): > yourhost% "ftp 131.215.137.69" > 220 mordor FTP server (SunOS 4.1) ready. Name (131.215.139.69:): "ftp" > 331 Guest login OK, send ident as password. Password: "yourname at yourhost.yourside.yourdomain" > 230 Guest login OK, access restrictions apply. ftp> "cd cns93" > 250 CWD command successful. ftp> At this point you can do one of several things: 1) To examine what is available type: "ls" 2) To download the meeting registration form type: "get registration" 3) To download the meeting agenda type: "get agenda" 4) To download a list of attendees type: "get attendees" 5) To download meeting abstracts first type: "cd cns93/abstracts" a) to view the list of abstracts type: "ls" b) to download specific abstracts type: "get " c) to download all abstracts type: "mget *" Once you have obtained what you want type: "quit" From petsche at learning.siemens.com Thu May 13 18:26:36 1993 From: petsche at learning.siemens.com (Thomas Petsche) Date: Thu, 13 May 93 18:26:36 EDT Subject: Position available immediately Message-ID: <9305132226.AA07416@learning.siemens.com> A Siemens subsidiary in Atlanta Georgia has an immediate opening for an engineer with a background that includes neural networks and electric machines (motors and generators). The position requires a master's degree or equivalent experience. Your responsibilities would include developing, implementing and testing neural network, statistical, and/or machine learning based algorithms for electric machine diagnosis. Siemens AG is a worldwide supplier of electrical and electronic devices with sales in excess of 4Billion$ in the US and 40Billion$ worldwide. If you are interested, send a cover letter and resume to me and I'll forward it to the relevant people. Thomas Petsche Siemens Corporate Research 755 College Road East Princeton, NJ 08540 Fax: 609-734-3392 From isk at lautaro.fb10.tu-berlin.de Thu May 13 03:18:28 1993 From: isk at lautaro.fb10.tu-berlin.de (I.SANTIBANEZ-KOREF) Date: Thu, 13 May 93 09:18:28 +0200 Subject: Evolutionary Structuring of Artificial Neural Networks Message-ID: <9305130718.AA06992@lautaro.fb10.tu-berlin.de > *** DO NOT FORWARD TO ANY OTHER LISTS *** A postscript copy of the following Technical Report can be obtained by anonymous ftp at ftp-bionik.fb10.tu-berlin.de (ftp-instructions at the end of the message) : Evolutionary Structuring of Artificial Neural Networks H.--M. Voigt, J. Born, I. Santibanez--Koref Technical University Berlin Bionics and Evolution Techniques Laboratory Bio-- and Neuroinformatics Research Group The report summarizes research on the structuring of Artificial Neural Networks by a stochastic graph generation grammar.The main feature of the approach is to carry out the graph generation in view of an individual development process which is embedded in an evolutionary framework.We explain this approach by examples, and evaluate its practicability. Comments and questions are welcome. ========== ftp-instructions: unix %ftp ftp-bionik.fb10.tu-berlin.de Connected to lautaro.fb10.TU-Berlin.DE. Name (ftp-bionik.fb10.tu-berlin.de:pqp):anonymous 331 Guest login ok, send your complete e-mail address as password. Password: 230 Guest login ok, access restrictions apply. ftp> cd pub/papers/Bionik 250 CWD command successful. ftp> bin 200 Type set to I. ftp> get tr-02-93.ps.Z 200 PORT command successful. 150 Opening BINARY mode data connection for tr-02-93.ps.Z (157041 bytes). 226 Transfer complete. local: tr-02-93.ps.Z remote: tr-02-93.ps.Z 157041 bytes received in 0.44 seconds (3.5e+02 Kbytes/s) ftp> quit 221 Goodbye. unix % zcat tr-02-93.ps.Z | lpr -P =============== Ivan Santibanez-Koref FG: Bionik und Evoluionstechnik FoG Bio- und Neuroinformatik Sekr. ACK1 Ackerstrasse 71-76 1000 Berlin 6 GERMANY Tel.: +49 - 30 - 314 72 677 Fax.: +49 - 30 - 541 98 72 E-mail: isk at fb10.tu-berlin.dbp.de From ferna1 at sis.ucm.es Thu May 13 09:18:30 1993 From: ferna1 at sis.ucm.es (Fernando M. Pescador) Date: Thu, 13 May 1993 9:18:30 UTC+0200 Subject: Request for Connectionists Information Message-ID: I send to you the content of our 1st FORUM in Neural Networks. Sorry because the content is in Spanish language. Sincerely, Fernando Pescador ferna1 at sis.ucm.es * * * * * FORUM DE REDES NEURONALES * * * * * Jueves 27 de Mayo de 1993 Universidad Complutense Salon de Actos del Centro de Proceso de Datos de la U.C.M. Hora: 9:30 de la Ma~nana. Organizacion: Servicio de Informatica Universidad Complutense de Madrid Fernando Pescador ( ferna1 at sis.ucm.es ) ------ Ponencias -------- 1 *.- " MOLECULAR COMPUTING : O como dise~nar redes neuronales con proteinas y su aplicacion en el dise~no de Nano-computadores." Autor: Rafael Lahoz Beltra. Dpto. de Biomatematica (Matematica Aplicada) U.C.M. Resumen: El objeto de la conferencia es introducir el concepto de Molecular Computing y mostrar como redes neuronales pueden implementarse fisicamente con redes de polimeros, concretamente proteinas, y comentar los resultados por nosotros obtenidos y publicados en las revistas COMPUTER, ByoSystem, etc. 2 *.- " REPRESENTACION Y CODIFICACION DE ESTRUCTURAS DE LA QUIMICA ORGANICA MEDIANTE REDES NEURONALES " Autor: Manfred Stud. Instituto de Quimica del C.S.I.C. Resumen: " Se presenta un modulo grafico que transforma una estructura molecular de la Quimica Organica en una red neuronal cuyo procesamiento conduce a un codigo que es utilizado como representacion de dicha estructura en un proceso de asociacion mediante bp con una propiedad de la misma (su actividad biologica)." 3 *.- " IDENTIFICACION DE TIPOS DE DIAS MEDIANTE UN MAPA AUTO-ORGANIZATIVO " Autor: Alvaro Garcia Tejedor Dpto. Ingenieria del Conocimiento de ERITEL Resumen: Se esta intentando predecir la demanda nacional de consumo electrico en escala horaria (curva de demanda horaria) para cualquier dia del a~no. Sin embargo, el comportamiento de la demanda depende fuertemente del dia que se intenta predecir. El numero y tipo de dias presentes en un a~o es desconocido, ya que gran parte de los factores que influyen en la demanda no son cuantificables. Se ha propuesto un mapa de Kohonen como metodo para clasificar los dias a partir de los perfiles de demanda horaria. El resultado se ha comparado con otras tecnicas de clasificacion (Analisis de Componentes Principales y Analisis Discriminante usando Distancias de Mahalanobis). 4 *.- " DEFINICION DE LA CAPA DE ENTRADA " Autor: Susana Lopez Ornat. Dpto. Psicologia Basica: Procesos Cognitivos. U.C.M. Resumen: " Problema de definicion de la capa de entrada para modelos de adquisicion del lenguaje. Los resultados experimentales obtenidos muestran que el input es co-definido por el sistema-procesos en los patrones de entrada a la capa del input . " 5 *.- " APRENDIZAJE MEDIANTE VALOR ADAPTATIVO ". Autor: Antonio Murciano Cespedosa y Javier Zamora Romero. Dpto. Biomatematica ( Matematica Aplicada ) U.C.M. Resumen: Se presentara un modelo para el centrado de estimulos visuales, como ejemplo del aprendizaje mediante valor adaptativo. Dicho modelo esta basado en estructuras inspiradas biologicamente y evolutivamente seleccionadas. Este tipo de aprendizaje permite comportamientos dependientes del ambiente ( de entrenamiento y de funcionamiento ). 6 *.- "REDES METABOLICAS: PROTEINAS Y MAQUINAS DE TURING". Autor: Mario Reviriego Eiros. Dpto. Biomatematica. ( Matematica Aplicada ) U.C.M. Resumen: " El tema trata de la simulacion de redes metabolicas por medio de enzimas (proteinas) que se "comportan" como maquinas de Turing y metabolitos que son modelizados por medio de cadenas binarias." 7 *.- "CALCULO DE ESTRUCTURA SECUNDARIA DE PROTEINAS MEDIANTE UNA RED NEURONAL CON APRENDIZAJE NO SUPERVISADO". Autor: Pablo Chacon Montes Dpto. Biologia Molecular I ( Fac. Quimicas ) U.C.M. Resumen:" Se ha desarrollado una red de Kohonen para la clasificacion topologica de proteinas a partir de espectros de dicroismo circular. En los mapas resultantes se observa un ordenamiento en dependencia de los distintos tipos de estructura secundaria. Esta clasificacion es aprovechada posteriormente para el calculo de porcentaje de estructura de nuevas proteinas. 8 *.- "AUTOORGANIZACION DE CAMPOS RECEPTIVOS ON-OFF EN EL CORTEX VISUAL" Autor: Miguel A. Andrade Navarro Dpto. Biologia Molecular I ( Fac. Quimicas ) U.C.M. Resumen: " Se modeliza la autoorganizacion de las conexiones sinapticas en el cortex visual de mamiferos que sucede durante las etapas tempranas del desarrollo. Se emplea para ello una red neuronal de dos capas en la que la evolucion de las conexiones depende de la correlacion de actividad neuronal mediante reglas Hebbianas. Se observa la aparicion de neuronas sensibles a la posicion, orientacion y tama~no de un estimulo. 9 *.- " NT5000: SISTEMA PROCESADOR DE REDES NEURONALES ". Autor: Jose C. Chacon Gomez Laboratorio de Vision. Fac. Psicologia. Resumen: Presentacion de este sistema ( hardware especializado y software corriendo en PC ) de dise~no y calculo de redes neuronales. 10 *.- " ASPIRIN/MIGRAINES HERRAMIENTA DE DISE~NO Y ANALISIS DE REDES NEURONALES ". Autor: Fernando Pescador. Servicio de Informatica de la U.C.M. Resumen: Se presenta la herramienta de libre distribucion AM6.0, que puede correr en estaciones de trabajo y Superordenadores y sirve para el dise~no y analisis de Redes Neuronales. =============================================================================== From MASETTI at BOLOGNA.INFN.IT Mon May 17 05:57:00 1993 From: MASETTI at BOLOGNA.INFN.IT (MASETTI@BOLOGNA.INFN.IT) Date: Mon, 17 MAY 93 09:57 GMT Subject: call for papers: SAC '94 Message-ID: =========================================================== | | | | | | | CALL FOR PAPERS | | =============== | | | | 1994 ACM Symposium on Applied Computing (SAC'94) | | | | | | TRACK ON FUZZY LOGIC IN APPLICATIONS | | ------------------------------------ | | | | Phoenix Civic Plaza, Phoenix, Arizona, USA | | | | March 6-8, 1994 | | | =========================================================== SAC'94 is the annual conference of the ACM Special Interest Group on Applied Computing (SIGAPP), APL (SIGAPL), Biomedical Computing (SIGBIO), Business Information Technology (SIGBIT), Computer Uses in Education (SIGCUE), Forth (SIGFORTH), and Small and Personal Computer (SIGSMALL/PC). For the past nine years, SAC has become a primary forum for applied computing practitioners and researchers. Once again SAC'94 will be held in conjunction with the 1994 ACM Computer Science Conference (CSC'94). Fuzzy Logic in Applications is one of the major tracks in SAC. The purpose of this track is to provide a forum for the interchange of ideas, research, development activities, and applications among academic and practitioners in the areas related to Fuzzy Logic in Applications. State-of-the-art and state-of-the-practice original papers relevant to the track themes as well as panel proposals are solicited. RELEVANT TOPICS: Applications of Fuzzy Systems to: - System Control - Signal Processing - Intelligent Information Systems - Image Understanding - Case-Based Reasoning - Pattern Recognition - Decision Making and Analysis - Robotics and Automation - Modelling - Medical Diagnostic and MRI - Databases and Information Retrieval - Evolutionary Computation - Neural Systems IMPORTANT DATES: Submission of draft papers: 17.09.1993 Notification of acceptance: 01.11.1993 Camera-ready copy due: 20.11.1993 TRACK CHAIR: Madjid Fathi FB Informatik, LS1 P.O.BOX 500 500 University of Dortmund D-4600 Dortmund 50 Germany Tel: +49231-7556372 FAX: +49231-7556555 Email: fathi at ls1.informatik.uni-dortmund.de HONORARY ADVISOR : Lotfi A. Zadeh, University of California, Berkeley TRACK ADVISORY: Y. Attikiouzel, Univ. of Western Australia H. Berenji, NASA Ames Division, AI Research, CA, USA M. Jamshidi, Univ. of New Mexico, NM, USA A. Kandel, Univ. of South Florida, USA R. Kruse, Univ. of Braunschweig, Germany E.H. Mamdani, Univ. of London, GB M. Masetti, Univ. of Bologna, Italy H. Prade, Univ. of Paul Sabatier, France B. Reusch, Univ. of Dortmund, Germany E.H. Ruspini, SRI International, USA H. Tanaka, Univ. of Osaka, Japan L. Valverde, Univ. of de les Illes Baleares, Spain R.R. Yager, Iona College, Editor in-chief, USA H.J. Zimmermann, Univ. of Aachen, Germany GUIDELINES FOR SUBMISSION Several Categories of papers will be considered for presentation and publication including: (i) original and unpublished research articles, (ii) Reports of applications in - business, - government, - industrie, - arts, - science and - engineering. Accepted papers will be published in the ACM/SAC'94 Conference Proceedings to be printed by the ACM Press. In order to facilitate the blind external review process, submission guidelines must be strictly adhered to: - Submit 5 copies of your manuscript to the track chair. - Authors names and addresses MUST NOT appear in the body of the paper, self-reference must be in the third person, attribution to the author(s) must be in the form of "author", and bibliographical entries by the author(s) must also be in the form of "author". - The body of the paper should not exceed 5.000 words (approximately 20 doubled-spaced pages). - A seperate cover sheet shoeld be attached to each copy, containing - the title of the paper, - the author(s) and affiliation(s), - and the address (including e-mail address and fax number, if available) to which correspondence should be sent. - Panel proposals must include abstract of the topics and a copy of resume/vita of the moderator. ------- End of Forwarded Message From Prahlad.Gupta at K.GP.CS.CMU.EDU Mon May 17 18:57:24 1993 From: Prahlad.Gupta at K.GP.CS.CMU.EDU (Prahlad.Gupta@K.GP.CS.CMU.EDU) Date: Mon, 17 May 93 18:57:24 EDT Subject: Cognitive Science Preprint Message-ID: FTP-host: reports.adm.cs.cmu.edu (128.2.218.42) FTP-filename: /1993/CMU-CS-93-146.ps The following article will appear in the Cognitive Science journal. A preprint of the paper is available as CMU Computer Science Technical Report No. CMU-CS-93-146, in electronic as well as hard-copy form. Information follows about electronic retrieval (free), as well as ordering hard copies (for a small charge). Comments on the paper are invited. Note: A preliminary and substantially different version of this paper was announced in the neuroprose electronic archive in December-January 1991-92 as the file gupta.stress.ps.Z (which is no longer available). -- Prahlad ------------------------------------------------------------------------------ Connectionist Models and Linguistic Theory: Investigations of Stress Systems in Language Prahlad Gupta and David S. Touretzky Carnegie Mellon University (To appear in Cognitive Science) Abstract We question the widespread assumption that linguistic theory should guide the formulation of mechanistic accounts of human language processing. We develop a pseudo-linguistic theory for the domain of linguistic stress, based on observation of the learning behavior of a perceptron exposed to a variety of stress patterns. There are significant similarities between our analysis of perceptron stress learning and metrical phonology, the linguistic theory of human stress. Both approaches attempt to identify salient characteristics of the stress systems under examination without reference to the workings of the underlying processor. Our theory and computer simulations exhibit some strikingly suggestive correspondences with metrical theory. We show, however, that our high-level pseudo-linguistic account bears no causal relation to processing in the perceptron, and provides little insight into the nature of this processing. Because of the persuasive similarities between the nature of our theory and linguistic theorizing, we suggest that linguistic theory may be in much the same position. Contrary to the usual assumption, it may not provide useful guidance in attempts to identify processing mechanisms underlying human language. ------------------------------------------------------------------------------ INSTRUCTIONS FOR ELECTRONIC RETRIEVAL VIA ANONYMOUS FTP unix> ftp reports.adm.cs.cmu.edu # or ftp 128.2.218.42 Connected to reports.adm.cs.cmu.edu. 220 REPORTS.ADM.CS.CMU.EDU FTP server (Version 4.105 of 10-Jul-90 12:08) ready. Name (reports.adm.cs.cmu.edu:prahlad): anonymous 331 Guest login ok, send username at node as password. Password: # you must include the "@" 230-Filenames can not begin with "/.." . Other than that, everything is ok. 230 User anon logged in. ftp> cd 1993 250 Directory path set to 1993. ftp> get CMU-CS-93-146.ps 200 PORT command successful. 150 Opening data connection for CMU-CS-93-146.ps (128.2.248.83,1073) (591324 byt es). 226 Transfer complete. local: CMU-CS-93-146.ps remote: CMU-CS-93-146.ps 600021 bytes received in 10 seconds (57 Kbytes/s) ftp> quit unix> lpr -P CMU-CS-93-146.ps # or however you print PostScript files ------------------------------------------------------------------------------ ORDERING A HARD COPY (The TR No. is CMU-CS-93-146) Contact: Computer Science Documentation School of Computer Science Carnegie Mellon University Pittsburgh, Pennsylvania 15213, USA Phone: (412) 268-2596 Internet: reports at cs.cmu.edu ------------------------------------------------------------------------------ From sjodin at sics.se Tue May 18 09:11:23 1993 From: sjodin at sics.se (Gunnar Sj|din) Date: Tue, 18 May 1993 15:11:23 +0200 Subject: Two research positions at the Swedish Institute of Computer Science available. Message-ID: <9305181311.AA06261@sics.se> SICS is the joint effort of Swedish industry and government in computer science research. We are now entering the area of neural networks and would like to permanently employ one researcher, and invite a guest researcher for one year. They should be willing to help build the group and its research program. We are interested in candidates with a strong background both in the theory of the field and its applications. On the application side, we are particularly interested in methods for telecommunications and robotics but other areas may come in as well. Duties to begin as soon as possible after September 1, 1993. Apply, by June 15, in writing, email or fax to Gunnar Sjodin SICS Box 1263 S-164 28 Kista Sweden phone: +46-8-752 15 48, fax +46-8-751 72 30 email:sjodin at sics.se Please enclose a curriculum vitae, a list of publications, and the names, addresses, and phone numbers of two referees. From stiber at cs.ust.hk Wed May 19 14:14:10 1993 From: stiber at cs.ust.hk (Dr. Michael D. Stiber) Date: Wed, 19 May 93 14:14:10 HKT Subject: Paper in neuroprose: Learning In Neural Models With Complex Dynamics Message-ID: <9305190614.AA21563@cs.ust.hk> The following preprint has been placed in the Neuroprose archives at Ohio State (filename: stiber.dynlearn.ps.Z). If you cannot use FTP, I can email the file to you. "Learning In Neural Models With Complex Dynamics" (4 pages) Michael Stiber Department of Computer Science The Hong Kong University of Science and Technology Clear Water Bay, Kowloon, Hong Kong stiber at cs.ust.hk Jose P. Segundo Department of Anatomy and Cell Biology and Brain Research Institute University of California Los Angeles, California 90024, USA iaqfjps at mvs.oac.ucla.edu Abstract Interest in the ANN field has recently focused on dynamical neural {\em networks} for performing temporal operations, as more realistic models of biological information processing, and to extend ANN learning techniques. While this represents a step towards realism, it is important to note that {\em individual} neurons are complex dynamical systems, interacting through nonlinear, nonmonotonic connections. The result is that the ANN concept of {\em learning}, even when applied to a single synaptic connection, is a nontrivial subject. Based on recent results from living and simulated neurons, a first pass is made at clarifying this problem. We summarize how synaptic changes in a 2-neuron, single synapse neural network can change system behavior and how this constrains the type of modification scheme that one might want to use for realistic neuron-like processors. Dr. Michael Stiber stiber at cs.ust.hk Department of Computer Science tel: (852) 358 6981 The Hong Kong University of Science & Technology fax: (852) 358 1477 Clear Water Bay, Kowloon, Hong Kong From pastor at max.ee.lsu.edu Wed May 19 16:41:28 1993 From: pastor at max.ee.lsu.edu (John Pastor) Date: Wed, 19 May 93 15:41:28 CDT Subject: No subject Message-ID: <9305192041.AA28051@max.ee.lsu.edu> The following technical report is now available. If you would like to have a copy, please let me know. ------------------------------------------------------------------ Technical Report ECE/LSU 93-04 Another Alternative to Backpropagation: A One Pass Classification Scheme for Use with the Kak algorithm John F. Pastor Department of Electrical and Computer Engineering Louisiana State University Baton Rouge, La. 70803 April 26,1993 email: pastor at max.ee.lsu.edu ABSTRACT Kak[1] provides a new technique for designing, and training, a feedforward neural network. Training with the Kak algorithm is much faster, and is implemented much more easily, than with the backpropagation algorithm[2]. The Kak algorithm calls for the construction of a network with one hidden layer. Each hidden neuron classifies an input vector in the training set that maps to a nonzero output vector. Kak[1] also presents two classification algorithms. The first, CC1, provides generalization comparable to backpropagation[2] but may require numerous passes through the training set to classify one input vector. The second, CC2, only requires inspection of the vector we wish to classify but does not provide generalization. An extension of CC2 is suggested as a new classification scheme that will classify an input vector with only one pass through the training set yet will provide generalization. Simulation results are presented that demonstrate that using the new classification scheme not only signifigantly reduces training time, but provides better generalization capabilities, than classifying with CC1. Thus, the Kak algorithm, using this new classification scheme, is an even better alternative to backpropagation.