From esann at dice.ucl.ac.be Sun Mar 1 14:34:52 1998 From: esann at dice.ucl.ac.be (ESANN) Date: Sun, 01 Mar 1998 20:34:52 +0100 Subject: 1998 European Symposium on Artificial Neural Networks Message-ID: <3.0.3.32.19980301203452.006aaa20@ns1.dice.ucl.ac.be> ************************************************************************ Please accept our apologies if you received this message more than once. We do our best to maintain our mailing lists and avoid duplicates. However, your e-mail may be part of an alias-list maintained elsewhere, that we used for the ESANN'98 mailing. ************************************************************************ ------------------------------- | 6th European Symposium on | | Artificial Neural Networks | | | | ESANN'98 | | | | Bruges - April 22-24, 1998 | ------------------------------- -- Preliminary programme -- http://www.dice.ucl.ac.be/esann The programme of the ESANN'98 conference, the 6th European Symposium on Artificial Neural Networks, including information for registration, is available on the Web at the above URL. A printed copy of this programme is also available on request (mailto:esann at dice.ucl.ac.be). Please do not hesitate to contact us (preferably by email) if you need any other information concerning this conference. ===================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat D facto conference services 45 rue Masui - B-1000 Brussels - Belgium tel: + 32 2 203 43 63 - fax: + 32 2 203 42 94 mailto:esann at dice.ucl.ac.be ===================================================== From kainen at cs.umd.edu Mon Mar 2 23:02:50 1998 From: kainen at cs.umd.edu (Paul Kainen) Date: Mon, 2 Mar 1998 23:02:50 -0500 (EST) Subject: paper available: Approximation by neural networks is not continuous Message-ID: <199803030402.XAA21306@tove.cs.umd.edu> Dear Colleagues, The paper described below is accessible via the web at http://www.clark.net/pub/kainen/not-cont.ps It is 10 pages printed, 174 KB; sorry, hard copy not available. The paper has been submitted for a special issue of a journal. Approximation by neural networks is not continuous Paul C. Kainen, Vera Kurkova and Andrew Vogt It is shown that in a Banach space X satisfying mild conditions, for an infinite, independent subset G, there is no continuous best approximation map from X to the n-span, span_n G. The hypotheses are satisfied when X is an L_p space, 1 < p < \infty, and G is the set of functions computed by the hidden units of a typical neural network (e.g., Gaussian, Heaviside or hyperbolic tangent). If G is finite and span_n G is not a subspace of X, it is also shown that there is no continuous map from X to span_n G within any positive constant of a best approximation. Keywords. nonlinear approximation, one-hidden-layer neural network, rates of approximation, continuous selection, metric projection, proximinal set, Chebyshev set, n-width, geometry of Banach spaces. kainen at gumath1.math.georgetown.edu vera at uivt.cas.cz andy at gumath1.math.georgetown.edu From prefenes at lbs.ac.uk Tue Mar 3 07:05:14 1998 From: prefenes at lbs.ac.uk (Paul Refenes) Date: Tue, 3 Mar 1998 12:05:14 UTC Subject: Post-Doc and PhD Scholarships at LBS Message-ID: <6D5DBFA6218@nemesis.lbs.ac.uk> Dear Connectionists, Scholarships for the following posts are available at London Business School. =================================================== London Business School Decision Technology Centre Computational Finance Programme * Post-doctoral Research in Computational Finance * PhD Research Scholarship in Computational Finance The Department of Decision Science at London Business School is offering two scholarships on its Computational Finance programme. The research areas include Neural Networks, Non-parametric statistics, Financial Engineering, Simulation, Optimisation and Decision Analysis. 1. Post-doctoral Research in Computational Finance: The use of advanced decision technologies such as neural networks, non-parametric statistics and adaptive systems for the development of financial risk models in the fixed income and currency markets. Our industrial collaborators are leading European Banks and have a special interest on fixed income arbitrage and relative value models. 2. PhD Research Scholarship in Computational Finance: to utilise developments from times series theory and from the non-parametric statistics field for developing distribution theories, statistical diagnostics, and test procedures for neural model identification. London Business School offers students enrolled in the doctoral programme core courses on Research Methodology, Statistical Analysis, as well as a choice of advanced specialised subject area courses including Financial Economics, Equity Investment, Derivatives Research, etc. Candidates with a strong background in mathematics, operations research, computer science, nonparametric statistics, and/or econometrics are invited to apply. Applicants should have at least an upper second class degree and, ideally, an MSc in Computer Science, Statistics/Econometrics, Operations Research, or related areas. Please send a CV and addresses of 2 referees by March 20 to: Dr A-P. N. Refenes London Business School Regents Park, London NW1 4SA Tel: ++ 44 171 262 50 50 Fax: ++ 44 171 728 78 75 Application forms for the PhD programme can be obtained from and should be returned to: Dr Raymond Madden London Business School Regents Park, London NW1 4SA Tel: ++ 44 171 262 50 50 Fax: ++ 44 171 728 78 75 The Department ============ The Department of Decision Sciences of the London Business School is actively involved in innovative multi-disciplinary research on the application of new business modelling methodologies to individual and organisation decision-making. In seeking to extend the effectiveness of conventional methods of management science, statistical methods and decision support systems, with the latest generation of software platforms, artificial intelligence, neural networks, genetic algorithms and computationally intensive methods, the research themes of the department remain at the forefront of new practice. The Decision Science department of the London Business School is internationally known for its research in the areas of forecasting, optimisation, simulation and intelligent decision support methods. The Computational Finance Research Programme ==================================== The Computational Finance research programme at London Business School is the major centre in Europe for research into neural networks, non-parametric statistics and financial engineering. With funding from the DTI, the European Commission and a consortium of leading financial institutions the research unit has attained a world-wide reputation for collaborative research. Doctoral students work in a team of highly motivated post-doctoral fellows, research fellows, doctoral students and faculty who are amongst Europe's leading authorities in the field. From yx at pics91.cis.pku.edu.cn Wed Mar 4 02:55:25 1998 From: yx at pics91.cis.pku.edu.cn (Yu Xiang) Date: Wed, 4 Mar 1998 15:55:25 +0800 (CST) Subject: ICNN&B'98: Second Call for Papers Message-ID: Final Announcement and Call for Papers 1998 International Conference on Neural Network and Brain (NN&B'98) October 27-30,1998 Beijing, China Http://www.cie-china.org Sponsored by: China Neural Networks Council Co-sponsored by: IEEE NNC IEEE Beijing Section IEEE NNC Beijing RIG INNS-SIG Supported by: National Natural Science Foundation of China CALL FOR PAPERS Over the past decade or so, neural networks have emerged as a research area with active involvement by researchers from a number of different disciplines, including cognitive science, computer science, engineering, mathematics, neurobiology, physics, and statistics. The theme for the International Conference on Neural Network and Brain (NN&B'98) is to provide a chance for interdisciplinary collaboration and exchange of ideas which will lead us to address research issue in this area from different perspectives, and to promote its application to industries. This conference is intended to bring together researchers from different disciplines to review the current status of neural networks and understand higher brain functions. Submissions of papers related, but not limited, to the topics listed below are invited. General Topics Adaptive Filtering Architectures Associative Memory Brain Functions Cognitive Science Cellular Neural Networks Computational Intelligence Data Analysis Fuzzy Neural Systems Genetic and Annealing Algorithms Hybrid Systems Image Signal Processing Industrial Automation Intelligent Control and Robotics Learning and Memory Machine Vision Model Identification Motion Analysis Motion Vision Neurobiology Neurocognition Neural Modeling Neurosensors and Wavelets Neurodynamics and Chaos Optimization Parameter Estimation Pattern Recognition and signal Processing Prediction Sensation and Perception Sensorimotor Systems Speech, Hearing,and Language Supervised/Unsupervised Learning System Identification Time Series Analysis Implementations Hardware Implementation Optical Implementation Parallel and Distributed Computing Environment Simulation Applications Areas Business, Chemical Engineering, Communications, Economics and Finance,Industry, Manufacture, Medical, OCR,etc. Submission Authors are invited to submit 3 copies of detail synopsis of about 1000 words written in English. A cover sheet with author's title, name, affiliation, telephone and fax numbers, mailing address and e-mail. Synopsis and manuscript should be sent to: Mr. Yu Xiang Center for Information Science Peking University Beijing 100871 P. R. China Tel: 86-10-6275 1937 Fax: 86-10-6275 5654 E-mail: yx at cis.pku.edu.cn Important Dates Submission of synopsis March 31,1998 Notification of acceptance April 30, 1998 Submission of photo-ready accepted paper June 30,1998 GENERAL INFORMATION Conference Language The official language is English. No simultaneous translation is available. Conference Schedule October 27 Registration October 28-30 Parallel sessions October 31 One-day excursion to The Great Wall and Ming Tombs (tickets may be purchased at US$35) Venue The conference will be held at the Media Center, located at Fuxing Road, Beijing China, which is approximately 3km west to the Tiananmen Square. Accommodations A block of rooms has been reserved for the NN&B*98 Participants at the Media Center, which is an attractive and modern hotel. Single occupancy : US$40 per day Double occupancy(2 beds) : US$56 per day Registration Registration fee covers admission to the conference, reception, refreshments and a copy of the proceedings. Registration Fee by September 10,1998 after September 10,1998 Regular US$360 US$395 Student US$275 US$325 In case of Cancellation, a fee of US$50 will be deducted from the refund. Cancellations should be made in writing to the NN&B*98 Secretariat by Oct.15, 1998. No cancellation will be allowed after Oct.15,1998. But a copy of the conference proceedings will be mailed to. On Site Registration The registration desk at the Media Center will be open during the following hours for on-site registration: October 27,1998 8:30-23:00 October 28,1998 7:30-9:00AM Banquet There will be a banquet at 18:30 on October 30,1998. Tickets are available at a cost of US$30. Method of Payment All payments including registration, accommodation and tours should be made in US Dollars by Bank transfer to: Account number: 71404625 Account name: Chinese Institute of Electronics Bank of China, Headquarters Fuchengmen Road, Beijing, China Attn: Ms. Fang Min, NN&B*98 Secretariat Your name and NN&B*98 must be stated on all your payments. Weather The temperature is around 10c during the day, and 2c at night in October. Visas All travelers to China must have a valid visa. Visas may be obtained from the Chinese Consulate in most major cities. Conference registrants will be mailed an official invitation letter upon the registration form received to be used for visa application. Companions' Program A companions' program will be available during the conference. Airport Transportation On October 27,1998, the Conference staff will assist you at the Beijing airport in getting a taxi to the Media Center. Please provide your flight information in the registration form for this purpose. The taxi fare from the airport to the Media Center is approximately US$20. Please find the NN&B*98 sign at the arrival hall of the airport. POST-CONFERENCE TOURS Two post -conference tours will be offered to the conference attendees and accompanying persons. Tour A: November 1-8 Beijing-Xian-Guilin-Guangzhou (Hong Kong Exiting) US$ 998 per person for double occupancy US$ 1159 per person for single occupancy Tour B: November 1-8 Beijing-Hongzhou-Shanghai-Suzhou-Guangzhou (Hong Kong Exiting) US$ 989 per person for double occupancy US$ 1139 per person for single occupancy Note: The above fees include accommodations, meals, transportation between cities in China, and airport transfer. The organizer reserves the right to cancel any tour or offer new prices if a minimum number of 10 persons not reached. FOR FURTHER INFORMATION About paper and program Mr. Yu Xiang Center for Information Science Peking University Beijing 100871 China Tel: 86-10-6275 1937 Fax: 86-10-6275 5654 E-mail: yx at cis.pku.edu.cn About registration Ms. Min Fang NN&B'98 Secretariat P.O. Box 165, Beijing 100036, China Tel.:(8610)6828 3463 Fax:(8610)6828 3458 E-mail: shaz at sun.ihep.ac.cn From jimmy at ecowar.demon.co.uk Wed Mar 4 05:59:50 1998 From: jimmy at ecowar.demon.co.uk (Jimmy Shadbolt) Date: Wed, 4 Mar 1998 10:59:50 +0000 Subject: research position at Econostat, Ltd. (UK) Message-ID: RESEARCH POSITION OFFERED IN FINANCIAL MARKET PREDICTION -------------------------------------------------------- Position Quantitative Analyst Environment A small financial advisory company involved in predicting bonds and equities from a broad base of economic and financial indicators, using state-of-the-art statistical techniques (regression, neural networks, GA's, and others). Job Description Research and development of expected return models, based on in-house techniques and their expansion into further areas, such as Bayesian, wavelets, information measures and geometry. Applications to Jimmy Shadbolt jimmy at ecowar.demon.co.uk Start Date Immediate Qualifications -------------- First degree in numerate scientific discipline (maths, engineering, physics, statistics, etc). PhD (or MSc) in one of econometrics, mathematical statistics, applied mathematics or other related field of study. Strong interest in financial economics Training and experience ----------------------- Programming in C/C++ and/or Splus User experience in PC (word processing and spreadsheet) and Unix environments Aptitude and Ability -------------------- Good oral and writing skills Creative and problem solving approach to research Personal Attributes ------------------- Ability to work without close supervision as a member of a team Flexibility to meet changing opportunities in a dynamic research environment -- Jimmy Shadbolt Econostat Ltd Hennerton House Wargrave Berks RG10 8PD United Kingdom From jose at tractatus.rutgers.edu Fri Mar 6 12:16:59 1998 From: jose at tractatus.rutgers.edu (Stephen Hanson) Date: Fri, 6 Mar 1998 12:16:59 -0500 Subject: SYS ADM // COGSCI Message-ID: <199803061716.MAA00793@tractatus.rutgers.edu> ******* IMMEDIATE OPENING ******* 3/6/98 RUTGERS-Newark Campus -PSYCHOLOGY DEPARTMENT/COGNITIVE SCIENCE Systems Adminstration/Cognitive Science Research We are looking for an individual to do research in Cognitive Science and to help administer the computing resources of the Psychology Department at Rutgers-University (Newark Campus). Resources include a network of Sun workstations, PCs and Macs, printers, pc-voice mail system and various peripheral devices. The individual will be responsible for installing and debugging software, and various routine system administration activites. At least half their time will be spent in research involving Cognitive Science especially related to Connectionist networks (or Neural Networks and Computational Neuroscience. Familiarity with C programming, UNIX system internals (BSD, System V, Solaris, Linux) and Windows (95, NT) and local area networks running TCP/IP is required. Image processing or graphics programing experience are pluses. Candidates should possess either a BS/MS in Computer Science, Cognitive Science, AI or other relevant fields or equivalent experience. Salary will be dependent upon qualifications and experience. Rutgers University is an equal opportunity affirmative action employer. Please send resumes and references to Stephen J. Hanson Department of Psychology 101 Warren Street Rutgers University Newark, New Jersey, 07102 Direct email inquiries or resumes to: jose at psychology.rutgers.edu Please indicate on SUBJECT Line: SYS ADM as a keyword. From lba at inesc.pt Fri Mar 6 11:05:16 1998 From: lba at inesc.pt (Luis B. Almeida) Date: Fri, 06 Mar 1998 16:05:16 +0000 Subject: Paper available Message-ID: <35001EBC.5DB14D5D@inesc.pt> The following paper is available for download: Parameter Adaptation in Stochastic Optimization Luis B. Almeida, Thibault Langlois, Jose D. Amaral and Alexander Plakhov ABSTRACT Optimization is an important operation in many domains of science and technology. Local optimization techniques typically employ some form of iterative procedure, based on derivatives of the function to be optimized (objective function). These techniques normally involve parameters that must be set by the user, often by trial and error. Those parameters can have a strong influence on the convergence speed of the optimization. In several cases, a significant speed advantage could be gained if one could vary these parameters during the optimization, to reflect the local characteristics of the function being optimized. Some parameter adaptation methods have been proposed for this purpose, for deterministic optimization situations. For stochastic (also called on-line) optimization situations, there appears to be no simple and effective parameter adaptation method. This paper proposes a new method for parameter adaptation in stochastic optimization. The method is applicable to a wide range of objective functions, as well as to a large set of local optimization techniques. We present the derivation of the method, details of its application to gradient descent and to some of its variants, and examples of its use in the gradient optimization of several functions, as well as in the training of a multilayer perceptron by on-line backpropagation. The paper has 24 pages, and is available in compressed postscript form (162 kB) at ftp://146.193.2.131/pub/lba/papers/adsteps.ps.gz and in uncompressed postscript form (956 kB) at ftp://146.193.2.131/pub/lba/papers/adsteps.ps Comments are welcome. Luis B. Almeida Phone: +351-1-3100246,+351-1-3544607 INESC Fax: +351-1-3145843 R. Alves Redol, 9 E-mail: lba at inesc.pt 1000 Lisboa, Portugal http://ilusion.inesc.pt/~lba/lba.html ------------------------------------------------------------------------ *** Indonesia is killing innocent people in East Timor *** see http://amadeus.inesc.pt/~jota/Timor/ From jin at mail.utexas.edu Sun Mar 8 17:09:48 1998 From: jin at mail.utexas.edu (Hiroshi Jin) Date: Sun, 8 Mar 1998 16:09:48 -0600 Subject: No subject Message-ID: To users of the tlearn neural network simulator: Version 1.0.1 of the tlearn simulator is now available for ftp at two sites: ftp://ftp.psych.ox.ac.uk/pub/tlearn/ (Old Worlds Users) ftp://crl.ucsd.edu/pub/neuralnets/tlearn (New Worlds Users) This version mostly involves bug fixes to the earlier version. A complete user manual for the software plus a set of tutorial exercises is available in: Plunkett and Elman (1997) "Exercises in Rethinking Innateness: A Handbook for Connectionist Simulations". MIT Press. For WWW access, the San Diego tlearn page is http://crl.ucsd.edu/innate/tlearn.html This contains a link to the directory containing the binaries: ftp://crl.ucsd.edu/pub/neuralnets/tlearn Then click on the filename(s) to download. For direct ftp/fetch access via anonymous login: - ftp/fetch to crl.ucsd.edu (132.239.63.1) - login anonymous/email address - cd pub/neuralnets/tlearn At the Oxford site: ftp://ftp.psych.ox.ac.uk/pub/tlearn/wintlrn1.0.1.zip is a zip archive that contains the windows 95 tlearn executable. ftp://ftp.psych.ox.ac.uk/pub/tlearn/wintlrn.zip is a link that always points to the latest version, in this case wintlrn1.0.1.zip. The mac version is in the following location: ftp://ftp.psych.ox.ac.uk/pub/tlearn/mac_tlearn_1.0.1.sea.hqx From terry at salk.edu Mon Mar 9 17:59:02 1998 From: terry at salk.edu (Terry Sejnowski) Date: Mon, 9 Mar 1998 14:59:02 -0800 (PST) Subject: NEURAL COMPUTATION 10:3 Message-ID: <199803092259.OAA03105@helmholtz.salk.edu> Neural Computation - Contents Volume 10, Number 3 - April 1, 1998 ARTICLE Towards A Biophysically Plausible Bidirectional Hebbian Rule Norberto M. Grzywacz and Pierre-Yves Burgi NOTES Axon Guidance: Stretching Gradients to the Limit Geoffrey Goodhill and Herwig Baier Equivalence of a Sprouting-And-Retraction Model and Correlation-Based Plasticity Models of Neural Development Kenneth D. Miller Axonal Processes and Neural Plasticity: A Reply T. Elliott, C.I. Howarth and N. R. Shadbolt LETTERS Synaptic Delay Learning in Pulse-Coupled Neurons Harold Huning, Helmut Glunder and Gunther Palm Neural Processing in the Subsecond Time Range in the Temporal Cortex Kiyohiko Nakamura Temporal-to-Rate-Code Conversion by Neuronal Phase-Locked Loops Ehud Ahissar Deformation Theory of the Dynamic Link Matching Toru Aonishi and Koji Kurata Constrained Optimization for Neural Map Formation: A Unifying Framework for Weight Growth and Normalization Laurenz Wiskott and Terrence J. Sejnowski Breaking Rotational Symmetry in a Self-Organizing Map-Model for Orientation Map Development M. Riesenhuber, H.-U. Bauer, D. Brockmann and T. Geisel Nonlinear Time-Series Prediction with Missing and Noisy Data Volker Tresp and Reimar Hofmann Issues in Bayesian Analysis of Neural Network Models Peter Muller and David Rios Insua ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1998 - VOLUME 10 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $78 Individual $82 $87.74 $110 Institution $285 $304.95 $318 * includes 7% GST (Back issues from Volumes 1-9 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From franz at homer.njit.edu Mon Mar 9 18:46:44 1998 From: franz at homer.njit.edu (Franz Kurfess) Date: Mon, 9 Mar 1998 18:46:44 -0500 Subject: CfP Special Issue "Neural Networks and Structured Knowledge" Message-ID: <199803092346.SAA21513@vector.njit.edu> Special Issue "Neural Networks and Structured Knowledge" in Applied Intelligence: The International Journal of Artificial Intelligence, Neural Networks, and Complex Problem-Solving Techniques Call for Contributions The submission of papers is invited for a special issue on "Neural Networks and Structured Knowledge" of the Applied Intelligence Journal. Issue Theme The representation and processing of knowledge in computers traditionally has been concentrated on symbol-oriented approaches, where knowledge items are associated with symbols. These symbols are grouped into structures, reflecting the important relationships between the knowledge items. Processing of knowledge then consists of manipulation of the symbolic structures, and the result of the manipulations can be interpreted by the user. Whereas this approach has seen some remarkable successes, there are also domains and problems where it does not seem adequate. Some of the problems are computational complexity, rigidity of the representation, the difficulty of reconciling the artificial model with the real world, the integration of learning into the model, and the treatment of incomplete or uncertain knowledge. Neural networks, on the other hand, have advantages that make them good candidates for overcoming some of the above problems. Whereas approaches to use neural networks for the representation and processing of structured knowledge have been around for quite some time, especially in the area of connectionism, they frequently suffer from problems with expressiveness, knowledge acquisition, adaptivity and learning, or human interpretation. In the last years much progress has been made in the theoretical understanding and the construction of neural systems capable of representing and processing structured knowledge in an adequate way, while maintaining essential capabilities of neural networks such as learning, tolerance of noise, treatment of inconsistencies, and parallel operation. The theme of this special issue comprises * the investigation of the underlying theorecical foundations, * the implementation and evaluation of methods for representation and processing of structured knowledge with neural networks, and * applications of such approaches in various domains. Topics of Interest The list below gives some examples of intended topics. * Concepts and Methods: o extraction, injection and refinement of structured knowledge from, into and by neural networks o inductive discovery/formation of structured knowledge o combining symbolic machine learning techniques with neural lerning paradigms to improve performance o classification, recognition, prediction, matching and manipulation of structured information o neural methods that use or discover structural similarities o neural models to infer hierachical categories o structuring of network architectures: methods for introducing coarse-grained structure into networks, unsupervised learning of internal modularity * Application Areas: o medical and technical diagnosis: discovery and manipulation of structured dependencies, constraints, explanations o molecular biology and chemistry: prediction of molecular structure unfolding, classification of chemical structures, DNA analysis o automated reasoning: robust matching, manipulation of logical terms, proof plans, search space reduction o software engineering: quality testing, modularisation of software o geometrical and spatial reasoning: robotics, structured representation of objects in space, figure animation, layouting of objects o other applications that use, generate or manipulate structures with neural methods: strucures in music composition, legal reasoning, architectures, technical configuration, ... The central theme of this issue will be the treatment of structured information using neural networks, independent of the particular network type or processing paradigm. Thus the theme is orthogonal to the question of connectionist/symbolic integration, and is not intended as a continuation of the more philosphically oriented discussion of symbolic vs. subsymbolic representation and processing. Submission Process Prospective authors should send an electronic mail message indicating their intent to submit a paper to the guest editor of the special issue, Franz J. Kurfess (kurfess at cis.njit.edu). This message should contain a preliminary abstract and three to five keywords. Six hard copies of the final manuscript should be sent to the guest editor (not to the Applied Intelligence Editorial office): Prof. Franz J. Kurfess New Jersey Institute of Technology Phone: (973) 596 5767 Department of Computer and Information Science Fax: (973) 596 5777 University Heights Email: kurfess at cis.njit.edu Newark, NJ 07102-1982 WWW: http://www.cis.njit.edu/~franz To speed up the reviewing process, authors should also send a PostScript version of the paper via email to the guest editor. Prospective authors can find further information about the journal on the home page http://kapis.www.wkap.nl/journalhome.htm/0924-669X Schedule Paper submission deadline: May 1, 1998 Review decision by: July 31, 1998 Final manuscript due: August 31, 1998 Tentative publication date: November 1998 From lba at inesc.pt Tue Mar 10 13:18:21 1998 From: lba at inesc.pt (Luis B. Almeida) Date: Tue, 10 Mar 1998 18:18:21 +0000 Subject: new version of paper Message-ID: <350583ED.D61BBD84@inesc.pt> A new version of the paper Parameter Adaptation in Stochastic Optimization Luis B. Almeida, Thibault Langlois, Jose D. Amaral and Alexander Plakhov is available. In the new version, a few typos in equations have been corrected, following kind remarks from a reader. A few other details also were ironed out. The new version has replaced the old one at the ftp site, so you should use the same addresses to download it: Compressed postscript form (162 kB) at ftp://146.193.2.131/pub/lba/papers/adsteps.ps.gz and uncompressed postscript form (956 kB) at ftp://146.193.2.131/pub/lba/papers/adsteps.ps Comments are welcome. Luis B. Almeida Phone: +351-1-3100246,+351-1-3544607 INESC Fax: +351-1-3145843 R. Alves Redol, 9 E-mail: lba at inesc.pt 1000 Lisboa, Portugal http://ilusion.inesc.pt/~lba/lba.html ------------------------------------------------------------------------ *** Indonesia is killing innocent people in East Timor *** see http://amadeus.inesc.pt/~jota/Timor/ From P.N.Roper at lboro.ac.uk Tue Mar 10 21:21:58 1998 From: P.N.Roper at lboro.ac.uk (Peter Roper) Date: Wed, 11 Mar 1998 02:21:58 +0000 Subject: Postdoc available Message-ID: <3505F546.1142@lboro.ac.uk> RESEARCH ASSOCIATE Neurodynamical Model Of Locomotion In A Simple Vertebrate Department Of Mathematical Sciences A postdoctoral research associate is required to work for two years with Professor P C Bressloff and Dr S Coombes in the Nonlinear and Complex Systems Group, Department of Mathematical Sciences, Loughborough University, UK. The post is funded by EPSRC and will be available from 1 May 1998, or as soon as possible thereafter. The aim of the project is to develop a general dynamical theory of pulse-coupled oscillator networks, and to apply this to a neurobiological model of the swimming and struggling behaviour of the Xenopus Tadpole. The work is in collaboration with Professor Alan Roberts, School of Biological Sciences, Bristol University. The appointment will be on the Research Grade 1A salary scale ?15,159 - ?22,785 per annum, dependent on qualifications and experience. More information about the Nonlinear and Complex Systems Group may be found from http://info.lboro.ac.uk/departments/ma/research/ncsg/index.html and specifically about the post http://info.lboro.ac.uk/departments/ma/research/ncsg/job.html Informal enquiries and requests for application forms should be addressed to Professor P C Bressloff, Department of Mathematical Sciences, Loughborough University, Loughborough, Leicestershire, LE11 3TU, UK tel: 01509-223188, fax: 01509-223969, email P.C.Bressloff at lboro.ac.uk. Please quote reference MA/188/W. Closing date 13 April 1998. -- ______________________________________________________________ Peter Roper Dept Mathematical Sciences Loughborough University LEICS LE11 3TU UK Phone (+44) (0) 1509 228206 work email P.N.Roper at lboro.ac.uk http://www.lboro.ac.uk/departments/ma/pg/peterRoper.html ______________________________________________________________ From Johan.Suykens at esat.kuleuven.ac.be Wed Mar 11 08:41:24 1998 From: Johan.Suykens at esat.kuleuven.ac.be (Johan.Suykens@esat.kuleuven.ac.be) Date: Wed, 11 Mar 1998 14:41:24 +0100 Subject: International Workshop Message-ID: <199803111341.OAA00032@euler.esat.kuleuven.ac.be> Second call for papers International Workshop on *** ADVANCED BLACK-BOX TECHNIQUES FOR NONLINEAR MODELING: THEORY AND APPLICATIONS *** with !!! TIME-SERIES PREDICTION COMPETITION !!! Date: July 8-10, 1998 Place: Katholieke Universiteit Leuven, Belgium On-line Info: http://www.esat.kuleuven.ac.be/sista/workshop/ Organized at the Department of Electrical Engineering (ESAT-SISTA) and the Interdisciplinary Center for Neural Networks (ICNN) in the framework of the project KIT and the Belgian Interuniversity Attraction Pole IUAP P4/02. In cooperation with the IEEE Circuits and Systems Society. * GENERAL SCOPE The rapid growth of the field of neural networks, fuzzy systems and wavelets is offering a variety of new techniques for modeling of nonlinear systems in the broad sense. These topics have been investigated from differents points of view including statistics, identification and control theory, approximation theory, signal processing, nonlinear dynamics, information theory, physics and optimization theory among others. The aim of this workshop is to serve as an interdisciplinary forum for bringing together specialists in these research disciplines. Issues related to the fundamental theory as well as real-life applications will be addressed at the workshop. * TIME-SERIES PREDICTION COMPETITION Within the framework of this workshop a time-series prediction competition will be held. The results of the competition will be announced during the workshop, where the winner will be awarded. Participants in the competition are asked to submit their predicted data together with a short description and references of the methods used. In order to stimulate wide participation in the competition, attendance of the workshop is not mandatory but is of course encouraged. All information about this contest is available at http://www.esat.kuleuven.ac.be/sista/workshop/ . * INVITED SPEAKERS (confirmed) L. Feldkamp (Ford Research, USA) - Extended Kalman filtering C. Micchelli (IBM T.J. Watson, USA) - Density estimation U. Parlitz (Gottingen, Germany) - Nonlinear time-series analysis J. Sjoberg (Goeteborg, Sweden) - Nonlinear system identification S. Tan (Beijing, China) - Wavelet-based system modeling V. Vapnik (AT&T Labs-Research, USA) - Support vector method of function estimation M. Vidyasagar (Bangalore, India) - Statistical learning theory V. Wertz (Louvain-la-Neuve, Belgium) - Fuzzy modeling A workshop book containing the invited talks will be published by Kluwer and will be available at the workshop. * TOPICS include but are not limited to Nonlinear system identification Backpropagation Time series analysis Learning and nonlinear optimization Multilayer perceptrons Recursive algorithms Radial basis function networks Extended Kalman filtering Fuzzy modelling Embedding dimension Wavelets Subspace methods Piecewise linear models Identifiability Mixture of experts Model selection and validation Universal approximation Simulated annealing Recurrent networks Genetic algorithms Regularization Forecasting Bayesian estimation Frequency domain identification Density estimation Classification Information geometry Real-life applications Generalization Software * REGISTRATION Registration fee: 6500 BF for IEEE members and students, and 7500 BF for others (1 US Dollar is approximately 37.5 BF). It includes the workshop book, proceedings, lunches, dinners, refreshments/coffee. For registration form and payment, see http://www.esat.kuleuven.ac.be/sista/workshop/ . * HOTEL INFORMATION A block of rooms has been reserved at Begijnhof Congreshotel, New Damshire, Holiday Inn and Ibis. For more information in order to contact the hotels, see http://www.esat.kuleuven.ac.be/sista/workshop/. * IMPORTANT DATES Deadline paper submission: April 2, 1998 Notification of acceptance: May 4, 1998 Workshop: July 8-10, 1998 Time-series competition: Deadline data submission: March 20, 1998 * Chairman: Johan Suykens Katholieke Universiteit Leuven Departement Elektrotechniek - ESAT/SISTA Kardinaal Mercierlaan 94 B-3001 Leuven (Heverlee), Belgium Tel: 32/16/32 18 02 Fax: 32/16/32 19 70 Email: Johan.Suykens at esat.kuleuven.ac.be Program Committee: B. De Moor, E. Deprettere, D. Roose, J. Schoukens, S. Tan, J. Vandewalle, V. Wertz, Y. Yu From cdr at lobimo.rockefeller.edu Wed Mar 11 11:07:35 1998 From: cdr at lobimo.rockefeller.edu (George Reeke) Date: Wed, 11 Mar 1998 11:07:35 -0500 Subject: Postdoctoral Position Available Message-ID: <980311110735.ZM1252@grane.rockefeller.edu> POSTDOCTORAL ASSOCIATE Laboratory of Biological Modelling The Rockefeller University A postdoctoral position is available for an individual interested in collaborating with the Lab Head, Dr. George Reeke, to develop biologically realistic neural models for behaviors in which temporal interval and pattern recognition and production are main components. Current work in the Laboratory is aimed at applying theoretical principles of neuronal group selection ("neural Darwinism") to generate models for a variety of paradigmatic cases that can be tested by comparing results of computational simulations with data from psychophysical experiments. Applicants should have a Ph.D. in a relevant area of neurobiology or psychology and strong computer skills. Experience with realistic neural simulations or neural networks is desirable. Starting date is flexible. The position is available for 1-2 years depending on accomplishment. Send curriculum vitae, statement of interests, list of publications, and names of three references by regular mail, e-mail, or FAX to the undersigned. The Rockefeller University is an Equal Opportunity Employer. George Reeke Laboratory of Biological Modelling The Rockefeller University 1230 York Avenue New York, NY 10021 phone: (212)-327-7627 FAX: (212)-327-7469 email: reeke at lobimo.rockefeller.edu From sala at digame.dgcd.doc.ca Wed Mar 11 08:17:28 1998 From: sala at digame.dgcd.doc.ca (sala@digame.dgcd.doc.ca) Date: Wed, 11 Mar 1998 08:17:28 -0500 Subject: Research Position Message-ID: <199803111317.IAA10821@digame.doc.ca> Announcement of Research Position in Neural Network Research Neural Network Research Scientist Communications Research Center Ottawa, Ontario Essential Requirements Doctoral degree or equivalent from a recognized university in Physics, Electrical Engineering, or Computer Science. A working knowledge of English or French is required for this position. Experience in conducting independent research in the field of neural networks and in the application of neural networks to problems in pattern recognition and classification. An ability to communicate scientific knowledge effectively, both orally and in writing. Applicants may be required to undergo security clearance prior to hiring. Desirable Requirements A knowledge of neural network architectures and learning paradigms and their role in different types of pattern recognition tasks. A thorough knowledge of computer systems and operating systems (WindowsNT, Windows95, SunOS, and/or Solaris), experience with advanced computer software for the purposes of simulating and controlling neural network circuitry, experience with the design and testing of communications devices, experience with digital signal processing software, and a working knowledge of the MatLab programming environment. In addition, the successful candidate is expected to show a high degree of personal motiviation, initiative, an ability to work as a member of a research team, and an ability to form and maintain working interpersonal relationships. This position is currently funded for a period of three years with a strong possibility of an extension beyond that period. The successful candidate will be paid in accordance with the salary scales appropriate to the SE-RES-01 category (salary range $37,036 to $48,727 Cdn.) and will enjoy all the benefits associated with that position. The Communications Research Center is the premier communications research facility of Industry Canada and is situated at the outskirts of Ottawa at Shirley Bay. The CRC shares the research site with the Defense Research Establishment Ottawa and the Canadian Space Agency's David Florida Lab. The site is served by public transportation and is close to residential neighborhoods with ample rental properties. Please reply by: (1) e-mail with resume as attachment to research at digame.dgcd.crc.ca (2) fax resume and cover letter to 613-991-0246 (3) If preferred, you may submit your resume document directly by anonymous FTP to digame.dgcd.crc.ca, placing the document in the folder "resume". PLEASE BE SURE to send a brief, accompanying e-mail to research at digame.dgcd.crc.ca indicating that you have uploaded your resume, your name, and the exact name of the file deposited on digame.dgcd.crc.ca. From zhangw at redwood.rt.cs.boeing.com Wed Mar 11 19:01:28 1998 From: zhangw at redwood.rt.cs.boeing.com (Wei Zhang) Date: Wed, 11 Mar 1998 16:01:28 -0800 Subject: dissertation available Message-ID: <199803120001.QAA12301@darwin.network-b> Dear colleagues, I finally decide to make my dissertation available on Internet. It is at neuroprose ftp://archive.cis.ohio-state.edu/pub/neuroprose/Thesis/zhang.rl4jss.ps.Z Here are the title and abstract. Thanks. Reinforcement Learning for Job-Shop Scheduling Wei Zhang Oregon State University Department of Computer Science May 1996 173 pages, double side. Abstract. This dissertation studies applying reinforcement learning algorithms to discover good domain-specific heuristics automatically for job-shop scheduling. It focuses on the NASA space shuttle payload processing problem. The problem involves scheduling a set of tasks to satisfy a set of temporal and resource constraints while also seeking to minimize the total length (makespan) of the schedule. The approach described in the dissertation employs a repair-based scheduling problem space that starts with a critical-path schedule and incrementally repairs constraint violations with the goal of finding a short conflict-free schedule. The temporal difference (TD) learning algorithm $TD(\lambda)$ is applied to train a neural network to learn a heuristic evaluation function for choosing repair actions over schedules. This learned evaluation function is used by a one-step lookahead search procedure to find solutions to new scheduling problems. Several important issues that affect the success and the efficiency of learning have been identified and deeply studied. These issues include schedule representation, network architectures, and learning strategies. A number of modifications to the $TD(\lambda)$ algorithm are developed to improve learning performance. Learning is investigated based on both hand-engineered features and raw features. For learning from raw features, a time-delay neural network architecture is developed to extract features from irregular-length schedules. The learning approach is evaluated on synthetic problems and on problems from a NASA space shuttle payload processing task. The evaluation function is learned on small problems and then applied to solve larger problems. Both learning-based schedulers (using hand-engineered features and raw features respectively) perform better than the best existing algorithm for this task---Zweben's iterative repair method. It is important to understand why TD learning works in this application. Several performance measures are employed to investigate learning behavior. We verified that TD learning works properly in capturing the evaluation function. It is concluded that TD learning along with a set of good features and a proper neural network is the key to this success. The success shows that reinforcement learning methods have the potential for quickly finding high-quality solutions to other combinatorial optimization problems. #=====================================================================# | Dr. Wei Zhang | ___ ___ ___ ___ ___ | | Computer Scientist | /__// //__ / /\ // _ | | Adaptive Systems | /__//__//__ _/_ / //__/ | | Applied Research & Technology | | | | P.O. Box 3707, M/S 7L-66 | | Voice: (425) 865-2602 | Seattle, WA 98124-2207 | | FAX: (425) 865-2964 | -- or for ground express mail -- | | | 2710 160th Ave. S.E., Bldg. 33-07 | | zhangw at redwood.rt.cs.boeing.com | Bellevue, WA 98008 | #=====================================================================# From recruit at phz.com Thu Mar 12 13:17:31 1998 From: recruit at phz.com (PHZ Recruiting) Date: Thu, 12 Mar 98 13:17:31 EST Subject: financial modeling job available in Boston area Message-ID: <9803121817.AA11786@phz.com> Applied research position available immediately in QUANTITATIVE MODELING OF FINANCIAL MARKETS at PHZ CAPITAL PARTNERS LP March, 1998 PHZ is a small Boston area company founded in 1993 which invests client money in global financial markets using state-of-the-art proprietary statistical models. Our principals are Tomaso Poggio, Jim Hutchinson, and Xiru Zhang, and our partners include Commodities Corporation LLC, a Goldman Sachs company. PHZ's strong trading performance to date has led to exceptional client interest and asset growth. To further expand our business, PHZ is now looking for a talented, hard working person to join our research and development team to work on our next generation of trading systems. The successful applicant for this position will have a Ph.D. in computer science, math, finance, or a related field, or 4-5 years of work experience in an applied research setting. Experience with machine learning / advanced statistical modeling techniques (e.g. neural networks, genetic algorithms, etc) and their application to real world numerical data sets is required, as are strong software engineering skills (esp. on PCs and Unix). Knowledge of financial markets is also a plus. Depending on candidate interests and skills, this position will involve or lead into basic research and application of sophisticated model development tools, exploratory data gathering and analysis, development of our trading and risk management software platform, and/or trading and monitoring of live models. The growth potential of this position is large, both in terms of responsibilities and compensation. Initial compensation will be competitive based on qualifications, possibly including partnership equity. Interested applicants should email resumes (ascii format) to recruiting at phz.com, or send by US mail to: Attn: Recruiting PHZ Capital Partners LP 111 Speen St, Suite 313 Framingham, MA 01701 USA From lbl at nagoya.riken.go.jp Fri Mar 13 02:49:43 1998 From: lbl at nagoya.riken.go.jp (Bao-Liang Lu) Date: Fri, 13 Mar 1998 16:49:43 +0900 Subject: TR available: Task Decomposition and Module Combination Message-ID: <9803130749.AA13600@xian> The following Technical Report is available via anonymous FTP. FTP-host:ftp.bmc.riken.go.jp FTP-file:pub/publish/Lu/lu-ieee-tnn-98-2nd-rev.ps.gz ========================================================================== TITLE: Task Decomposition and Module Combination Based on Class Relations: A Modular Neural Network for Pattern Classification BMC Technical Report BMC-TR-98-1 AUTHORS: Bao-Liang Lu and Masami Ito ORGANISATIONS: Bio-Mimetic Control Research Center, The Institute of Physical and Chemical Research (RIKEN) ABSTRACT: In this paper, we propose a new method for decomposing pattern classification problems based on the class relations among training data. By using this method, we can divide a $K$-class classification problem into a series of ${K\choose 2}$ two-class problems. These two-class problems are to discriminate class ${\cal C}_{i}$ from class ${\cal C}_{j}$ for $i=1,\, \cdots,\, K$ and $j=i+1$, while the existence of the training data belonging to the other $K-2$ classes is ignored. If the two-class problem of discriminating class ${\cal C}_{i}$ from class ${\cal C}_{j}$ is still hard to be learned, we can further break down it into a set of two-class subproblems as small as we expect. Since each of the two-class problems can be treated as a completely separate classification problem with the proposed learning paradigm, the two-class problems can be learned by different network modules in parallel. We also propose two module combination principles which give practical guidelines in integrating individual trained modules. After learning of each of the two-class problems with a network module, we can easily integrate all of the trained modules into a min-max modular (${\rm M}^{3}$) network according to the module combination principles and obtain a solution to the original problem. Consequently, a large-scale and complex $K$-class classification problem can be solved effortlessly and efficiently by learning a series of smaller and simpler two-class problems in parallel. (38 pages, 4.8Mk) Any comments are appreciated. Bao-Liang Lu ====================================== Bio-Mimetic Control Research Center The Institute of Physical and Chemical Research (RIKEN) 2271-130, Anagahora, Shimoshidami, Moriyama-ku Nagoya 463-0003, Japan Tel: +81-52-736-5870 Fax: +81-52-736-5871 Email: lbl at bmc.riken.go.jp From blair at csee.uq.edu.au Fri Mar 13 03:26:01 1998 From: blair at csee.uq.edu.au (Alan Blair) Date: Fri, 13 Mar 1998 18:26:01 +1000 (EST) Subject: CFP - Simulated Evolution And Learning (SEAL'98) Message-ID: *********************************** * C A L L F O R P A P E R S * *********************************** The Second Asia-Pacific Conference on Simulated Evolution And Learning (SEAL'98) Canberra, Australia 24-27 November 1998 Hosted by School of Computer Science University College, the University of New South Wales Australian Defence Force Academy, Canberra, ACT 2600, Australia in cooperation with COMPLEX SYSTEMS '98, Sydney, Australia, Nov 30 - Dec 4, 1998 IEEE ACT Section Conference URL: http://www.cs.adfa.oz.au/conference/seal98 Aims and Scopes --------------- Evolution and learning are two fundamental forms of adaptation. This conference follows the successful SEAL'96 (The First Asia-Pacific Conference on Simulated Evolution And Learning) in Taejon, Korea, 9-12 November 1996, and aims at exploring these two forms of adaptation and their roles and interactions in adaptive systems. Cross-fertilisation between evolutionary learning and other machine learning approaches, such as neural network learning, reinforcement learning, decision tree learning, fuzzy system learning, etc., will be strongly encouraged by the conference. The other major theme of the conference is optimisation by evolutionary approaches or hybrid evolutionary approaches. The conference will feature both academic and application streams in order to encourage more interactions between researchers and practitioners in the field of simulated evolution and learning. To provide timely feedback, the applications stream will use a two-stage reviewing process featuring preliminary acceptance by abstract and final acceptance by complete paper. The refereeing panel for the applications stream will consist of eminent practitioners in the field. The topics of interest to this conference include but are not limited to the following: 1. Evolutionary Learning + Fundamental Issues in Evolutionary Learning (e.g., Generalisation, Scalability, and Computational Complexity) + Co-Evolutionary Learning + Modular Evolutionary Learning Systems + Classifier Systems + Artificial Immune Systems + Representation Issue in Evolutionary Learning (e.g., rules, trees, graphs, etc.) + Interactions Between Learning and Evolution + Comparison between Evolutionary Learning and Other Learning Approaches (Neural Network, Decision Tree, Reinforcement Learning, etc.) 2. Evolutionary Optimisation + Global (Numerical/Function) Optimisation + Combinatorial Optimisation (e.g., scheduling, allocation, planning, packing, transportation, and various tree/graph problems.) + Comparison between evolutionary and non-evolutionary optimisation algorithms + Hybrid Optimisation Algorithms 3. Hybrid Learning + Evolutionary Artificial Neural Networks + Evolutionary Fuzzy Systems + Combinations Between Evolutionary and Reinforcement Learning + Combinations Between Evolutionary and Decision Tree Learning + Evolutionary Clustering and Unsupervised Learning + Genetic Programming + Other Hybrid Learning Systems 4. Adaptive Systems + Complexity in Adaptive Systems + Evolutionary Robotics + Artificial Ecology 5. Evolutionary Economics and Games + Analysis and Simulation in Evolutionary Economics, Finance and Marketing + Evolutionary Games + Evolutionary Computation Techniques in Economics, Finance and Marketing 6. Theoretical Issues in Evolutionary Computation + Convergence and Convergence Rate of Evolutionary Algorithms + Computational Complexity of Evolutionary Algorithms + Self-Adaptation in Evolutionary Algorithms 7. Evolvable Hardware (EHW) + FPGA Implementation of EHW + Algorithms for EHW + EHW Systems and Chips 8. Applications + Novel Applications of Evolutionary Techniques + Optimisation Algorithms for Real-World Problems + Solving classical OR (Operations Research) problems by Evolutionary Algorithms + Pattern Classification and Recognition + Time-Series Prediction + System Identification + Others Important Dates --------------- Regular Papers 03 July 1998 Deadline for submission of papers (<=8 pages) 21 Aug. 1998 Notification of acceptance 25 Sep. 1998 Deadline for camera-ready copies of accepted papers 24-27 Nov. 1998 Conference sessions Applications Papers 19 June 1998 Deadline for submission of applications abstracts (<= 1 page) (abstracts to be submitted by email) 03 July 1998 Notification of preliminary acceptance of applications papers 31 July 1998 Deadline for submission of applications papers (<=8 pages) 21 Aug. 1998 Notification of acceptance of applications papers 25 Sep. 1998 Deadline for camera-ready copies of applications papers 24-27 Nov. 1998 Conference sessions Paper Submission ---------------- Applications paper abstracts should be submitted by email before the cutoff date to SEAL98 at cs.adfa.oz.au. FOUR (4) hard copies of the completed paper should be submitted to the programme committee chair by the due date. All manuscripts should be prepared in LaTeX according to Springer-Verlag's llncs style (URL: gopher://trick.ntp.springer.de/11/tex/latex/llncs). Each submitted paper must include a title, a 300-400 word abstract, a list of keywords, the names and addresses of all authors (including email addresses, and telephone and fax numbers), and the body. The length of submitted papers must be no more than 8 single-spaced, single-column pages including all figures, tables, and bibliography. Shorter papers are strongly encouraged. Papers should be submitted to the following address: Dr Xin Yao School of Computer Science University College, UNSW Australian Defence Force Academy Canberra, ACT 2600, Australia Publications ------------ All accepted papers which are presented at the conference will be included in the conference proceedings. Outstanding papers will be considered for inclusion in a proposed volume of Springer Verlag's Lecture Notes in Artificial Intelligence (LNAI). Some of them will be invited to further expand and revise their papers for inclusion in an international journal. (Selections from SEAL'96 papers were published by Springer-Verlag as Volume 1285 of LNAI.) Special Sessions and Tutorials ------------------------------ Special sessions and tutorials will be organised at the conference. The conference is calling for special session and tutorial proposals. Contact Persons --------------- Conference General Chair: Professor Charles Newton School of Computer Science University College, UNSW, ADFA Canberra, ACT 2600, Australia Organising Committee Chair: Dr Bob McKay School of Computer Science University College, UNSW, ADFA Canberra, ACT 2600, Australia Programme Committee Co-Chair: Dr Xin Yao School of Computer Science University College, UNSW, ADFA Canberra, ACT 2600, Australia Programme Committee Co-Chair: Professor Jong-Hwan Kim Department of Electrical Engineering KAIST 373-1, Kusung-dong, Yusung-gu, Taejon-shi 305-701, Republic of Korea Email: johkim at vivaldi.kaist.ac.kr Programme Committee Co-Chair: Professor Takeshi Furuhashi Department of Information Electronics Nagoya University Furo-cho, Chikusa-ku, Nagoya 464-01 Japan Email: furuhashi at nuee.nagoya-u.ac.jp Special Sessions Chair: Professor Kit Po Wong Department of Electrical Engineering The University of Western Australia Nedlands, WA 6009, Australia Email: kitpo at ee.uwa.edu.au Conference Secretary: Miss Alison McMaster School of Computer Science University College, UNSW, ADFA Canberra, ACT 2600, Australia Email: SEAL98 at cs.adfa.oz.au Phone: +61 2 6268 8184, Fax: +61 2 6268 8581 From barba at cvs.rochester.edu Mon Mar 16 12:40:13 1998 From: barba at cvs.rochester.edu (Barbara Arnold) Date: Mon, 16 Mar 1998 12:40:13 -0500 Subject: 21st CVS Symposium Message-ID: 21st CVS Symposium "Environmental Structure, Statistical Learning & Visual Perception" June 4 - 6, 1998 CENTER FOR VISUAL SCIENCE University of Rochester Rochester, NY The Center for Visual Science at the University of Rochester is proud to present the 21st Symposium, "Environmental Structure, Statistical Learning and Visual Perception". The three-day symposium will consist of five sessions plus an open house and lab tours on Saturday afternoon. The meeting will begin with a Reception/Buffet on Wednesday evening, June 3. Formal sessions start Thursday morning, June 4, and end at noon on Saturday. There will be optional banquets held on Thursday and Friday evenings, and a cookout lunch on Saturday. Informal discussion gatherings will follow the banquets. PROGRAM Wednesday, June 3 4:00-10:00 PM Registration 6:00-8:00 PM Reception/Buffet Thursday, June 4 SESSION I: Image Statistics E Simoncelli, New York University C Chubb, University of CA Irvine D Ruderman, The Salk Institute SESSION II: Color Constancy D. Brainard, Univ of CA Santa Barbara S Shevell, University of Chicago A Hurlbert, Univ of Newcastle, England Friday, June 5 SESSION III:Surface Perception T Adelson, MIT L Maloney, New York University Zili Liu, NEC Research Institute SESSION IV: Object Perception D Knill , University of Pennsylvania K Nakayama, Harvard University P Kellman, University of CA Los Angeles Saturday, June 6 SESSION V: Neural Coding and Plasticity W Geisler, University of Texas Austin N Logothetis, Max-Planck Institute SESSION VI: OPEN HOUSE Center for Visual Science Open House and Lab Tours REGISTRATION FEES Preregistration, Regular $125.00 Preregistration, Student $ 95.00 On-site, Regular $180.00 On-site, Student $130.00 To preregister, please return the form posted on our website http://www.cvs.rochester.edu/symposium/propsymposia98.html Please send a separate form for each person registering. No preregistrations will be accepted after May 15. If you do not have access to our website please contact Barbara Arnold at barba at cvs.rochester.edu or 716-275-8659 ACCOMMODATIONS AND MEALS The University has moderate cost rooms available for symposium attendees. Residence halls are centrally located on the campus and are a short walk to Hoyt Hall where the symposium sessions will be held. A special package of residence hall room and all meals and banquets is being offered to Symposium participants. This package includes all meals from Thursday breakfast through the Saturday barbecue. TRAVEL AWARDS A small number of travel awards are available to graduate and postdoctoral students. Applications for travel assistance must be received by May 1, 1998. Please refer to the travel award application form, posted on our website, for more information. http://www.cvs.rochester.edu/symposium/propsymposia98.html If you do not have access to our website please contact Barbara Arnold at barba at cvs.rochester.edu or 716-275-8659 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Barbara Arnold email: barba at cvs.rochester.edu Center for Visual Science phone: 716 275 8659 Room 274 Meliora Hall fax: 716 271 3043 University of Rochester Rochester NY 14627-0270 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From aperez at lslsun.epfl.ch Tue Mar 17 08:15:59 1998 From: aperez at lslsun.epfl.ch (Andres Perez-Uribe) Date: Tue, 17 Mar 1998 14:15:59 +0100 Subject: Java Applet:Black Jack and Reinforcement Learning Message-ID: <350E778E.136C53F9@lslsun.epfl.ch> Dear Connectionist, This is to announce a Java applet that implements a simplified version of the game of Black Jack. One or two players can play against the dealer (i.e., the casino). Though one or both players can be set to be your computer. By default, the computer plays in a random manner. However, you may let it play against the dealer and learn to play Black Jack from experience. The learning algorithm it uses is called the SARSA algorithm, a reinforcement learning algorithm introduced by G.Rummery and M.Niranjan. URL : http://lslwww.epfl.ch/~aperez/BlackJack/ For further information on reinforcement learning and Black Jack playing, you may refer to the www page "Learning to Play Black Jack with Artificial Neural Networks" : URL : http://lslwww.epfl.ch/~aperez/rlbj.html also at the Logic Systems Laboratory, Swiss Federal Institute of Technology-Lausanne. Best regards, -- Andres PEREZ-URIBE Logic Systems Laboratory Computer Science Department Swiss Federal Institute of Technology-Lausanne 1015 Lausanne, Switzerland Email: aperez at lslsun.epfl.ch http://lslwww.epfl.ch/~aperez Tel: +41-21-693-2652 Fax: +41-21-693 3705 From trevor.clarkson at kcl.ac.uk Wed Mar 18 09:24:27 1998 From: trevor.clarkson at kcl.ac.uk (Trevor Clarkson) Date: Wed, 18 Mar 1998 14:24:27 +0000 Subject: 4-month RA post at King's College Message-ID: <1.5.4.16.19980318142427.297f250c@mail.kcl.ac.uk> RESEARCH ASSISTANT (NEURAL NETWORKS) from June - September 1998. A post is available for 4-months, starting in June 1998 for a Research Assistant at point 9 on the scale, at King's College London. The post is intended for a post-doctoral researcher. A feasability study will be carried out with a customer in Cambridge to develop a neural network system to provide accurate ink-drop placement in an ink-jet printer. Some travel to Cambridge will be required and this will be fully funded. Experience in neural networks, real-time systems or neural hardware will be required. The ability to work to a tight schedule is essential. Owing to the imminent start date, applicants should preferably be EU citizens. For further details please contact Professor Trevor Clarkson at King's College London (tgc at kcl.ac.uk). (Please pass on this message to potential candidates. Thank you.) From harnad at coglit.soton.ac.uk Wed Mar 18 11:09:25 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Wed, 18 Mar 1998 16:09:25 +0000 (GMT) Subject: Words in the Brain's Language: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article on: WORDS IN THE BRAIN'S LANGUAGE by Friedemann Pulvermueller This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences Department of Psychology University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. ___________________________________________________________________ WORDS IN THE BRAIN'S LANGUAGE Friedemann Pulvermueller Fachgruppe Psychologie Universitaet Konstanz 78434 Konstanz Germany pumue at uni-tuebingen.de KEYWORDS: associative learning, cell assembly, cognition, cortex, language, word category ABSTRACT: If the cortex is an associative memory, strongly connected cell assemblies will form when neurons in different cortical areas are frequently active at the same time. The cortical distributions of these assemblies must be a consequence of where in the cortex correlated neuronal activity occurred during learning. An assembly can be considered a functional unit exhibiting activity states such as full activation (ignition) after appropriate sensory stimulation (possibly related to perception) and continuous reverberation of excitation within the assembly (a putative memory process). This has implications for cortical topographies and activity dynamics of cell assemblies representing words. Cortical topographies of assemblies should be related to aspects of the meaning of the words they represent, and physiological signs of cell assembly ignition should be followed by possible indicators of reverberation. The following postulates are discussed in detail: (1) assemblies representing phonological word forms are strongly lateralized and distributed over perisylvian cortices; (2) assemblies representing highly abstract words, such as grammatical function words, are also strongly lateralized and restricted to these perisylvian regions; (3) assemblies representing concrete content words include additional neurons in both hemispheres; (4) assemblies representing words referring to visual stimuli include neurons in visual cortices; (5) assemblies representing words referring to actions include neurons in motor cortices. Two main sources of evidence are used for evaluating these proposals: (a) imaging studies aiming at localizing word processing in the brain, based on stimulus-triggered event-related potentials (ERP), positron emission tomography (PET) and functional magnetic resonance imaging (fMRI), and (b) studies of the temporal dynamics of fast activity changes in the brain, as revealed by high-frequency responses recorded in the electroencephalogram (EEG) and magnetoencephalogram (MEG). These data provide evidence for processing differences between words and matched meaningless pseudowords, and between word classes such as concrete content and abstract function words, and words evoking visual or motor associations. There is evidence for early word class-specific spreading of neuronal activity and for equally specific high-frequency responses occurring later. These results support a neurobiological model of language in the Hebbian tradition. Competing large-scale neuronal theories of language are discussed in the light of the summarized data. A final paragraph addresses neurobiological perspectives on the problem of serial order of words in syntactic strings. -------------------------------------------------------------- To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp or gopher from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.pulvermueller.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.pulvermueller ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.pulvermueller gopher://gopher.princeton.edu:70/11/.libraries/.pujournals To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.pulvermueller When you have the file(s) you want, type: quit From ssingh at soc.plym.ac.uk Thu Mar 19 14:13:18 1998 From: ssingh at soc.plym.ac.uk (Sameer Singh) Date: Thu, 19 Mar 1998 19:13:18 +0000 Subject: PhD studentship Message-ID: <199803191909.TAA20911@hebe.soc.plym.ac.uk> UNIVERSITY OF PLYMOUTH, UK SCHOOL OF COMPUTING PhD Studentship in Financial Forecasting using Neural Networks Applications are invited for a PhD studentship in the area of financial forecasting neural networks. The studentships only cover the UK/EEC fees, and maintenance. The applicant should have a good background in programming with C/C++ and ideally should have experience with neural networks. Postgraduate students with an MSc degree are particularly encouraged to apply. Applications can be made by e-mailing your CV to Dr. Sameer Singh at the address given below. For further information, Dr. Singh may be emailed at s1singh at plym.ac.uk. Deadline for applications: 1 April, 1998 ___________________________________________________ School of Computing University of Plymouth Kirkby Place Plymouth PL4 8AA UK Tel: +44-1752-232612 Fax: +44-1752-232540 e-mail: s1singh at plym.ac.uk/ ssingh at soc.plym.ac.uk web: http://www.soc.plym.ac.uk/soc/sameer __________________________________________________ From rich at cs.umass.edu Thu Mar 19 14:35:12 1998 From: rich at cs.umass.edu (Rich Sutton) Date: Thu, 19 Mar 1998 14:35:12 -0500 Subject: New Textbook on Reinforcement Learning Message-ID: Dear Colleagues, This note is to announce the availability of a new textbook on reinforcement learning by Andy Barto and me. As many of you know, we have been working on this book for over four years. A few weeks ago we received our authors' copies, and the book is now available by internet/mail order and in bookstores: Sutton, R.S., Barto, A.G. (1998) Reinforcement Learning: An Introduction. MIT Press, Cambridge, MA. The rest of this note says a little more about the book and points to further information. As its title indicates, the book is meant to be an introductory treatment of reinforcement learning, emphasizing foundations and ideas rather than the latest developments and mathematical proofs. We divide the ideas underlying the field into a half dozen primary dimensions, consider each in detail, and then combine them to form a much larger space of possible methods including all the most popular ones from Q-learning to value iteration and heuristic search. In this way we have tried to make the book interesting to both newcomers and experts alike. We have tried to make the work accessible to the broadest possible audiences in artificial intelligence, control engineering, operations research, psychology, and neuroscience. If you are a teacher, we urge you to consider creating or altering a course to use the book. We have found that the book works very well as the basis for an independent course on reinforcement learning at the graduate or advanced undergraduate level. The eleven chapters can be covered one per week. Exercises are provided in each chapter to help the students think on their own about the material. Answers to the exercises are available to instructors, for now from me, and probably later from MIT Press in an instructor's manual. Programming projects are also suggested throughout the book. Of course, the book can also be used to help teach reinforcement learning as it is most commonly done now, that is, as part of a broader course on machine learning, artificial intelligence, neural networks, or advanced control. I have taught all the material in the book in as little as four weeks, and of course subsets can be covered in less time. Finally, if you are interested in reviewing the book for a major journal or magazine, please contact our MIT Press publicist, Gita Manaktala (manak at mit.edu or 617-253-5643), directly. Further information about the book, including ordering information and detailed information about its contents, can be obtained from its home page at http://www.cs.umass.edu/~rich/book/the-book.html. Rich Sutton rich at cs.umass.edu From nic at idsia.ch Fri Mar 20 06:15:22 1998 From: nic at idsia.ch (Nici Schraudolph) Date: Fri, 20 Mar 1998 12:15:22 +0100 Subject: TR available: fast exponentiation Message-ID: <199803201115.MAA01103@idsia.ch> Dear colleagues, the following technical note (5 pages) is available by anonymous ftp; it may be of interest to those who write their own C/C++ neural network code. Best regards, -- Dr. Nicol N. Schraudolph Tel: +41-91-970-3877 IDSIA Fax: +41-91-911-9839 Corso Elvezia 36 CH-6900 Lugano http://www.idsia.ch/~nic/ Switzerland http://www.cnl.salk.edu/~schraudo/ Technical Report IDSIA-07-98: A Fast, Compact Approximation of the Exponential Function --------------------------------------------------------- Nicol N. Schraudolph Neural network simulations often spend a large proportion of their time computing exponential functions. Since the exponentiation routines of typical math libraries are rather slow, their replacement with a fast approximation can greatly reduce the overall computation time. This note describes how exponentiation can be approximated by manipulating the components of a standard (IEEE-754) floating-point representation. This models the exponential function as well as a lookup table with linear interpolation, but is significantly faster and more compact. ftp://ftp.idsia.ch/pub/nic/exp.ps.gz From Jakub.Zavrel at kub.nl Fri Mar 20 07:31:21 1998 From: Jakub.Zavrel at kub.nl (Jakub.Zavrel@kub.nl) Date: Fri, 20 Mar 1998 13:31:21 +0100 (MET) Subject: Software release: TiMBL 1.0 Message-ID: <199803201231.NAA10874@kubsuw.kub.nl> ---------------------------------------------------------------------- Software release: TiMBL 1.0 Tilburg Memory Based Learner ILK Research Group, http://ilk.kub.nl/ ---------------------------------------------------------------------- The ILK (Induction of Linguistic Knowledge) Research Group at Tilburg University, The Netherlands, announces the release of TiMBL, Tilburg Memory Based Learner (version 1.0). TiMBL is a machine learning program implementing a family of Memory-Based Learning techniques for discrete data. TiMBL stores a representation of the training set explicitly in memory (hence `Memory Based'), and classifies new cases by extrapolating from the most similar stored cases. TiMBL features the following (optional) metrics and speed-up optimalizations that enhance the underlying k-nearest neighbour classifier engine: - Information Gain weighting for dealing with features of differing importance (the IB1-IG learning algorithm). - Stanfill & Waltz's / Cost & Salzberg's (Modified) Value Difference metric for making graded guesses of the match between two different symbolic values. - Conversion of the flat instance memory into a decision tree, and inverted indexing of the instance memory, both yielding faster classification. - Further compression and pruning of the decision tree, guided by feature information gain differences, for an even larger speed-up (the IGTREE learning algorithm). TiMBL accepts commandline arguments by which these metrics and optimalizations can be selected and combined. TiMBL can read the C4.5 and WEKA's ARFF data file formats as well as column files and compact (fixed-width delimiter-less) data. -[download]----------------------------------------------------------- You are invited to download the TiMBL package for educational or non-commercial research purposes. When downloading the package you are asked to register, and express your agreement with the license terms. TiMBL is *not* shareware or public domain software. The TiMBL software package can be downloaded from http://ilk.kub.nl/software.html or by following the `Software' link under the ILK home page at http://ilk.kub.nl/ . The TiMBL package contains the following: - Source code (C++) with a Makefile. - A reference guide containing descriptions of the incorporated algorithms, detailed descriptions of the commandline options, and a brief hands-on tuturial. - Some example datasets. - The text of the licence agreement. - A postscript version of the paper that describes IGTREE. The package should be easy to install on most UNIX systems. -[background]--------------------------------------------------------- Memory-based learning (MBL) has proven to be quite successful in a large number of tasks in Natural Language Processing (NLP) -- MBL of NLP tasks (text-to-speech, part-of-speech tagging, chunking, light parsing) is the main theme of research of the ILK group. At one point it was decided to build a well-coded and generic tool that would combine the group's algorithms, favorite optimization tricks, and interface desiderata, the whole of which is now version 1.0 of TiMBL. We think TiMBL can make a useful tool for NLP research, and, for that matter, for any other domain with discrete classification tasks. For information on the ILK Research Group, visit our site at http://ilk.kub.nl/ On this site you can find links to (postscript versions of) publications relating to the algorithms incorporated in TiMBL and on their application to NLP tasks. The reference guide ("TiMBL: Tilburg Memory-Based Learner, version 1.0, Reference Guide.", Walter Daelemans, Jakub Zavrel, Ko van der Sloot, and Antal van den Bosch. ILK Technical Report 98-03) can be downloaded separately and directly from http://ilk.kub.nl/~ilk/papers/ilk9803.ps.gz For comments and bugreports relating to TiMBL, please send mail to Timbl at kub.nl ---------------------------------------------------------------------- From mkearns at research.att.com Sat Mar 21 14:36:14 1998 From: mkearns at research.att.com (Michael J. Kearns) Date: Sat, 21 Mar 1998 14:36:14 -0500 (EST) Subject: NIPS*98 Call for Papers Message-ID: <199803211936.OAA22406@radish.research.att.com> CALL FOR PAPERS -- NIPS*98 Neural Information Processing Systems -- Natural and Synthetic Monday November 30 - Saturday December 5, 1998 Denver, Colorado This is the twelfth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks and oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Nov. 30), and following it there will be two days of focused workshops on topical issues at a nearby ski area (Dec. 4-5). Major categories for paper submission, with example subcategories (by no means exhaustive), are as follows: Algorithms and Architectures: supervised and unsupervised learning algorithms, model selection algorithms, active learning algorithms, feedforward and recurrent network architectures, localized basis functions, mixture models, belief networks, graphical models, Gaussian processes, factor analysis, topographic maps, combinatorial optimization. Applications: handwriting recognition, DNA and protein sequence analysis, expert systems, fault diagnosis, medical diagnosis, analysis of medical images, data analysis, database mining, network traffic, music processing, time-series prediction, financial analysis. Artificial Intelligence: inductive reasoning, problem solving and planning, natural language, hybrid symbolic-subsymbolic systems. Cognitive Science: perception and psychophysics, neuropsychology, cognitive neuroscience, development, conditioning, human learning and memory, attention, language. Implementation: analog and digital VLSI, optical neurocomputing systems, novel neurodevices, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics. Reinforcement Learning and Control: exploration, planning, navigation, Q-learning, TD-learning, dynamic programming, robotic motor control, process control, Markov decision processes. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, auditory scene analysis, source separation, hidden Markov models, models of human speech perception. Theory: computational learning theory, statistical physics of learning, information theory, prediction and generalization, regularization, Boltzmann machines, Helmholtz machines, decision trees, support vector machines, online learning, dynamics of learning algorithms, approximation and estimation theory, learning of dynamical systems, model selection, complexity theory. Visual Processing: image processing, image coding, object recognition, visual psychophysics, stereopsis, motion detection and tracking. REVIEW CRITERIA: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting final camera-ready copy. PAPER FORMAT: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are strongly encouraged to use the NIPS LaTeX style files obtainable by anonymous FTP at the site given below. Papers must indicate (1) physical and e-mail addresses of all authors; (2) one of the nine major categories listed above, and a subcategory if desired; (3) if the work, or any substantial part thereof, has been submitted to or has appeared in other scientific conferences; (4) the authors' preference, if any, for oral or poster presentation (this preference will play no role in paper acceptance); and (5) author to whom correspondence should be addressed. SUBMISSION INSTRUCTIONS: Send eight copies of submitted papers to the address below; electronic or FAX submission is not acceptable. Include one additional copy of the abstract only, to be used for preparation of the abstracts booklet distributed at the meeting. SUBMISSIONS MUST BE RECEIVED BY MAY 22, 1998. From within the U.S., submissions will be accepted if mailed first class and postmarked by May 19, 1998. Mail submissions to: Sara A. Solla NIPS*98 Program Chair Department of Physiology Ward Building 5-003, MC211 Northwestern University Medical School 303 E. Chicago Avenue Chicago, IL 60611-3008, USA Mail general inquiries and requests for registration material to: NIPS Foundation Computational Neurobiology Laboratory Salk Institute for Biological Studies 10010 North Torrey Pines Road La Jolla, CA 92037 FAX: (619)587-0417 E-mail: nipsinfo at salk.edu Copies of the LaTeX style files for NIPS are available via anonymous ftp at ftp.cs.cmu.edu (128.2.206.173) in /afs/cs/Web/Groups/NIPS/formatting The style files and other conference information may also be retrieved via World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS NIPS*98 Organizing Committee: General Chair, Michael Kearns, AT&T Labs Research; Program Chair, Sara Solla, Northwestern University; Publications Chair, David Cohn, Harlequin; Tutorial Chair, Klaus Mueller, GMD First; Workshops Co-Chairs, Richard Zemel, University of Arizona, and Sue Becker, McMaster University; Publicity Chair, Jonathan Baxter, Australian National University; Treasurer, Bartlett Mel, University of Southern California; Web Master, Doug Baker, Carnegie Mellon University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*98 Program Committee: Andrew Barto, University of Massachusetts; Joachim Buhmann, University of Bonn; Yoav Freund, AT&T Labs Research; Lars Kai Hansen, Danish Technical University; Nathan Intrator, Brown University; Robert Jacobs, University of Rochester; Esther Levin, AT&T Labs Research; Alexandre Pouget, Georgetown University; David Saad, Aston University; Lawrence Saul, AT&T Labs Research; Sara Solla, Northwestern University (chair); Sebastian Thrun, Carnegie Mellon University; Yair Weiss, MIT. DEADLINE FOR RECEIPT OF SUBMISSIONS IS MAY 22, 1998 - please post - From mkearns at research.att.com Sat Mar 21 14:36:41 1998 From: mkearns at research.att.com (Michael J. Kearns) Date: Sat, 21 Mar 1998 14:36:41 -0500 (EST) Subject: NIPS*98 Call for Workshop Proposals Message-ID: <199803211936.OAA22409@radish.research.att.com> CALL FOR PROPOSALS NIPS*98 Post Conference Workshops December 4 and 5, 1998 Breckenridge, Colorado Following the regular program of the Neural Information Processing Systems 1998 conference, workshops on current topics in neural information processing will be held on December 4 and 5, 1998, in Breckenridge, Colorado. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Past topics have included: Active Learning, Architectural Issues, Attention, Audition, Bayesian Analysis, Bayesian Networks, Benchmarking, Brain Imaging, Computational Complexity, Computational Molecular Biology, Control, Genetic Algorithms, Graphical Models, Hippocampus and Memory, Hybrid HMM/ANN Systems, Implementations, Music, Neural Plasticity, Language Processing, Lexical Acquisition, Network Dynamics, On-Line Learning, Optimization, Recurrent Nets, Robot Learning, Rule Extraction, Self-Organization, Sensory Biophysics, Signal Processing, Support Vectors, Speech, Time Series, Topological Maps, and Vision Models and Applications. The goal of the workshops is to provide an informal forum for researchers to discuss important issues of current interest. There will be two workshop sessions a day, for a total of six hours, with free time in between for ongoing individual exchange or outdoor activities. Concrete open and/or controversial issues are encouraged and preferred as workshop topics. Representation of alternative viewpoints and panel-style discussions are particularly encouraged. Workshop organizers will have responsibilities including: 1) coordinating workshop participation and content, which involves arranging short informal presentations by experts working in an area, arranging for expert commentators to sit on a discussion panel and formulating a set of discussion topics, etc. 2) moderating or leading the discussion and reporting its high points, findings, and conclusions to the group during evening plenary sessions 3) writing a brief summary and/or coordinating submitted material for post-conference electronic dissemination. Submission Instructions ----------------------- Interested parties should submit via e-mail a short proposal for a workshop of interest by May 29, 1998. Proposals should include a title, a description of what the workshop is to address and accomplish, the proposed length of the workshop (one day or two days), the planned format (mini-conference, panel discussion, or group discussion, combinations of the above, etc), and the proposed number of speakers. Where possible, please also indicate potential invitees (particularly for panel discussions). Please note that this year we are looking for fewer "mini-conference" workshops and greater variety of workshop formats. The time allotted to workshops is six hours each day, in two sessions of three hours each. We strongly encourage that the organizers reserve a significant portion of time for open discussion. The proposal should motivate why the topic is of interest or controversial, why it should be discussed and who the targeted group of participants is. In addition, please send a brief resume of the prospective workshop chair, a list of publications, and evidence of scholarship in the field of interest. Submissions should include contact name, address, e-mail address, phone number and fax number if available. Proposals should be mailed electronically to zemel at u.arizona.edu. All proposals must be RECEIVED by May 29, 1998. If e-mail is unavailable, mail so as to arrive by the deadline to: NIPS*98 Workshops c/o Richard Zemel Department of Psychology University of Arizona Tucson, AZ 85721 Questions may be addressed to either of the Workshop Co-Chairs: Richard Zemel Sue Becker University of Arizona McMaster University zemel at u.arizona.edu becker at mcmaster.ca PROPOSALS MUST BE RECEIVED BY MAY 29, 1998 -Please Post- From aweigend at stern.nyu.edu Sat Mar 21 16:54:31 1998 From: aweigend at stern.nyu.edu (Andreas Weigend) Date: Sat, 21 Mar 1998 16:54:31 -0500 (EST) Subject: New Book: Decision Technologies for Financial Engineering Message-ID: <199803212154.QAA02093@sabai.stern.nyu.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 3010 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/d9e75983/attachment.ksh From georgiou at wiley.csusb.edu Mon Mar 23 03:45:31 1998 From: georgiou at wiley.csusb.edu (georgiou@wiley.csusb.edu) Date: Mon, 23 Mar 1998 00:45:31 -0800 (PST) Subject: ICCIN'98: Call for Papers Message-ID: <199803230845.AAA26948@wiley.csusb.edu> Call for Papers 3nd International Conference on COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE http://www.csci.csusb.edu/iccin Sheraton Imperial Hotel & Convention Center Research Triangle Park, North Carolina October 23-28, 1998 Conference Co-chairs: Subhash C. Kak, Louisiana State University Jeffrey P. Sutton, Harvard University This conference is part of the Fourth Joint Conference Information Sciences. Plenary Speakers include the following: James Anderson Panos J. Antsaklis John Baillieul Walter Freeman David Fogel Stephen Grossberg Stuart Hameroff Yu Chi Ho Thomas S.Huang George J. Klir Teuvo Kohonen John Koza Richard G. Palmer Zdzislaw Pawlak Karl Pribram Azriel Rosenfeld Julius T. Tou I.Burhan Turksen Paul J. Werbos A.K.C.Wong Lotfi A. Zadeh Hans J.Zimmermann Areas for which papers are sought include: o Artificial Life o Artificially Intelligent NNs o Associative Memory o Cognitive Science o Computational Intelligence o Efficiency/Robustness Comparisons o Evaluationary Computation for Neural Networks o Feature Extraction & Pattern Recognition o Implementations (electronic, Optical, Biochips) o Intelligent Control o Learning and Memory o Neural Network Architectures o Neurocognition o Neurodynamics o Optimization o Parallel Computer Applications o Theory of Evolutionary Computation Summary (4 pages) Submission Deadline: June 1, 1998 Decision & Notification: August 1, 1998 For more information please see Conference Web site: http://www.csci.csusb.edu/iccin From rmeir at dumbo.technion.ac.il Mon Mar 23 04:57:23 1998 From: rmeir at dumbo.technion.ac.il (Ron Meir) Date: Mon, 23 Mar 1998 12:57:23 +0300 (IDT) Subject: Postdoc or Ph.D. position at the Technion, Israel Message-ID: Postdoctoral or PhD Position Aerospace and Electrical Engineering Technion - Israel Institute of Technology The Technion has an immediate opening for a two-year post-doc position, to pursue research on non-linear fault detection and isolation, with application to robust and affordable flight control systems for small commercial aircraft. The project is part of a large scale European effort and is funded by the Brite-Euram foundation. The successful candidate is expected to have some working knowledge of at least one of the following fields: flight control, neural networks, non-linear filtering, and non-linear system identification. Experience in applying non-linear approaches, such as neural networks, extended Kalman filtering etc. to real-world problems is a definite asset. Candidates are expected to have a Ph.D. in Aerospace or Electrical Engineering, Applied Mathematics, or Computer Science. Strong analytical skills and demonstrated ability to perform creative research, along with practical experience with Matlab, C, or C++ are essential. The position is also open to practicing engineers with similar backgrounds, who wish to pursue a two year research program on the above topics. For candidates holding the Masters degree, the program may lead to a topic for a PhD thesis in Aerospace or Electrical Engineering at the Technion. Salaries and social benefits are commensurate with those of senior research associates or senior engineers at the Technion, depending on the candidates' background and credentials. The project will be conducted under the joint supervision of: Dr. Moshe Idan Dr. Ron Meir Faculty of Aerospace Eng. Faculty of Electrical Engineering Technion - IIT Technion - IIT Haifa, 32000, Israel Haifa, 32000, Israel Tel. : ++972-4-8293489 Tel. : ++972-4-8294658 Fax. : ++972-4-8231848 Fax. : ++972-4-8323041 E-mail : aeridan at aeron.technion.ac.il E-mail : rmeir at dumbo.technion.ac.il From jfeldman at ICSI.Berkeley.EDU Tue Mar 24 12:08:34 1998 From: jfeldman at ICSI.Berkeley.EDU (Jerry Feldman) Date: Tue, 24 Mar 1998 09:08:34 -0800 Subject: Backprop w/o Math, Not. Message-ID: <3517E892.52741CC7@icsi.berkeley.edu> Last fall, I posted a request for ideas on how to teach back propagation to undergrads, some from linguistics, etc., who had little math. There were quite a few clever suggestions, but my conclusion was that it couldn't be done. There is no problem conveying the ideas of searching weight space, local minima, etc. But how could they understand the functional dependence w/o math? The students had already done some simple exercises with Tlearn and this seemed to help get them motivated. Following several suggestions and PDP v.3, I started with the integer based perceptron "delta rule", and left that on the board. The next step was to do the "delta rule" for linear nets with no hidden units. But, before doing that I "reminded" them about partial derivatives, using the volume of a cylinder, V = pi*r*r*h. There was an overhead with pictures of how the two partials affected V: dV/dh = pi*r*r was a thin dotted disk dV/dr = 2*pi*r*h was a thin dotted sleeve The only other math needed is the chain rule and it worked to motivate that directly from the error formula for the linear case. They saw that the error is expressed in terms of the output, but that one needs to know the effect of a weight change, etc. The fact that the result had the same form as the perceptron case was, of course, satisfying. They had already seen various activation functions and knew that the sigmoid had several advantages, but was obviously more complex. I derived the delta rule for a network with only one top node and using only one input pattern, this eliminates lots of subscripts. The derivation of the sigmoid derivative = f*(1-f) was given as a handout in gory detail, but I only went over it briefly. The idea was to get them all to believe that they could work it through and maybe some of them did. At that point, I just hand-waved about the delta for hidden layers being the appropriate function of the deltas to which it contributed and gave the final result. We then talked about search, local minima and the learning rate. Since they used momentum in Tlearn, there was another slide and story on that. My impression is that this approach works and that nothing simpler would suffice. There were only about thirty students and questions were allowed; it would certainly be harder with a large lecture. This was all done in one lecture because of the nature of the course. With more time, I would have followed some other suggestions and had them work through a tiny example by hand in class. For this course, we next went to a discussion of Regier's system, which uses some backprop extensions to push learning into the structured part of the net. I was able to describe Regier's techniques quite easily based on their being familiar with the derivation of backprop. I would still be interested in feedback on the overall course design: www.icsi.berkeley.edu/~mbrodsky/cogsci110/ -- Jerry Feldman From cns-cas at cns.bu.edu Tue Mar 24 11:33:53 1998 From: cns-cas at cns.bu.edu (Boston University - Cognitive and Neural Systems) Date: Tue, 24 Mar 1998 11:33:53 -0500 Subject: call for registration Message-ID: <199803241633.LAA28911@maverick.bu.edu> CALL FOR REGISTRATION and MEETING SCHEDULE OF EVENTS SECOND INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS May 27-30, 1998 Boston University 677 Beacon Street Boston, Massachusetts http://cns-web.bu.edu/cns-meeting/ Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from DARPA and ONR How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? The conference will include 125 invited lectures and contributed lectures and posters by experts from 25 countries on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. Although Memorial Day falls on Saturday, May 30, it is observed on Monday, May 25, 1998. Over 200 people attended last year's meeting, so early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If paying by check, mail to the address on the form. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. Registration fees will be returned on request only until April 30, 1998. ************************* REGISTRATION FORM Second International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 27, 1998 Meeting: May 28-30, 1998 FAX: (617) 353-7755 (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $70 Conference plus Tutorial (Regular) ( ) $45 Conference plus Tutorial (Student) ( ) $45 Conference Only (Regular) ( ) $30 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ ************************* MEETING SCHEDULE Wednesday, May 27, 1998 (Tutorials): 7:45am---8:30am MEETING REGISTRATION 8:30am--10:00am Larry Abbott: "Short-term synaptic plasticity: Mathematical description and computational function" 10:00am--10:30am COFFEE BREAK 10:30am--12:00pm George Cybenko: "Understanding Q-learning and other adaptive learning methods" 12:00pm---1:30pm LUNCH 1:30pm---3:00pm Ennio Mingolla: "Neural models of biological vision" 3:00pm---3:30pm COFFEE BREAK 3:30pm---5:00pm Alex Pentland: "Visual recognition of people and their behavior" Thursday, May 28, 1998 (Invited Talks, Contributed Talks, and Posters): 7:15am---8:00am MEETING REGISTRATION 7:55am---8:00am Stephen Grossberg: "Welcome and Introduction" 8:00am---8:45am Azriel Rosenfeld: "Understanding object motion" 8:45am---9:30am Takeo Kanade: "Computational sensors: Further progress" 9:30am--10:15am Tomaso Poggio: "Sparse representations for learning" 10:15am--10:45am COFFEE BREAK AND POSTER SESSION I 10:45am--11:30am Gail Carpenter: "Applications of ART neural networks" 11:30am--12:15pm Rodney Brooks: "Experiments in developmental models for a neurally controlled humanoid robot" 12:15pm---1:00pm Lee Feldkamp: "Recurrent networks: Promise and practice" 1:00pm---2:15pm LUNCH 2:15pm---3:15pm PLENARY TALK: Stephen Grossberg: "Adaptive resonance theory: From biology to technology" 3:15pm---3:30pm T.P. Caudell, P. Soliz, S.C. Nemeth, and G.P. Matthews: "Adaptive resonance theory: Diagnostic environment for clinical ophthalmology" 3:30pm---3:45pm Nabeel Murshed, Adnan Amin, and Samir Singh: "Recognition of handwritten Chinese characters with the fuzzy ARTMAP neural network" 3:45pm---4:00pm Yukinori Suzuki and Junji Maeda: "ECG wave form recognition with ART 2" 4:00pm---4:15pm Thomas E. Sandidge and Cihan H. Dagli: "Toward optimal fuzzy associative systems using interactive self-organizing maps and multi-layer feed forward principles" 4:15pm---4:30pm Alan Stocker: "Application of neural computational principles to compute smooth optical flow" 4:30pm---4:45pm Sorin Draghici and Valeriu Beiu: "On issues related to VLSI implementations of neural networks" 4:45pm---5:15pm COFFEE BREAK 4:45pm---7:45pm POSTER SESSION I (see below for details) Friday, May 29, 1998 (Invited and Contributed Talks): 7:30am---8:00am MEETING REGISTRATION 8:00am---8:45am J. Anthony Movshon: "Contrast gain control in the visual cortex" 8:45am---9:30am Hugh Wilson: "Global processes at intermediate levels of form vision" 9:30am--10:15am Mel Goodale: "Biological teleassistance: Perception and action in the human visual system" 10:15am--10:45am COFFEE BREAK 10:45am--11:30am Ken Stevens: "The categorical representation of speech and its traces in acoustics and articulation" 11:30am--12:15pm Carol Fowler: "Production-perception links in speech" 12:15pm---1:00pm Frank Guenther: "A theoretical framework for speech acquisition and production" 1:00pm---2:15pm LUNCH 2:15pm---2:30pm S. Oddo, J. Beck, and E. Mingolla: "Texture segregation in chromatic element- arrangement patterns" 2:30pm---2:45pm Joseph S. Lappin and Warren D. Craft: "The spatial structure of visual input explains perception of local surface shape" 2:45pm---3:00pm Glenn Becker and Peter Bock: "The ALISA shape module: Adaptive shape classification using a radial feature token" 3:00pm---3:15pm Sachin Ahuja and Bart Farell: "Stereo vision in a layered world" 3:15pm---3:30pm A.W. Przybyszewski, W. Foote, and D.A. Pollen: "Contrast gain of primate LGN neurons is controlled by feedback from V1" 3:30pm---3:45pm Bertram R. Payne and Stephen G. Lomber: "It doesn't add up: Non-linear interactions in the visual cerebral network" 3:45pm---4:00pm Larry Cauller: "NeuroInteractivism: Dynamical sensory/motor solutions to exploration inverse problems based upon the functional architecture of cerebral cortex" 4:00pm---4:30pm COFFEE BREAK 4:30pm---4:45pm Rashmi Sinha, William Heindel, and Leslie Welch: "Evidence for retinotopy in category learning" 4:45pm---5:00pm Michele Fabre-Thorpe, Ghislaine Richard, and Arnaud Delorme: "Color is not necessary for rapid categorization of natural images" 5:00pm---5:15pm Timothy C. Pearce, Todd A. Dickenson, David R. Walt, and John S. Kauer: "Exploiting statistics of signals obtained from large numbers of chemically sensitive polymer beads to implement hyperacuity in an artificial olfactory system" 5:15pm---5:30pm A. Wichert and G. Palm: "Hierarchical categorization" 5:30pm---5:45pm Nabil H. Farhat: "Bifurcation networks: An approach to cortical modeling and higher-level brain function" 5:45pm---6:00pm Robert Hecht-Nielsen: "A theory of the cerebral cortex" 6:00pm---8:00pm MEETING RECEPTION Saturday, May 30 (Invited Talks, Contributed Talks, and Posters): 7:30am---8:00am MEETING REGISTRATION 8:00am---8:45am Howard Eichenbaum: "The hippocampus and mechanisms of declarative memory" 8:45am---9:30am Earl Miller: "Neural mechanisms for working memory and cognition" 9:30am--10:15am Bruce McNaughton: "Neuronal population dynamics and the interpretation of dreams" 10:15am--10:45am COFFEE BREAK AND POSTER SESSION II 10:45am--11:30am Richard Thompson: "The cerebellar circuitry essential for classical conditioning of discrete behavioral responses" 11:30am--12:15pm Daniel Bullock: "Cortical control of arm movements" 12:15pm---1:00pm Andrew Barto: "Reinforcement learning applied to large-scale dynamic optimization" 1:00pm---2:15pm LUNCH 2:15pm---3:15pm PLENARY TALK: Ken Nakayama: "Psychological studies of visual attention" 3:15pm---3:30pm Emery N. Brown, Loren M. Frank, Dengda Tang, Michael C. Quirk, and Matthew A. Wilson: "A statistical model of spatial information encoding in the rat hippocampus" 3:30pm---3:45pm Michael Herrmann, Klaus Pawelzik, and Theo Geisel: "Self-localization of a robot by simultaneous self-organization of place and direction selectivity" 3:45pm---4:00pm Stefan Schaal and Dagmar Sternad: "Segmentation of endpoint trajectories does not imply segmented control" 4:00pm---4:15pm Stefan Schaal and Dagmar Sternad: "Nonlinear dynamics as a coherent framework for discrete and rhythmic movement primitives" 4:15pm---4:30pm Andrew L. Kun and W. Thomas Miller: "Unified walking control for a biped robot using neural networks" 4:30pm---4:45pm J.L. Buessler and J.P. Urban: "Global training of modular neural architectures in robotics" 4:45pm---5:15pm COFFEE BREAK 4:45pm---7:45pm POSTER SESSION II (see below for details) POSTER SESSION I: Thursday, May 28, 1998 All posters will be displayed for the full day. Cognition, Learning, Recognition (B): #1 A. Tijsseling, M. Casey, and S. Harnad: "Categories as attractors" #2 Vinoth Jagaroo: "Allocentric spatial processing and some of their cortical neural nodes: A neuropsychological investigation" #3 M.J. Denham and S.L. McCabe: "A dynamic learning rule for synaptic modification" #4 M.J. Denham and S.L. McCabe: "A computational model of predictive learning in hippocampal CA3 principal cells of the rat during spatial activity" #5 Nina Emilia Poriet Ramirez and Andreu Catala Mallofre: "Qualitative approximation of neural cognitive maps" #6 Gary C.-W. Shyi: "Computing representations for bound and unbound 3D object matching" #7 Ghislaine Richard, Michele Fabre-Thorpe, and Arnaud Delorme: "On the similarity between fast visual categorization of natural images in monkeys and humans" #8 Robert Proulx and Sylvain Chartier: "Reproduction of the list-length and the list-strength effect in unsupervised neural networks" #9 Jean-Claude Dreher and Emmanuel Guigon: "A model of dopamine modulation on sustained activities in prefrontal cortex" #10 Oury Monchi and John G. Taylor: "Neural modeling of the anatomical areas involved in working memory tasks" Cognition, Learning, Recognition (T): #11 Christophe Lecerf: "The double loop learning model" #12 V. Petridis and Ath. Kehagias: "A general convergence result for data allocation in online unsupervised learning methods" #13 C.S. Liu and C.H. Tseng: "Hierarchical decomposition training algorithm for multilayer Perceptron networks" #14 Antonio Ballarin and Simona Gervasi: "Political surveys and scenario simulations" #15 Regis Quelavoine and Pascal Nocera: "Transients classification, learning with expert interaction" #16 John R. Alexander Jr.: "How technology CAN emulate biological intelligence: Begin at the beginning - a speculation" #17 Maria P. Alvarez: "A supervised learning algorithm for feedforward networks with inhibitory lateral connections" #18 Mark A. Rubin, Michael A. Cohen, Joanne S. Luciano, and Jacqueline A. Samson: "Can we predict the outcome of treatment for depression?" #19 Irak Vicarte Mayer and Haruhisa Takahashi: "Object matching by principal component analysis" #20 Jun Saiki: "A neural network model for computation of object-based spatial relations" #21 Harald Ruda and Magnus Snorrason: "An algorithm for the construction of a hierarchical classifier using single trial learning and self-organizing neural networks" #22 Gail A. Carpenter and William W. Streilein: "Fuzzy ARTMAP neural networks for data fusion and sonar classification" #23 William W. Streilein and Paolo Gaudiano: "Autonomous robotics: Object identification and classification through sonar" #24 Nabeel Murshed, Ana Cristina de Carvalho, Regiane Aires, and Sergio Ossamu Ioshii: "Detection of carcinoma with the fuzzy ARTMAP NN" #25 Gail A. Carpenter, Sucharita Gopal, Scott Macomber, Siegfried Martens, and Curtis E. Woodcock: "Mapping vegetation ground cover with fuzzy ARTMAP" #26 Siegfried Martens and Paolo Gaudiano: "Mobile robot sensor fusion with an ARTMAP neural network" #27 Tayeb Nedjari and Younes Bennani: "Symbolic-connectionist interaction" #28 Eduardo da Fonesca Melo and Edson Costa de Barros Carvalho Filho: "An autonomous multi feature selective attention neural network model" #29 Wonil Kim, Kishan Mehrotra, and Chilukuri K. Mohan: "Learning collages: An adaptive multi-module approximation network" #30 Christine Lisetti: "Connectionist modeling of emotional arousal along the autonomic nervous system" Neural System Models (B): #31 Roger A. Drake: "Redundant behavioral measures of activation: Leftward visual inattention" #32 Michael Lamport Commons: "Can neural nets be conscious and have a sense of free will?" #33 Fabio R. Melfi and Andre C.P. Carvalho: "Human performance in maze navigation problems" #34 K. Gurney, A. Prescott, and P. Redgrave: "A model of intrinsic processing in the basal ganglia" Neural System Models (T): #35 (presentation withdrawn by the authors) #36 Robert Alan Brown: "Machine bonding" #37 Kit S. Choy and Peter D. Scott: "Reinforcement learning enhanced by learning from exemplary behaviors" #38 Andras Peter: "A neural network for self-adaptive classification" POSTER SESSION II: Saturday, May 30, 1998 All posters will be displayed for the full day. Vision (B): #1 Magnus Snorrason, Harald Ruda, and James Hoffman: "Visual search patterns in photo-realistic imagery" #2 Drazen Domijan: "New mechanism for luminance channel in the network model of brightness perception" #3 Colin W.G. Clifford and Michael R. Ibbotson: "Adaptive encoding of visual motion: Modelling the response properties of directional neurons in the accessory optic system" #4 J.R. Williamson and S. Grossberg: "How cortical development leads to perceptual grouping" #5 Stephen G. Lomber and Bertram R. Payne: "Behavioral dissociations in visual cortex: A multi-dimensional view of the cerebrum" Audition, Speech, Language (B + T): #6 Robert A. Baxter and Thomas F. Quatieri: "AM-FM estimation by shunting neural networks" #7 Lars Koop and Holger U. Prante: "Classification of artificial and natural sounds with stationary and temporal self-organized feature maps" #8 S.L. McCabe: "Synaptic depression and temporal order identification" #9 Fatima T. Husain and Frank H. Guenther: "Experimental tests of neural models of the perceptual magnet effect" #10 Jerome Braun and Haim Levkowitz: "Perceptually guided training in recurrent neural networks based automatic language identification" #11 Shinji Karasawa, Ken-ichi Suzuki, and Jun-ichii Oomori: "The artificial intelligence organized by decoders for language processing" Spatial Mapping and Navigation (B): #12 Herve Frezza-Buet and Frederic Alexandre: "A model of cortical activation for robot navigation" #13 William Gnadt and Stephen Grossberg: "A self-organizing neural network model of spatial navigation and planning" Control and Robotics (T): #14 Jesse Reichler and Clay Holroyd: "An architecture for autonomous control and planning in uncertain domains" #15 Fernando J. Corbacho: "Biologically inspired design principles for autonomous agents" #16 Erol Sahin, Paolo Gaudiano, and Robert Wagner: "A comparison of visual looming and sonar as mobile robot range sensors" #17 J.J. Collins and Malachy Eaton: "Situated pursuit and evasion using temporal difference learning" #18 Kent Thompson and Wayne Lu: "The big eyeballs project" #19 Stevo Bozinovski, Georgi Stojanov, and Liljana Bozinovska: "Learning sparse environments using emotionally reinforced neural network" #20 Catalin V. Buhusi and Nestor A. Schmajuk: "Stimulus selection mechanisms: Implications for artificial life systems" VLSI (T): #21 Richard Izak and Thomas Zahn: "Modeling auditory pathway: A neuromorphic VLSI system with integrate and fire neurons and on-chip learning synapses" #22 Todd Hinck and Allyn E. Hubbard: "Combining featural and boundary information to create a surface representation: A hardware paradigm" #23 Christian Karl and Allyn E. Hubbard: "Pipelined asynchronous communication between large arrays" #24 Catherine Collin and Susanne Still: "Towards a neuronally-controlled walking machine" #25 Radu Dogaru and Leon O. Chua: "Emergent computation in cellular neural networks with FitzHugh-Nagumo cells: A novel approach based on the local activity theory" Hybrid Systems and Industrial Applications (T): #26 Geoffrey N. Hone and Richard Scaife: "The SME Machine: A non-learning network implemented in a commercial spreadsheet delivers Subject Matter Expert judgments" #27 A. Bernatzki, W. Eppler, and H. Gemmeke: "Neural network debugger (NNDB) using local principal component analysis (LPCA) for high-dimensional input data" #28 Peter Sincak, Norbert Kopco, Rudolf Jaksa, and Marek Bundzel: "Computational intelligence tools for environmental applications" #29 Leo Chau-Kuang Liau and Robert W. McLaren: "The application of a neural net to parameter optimization of a continuous stirred tank reactor" #30 Brian M. O'Rourke: "Advanced time series modeling with neural networks" Neural System Models (B): #31 Clay Holroyd, Jesse Reichler, and Michael G.H. Coles: "Generation of error-related scalp potentials by a mesencephalic dopamine system for error processing" #32 Christian W. Eurich, Klaus Pawelzik, and John G. Milton: "Encoding temporal patterns by delay adaptation in networks of spiking neurons" #33 J.F. Gomez, F.J. Lopera, E. Marin, D. Pineda, and A. Rios: "A computer model using neuroimaging of a cyclic cortical wave for brain development, adult-brain-steady-state, and cerebral degeneration" Neural System Models (T): #34 M. Sreenivasa Rao and Arun K. Pujari: "A new neural network architecture with associative memory, pruning and order-sensitive learning" #35 L.M. Gelman and N.I. Bouraou: "Adaptive method for object recognition" #36 Alexandra I. Cristea and Toshio Okamoto: "Deduction of an L-based energy function for SE prediction" #37 Wei Cao and James Burghart: "Pattern-up-mapping method and fractional convergence" #38 Norbert Jankowski: "Controlling the structure of neural networks that grow and shrink" From salzamas at netlab.it Tue Mar 24 12:08:40 1998 From: salzamas at netlab.it (salzano) Date: Tue, 24 Mar 1998 18:08:40 +0100 Subject: New Journal Message-ID: <1.5.4.32.19980324170840.006e5390@mbox.netlab.it> Sorry for multiple posting. Hallo to Everybody, We are now publishing a new Journal, Economics & Complexity; Is anybody interested at topics as NN, Complex system, Chaos, Fuzzy Sets and Fuzzy Choice Theory applied to Economics, Public Finance, Financial ystem? We would like to increase the list of experts connected woth the Journal. Let me know. In the attachment You will find information about "Economics & Complexity" Massimo Salzano, University of Salerno - Italy ECONOMICS & COMPLEXITY an Interdisciplinary Journal on Public, Financial, Globalisation and Social Economics Federico Caff? Centre Publisher Roskilde University POBOX 260 DK-4000 Roskilde, DENMARK ISSN 1398-1706 VOL. 1 N. 1 WINTER 1997-98 CONTENTS The Control of Economics as a Complex System Massimo Salzano Towards a Model of Economic Growth Embodying an Evolutionistic Perspective Davide Infante Long-Term Memory Stability in the Italian Stock Market Marco Corazza Financial Time Series and Non Linear Models A.Amendola, M.S. Andreano, C. Vitale Announcements Scientific board: Prof. Bruno Amoroso - Economics of Globalization - University of Roskilde -DK Prof. Elio Canestrelli - Financial Mathematics - University of Venice - ca' Foscari - IT. Prof. Stefano Ecchia - Financial Markets - University of Naples - Federico II - IT Prof. Stuart Holland - International Political Economy - University of Roskilde - DK Prof. Roman Strongin - Math. Optimisation - Nizhni Novgorod - Lobacesky University - RF Prof. Salvatore Vinci - Economics -"Navale" University - Napoli - IT Prof. Cosimo Vitale - Statistics - University of Salerno - IT Managing editor: Prof. Massimo Salzano - Public Finance - University of Salerno - University of Calabria - IT Tel: (39) 089 962158 (39) 984 492443 fax: (39) 984 492421 The aim of this journal is to spread the use of a complex, interdisciplinary, methodological approach to the study of economics. Every issue will be devoted to a specific topic which will take into account the importance of having different perspectives on the subject. Many books and articles have been written on complexity in economics but, generally, they are oriented towards a more theoretical approach and it is very difficult, at the present, to speak about a complex approach to public policy. Perhaps, this is because the complex approach is at crossroads between economics, public finance, banking, financial sector, mathematics and statistics. The choice of the scientific board takes this reality into consideration. Only academics that are very well known in their relative fields, but are still open to new ideas, could be the garantors that this new approach will be applied to economics whilst maintaining a substantial respect of the disciplinary and interdisciplinary methodology. All contributions to the journal will be refereed using the usual approach of "invisible referee". For works presented at conferences organised by the journal, there will be a double screening. First acceptance for the conference will be dependent on a referee and then, after the conference, the results of the public discussion will be taken into account. This journal will be published two times a year, though some special numbers may appear. Each number will be published in Denmark both in paper and on the Internet. Generally we will produce fifty physical copies. With author's acceptance, provisional versions of articles that will be published on the Journal could be posted on Internet. The language of the journal is English. Instructions to Authors (1) Papers must be in English. (2) Papers for publication (2copies and electronic manuscript, i.e., on disk or by e at mail with the accompanying manuscrip) should be sent to: Professor M. Salzano, Managing Editor, Economics & Complexity, Department of Economics, University of Salerno Fisciano- ITALY: e at mail: salzamas at netlab.it Submission of a paper will be held to imply that it contains original unpublished work and is not being submitted for publication elsewhere. The Editor does not accept responsibility for damage or loss of papers submitted. Upon acceptance of an article author(s) will be asked to autorise the publication also on internet of his article. (3) The preferred format is Winword 8 for Win95. Do not submit your original paper only in paper. Do submit the accepted version of your paper as electronic manuscripts. Make absolutely sure that the file on the disk and the printout are identical. Label the disk with your name; also specify the software and hardware used as well as the title of the file to be processed. Please check that the file is correctly formatted. Do not allow your word processor to introduce word breaks and do not use a justified layout. Please adhere strictly the general instructions below on style, arrangement and, in particular, the reference style of the journal. (4) Manuscripts should be typewritten on one side of the paper only, double-spaced with wide margins. All pages should be numbered consecutively. Title and subtitles should be short (less then 1/2 line). (5) The first page of the manuscript should contain the following information: (i) the title; (ii) the name(s) and institutional affiliation(s) of the aurthor(s); (iii) an abstract of not more than 80 words. A footnote on the same sheet should give a) the name, address, and telephone and fax numbers of the corresponding author, as well as an e-mail address;b) acknowledgements and information on grants received. (6) The first page of the manuscript should also contain up to five key words - should be supplied. (7) Footnotes should be kept to a minimum and numbered consecutively throughout the text with superscript arabic numerals. (8) For the formulae, please would You use Winword basic MathType, and not an advanced one. (9) The References in the text should be as follow: "Allison (1915) demonstrate that ." The list of references should appear at the end of the main text. It should be single spaced and listed in alphabetical order by author's name. References should appear as follows: Author1, N., Autor2, N.B., and Autor3, L.J., (1938): The tax advantage. MacMillan, New York Author1, W., and Author2, E. (1998): The Elements of Economics, Journal of Economic and society The journal title should not be abbreviated. (10) The illustrations should be inserted in the text. All graphs and diagrams, numbered consecutively, should be referred to as figures. Illustrations can be printed in colour only if they are judged by the Editor to be essential to the presentation. Further information concerning colour illustrations and the costs to the author can be obtained from the publisher Any manuscripts which does not conform to be above instructions will not be considered for the publication. Usually, no page proofs will be sent to the corresponding author. Twenty offprints of each paper are supplied free of charge to the corresponding author; additional offprints are available at cost if they are ordered when revised final version of the work is posted. ISSN 1398-1706 From kaplane at rockvax.rockefeller.edu Wed Mar 25 08:13:18 1998 From: kaplane at rockvax.rockefeller.edu (Dr. Ehud Kaplan) Date: Wed, 25 Mar 1998 08:13:18 -0500 Subject: Post-Doc position in Computational Neuroscience Message-ID: <01BD57C5.DF0DA870.kaplane@rockvax.rockefeller.edu> Post-Doctoral position in Computational Neuroscience available-- We are looking for a post-doctoral fellow to work with us in computational neuroscience at the Mount Sinai School of Medicine in New York City. We are developing an approach to simulation of large-scale neuronal ensembles, and are looking for someone with expertise and interest in computer simulations, dynamical systems, and applied mathematics. Knowledge of neuroscience is an obvious advantage. Our group includes, among others: Bruce Knight, Larry Sirovich and Ehud Kaplan, and is involved in both theoretical (mathematical and computational simulations) and experimental (optical imaging and electrophysiology of the visual cortex) approaches. Physicists, mathematicians, neurobiologists and engineers work together. Please apply to: postdoc at camelot.mssm.edu Ehud Kaplan, Ph.D. Jules & Doris Stein Research-to-Prevent-Blindness Professor Depts. of Ophthalmology, Physiology & Biophysics Mount Sinai School of Medicine One Gustave Levy Place NY, NY, 10029 From dayan at flies.mit.edu Wed Mar 25 12:08:34 1998 From: dayan at flies.mit.edu (Peter Dayan) Date: Wed, 25 Mar 1998 12:08:34 -0500 (EST) Subject: postdoc job Message-ID: <199803251708.MAA19298@flies.mit.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 1627 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/b4b8dad3/attachment.ksh From leila at ida.his.se Wed Mar 25 18:31:03 1998 From: leila at ida.his.se (Leila Khammari) Date: Thu, 26 Mar 1998 00:31:03 +0100 Subject: ICANN 98 - EXTENDED DEADLINE Message-ID: <351993B6.DDE4DF@ida.his.se> _________________________________________________________________ ICANN 98 - DEADLINE EXTENDED to Monday, March 30 Due to numerous requests for an extension of the ICANN 98 paper submission deadline, all papers received by Monday, March 30, 11.00 p.m. CET will be included in the review process. _________________________________________________________________ 8th INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS September 1-4, 1998, Skoevde, Sweden For details see: http://www.his.se/ida/icann98/ _________________________________________________________________ SUBMISSION Submissions are sought in all areas of artficial neural network research, in particular - Theory - Applications - Computational Neuroscience & Brain Theory - Connectionist Cognitive Science - Autonomous Robotics & Adaptive Behaviour - Hardware & Implementations Papers of maximum 6 pages can be submitted by March 30, 1998, * either ELECTRONICALLY via the web-service * or via EXPRESS/COURIER MAIL to the conference secretariat. Templates for LaTeX, Word and Framemaker are available for down- load. For details please see: http://www.his.se/ida/icann98/submission or contact the conference secretariat (see below). All papers accepted for oral or poster presentation will appear in the conference proceedings published by Springer-Verlag. _________________________________________________________________ IMPORTANT DATES March 30, 1998 - NEW DEADLINE, papers must be received May 6, 1998 - notification of acceptance May 28, 1998 - camera ready papers must be received September 1, 1998 - ICANN 98 tutorials September 2-4, 1998 - ICANN 98 takes place _________________________________________________________________ CONFERENCE SECRETARIAT ICANN 98 Hoegskolan Skoevde Hoegskolevaegen 1 S-541 28 Skoevde SWEDEN Email: icann98 at ida.his.se Telefax: +46 (0)500-46 47 25 WWW: http://www.his.se/ida/icann98/ _________________________________________________________________ From kchen at cis.ohio-state.edu Thu Mar 26 11:20:08 1998 From: kchen at cis.ohio-state.edu (Ke CHEN) Date: Thu, 26 Mar 1998 11:20:08 -0500 (EST) Subject: preprint. Message-ID: Dear Connectionists, The following preprint is available now on line: http://www.cis.ohio-state.edu/~kchen/jnc98.ps Best regards, -kc ---------------------------------------------------- Dr. Ke CHEN Department of Computer and Information Science The Ohio State University 583 Dreese Laboratories 2015 Neil Avenue Columbus, Ohio 43210-1277 U.S.A. Phone: 1-614-292-4890(O) (with an answering machine) Fax: 1-614-292-2911 E-Mail: kchen at cis.ohio-state.edu WWW: http://www.cis.ohio-state.edu/~kchen ------------------------------------------------------ ########################################################################## A Method of Combining Multiple Probabilistic Classifiers through Soft Competition on Different Feature Sets Ke Chen{1,2} and Huisheng Chi{1} {1} National Lab of Machine Perception and Center for Information Science Peking University, Beijing 100871, China {2} Dept of CIS and Center for Cognitive Science The Ohio State University, Columbus, OH 43210-1277, USA To appear in NEUROCOMPUTING - AN INTERNATIONAL JOURNAL, 1998. ABSTRACT A novel method is proposed for combining multiple probabilistic classifiers on different feature sets. In order to achieve the improved classification performance, a generalized finite mixture model is proposed as a linear combination scheme and implemented based on radial basis function networks. In the linear combination scheme, soft competition on different feature sets is adopted as an automatic feature rank mechanism so that different feature sets can be always simultaneously used in an optimal way to determine linear combination weights. For training the linear combination scheme, a learning algorithm is developed based on Expectation-Maximization (EM) algorithm. The proposed method has been applied to a typical real world problem, viz. speaker identification, in which different feature sets often need consideration simultaneously for robustness. Simulation results show that the proposed method yields good performance in speaker identification. Keywords: Combination of multiple classifiers, soft competition, different feature sets, Expectation-Maximization (EM) algorithm, speaker identification ########################################################################## From omori at cc.tuat.ac.jp Thu Mar 26 04:13:20 1998 From: omori at cc.tuat.ac.jp (=?ISO-2022-JP?B?GyRCQmc/OSEhTjQ7ShsoSg==?=) Date: Thu, 26 Mar 1998 18:13:20 +0900 Subject: ICONIP'98 : call for paper : EXTENDED DEADLINE to MAY 15 Message-ID: <01BD58E2.DBEF2CE0@BRAIN> - - Please accept our apologies if you receive multiple copies of this ??message. - - We would be most grateful if you would forward this message to ??potentially interested parties. +-----------------------------------------+ | SUBMISSION DEADLINE EXTENDED TO MAY 15 | +-----------------------------------------+ Call for Papers The Fifth International Conference on Neural Information Processing ICONIP'98 +----------------------------------------------------+ | http://jnns-www.okabe.rcast.u-tokyo.ac.jp/jnns/ICONIP98.html | +----------------------------------------------------+ Organized by Japanese Neural Network Society (JNNS) Sponsored by Asian Pacific Neural Network Assembly (APNNA) October 21-23,1998 Kitakyushu International Conference Center 3-9-30 Asano, Kokura-ku, Kitakyushu 802, Japan In 1998, the annual conference of the Asian Pacific Neural Network Assembly, ICONIP'98, will be held jointly with the ninth annual conference of Japanese Neural Network Society, from 21 to 23 October 1998 in Kitakyushu, Japan. The goal of ICONIP'98 is to provide a forum for researchers and engineers from academia and industries to meet and to exchange ideas on advanced techniques and recent developments in neural information processing. The conference further serves to stimulate local and regional interests in neural information processing and its potential applications to industries indigenous to this region. Topics of Interest Track 1: Neurobiological Basis of Brain Functions Track 2: Mathematical Theory of Brain Functions Track 3: Cognitive and Behavioral Aspects of Brain Functions Track 4: Tecnical Aspect of Neural Networks Track 5: Distributed Processing Systems Track 6: Applications of Neural Networks Track 7: Implementations of Neural Networks Topics cover (Key Words): Neuroscience, Neurobiology and Biophysics, Learning and Plasticity, Sensory and Motor Systems, Cognition and Perception Algorithms and Architectures, Learning and Generalization, Memory, Neurodynamics and Chaos, Probabilistic and Statistical Methods, Neural Coding Emotion, Consciousness and Attention, Visual and Auditory Computation, Speach and Languages, Neural Control and Robotics, Pattern Recognition and Signal Processing, Time Series Forecasting, Blind Separation, Knowledge Acquisition, Data Mining, Rule Extraction Emergent Computation, Distributed AI Systems, Agent-Based Systems, Soft Computing, Real World Systems, Neuro-Fuzzy Systems Neural Device and Hardware, Neuraland Brain Computers, Software Tools, System Integration Conference Committee Conference Chair: Kunihiko Fukushima, Osaka University Conference Vice-chair: Minoru Tsukada, Tamagawa University Organizing Chair: Shuji Yoshizawa, Tokyo University Program Chair: Shiro Usui, Toyohashi University of Technology International Advisory Committee (tentative) Chair: Shun-ichi Amari, Institute of Physical and Chemical Research Members: S. Bang (Korea), J. Bezdek (USA), J. Dayhoff (USA), R. Eckmiller (Germany), W. Freeman (USA), N. Kasabov (New Zealand), H. Mallot (Germany), G. Matsumoto (Japan), N. Sugie (Japan), R. Suzuki(Japan), K. Toyama (Japan), Y. Wu (China), Lei Xu (Hong Kong), J. Zurada (USA) Call for paper The Programme Committee is looking for original papers on the above mentioned topics. Authors should pay special attention to explanation of theoretical and technical choices involved, point out possible imitations and describe the current states of their work. All received papers will be reviewed by the Programme Committee. The authors will be informed about the decision of the review process by June 30, 1998. All accepted papers will be published. As the conference is a multi- disciplinary meeting the papers are required to be comprehensible to a wider rather than to a very specialized audience. Instruction to Authors Papers must be received by May 15, 1998. The papers must be submitted in a camera-ready format. Papers will be presented at the conference either in an oral or in a poster session. Please submit six copies of the paper written in English on A4-format white paper with equal sized left and right margins, and 18 mm from the top, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. Centred at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by an abstract and the text. In the covering letter the track and the topics(3-4 Keywords) of the paper according to the list above should be indicated. No changes will be possible after submission of your manuscript. Authors may also retrieve the ICONIP style "iconip98.tex","iconip98.sty" "epsbox.sty", and "sample.ps" files (they are compressed as "form.tar.gz") for the conference from the homepage. Language The use of English is required for papers and presentation. No simultaneous interpretation will be provided. Workshops No tutorial will be held before and after the conference. Two satellite workshops will be held. One is "Satellite workshop for young researcher on Information processing" will be held after the conference. The details will be obtained from http://jnns-www.okabe.rcast.u-tokyo.ac.jp/jnns/iconip98/iconip98_ws.html. Another workshop is the Riken-Tamagawa International Dynamic Brain Forum (DBF`98), Ocober 18 - 20, 1998. This will take place in Brain Science Research Center, Tamagawa University Research Institute. ?? It is organized by S.Amari, M.Tsukada, K.Aihara, and H,Dinse, and tentative list of invited speakers include following people. J.P.Segundo(Univ.of California,USA), W.Freeman(USA), Maass, Longtin, Gerstein(USA), S.Thorpe, P.Werbos USA), Ad Aertsen (Albert-Ludwigs-Univ. Germany), H.Dinse(Ruhr-University, Germany) G.Hauske(TU Munich ,Germany), W.Von Seelen(Ruhr-University,Germany), G.Sandner (Univ.Louis Pasteur France), C.E.Schreiner(Univ.of California, USA), N.M.Weinberger(Univ.of California,USA), P. Erdi(Academy of Sciences, Hungary) Important Dates for ICONIP'98 Papers Due: May 15, 1998 Notification of Paper Acceptance: June 30, 1998 Second Circular (with Registration Form): June 30, 1998 Registration of at least one author of a paper: July 31, 1998 Early Registration: July 31, 1998 Conference: October 21-23, 1998 Workshops: October 24-26, 1998 For further information, please contact: ICONIP'98 Secretariat Mr. Masahito Matue Japan Technical Information Service Sogo Kojimachi No.3 Bldg. 1-6 Kojimachi, Chiyoda-ku, Tokyo 102, Japan Tel:+81-3-3239-4565 Fax:+81-3-3239-4714 E-mail:jatisc at msm.com Could you suggest your friends and acquaintances who will be interested in ICONIP'98-Kitakyushu? Thank you. From elman at crl.ucsd.edu Wed Mar 25 17:22:22 1998 From: elman at crl.ucsd.edu (Jeff Elman) Date: Wed, 25 Mar 1998 14:22:22 -0800 (PST) Subject: Postdoc announcement: CRL/UC San Diego Message-ID: <199803252222.OAA27121@crl.ucsd.edu> CENTER FOR RESEARCH IN LANGUAGE UNIVERSITY OF CALIFORNIA, SAN DIEGO ANNOUNCEMENT OF POSTDOCTORAL FELLOWSHIPS Applications are invited for postdoctoral fellowships in Language, Communication and Brain at the Center for Research in Language at the University of California, San Diego. The fellowships are supported by the National Institutes of Health (NIDCD), and provide an annual stipend ranging from $20,292 to $26,900 depending upon years of postdoctoral experience. In addition, funding is provided for medical insurance and limited travel. The program provides interdisciplinary training in: (1) psycholinguistics, including language processing in adults and language development in children; (2) communication disorders, including childhood language disorders and adult aphasia; (3) electrophysiological studies of language, and (4) neural network models of language learning and processing. Candidates are expected to work in at least one of these four areas, and preference will be given to candidates with background and interests involving more than one area. Executive Committee Members: Elizabeth Bates, Depts. of Cognitive Science & Psychology, UCSD Jeffrey Elman, Dept. of Cognitive Science, UCSD Marta Kutas, Dept. of Cognitive Science, UCSD David Swinney, Dept. of Psychology, UCSD Beverly Wulfeck, Dept. of Communicative Disorders, San Diego State University Grant conditions require that candidates be citizens or permanent residents of the U.S. In addition, trainees will incur a payback obligation during their first year of postdoctoral NRSA support and are required to complete a Payback Agreement. Applications must be RECEIVED by May 1, 1998. Training may begin as early as July 1, 1998 and as late as May 30, 1999. This is a one year appointment. Questions regarding this program may be sent to Joanna Mancusi, jmancusi at ucsd.edu. Applicants should send a statement of interest, three letters of recommendation, a curriculum vitae and copies of relevant publications to: Postdoc Fellowship Committee Center for Research in Language 0526 University of California, San Diego 9500 Gilman Drive La Jolla, California 92093-0526 (619) 534-2536 Women and minority candidates are specifically invited to apply. From paolo at McCulloch.ING.UNIFI.IT Fri Mar 27 06:15:19 1998 From: paolo at McCulloch.ING.UNIFI.IT (Paolo Frasconi) Date: Fri, 27 Mar 1998 12:15:19 +0100 (MET) Subject: PhD Scholarship Message-ID: PhD Scholarship in Adaptive Processing of Data Structures Faculty of Informatics University of Wollongong Australia An Australian Research Council Large grant for 1998 -- 2000 was awarded to Professors Tsoi (University of Wollongong), Gori (University of Siena), and Sperduti (University of Pisa) to study adaptive processing of data structures, a new way of studying problems which can be represented as data structures. Many practical problems, e.g., image understanding, document understanding, modelling of access behaviour on the internet, are more suitable to be modelled by data structures due to their relative ease in handling problems with dynamic and variable structures. A PhD scholarship for three years tenable at the University of Wollongong in adaptive processing of data structures is available for suitably qualified candidate to take up immediately. The candidate must have a good first class undergraduate degree in computer science, computer engineering, mathematics or other related discipline with some familiarity with neural networks, data structures, automata theory. It is desirable for the candidate to have some postgraduate training in neural networks. Interested candidate should access our project web site: http://www.dsi.unifi.it/~paolo/datas for more information on the project, researchers involved in the project, and papers relevant to the project. Further information can be obtained by contacting Professor A. C. Tsoi, Dean, Faculty of Informatics, University of Wollongong, Email: act at wumpus.uow.edu.au; Phone: +(61) 2-42-21-38-43; Fax: +(61) 2-42-21-48-43. Application for the PhD scholarship, including a brief curriculum vitae, transcripts of academic results, recommendations from three lecturers who know about you, should be sent to Ms Cathy McIvor, Personnel Services, University of Wollongong, Northfields Avenue, Wollongong, NSW 2522, Australia by 24th April, 1998. Paolo Frasconi Universita' di Firenze Dipartimento di Sistemi tel: +39 (55) 479-6362 e Informatica fax: +39 (55) 479-6363 Via di Santa Marta 3 50139 Firenze (Italy) http://www.dsi.unifi.it/~paolo/ From niall at zeus.csis.ul.ie Sat Mar 28 08:37:00 1998 From: niall at zeus.csis.ul.ie (Niall Griffith) Date: Sat, 28 Mar 1998 13:37:00 GMT Subject: IEE Colloqiuim - Neural Nets and MultiMedia Message-ID: <9803281337.AA20770@zeus.csis.ul.ie> Please pass this on to anyone or any group you think may be interested. ============================================================== IEE Colloquium on "Neural Networks in Multimedia Interactive Systems" Thursday 22 October 1998, Savoy Place, London. Call for Papers --------------- The IEE are holding a colloquium at Savoy Place on the use of neural network models in multimedia systems. This is a developing field of importance to both Multimedia applications developers who want to develop more responsive and adaptive systems as well as to neural network researchers. The aim of the colloquium is to present a range of current neural network applications in the area of interactive multimedia. The aim is cover a range of topics including learning, intelligent agents within multimedia systems, data mining, image processing and intelligent application interfaces. Invited Speakers: ----------------- Bruce Blumberg, MIT Media Lab. Jim Austin, York. Russell Beale, Birmingham. Call For Papers --------------- Submissions are invited in any (but not exclusively) of the following areas: Adaptive and plastic behaviour in multi-media systems Concept and behaviour learning and acquisition Browsing mechanisms Preference and strategy identification and learning Data mining Image processing in multimedia systems Cross modal and media representations and processes Intelligent agents Interested parties are invited to submit a two page (maximum) abstract of their proposed talk to either Dr. Niall Griffith, Department of Computer Science and Information Science, University of Limerick, Limerick, Ireland. email: niall.griffith at ul.ie Telephone: +353 61 202785 Fax: +353 61 330876 or Professor Nigel M Allinson Dept. of Elec. Eng. & Electronics UMIST PO Box 88 Manchester, M60 1QD, UK Voice: (+44) (0) 161-200-4641 Fax: (+44) (0) 161-200-4781/4 Internet: allinson at umist.ac.uk Timetable: ---------- 29th April: Deadline for talk submissions 15th June: Authors notified. 24th November: Colloquium at IEE, Savoy Place, London ===================================================== From skremer at q.cis.uoguelph.ca Sun Mar 29 16:44:29 1998 From: skremer at q.cis.uoguelph.ca (Stefan C. Kremer) Date: Sun, 29 Mar 1998 16:44:29 -0500 (EST) Subject: M.Sc. program in connectionism (Guelph, Ont., CANADA) Message-ID: The University of Guelph (Ontario, Canada) is now accepting applications from students with an interest in artificial neural and connectionist networks for its M.Sc. program in Computing and Information Science. The department has one of the largest neural network research groups in Canada and offers graduate courses in Artificial Neural Networks, Genetic Algorithms, Autonomous Robotics, as well as traditional Artificial Intelligence and other topics in Computer Science. Joint research projects with industry and collaboration with other neural network groups in the area enhance the learning environment. To be considered for admission, applicants must have a minimum 73% (`B') average during the previous four semesters of university study (though actual cut-offs are usually much higher) and are expected to possess a four-year honours degree in computer science. Students who are externally funded are especially encouraged to apply, though local funding may also be available for outstanding applicants. Most available spaces will be filled in May for entry in September. To assist in identifying a suitable thesis advisor, applicants are requested to submit descriptions of their research interests. For more information please e-mail Prof. Stefan Kremer at the address listed below. -- Dr. Stefan C. Kremer, Assistant Prof., Dept. of Computing and Information Science University of Guelph, Guelph, Ontario N1G 2W1 WWW: http://hebb.cis.uoguelph.ca/~skremer Tel: (519)824-4120 Ext.8913 Fax: (519)837-0323 E-mail: skremer at snowhite.cis.uoguelph.ca From segevr at post.tau.ac.il Mon Mar 30 06:14:44 1998 From: segevr at post.tau.ac.il (Ronen Segev) Date: Mon, 30 Mar 1998 14:14:44 +0300 (IDT) Subject: Paper: Self-Wiring of Neural Networks. Message-ID: Dear Connectionist, The following paper has been published at Physics Letters A, Vol 237/4-5(1998), p. 307-313. Hard copies can be obtained by sending an email to: segevr at post.tau.ac.il Your comments are welcome! Ronen Segev, email: segevr at post.tau.ac.il, School of Physics & Astronomy, Tel Aviv university. ========================================================================== TITLE: Self-Wiring of Neural Networks AUTHORS: Ronen Segev and Eshel Ben-Jacob. Tel Aviv university, Tel Aviv, Israel. ABSTRACT: In order to form the intricate network of synaptic connections in the brain, the growth cones migrate through the embryonic environment to their targets using chemical communication. As a first step to study self-wiring, 2D model systems of neurons have been used. We present a simple model to reproduce the salient features of the 2D systems. The model incorporates random walkers representing the growth cones, which migrate in response to chemotaxis substances extracted by the soma and communicate with each other and with the soma by means of attractive chemotactic "feedback". From nic at idsia.ch Mon Mar 30 04:20:51 1998 From: nic at idsia.ch (Nici Schraudolph) Date: Mon, 30 Mar 1998 11:20:51 +0200 Subject: TR: online local step size adaptation Message-ID: <199803300920.LAA00873@idsia.ch> Dear colleagues, the following technical report (10 pages, 143kB gzipped postscript) is available by anonymous ftp from the address given below. Best regards, -- Dr. Nicol N. Schraudolph Tel: +41-91-970-3877 IDSIA Fax: +41-91-911-9839 Corso Elvezia 36 CH-6900 Lugano http://www.idsia.ch/~nic/ Switzerland http://www.cnl.salk.edu/~schraudo/ Technical Report IDSIA-09-98: Online Local Gain Adaptation for Multi-Layer Perceptrons -------------------------------------------------------- Nicol N. Schraudolph We introduce a new method for adapting the step size of each individual weight in a multi-layer perceptron trained by stochastic gradient descent. Our technique derives from the K1 algorithm for linear systems (Sutton, 1992), which in turn is based on a diagonalized Kalman Filter. We expand upon Sutton's work in two regards: K1 is a) extended to nonlinear systems, and b) made more efficient by linearizing an exponentiation operation. The resulting ELK1 (extended, linearized K1) algorithm is computationally little more expensive than alternative proposals (Zimmermann, 1994; Almeida et al., 1997, 1998), and does not require an arbitrary smoothing parameter. On a first benchmark problem ELK1 clearly outperforms these alternatives, as well as stochastic gradient descent with momentum, even when the number of floating-point operations required per weight update is taken into account. Unlike the method of Almeida et al., ELK1 does not require statistical independence between successive training patterns. ftp://ftp.idsia.ch/pub/nic/olga.ps.gz From bogus@does.not.exist.com Mon Mar 30 08:32:40 1998 From: bogus@does.not.exist.com () Date: Mon, 30 Mar 98 14:32:40 +0100 Subject: No subject Message-ID: PhD Studentship in Neural Networks Neuroendocrine Research Group, Department of Physiology, University of Edinburgh,UK MODELLING THE EFFECTS OF SYNAPTIC INPUT ON A BURST-GENERATING NEURONE This BBSRC special committee studentship will involve the construction of mathematical models of oxytocin neurones, assessing the properties of the model by computer simulation and analytical techniques where possible, and planning, in conjunction with electrophysiologists, the experimental testing of models, e.g. by devising critical experiments to discriminate between hypotheses. Random synaptic input is an important element of the environment of oxytocin neurones, which have important physiological functions related to their two modes of firing. Tonic activity is involved in regulation of osmotic pressure of the blood, whereas bursts of intense activity coincide with milk ejections in lactating females, and contractions during parturition. Mathematical models will be constructed at various levels of complexity, starting with a development of the simple leaky integrator model, via well known models of bursting neurones, to more biophysically based models. Our general philosophy is to use models which are as simple as possible while still explaining in broad terms the experimental findings. This facilitates the use of analytical techniques to underpin simulation studies of more realistic models. Models are only further elaborated when the simple models are clearly inadequate. The studentship will be supervised by Professor Gareth Leng in the Physiology Department, Edinburgh, and David Brown in the Biomathematics Laboratory, Babraham Institute, Cambridge and in collaboration with neuroendocrinologists and other mathematicians in Edinburgh. This is an exciting opportunity to be involved in mathematical work on a neuronal system for which there is much experimental data, which has properties which are interesting from a physiological and mathematical point of view, and which has been little modelled so far. Applicants should have, or be likely to get, a good degree in a quantitative subject (e.g. mathematics, physics), preferably with an interest in biology. Maintenance grant of about #6,800 per annum (for UK residents). The studentship will start in October 1998, and run for for 3 years. The project will involve some travel between Edinburgh and Cambridge, so as to facilitate regular contact with the mathematicians and biologists at the two sites. Further information from Gareth Leng (0131 650 2869, email gareth.leng at ed.ac.uk) or David Brown (01223 832312 ext 224, email david.brown at bbsrc.ac.uk). Applications in the form of a CV (detailing academic performance so far) and names and addresses of two referees to Professor Gareth Leng, Department of Physiology, Medical School, University of Edinburgh, Teviot Place, Edinburgh, or David Brown, Laboratory of Biomathematics, Babraham Institute, Cambridge CB2 4AT as soon as possible. From iiass at tin.it Tue Mar 24 04:20:50 1998 From: iiass at tin.it (IIASS) Date: Tue, 24 Mar 1998 10:20:50 +0100 Subject: call Message-ID: <35177AF2.751B@tin.it> From esann at dice.ucl.ac.be Sun Mar 1 14:34:52 1998 From: esann at dice.ucl.ac.be (ESANN) Date: Sun, 01 Mar 1998 20:34:52 +0100 Subject: 1998 European Symposium on Artificial Neural Networks Message-ID: <3.0.3.32.19980301203452.006aaa20@ns1.dice.ucl.ac.be> ************************************************************************ Please accept our apologies if you received this message more than once. We do our best to maintain our mailing lists and avoid duplicates. However, your e-mail may be part of an alias-list maintained elsewhere, that we used for the ESANN'98 mailing. ************************************************************************ ------------------------------- | 6th European Symposium on | | Artificial Neural Networks | | | | ESANN'98 | | | | Bruges - April 22-24, 1998 | ------------------------------- -- Preliminary programme -- http://www.dice.ucl.ac.be/esann The programme of the ESANN'98 conference, the 6th European Symposium on Artificial Neural Networks, including information for registration, is available on the Web at the above URL. A printed copy of this programme is also available on request (mailto:esann at dice.ucl.ac.be). Please do not hesitate to contact us (preferably by email) if you need any other information concerning this conference. ===================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat D facto conference services 45 rue Masui - B-1000 Brussels - Belgium tel: + 32 2 203 43 63 - fax: + 32 2 203 42 94 mailto:esann at dice.ucl.ac.be ===================================================== From kainen at cs.umd.edu Mon Mar 2 23:02:50 1998 From: kainen at cs.umd.edu (Paul Kainen) Date: Mon, 2 Mar 1998 23:02:50 -0500 (EST) Subject: paper available: Approximation by neural networks is not continuous Message-ID: <199803030402.XAA21306@tove.cs.umd.edu> Dear Colleagues, The paper described below is accessible via the web at http://www.clark.net/pub/kainen/not-cont.ps It is 10 pages printed, 174 KB; sorry, hard copy not available. The paper has been submitted for a special issue of a journal. Approximation by neural networks is not continuous Paul C. Kainen, Vera Kurkova and Andrew Vogt It is shown that in a Banach space X satisfying mild conditions, for an infinite, independent subset G, there is no continuous best approximation map from X to the n-span, span_n G. The hypotheses are satisfied when X is an L_p space, 1 < p < \infty, and G is the set of functions computed by the hidden units of a typical neural network (e.g., Gaussian, Heaviside or hyperbolic tangent). If G is finite and span_n G is not a subspace of X, it is also shown that there is no continuous map from X to span_n G within any positive constant of a best approximation. Keywords. nonlinear approximation, one-hidden-layer neural network, rates of approximation, continuous selection, metric projection, proximinal set, Chebyshev set, n-width, geometry of Banach spaces. kainen at gumath1.math.georgetown.edu vera at uivt.cas.cz andy at gumath1.math.georgetown.edu From prefenes at lbs.ac.uk Tue Mar 3 07:05:14 1998 From: prefenes at lbs.ac.uk (Paul Refenes) Date: Tue, 3 Mar 1998 12:05:14 UTC Subject: Post-Doc and PhD Scholarships at LBS Message-ID: <6D5DBFA6218@nemesis.lbs.ac.uk> Dear Connectionists, Scholarships for the following posts are available at London Business School. =================================================== London Business School Decision Technology Centre Computational Finance Programme * Post-doctoral Research in Computational Finance * PhD Research Scholarship in Computational Finance The Department of Decision Science at London Business School is offering two scholarships on its Computational Finance programme. The research areas include Neural Networks, Non-parametric statistics, Financial Engineering, Simulation, Optimisation and Decision Analysis. 1. Post-doctoral Research in Computational Finance: The use of advanced decision technologies such as neural networks, non-parametric statistics and adaptive systems for the development of financial risk models in the fixed income and currency markets. Our industrial collaborators are leading European Banks and have a special interest on fixed income arbitrage and relative value models. 2. PhD Research Scholarship in Computational Finance: to utilise developments from times series theory and from the non-parametric statistics field for developing distribution theories, statistical diagnostics, and test procedures for neural model identification. London Business School offers students enrolled in the doctoral programme core courses on Research Methodology, Statistical Analysis, as well as a choice of advanced specialised subject area courses including Financial Economics, Equity Investment, Derivatives Research, etc. Candidates with a strong background in mathematics, operations research, computer science, nonparametric statistics, and/or econometrics are invited to apply. Applicants should have at least an upper second class degree and, ideally, an MSc in Computer Science, Statistics/Econometrics, Operations Research, or related areas. Please send a CV and addresses of 2 referees by March 20 to: Dr A-P. N. Refenes London Business School Regents Park, London NW1 4SA Tel: ++ 44 171 262 50 50 Fax: ++ 44 171 728 78 75 Application forms for the PhD programme can be obtained from and should be returned to: Dr Raymond Madden London Business School Regents Park, London NW1 4SA Tel: ++ 44 171 262 50 50 Fax: ++ 44 171 728 78 75 The Department ============ The Department of Decision Sciences of the London Business School is actively involved in innovative multi-disciplinary research on the application of new business modelling methodologies to individual and organisation decision-making. In seeking to extend the effectiveness of conventional methods of management science, statistical methods and decision support systems, with the latest generation of software platforms, artificial intelligence, neural networks, genetic algorithms and computationally intensive methods, the research themes of the department remain at the forefront of new practice. The Decision Science department of the London Business School is internationally known for its research in the areas of forecasting, optimisation, simulation and intelligent decision support methods. The Computational Finance Research Programme ==================================== The Computational Finance research programme at London Business School is the major centre in Europe for research into neural networks, non-parametric statistics and financial engineering. With funding from the DTI, the European Commission and a consortium of leading financial institutions the research unit has attained a world-wide reputation for collaborative research. Doctoral students work in a team of highly motivated post-doctoral fellows, research fellows, doctoral students and faculty who are amongst Europe's leading authorities in the field. From yx at pics91.cis.pku.edu.cn Wed Mar 4 02:55:25 1998 From: yx at pics91.cis.pku.edu.cn (Yu Xiang) Date: Wed, 4 Mar 1998 15:55:25 +0800 (CST) Subject: ICNN&B'98: Second Call for Papers Message-ID: Final Announcement and Call for Papers 1998 International Conference on Neural Network and Brain (NN&B'98) October 27-30,1998 Beijing, China Http://www.cie-china.org Sponsored by: China Neural Networks Council Co-sponsored by: IEEE NNC IEEE Beijing Section IEEE NNC Beijing RIG INNS-SIG Supported by: National Natural Science Foundation of China CALL FOR PAPERS Over the past decade or so, neural networks have emerged as a research area with active involvement by researchers from a number of different disciplines, including cognitive science, computer science, engineering, mathematics, neurobiology, physics, and statistics. The theme for the International Conference on Neural Network and Brain (NN&B'98) is to provide a chance for interdisciplinary collaboration and exchange of ideas which will lead us to address research issue in this area from different perspectives, and to promote its application to industries. This conference is intended to bring together researchers from different disciplines to review the current status of neural networks and understand higher brain functions. Submissions of papers related, but not limited, to the topics listed below are invited. General Topics Adaptive Filtering Architectures Associative Memory Brain Functions Cognitive Science Cellular Neural Networks Computational Intelligence Data Analysis Fuzzy Neural Systems Genetic and Annealing Algorithms Hybrid Systems Image Signal Processing Industrial Automation Intelligent Control and Robotics Learning and Memory Machine Vision Model Identification Motion Analysis Motion Vision Neurobiology Neurocognition Neural Modeling Neurosensors and Wavelets Neurodynamics and Chaos Optimization Parameter Estimation Pattern Recognition and signal Processing Prediction Sensation and Perception Sensorimotor Systems Speech, Hearing,and Language Supervised/Unsupervised Learning System Identification Time Series Analysis Implementations Hardware Implementation Optical Implementation Parallel and Distributed Computing Environment Simulation Applications Areas Business, Chemical Engineering, Communications, Economics and Finance,Industry, Manufacture, Medical, OCR,etc. Submission Authors are invited to submit 3 copies of detail synopsis of about 1000 words written in English. A cover sheet with author's title, name, affiliation, telephone and fax numbers, mailing address and e-mail. Synopsis and manuscript should be sent to: Mr. Yu Xiang Center for Information Science Peking University Beijing 100871 P. R. China Tel: 86-10-6275 1937 Fax: 86-10-6275 5654 E-mail: yx at cis.pku.edu.cn Important Dates Submission of synopsis March 31,1998 Notification of acceptance April 30, 1998 Submission of photo-ready accepted paper June 30,1998 GENERAL INFORMATION Conference Language The official language is English. No simultaneous translation is available. Conference Schedule October 27 Registration October 28-30 Parallel sessions October 31 One-day excursion to The Great Wall and Ming Tombs (tickets may be purchased at US$35) Venue The conference will be held at the Media Center, located at Fuxing Road, Beijing China, which is approximately 3km west to the Tiananmen Square. Accommodations A block of rooms has been reserved for the NN&B*98 Participants at the Media Center, which is an attractive and modern hotel. Single occupancy : US$40 per day Double occupancy(2 beds) : US$56 per day Registration Registration fee covers admission to the conference, reception, refreshments and a copy of the proceedings. Registration Fee by September 10,1998 after September 10,1998 Regular US$360 US$395 Student US$275 US$325 In case of Cancellation, a fee of US$50 will be deducted from the refund. Cancellations should be made in writing to the NN&B*98 Secretariat by Oct.15, 1998. No cancellation will be allowed after Oct.15,1998. But a copy of the conference proceedings will be mailed to. On Site Registration The registration desk at the Media Center will be open during the following hours for on-site registration: October 27,1998 8:30-23:00 October 28,1998 7:30-9:00AM Banquet There will be a banquet at 18:30 on October 30,1998. Tickets are available at a cost of US$30. Method of Payment All payments including registration, accommodation and tours should be made in US Dollars by Bank transfer to: Account number: 71404625 Account name: Chinese Institute of Electronics Bank of China, Headquarters Fuchengmen Road, Beijing, China Attn: Ms. Fang Min, NN&B*98 Secretariat Your name and NN&B*98 must be stated on all your payments. Weather The temperature is around 10c during the day, and 2c at night in October. Visas All travelers to China must have a valid visa. Visas may be obtained from the Chinese Consulate in most major cities. Conference registrants will be mailed an official invitation letter upon the registration form received to be used for visa application. Companions' Program A companions' program will be available during the conference. Airport Transportation On October 27,1998, the Conference staff will assist you at the Beijing airport in getting a taxi to the Media Center. Please provide your flight information in the registration form for this purpose. The taxi fare from the airport to the Media Center is approximately US$20. Please find the NN&B*98 sign at the arrival hall of the airport. POST-CONFERENCE TOURS Two post -conference tours will be offered to the conference attendees and accompanying persons. Tour A: November 1-8 Beijing-Xian-Guilin-Guangzhou (Hong Kong Exiting) US$ 998 per person for double occupancy US$ 1159 per person for single occupancy Tour B: November 1-8 Beijing-Hongzhou-Shanghai-Suzhou-Guangzhou (Hong Kong Exiting) US$ 989 per person for double occupancy US$ 1139 per person for single occupancy Note: The above fees include accommodations, meals, transportation between cities in China, and airport transfer. The organizer reserves the right to cancel any tour or offer new prices if a minimum number of 10 persons not reached. FOR FURTHER INFORMATION About paper and program Mr. Yu Xiang Center for Information Science Peking University Beijing 100871 China Tel: 86-10-6275 1937 Fax: 86-10-6275 5654 E-mail: yx at cis.pku.edu.cn About registration Ms. Min Fang NN&B'98 Secretariat P.O. Box 165, Beijing 100036, China Tel.:(8610)6828 3463 Fax:(8610)6828 3458 E-mail: shaz at sun.ihep.ac.cn From jimmy at ecowar.demon.co.uk Wed Mar 4 05:59:50 1998 From: jimmy at ecowar.demon.co.uk (Jimmy Shadbolt) Date: Wed, 4 Mar 1998 10:59:50 +0000 Subject: research position at Econostat, Ltd. (UK) Message-ID: RESEARCH POSITION OFFERED IN FINANCIAL MARKET PREDICTION -------------------------------------------------------- Position Quantitative Analyst Environment A small financial advisory company involved in predicting bonds and equities from a broad base of economic and financial indicators, using state-of-the-art statistical techniques (regression, neural networks, GA's, and others). Job Description Research and development of expected return models, based on in-house techniques and their expansion into further areas, such as Bayesian, wavelets, information measures and geometry. Applications to Jimmy Shadbolt jimmy at ecowar.demon.co.uk Start Date Immediate Qualifications -------------- First degree in numerate scientific discipline (maths, engineering, physics, statistics, etc). PhD (or MSc) in one of econometrics, mathematical statistics, applied mathematics or other related field of study. Strong interest in financial economics Training and experience ----------------------- Programming in C/C++ and/or Splus User experience in PC (word processing and spreadsheet) and Unix environments Aptitude and Ability -------------------- Good oral and writing skills Creative and problem solving approach to research Personal Attributes ------------------- Ability to work without close supervision as a member of a team Flexibility to meet changing opportunities in a dynamic research environment -- Jimmy Shadbolt Econostat Ltd Hennerton House Wargrave Berks RG10 8PD United Kingdom From jose at tractatus.rutgers.edu Fri Mar 6 12:16:59 1998 From: jose at tractatus.rutgers.edu (Stephen Hanson) Date: Fri, 6 Mar 1998 12:16:59 -0500 Subject: SYS ADM // COGSCI Message-ID: <199803061716.MAA00793@tractatus.rutgers.edu> ******* IMMEDIATE OPENING ******* 3/6/98 RUTGERS-Newark Campus -PSYCHOLOGY DEPARTMENT/COGNITIVE SCIENCE Systems Adminstration/Cognitive Science Research We are looking for an individual to do research in Cognitive Science and to help administer the computing resources of the Psychology Department at Rutgers-University (Newark Campus). Resources include a network of Sun workstations, PCs and Macs, printers, pc-voice mail system and various peripheral devices. The individual will be responsible for installing and debugging software, and various routine system administration activites. At least half their time will be spent in research involving Cognitive Science especially related to Connectionist networks (or Neural Networks and Computational Neuroscience. Familiarity with C programming, UNIX system internals (BSD, System V, Solaris, Linux) and Windows (95, NT) and local area networks running TCP/IP is required. Image processing or graphics programing experience are pluses. Candidates should possess either a BS/MS in Computer Science, Cognitive Science, AI or other relevant fields or equivalent experience. Salary will be dependent upon qualifications and experience. Rutgers University is an equal opportunity affirmative action employer. Please send resumes and references to Stephen J. Hanson Department of Psychology 101 Warren Street Rutgers University Newark, New Jersey, 07102 Direct email inquiries or resumes to: jose at psychology.rutgers.edu Please indicate on SUBJECT Line: SYS ADM as a keyword. From lba at inesc.pt Fri Mar 6 11:05:16 1998 From: lba at inesc.pt (Luis B. Almeida) Date: Fri, 06 Mar 1998 16:05:16 +0000 Subject: Paper available Message-ID: <35001EBC.5DB14D5D@inesc.pt> The following paper is available for download: Parameter Adaptation in Stochastic Optimization Luis B. Almeida, Thibault Langlois, Jose D. Amaral and Alexander Plakhov ABSTRACT Optimization is an important operation in many domains of science and technology. Local optimization techniques typically employ some form of iterative procedure, based on derivatives of the function to be optimized (objective function). These techniques normally involve parameters that must be set by the user, often by trial and error. Those parameters can have a strong influence on the convergence speed of the optimization. In several cases, a significant speed advantage could be gained if one could vary these parameters during the optimization, to reflect the local characteristics of the function being optimized. Some parameter adaptation methods have been proposed for this purpose, for deterministic optimization situations. For stochastic (also called on-line) optimization situations, there appears to be no simple and effective parameter adaptation method. This paper proposes a new method for parameter adaptation in stochastic optimization. The method is applicable to a wide range of objective functions, as well as to a large set of local optimization techniques. We present the derivation of the method, details of its application to gradient descent and to some of its variants, and examples of its use in the gradient optimization of several functions, as well as in the training of a multilayer perceptron by on-line backpropagation. The paper has 24 pages, and is available in compressed postscript form (162 kB) at ftp://146.193.2.131/pub/lba/papers/adsteps.ps.gz and in uncompressed postscript form (956 kB) at ftp://146.193.2.131/pub/lba/papers/adsteps.ps Comments are welcome. Luis B. Almeida Phone: +351-1-3100246,+351-1-3544607 INESC Fax: +351-1-3145843 R. Alves Redol, 9 E-mail: lba at inesc.pt 1000 Lisboa, Portugal http://ilusion.inesc.pt/~lba/lba.html ------------------------------------------------------------------------ *** Indonesia is killing innocent people in East Timor *** see http://amadeus.inesc.pt/~jota/Timor/ From jin at mail.utexas.edu Sun Mar 8 17:09:48 1998 From: jin at mail.utexas.edu (Hiroshi Jin) Date: Sun, 8 Mar 1998 16:09:48 -0600 Subject: No subject Message-ID: To users of the tlearn neural network simulator: Version 1.0.1 of the tlearn simulator is now available for ftp at two sites: ftp://ftp.psych.ox.ac.uk/pub/tlearn/ (Old Worlds Users) ftp://crl.ucsd.edu/pub/neuralnets/tlearn (New Worlds Users) This version mostly involves bug fixes to the earlier version. A complete user manual for the software plus a set of tutorial exercises is available in: Plunkett and Elman (1997) "Exercises in Rethinking Innateness: A Handbook for Connectionist Simulations". MIT Press. For WWW access, the San Diego tlearn page is http://crl.ucsd.edu/innate/tlearn.html This contains a link to the directory containing the binaries: ftp://crl.ucsd.edu/pub/neuralnets/tlearn Then click on the filename(s) to download. For direct ftp/fetch access via anonymous login: - ftp/fetch to crl.ucsd.edu (132.239.63.1) - login anonymous/email address - cd pub/neuralnets/tlearn At the Oxford site: ftp://ftp.psych.ox.ac.uk/pub/tlearn/wintlrn1.0.1.zip is a zip archive that contains the windows 95 tlearn executable. ftp://ftp.psych.ox.ac.uk/pub/tlearn/wintlrn.zip is a link that always points to the latest version, in this case wintlrn1.0.1.zip. The mac version is in the following location: ftp://ftp.psych.ox.ac.uk/pub/tlearn/mac_tlearn_1.0.1.sea.hqx From terry at salk.edu Mon Mar 9 17:59:02 1998 From: terry at salk.edu (Terry Sejnowski) Date: Mon, 9 Mar 1998 14:59:02 -0800 (PST) Subject: NEURAL COMPUTATION 10:3 Message-ID: <199803092259.OAA03105@helmholtz.salk.edu> Neural Computation - Contents Volume 10, Number 3 - April 1, 1998 ARTICLE Towards A Biophysically Plausible Bidirectional Hebbian Rule Norberto M. Grzywacz and Pierre-Yves Burgi NOTES Axon Guidance: Stretching Gradients to the Limit Geoffrey Goodhill and Herwig Baier Equivalence of a Sprouting-And-Retraction Model and Correlation-Based Plasticity Models of Neural Development Kenneth D. Miller Axonal Processes and Neural Plasticity: A Reply T. Elliott, C.I. Howarth and N. R. Shadbolt LETTERS Synaptic Delay Learning in Pulse-Coupled Neurons Harold Huning, Helmut Glunder and Gunther Palm Neural Processing in the Subsecond Time Range in the Temporal Cortex Kiyohiko Nakamura Temporal-to-Rate-Code Conversion by Neuronal Phase-Locked Loops Ehud Ahissar Deformation Theory of the Dynamic Link Matching Toru Aonishi and Koji Kurata Constrained Optimization for Neural Map Formation: A Unifying Framework for Weight Growth and Normalization Laurenz Wiskott and Terrence J. Sejnowski Breaking Rotational Symmetry in a Self-Organizing Map-Model for Orientation Map Development M. Riesenhuber, H.-U. Bauer, D. Brockmann and T. Geisel Nonlinear Time-Series Prediction with Missing and Noisy Data Volker Tresp and Reimar Hofmann Issues in Bayesian Analysis of Neural Network Models Peter Muller and David Rios Insua ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1998 - VOLUME 10 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $78 Individual $82 $87.74 $110 Institution $285 $304.95 $318 * includes 7% GST (Back issues from Volumes 1-9 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From franz at homer.njit.edu Mon Mar 9 18:46:44 1998 From: franz at homer.njit.edu (Franz Kurfess) Date: Mon, 9 Mar 1998 18:46:44 -0500 Subject: CfP Special Issue "Neural Networks and Structured Knowledge" Message-ID: <199803092346.SAA21513@vector.njit.edu> Special Issue "Neural Networks and Structured Knowledge" in Applied Intelligence: The International Journal of Artificial Intelligence, Neural Networks, and Complex Problem-Solving Techniques Call for Contributions The submission of papers is invited for a special issue on "Neural Networks and Structured Knowledge" of the Applied Intelligence Journal. Issue Theme The representation and processing of knowledge in computers traditionally has been concentrated on symbol-oriented approaches, where knowledge items are associated with symbols. These symbols are grouped into structures, reflecting the important relationships between the knowledge items. Processing of knowledge then consists of manipulation of the symbolic structures, and the result of the manipulations can be interpreted by the user. Whereas this approach has seen some remarkable successes, there are also domains and problems where it does not seem adequate. Some of the problems are computational complexity, rigidity of the representation, the difficulty of reconciling the artificial model with the real world, the integration of learning into the model, and the treatment of incomplete or uncertain knowledge. Neural networks, on the other hand, have advantages that make them good candidates for overcoming some of the above problems. Whereas approaches to use neural networks for the representation and processing of structured knowledge have been around for quite some time, especially in the area of connectionism, they frequently suffer from problems with expressiveness, knowledge acquisition, adaptivity and learning, or human interpretation. In the last years much progress has been made in the theoretical understanding and the construction of neural systems capable of representing and processing structured knowledge in an adequate way, while maintaining essential capabilities of neural networks such as learning, tolerance of noise, treatment of inconsistencies, and parallel operation. The theme of this special issue comprises * the investigation of the underlying theorecical foundations, * the implementation and evaluation of methods for representation and processing of structured knowledge with neural networks, and * applications of such approaches in various domains. Topics of Interest The list below gives some examples of intended topics. * Concepts and Methods: o extraction, injection and refinement of structured knowledge from, into and by neural networks o inductive discovery/formation of structured knowledge o combining symbolic machine learning techniques with neural lerning paradigms to improve performance o classification, recognition, prediction, matching and manipulation of structured information o neural methods that use or discover structural similarities o neural models to infer hierachical categories o structuring of network architectures: methods for introducing coarse-grained structure into networks, unsupervised learning of internal modularity * Application Areas: o medical and technical diagnosis: discovery and manipulation of structured dependencies, constraints, explanations o molecular biology and chemistry: prediction of molecular structure unfolding, classification of chemical structures, DNA analysis o automated reasoning: robust matching, manipulation of logical terms, proof plans, search space reduction o software engineering: quality testing, modularisation of software o geometrical and spatial reasoning: robotics, structured representation of objects in space, figure animation, layouting of objects o other applications that use, generate or manipulate structures with neural methods: strucures in music composition, legal reasoning, architectures, technical configuration, ... The central theme of this issue will be the treatment of structured information using neural networks, independent of the particular network type or processing paradigm. Thus the theme is orthogonal to the question of connectionist/symbolic integration, and is not intended as a continuation of the more philosphically oriented discussion of symbolic vs. subsymbolic representation and processing. Submission Process Prospective authors should send an electronic mail message indicating their intent to submit a paper to the guest editor of the special issue, Franz J. Kurfess (kurfess at cis.njit.edu). This message should contain a preliminary abstract and three to five keywords. Six hard copies of the final manuscript should be sent to the guest editor (not to the Applied Intelligence Editorial office): Prof. Franz J. Kurfess New Jersey Institute of Technology Phone: (973) 596 5767 Department of Computer and Information Science Fax: (973) 596 5777 University Heights Email: kurfess at cis.njit.edu Newark, NJ 07102-1982 WWW: http://www.cis.njit.edu/~franz To speed up the reviewing process, authors should also send a PostScript version of the paper via email to the guest editor. Prospective authors can find further information about the journal on the home page http://kapis.www.wkap.nl/journalhome.htm/0924-669X Schedule Paper submission deadline: May 1, 1998 Review decision by: July 31, 1998 Final manuscript due: August 31, 1998 Tentative publication date: November 1998 From lba at inesc.pt Tue Mar 10 13:18:21 1998 From: lba at inesc.pt (Luis B. Almeida) Date: Tue, 10 Mar 1998 18:18:21 +0000 Subject: new version of paper Message-ID: <350583ED.D61BBD84@inesc.pt> A new version of the paper Parameter Adaptation in Stochastic Optimization Luis B. Almeida, Thibault Langlois, Jose D. Amaral and Alexander Plakhov is available. In the new version, a few typos in equations have been corrected, following kind remarks from a reader. A few other details also were ironed out. The new version has replaced the old one at the ftp site, so you should use the same addresses to download it: Compressed postscript form (162 kB) at ftp://146.193.2.131/pub/lba/papers/adsteps.ps.gz and uncompressed postscript form (956 kB) at ftp://146.193.2.131/pub/lba/papers/adsteps.ps Comments are welcome. Luis B. Almeida Phone: +351-1-3100246,+351-1-3544607 INESC Fax: +351-1-3145843 R. Alves Redol, 9 E-mail: lba at inesc.pt 1000 Lisboa, Portugal http://ilusion.inesc.pt/~lba/lba.html ------------------------------------------------------------------------ *** Indonesia is killing innocent people in East Timor *** see http://amadeus.inesc.pt/~jota/Timor/ From P.N.Roper at lboro.ac.uk Tue Mar 10 21:21:58 1998 From: P.N.Roper at lboro.ac.uk (Peter Roper) Date: Wed, 11 Mar 1998 02:21:58 +0000 Subject: Postdoc available Message-ID: <3505F546.1142@lboro.ac.uk> RESEARCH ASSOCIATE Neurodynamical Model Of Locomotion In A Simple Vertebrate Department Of Mathematical Sciences A postdoctoral research associate is required to work for two years with Professor P C Bressloff and Dr S Coombes in the Nonlinear and Complex Systems Group, Department of Mathematical Sciences, Loughborough University, UK. The post is funded by EPSRC and will be available from 1 May 1998, or as soon as possible thereafter. The aim of the project is to develop a general dynamical theory of pulse-coupled oscillator networks, and to apply this to a neurobiological model of the swimming and struggling behaviour of the Xenopus Tadpole. The work is in collaboration with Professor Alan Roberts, School of Biological Sciences, Bristol University. The appointment will be on the Research Grade 1A salary scale ?15,159 - ?22,785 per annum, dependent on qualifications and experience. More information about the Nonlinear and Complex Systems Group may be found from http://info.lboro.ac.uk/departments/ma/research/ncsg/index.html and specifically about the post http://info.lboro.ac.uk/departments/ma/research/ncsg/job.html Informal enquiries and requests for application forms should be addressed to Professor P C Bressloff, Department of Mathematical Sciences, Loughborough University, Loughborough, Leicestershire, LE11 3TU, UK tel: 01509-223188, fax: 01509-223969, email P.C.Bressloff at lboro.ac.uk. Please quote reference MA/188/W. Closing date 13 April 1998. -- ______________________________________________________________ Peter Roper Dept Mathematical Sciences Loughborough University LEICS LE11 3TU UK Phone (+44) (0) 1509 228206 work email P.N.Roper at lboro.ac.uk http://www.lboro.ac.uk/departments/ma/pg/peterRoper.html ______________________________________________________________ From Johan.Suykens at esat.kuleuven.ac.be Wed Mar 11 08:41:24 1998 From: Johan.Suykens at esat.kuleuven.ac.be (Johan.Suykens@esat.kuleuven.ac.be) Date: Wed, 11 Mar 1998 14:41:24 +0100 Subject: International Workshop Message-ID: <199803111341.OAA00032@euler.esat.kuleuven.ac.be> Second call for papers International Workshop on *** ADVANCED BLACK-BOX TECHNIQUES FOR NONLINEAR MODELING: THEORY AND APPLICATIONS *** with !!! TIME-SERIES PREDICTION COMPETITION !!! Date: July 8-10, 1998 Place: Katholieke Universiteit Leuven, Belgium On-line Info: http://www.esat.kuleuven.ac.be/sista/workshop/ Organized at the Department of Electrical Engineering (ESAT-SISTA) and the Interdisciplinary Center for Neural Networks (ICNN) in the framework of the project KIT and the Belgian Interuniversity Attraction Pole IUAP P4/02. In cooperation with the IEEE Circuits and Systems Society. * GENERAL SCOPE The rapid growth of the field of neural networks, fuzzy systems and wavelets is offering a variety of new techniques for modeling of nonlinear systems in the broad sense. These topics have been investigated from differents points of view including statistics, identification and control theory, approximation theory, signal processing, nonlinear dynamics, information theory, physics and optimization theory among others. The aim of this workshop is to serve as an interdisciplinary forum for bringing together specialists in these research disciplines. Issues related to the fundamental theory as well as real-life applications will be addressed at the workshop. * TIME-SERIES PREDICTION COMPETITION Within the framework of this workshop a time-series prediction competition will be held. The results of the competition will be announced during the workshop, where the winner will be awarded. Participants in the competition are asked to submit their predicted data together with a short description and references of the methods used. In order to stimulate wide participation in the competition, attendance of the workshop is not mandatory but is of course encouraged. All information about this contest is available at http://www.esat.kuleuven.ac.be/sista/workshop/ . * INVITED SPEAKERS (confirmed) L. Feldkamp (Ford Research, USA) - Extended Kalman filtering C. Micchelli (IBM T.J. Watson, USA) - Density estimation U. Parlitz (Gottingen, Germany) - Nonlinear time-series analysis J. Sjoberg (Goeteborg, Sweden) - Nonlinear system identification S. Tan (Beijing, China) - Wavelet-based system modeling V. Vapnik (AT&T Labs-Research, USA) - Support vector method of function estimation M. Vidyasagar (Bangalore, India) - Statistical learning theory V. Wertz (Louvain-la-Neuve, Belgium) - Fuzzy modeling A workshop book containing the invited talks will be published by Kluwer and will be available at the workshop. * TOPICS include but are not limited to Nonlinear system identification Backpropagation Time series analysis Learning and nonlinear optimization Multilayer perceptrons Recursive algorithms Radial basis function networks Extended Kalman filtering Fuzzy modelling Embedding dimension Wavelets Subspace methods Piecewise linear models Identifiability Mixture of experts Model selection and validation Universal approximation Simulated annealing Recurrent networks Genetic algorithms Regularization Forecasting Bayesian estimation Frequency domain identification Density estimation Classification Information geometry Real-life applications Generalization Software * REGISTRATION Registration fee: 6500 BF for IEEE members and students, and 7500 BF for others (1 US Dollar is approximately 37.5 BF). It includes the workshop book, proceedings, lunches, dinners, refreshments/coffee. For registration form and payment, see http://www.esat.kuleuven.ac.be/sista/workshop/ . * HOTEL INFORMATION A block of rooms has been reserved at Begijnhof Congreshotel, New Damshire, Holiday Inn and Ibis. For more information in order to contact the hotels, see http://www.esat.kuleuven.ac.be/sista/workshop/. * IMPORTANT DATES Deadline paper submission: April 2, 1998 Notification of acceptance: May 4, 1998 Workshop: July 8-10, 1998 Time-series competition: Deadline data submission: March 20, 1998 * Chairman: Johan Suykens Katholieke Universiteit Leuven Departement Elektrotechniek - ESAT/SISTA Kardinaal Mercierlaan 94 B-3001 Leuven (Heverlee), Belgium Tel: 32/16/32 18 02 Fax: 32/16/32 19 70 Email: Johan.Suykens at esat.kuleuven.ac.be Program Committee: B. De Moor, E. Deprettere, D. Roose, J. Schoukens, S. Tan, J. Vandewalle, V. Wertz, Y. Yu From cdr at lobimo.rockefeller.edu Wed Mar 11 11:07:35 1998 From: cdr at lobimo.rockefeller.edu (George Reeke) Date: Wed, 11 Mar 1998 11:07:35 -0500 Subject: Postdoctoral Position Available Message-ID: <980311110735.ZM1252@grane.rockefeller.edu> POSTDOCTORAL ASSOCIATE Laboratory of Biological Modelling The Rockefeller University A postdoctoral position is available for an individual interested in collaborating with the Lab Head, Dr. George Reeke, to develop biologically realistic neural models for behaviors in which temporal interval and pattern recognition and production are main components. Current work in the Laboratory is aimed at applying theoretical principles of neuronal group selection ("neural Darwinism") to generate models for a variety of paradigmatic cases that can be tested by comparing results of computational simulations with data from psychophysical experiments. Applicants should have a Ph.D. in a relevant area of neurobiology or psychology and strong computer skills. Experience with realistic neural simulations or neural networks is desirable. Starting date is flexible. The position is available for 1-2 years depending on accomplishment. Send curriculum vitae, statement of interests, list of publications, and names of three references by regular mail, e-mail, or FAX to the undersigned. The Rockefeller University is an Equal Opportunity Employer. George Reeke Laboratory of Biological Modelling The Rockefeller University 1230 York Avenue New York, NY 10021 phone: (212)-327-7627 FAX: (212)-327-7469 email: reeke at lobimo.rockefeller.edu From sala at digame.dgcd.doc.ca Wed Mar 11 08:17:28 1998 From: sala at digame.dgcd.doc.ca (sala@digame.dgcd.doc.ca) Date: Wed, 11 Mar 1998 08:17:28 -0500 Subject: Research Position Message-ID: <199803111317.IAA10821@digame.doc.ca> Announcement of Research Position in Neural Network Research Neural Network Research Scientist Communications Research Center Ottawa, Ontario Essential Requirements Doctoral degree or equivalent from a recognized university in Physics, Electrical Engineering, or Computer Science. A working knowledge of English or French is required for this position. Experience in conducting independent research in the field of neural networks and in the application of neural networks to problems in pattern recognition and classification. An ability to communicate scientific knowledge effectively, both orally and in writing. Applicants may be required to undergo security clearance prior to hiring. Desirable Requirements A knowledge of neural network architectures and learning paradigms and their role in different types of pattern recognition tasks. A thorough knowledge of computer systems and operating systems (WindowsNT, Windows95, SunOS, and/or Solaris), experience with advanced computer software for the purposes of simulating and controlling neural network circuitry, experience with the design and testing of communications devices, experience with digital signal processing software, and a working knowledge of the MatLab programming environment. In addition, the successful candidate is expected to show a high degree of personal motiviation, initiative, an ability to work as a member of a research team, and an ability to form and maintain working interpersonal relationships. This position is currently funded for a period of three years with a strong possibility of an extension beyond that period. The successful candidate will be paid in accordance with the salary scales appropriate to the SE-RES-01 category (salary range $37,036 to $48,727 Cdn.) and will enjoy all the benefits associated with that position. The Communications Research Center is the premier communications research facility of Industry Canada and is situated at the outskirts of Ottawa at Shirley Bay. The CRC shares the research site with the Defense Research Establishment Ottawa and the Canadian Space Agency's David Florida Lab. The site is served by public transportation and is close to residential neighborhoods with ample rental properties. Please reply by: (1) e-mail with resume as attachment to research at digame.dgcd.crc.ca (2) fax resume and cover letter to 613-991-0246 (3) If preferred, you may submit your resume document directly by anonymous FTP to digame.dgcd.crc.ca, placing the document in the folder "resume". PLEASE BE SURE to send a brief, accompanying e-mail to research at digame.dgcd.crc.ca indicating that you have uploaded your resume, your name, and the exact name of the file deposited on digame.dgcd.crc.ca. From zhangw at redwood.rt.cs.boeing.com Wed Mar 11 19:01:28 1998 From: zhangw at redwood.rt.cs.boeing.com (Wei Zhang) Date: Wed, 11 Mar 1998 16:01:28 -0800 Subject: dissertation available Message-ID: <199803120001.QAA12301@darwin.network-b> Dear colleagues, I finally decide to make my dissertation available on Internet. It is at neuroprose ftp://archive.cis.ohio-state.edu/pub/neuroprose/Thesis/zhang.rl4jss.ps.Z Here are the title and abstract. Thanks. Reinforcement Learning for Job-Shop Scheduling Wei Zhang Oregon State University Department of Computer Science May 1996 173 pages, double side. Abstract. This dissertation studies applying reinforcement learning algorithms to discover good domain-specific heuristics automatically for job-shop scheduling. It focuses on the NASA space shuttle payload processing problem. The problem involves scheduling a set of tasks to satisfy a set of temporal and resource constraints while also seeking to minimize the total length (makespan) of the schedule. The approach described in the dissertation employs a repair-based scheduling problem space that starts with a critical-path schedule and incrementally repairs constraint violations with the goal of finding a short conflict-free schedule. The temporal difference (TD) learning algorithm $TD(\lambda)$ is applied to train a neural network to learn a heuristic evaluation function for choosing repair actions over schedules. This learned evaluation function is used by a one-step lookahead search procedure to find solutions to new scheduling problems. Several important issues that affect the success and the efficiency of learning have been identified and deeply studied. These issues include schedule representation, network architectures, and learning strategies. A number of modifications to the $TD(\lambda)$ algorithm are developed to improve learning performance. Learning is investigated based on both hand-engineered features and raw features. For learning from raw features, a time-delay neural network architecture is developed to extract features from irregular-length schedules. The learning approach is evaluated on synthetic problems and on problems from a NASA space shuttle payload processing task. The evaluation function is learned on small problems and then applied to solve larger problems. Both learning-based schedulers (using hand-engineered features and raw features respectively) perform better than the best existing algorithm for this task---Zweben's iterative repair method. It is important to understand why TD learning works in this application. Several performance measures are employed to investigate learning behavior. We verified that TD learning works properly in capturing the evaluation function. It is concluded that TD learning along with a set of good features and a proper neural network is the key to this success. The success shows that reinforcement learning methods have the potential for quickly finding high-quality solutions to other combinatorial optimization problems. #=====================================================================# | Dr. Wei Zhang | ___ ___ ___ ___ ___ | | Computer Scientist | /__// //__ / /\ // _ | | Adaptive Systems | /__//__//__ _/_ / //__/ | | Applied Research & Technology | | | | P.O. Box 3707, M/S 7L-66 | | Voice: (425) 865-2602 | Seattle, WA 98124-2207 | | FAX: (425) 865-2964 | -- or for ground express mail -- | | | 2710 160th Ave. S.E., Bldg. 33-07 | | zhangw at redwood.rt.cs.boeing.com | Bellevue, WA 98008 | #=====================================================================# From recruit at phz.com Thu Mar 12 13:17:31 1998 From: recruit at phz.com (PHZ Recruiting) Date: Thu, 12 Mar 98 13:17:31 EST Subject: financial modeling job available in Boston area Message-ID: <9803121817.AA11786@phz.com> Applied research position available immediately in QUANTITATIVE MODELING OF FINANCIAL MARKETS at PHZ CAPITAL PARTNERS LP March, 1998 PHZ is a small Boston area company founded in 1993 which invests client money in global financial markets using state-of-the-art proprietary statistical models. Our principals are Tomaso Poggio, Jim Hutchinson, and Xiru Zhang, and our partners include Commodities Corporation LLC, a Goldman Sachs company. PHZ's strong trading performance to date has led to exceptional client interest and asset growth. To further expand our business, PHZ is now looking for a talented, hard working person to join our research and development team to work on our next generation of trading systems. The successful applicant for this position will have a Ph.D. in computer science, math, finance, or a related field, or 4-5 years of work experience in an applied research setting. Experience with machine learning / advanced statistical modeling techniques (e.g. neural networks, genetic algorithms, etc) and their application to real world numerical data sets is required, as are strong software engineering skills (esp. on PCs and Unix). Knowledge of financial markets is also a plus. Depending on candidate interests and skills, this position will involve or lead into basic research and application of sophisticated model development tools, exploratory data gathering and analysis, development of our trading and risk management software platform, and/or trading and monitoring of live models. The growth potential of this position is large, both in terms of responsibilities and compensation. Initial compensation will be competitive based on qualifications, possibly including partnership equity. Interested applicants should email resumes (ascii format) to recruiting at phz.com, or send by US mail to: Attn: Recruiting PHZ Capital Partners LP 111 Speen St, Suite 313 Framingham, MA 01701 USA From lbl at nagoya.riken.go.jp Fri Mar 13 02:49:43 1998 From: lbl at nagoya.riken.go.jp (Bao-Liang Lu) Date: Fri, 13 Mar 1998 16:49:43 +0900 Subject: TR available: Task Decomposition and Module Combination Message-ID: <9803130749.AA13600@xian> The following Technical Report is available via anonymous FTP. FTP-host:ftp.bmc.riken.go.jp FTP-file:pub/publish/Lu/lu-ieee-tnn-98-2nd-rev.ps.gz ========================================================================== TITLE: Task Decomposition and Module Combination Based on Class Relations: A Modular Neural Network for Pattern Classification BMC Technical Report BMC-TR-98-1 AUTHORS: Bao-Liang Lu and Masami Ito ORGANISATIONS: Bio-Mimetic Control Research Center, The Institute of Physical and Chemical Research (RIKEN) ABSTRACT: In this paper, we propose a new method for decomposing pattern classification problems based on the class relations among training data. By using this method, we can divide a $K$-class classification problem into a series of ${K\choose 2}$ two-class problems. These two-class problems are to discriminate class ${\cal C}_{i}$ from class ${\cal C}_{j}$ for $i=1,\, \cdots,\, K$ and $j=i+1$, while the existence of the training data belonging to the other $K-2$ classes is ignored. If the two-class problem of discriminating class ${\cal C}_{i}$ from class ${\cal C}_{j}$ is still hard to be learned, we can further break down it into a set of two-class subproblems as small as we expect. Since each of the two-class problems can be treated as a completely separate classification problem with the proposed learning paradigm, the two-class problems can be learned by different network modules in parallel. We also propose two module combination principles which give practical guidelines in integrating individual trained modules. After learning of each of the two-class problems with a network module, we can easily integrate all of the trained modules into a min-max modular (${\rm M}^{3}$) network according to the module combination principles and obtain a solution to the original problem. Consequently, a large-scale and complex $K$-class classification problem can be solved effortlessly and efficiently by learning a series of smaller and simpler two-class problems in parallel. (38 pages, 4.8Mk) Any comments are appreciated. Bao-Liang Lu ====================================== Bio-Mimetic Control Research Center The Institute of Physical and Chemical Research (RIKEN) 2271-130, Anagahora, Shimoshidami, Moriyama-ku Nagoya 463-0003, Japan Tel: +81-52-736-5870 Fax: +81-52-736-5871 Email: lbl at bmc.riken.go.jp From blair at csee.uq.edu.au Fri Mar 13 03:26:01 1998 From: blair at csee.uq.edu.au (Alan Blair) Date: Fri, 13 Mar 1998 18:26:01 +1000 (EST) Subject: CFP - Simulated Evolution And Learning (SEAL'98) Message-ID: *********************************** * C A L L F O R P A P E R S * *********************************** The Second Asia-Pacific Conference on Simulated Evolution And Learning (SEAL'98) Canberra, Australia 24-27 November 1998 Hosted by School of Computer Science University College, the University of New South Wales Australian Defence Force Academy, Canberra, ACT 2600, Australia in cooperation with COMPLEX SYSTEMS '98, Sydney, Australia, Nov 30 - Dec 4, 1998 IEEE ACT Section Conference URL: http://www.cs.adfa.oz.au/conference/seal98 Aims and Scopes --------------- Evolution and learning are two fundamental forms of adaptation. This conference follows the successful SEAL'96 (The First Asia-Pacific Conference on Simulated Evolution And Learning) in Taejon, Korea, 9-12 November 1996, and aims at exploring these two forms of adaptation and their roles and interactions in adaptive systems. Cross-fertilisation between evolutionary learning and other machine learning approaches, such as neural network learning, reinforcement learning, decision tree learning, fuzzy system learning, etc., will be strongly encouraged by the conference. The other major theme of the conference is optimisation by evolutionary approaches or hybrid evolutionary approaches. The conference will feature both academic and application streams in order to encourage more interactions between researchers and practitioners in the field of simulated evolution and learning. To provide timely feedback, the applications stream will use a two-stage reviewing process featuring preliminary acceptance by abstract and final acceptance by complete paper. The refereeing panel for the applications stream will consist of eminent practitioners in the field. The topics of interest to this conference include but are not limited to the following: 1. Evolutionary Learning + Fundamental Issues in Evolutionary Learning (e.g., Generalisation, Scalability, and Computational Complexity) + Co-Evolutionary Learning + Modular Evolutionary Learning Systems + Classifier Systems + Artificial Immune Systems + Representation Issue in Evolutionary Learning (e.g., rules, trees, graphs, etc.) + Interactions Between Learning and Evolution + Comparison between Evolutionary Learning and Other Learning Approaches (Neural Network, Decision Tree, Reinforcement Learning, etc.) 2. Evolutionary Optimisation + Global (Numerical/Function) Optimisation + Combinatorial Optimisation (e.g., scheduling, allocation, planning, packing, transportation, and various tree/graph problems.) + Comparison between evolutionary and non-evolutionary optimisation algorithms + Hybrid Optimisation Algorithms 3. Hybrid Learning + Evolutionary Artificial Neural Networks + Evolutionary Fuzzy Systems + Combinations Between Evolutionary and Reinforcement Learning + Combinations Between Evolutionary and Decision Tree Learning + Evolutionary Clustering and Unsupervised Learning + Genetic Programming + Other Hybrid Learning Systems 4. Adaptive Systems + Complexity in Adaptive Systems + Evolutionary Robotics + Artificial Ecology 5. Evolutionary Economics and Games + Analysis and Simulation in Evolutionary Economics, Finance and Marketing + Evolutionary Games + Evolutionary Computation Techniques in Economics, Finance and Marketing 6. Theoretical Issues in Evolutionary Computation + Convergence and Convergence Rate of Evolutionary Algorithms + Computational Complexity of Evolutionary Algorithms + Self-Adaptation in Evolutionary Algorithms 7. Evolvable Hardware (EHW) + FPGA Implementation of EHW + Algorithms for EHW + EHW Systems and Chips 8. Applications + Novel Applications of Evolutionary Techniques + Optimisation Algorithms for Real-World Problems + Solving classical OR (Operations Research) problems by Evolutionary Algorithms + Pattern Classification and Recognition + Time-Series Prediction + System Identification + Others Important Dates --------------- Regular Papers 03 July 1998 Deadline for submission of papers (<=8 pages) 21 Aug. 1998 Notification of acceptance 25 Sep. 1998 Deadline for camera-ready copies of accepted papers 24-27 Nov. 1998 Conference sessions Applications Papers 19 June 1998 Deadline for submission of applications abstracts (<= 1 page) (abstracts to be submitted by email) 03 July 1998 Notification of preliminary acceptance of applications papers 31 July 1998 Deadline for submission of applications papers (<=8 pages) 21 Aug. 1998 Notification of acceptance of applications papers 25 Sep. 1998 Deadline for camera-ready copies of applications papers 24-27 Nov. 1998 Conference sessions Paper Submission ---------------- Applications paper abstracts should be submitted by email before the cutoff date to SEAL98 at cs.adfa.oz.au. FOUR (4) hard copies of the completed paper should be submitted to the programme committee chair by the due date. All manuscripts should be prepared in LaTeX according to Springer-Verlag's llncs style (URL: gopher://trick.ntp.springer.de/11/tex/latex/llncs). Each submitted paper must include a title, a 300-400 word abstract, a list of keywords, the names and addresses of all authors (including email addresses, and telephone and fax numbers), and the body. The length of submitted papers must be no more than 8 single-spaced, single-column pages including all figures, tables, and bibliography. Shorter papers are strongly encouraged. Papers should be submitted to the following address: Dr Xin Yao School of Computer Science University College, UNSW Australian Defence Force Academy Canberra, ACT 2600, Australia Publications ------------ All accepted papers which are presented at the conference will be included in the conference proceedings. Outstanding papers will be considered for inclusion in a proposed volume of Springer Verlag's Lecture Notes in Artificial Intelligence (LNAI). Some of them will be invited to further expand and revise their papers for inclusion in an international journal. (Selections from SEAL'96 papers were published by Springer-Verlag as Volume 1285 of LNAI.) Special Sessions and Tutorials ------------------------------ Special sessions and tutorials will be organised at the conference. The conference is calling for special session and tutorial proposals. Contact Persons --------------- Conference General Chair: Professor Charles Newton School of Computer Science University College, UNSW, ADFA Canberra, ACT 2600, Australia Organising Committee Chair: Dr Bob McKay School of Computer Science University College, UNSW, ADFA Canberra, ACT 2600, Australia Programme Committee Co-Chair: Dr Xin Yao School of Computer Science University College, UNSW, ADFA Canberra, ACT 2600, Australia Programme Committee Co-Chair: Professor Jong-Hwan Kim Department of Electrical Engineering KAIST 373-1, Kusung-dong, Yusung-gu, Taejon-shi 305-701, Republic of Korea Email: johkim at vivaldi.kaist.ac.kr Programme Committee Co-Chair: Professor Takeshi Furuhashi Department of Information Electronics Nagoya University Furo-cho, Chikusa-ku, Nagoya 464-01 Japan Email: furuhashi at nuee.nagoya-u.ac.jp Special Sessions Chair: Professor Kit Po Wong Department of Electrical Engineering The University of Western Australia Nedlands, WA 6009, Australia Email: kitpo at ee.uwa.edu.au Conference Secretary: Miss Alison McMaster School of Computer Science University College, UNSW, ADFA Canberra, ACT 2600, Australia Email: SEAL98 at cs.adfa.oz.au Phone: +61 2 6268 8184, Fax: +61 2 6268 8581 From barba at cvs.rochester.edu Mon Mar 16 12:40:13 1998 From: barba at cvs.rochester.edu (Barbara Arnold) Date: Mon, 16 Mar 1998 12:40:13 -0500 Subject: 21st CVS Symposium Message-ID: 21st CVS Symposium "Environmental Structure, Statistical Learning & Visual Perception" June 4 - 6, 1998 CENTER FOR VISUAL SCIENCE University of Rochester Rochester, NY The Center for Visual Science at the University of Rochester is proud to present the 21st Symposium, "Environmental Structure, Statistical Learning and Visual Perception". The three-day symposium will consist of five sessions plus an open house and lab tours on Saturday afternoon. The meeting will begin with a Reception/Buffet on Wednesday evening, June 3. Formal sessions start Thursday morning, June 4, and end at noon on Saturday. There will be optional banquets held on Thursday and Friday evenings, and a cookout lunch on Saturday. Informal discussion gatherings will follow the banquets. PROGRAM Wednesday, June 3 4:00-10:00 PM Registration 6:00-8:00 PM Reception/Buffet Thursday, June 4 SESSION I: Image Statistics E Simoncelli, New York University C Chubb, University of CA Irvine D Ruderman, The Salk Institute SESSION II: Color Constancy D. Brainard, Univ of CA Santa Barbara S Shevell, University of Chicago A Hurlbert, Univ of Newcastle, England Friday, June 5 SESSION III:Surface Perception T Adelson, MIT L Maloney, New York University Zili Liu, NEC Research Institute SESSION IV: Object Perception D Knill , University of Pennsylvania K Nakayama, Harvard University P Kellman, University of CA Los Angeles Saturday, June 6 SESSION V: Neural Coding and Plasticity W Geisler, University of Texas Austin N Logothetis, Max-Planck Institute SESSION VI: OPEN HOUSE Center for Visual Science Open House and Lab Tours REGISTRATION FEES Preregistration, Regular $125.00 Preregistration, Student $ 95.00 On-site, Regular $180.00 On-site, Student $130.00 To preregister, please return the form posted on our website http://www.cvs.rochester.edu/symposium/propsymposia98.html Please send a separate form for each person registering. No preregistrations will be accepted after May 15. If you do not have access to our website please contact Barbara Arnold at barba at cvs.rochester.edu or 716-275-8659 ACCOMMODATIONS AND MEALS The University has moderate cost rooms available for symposium attendees. Residence halls are centrally located on the campus and are a short walk to Hoyt Hall where the symposium sessions will be held. A special package of residence hall room and all meals and banquets is being offered to Symposium participants. This package includes all meals from Thursday breakfast through the Saturday barbecue. TRAVEL AWARDS A small number of travel awards are available to graduate and postdoctoral students. Applications for travel assistance must be received by May 1, 1998. Please refer to the travel award application form, posted on our website, for more information. http://www.cvs.rochester.edu/symposium/propsymposia98.html If you do not have access to our website please contact Barbara Arnold at barba at cvs.rochester.edu or 716-275-8659 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Barbara Arnold email: barba at cvs.rochester.edu Center for Visual Science phone: 716 275 8659 Room 274 Meliora Hall fax: 716 271 3043 University of Rochester Rochester NY 14627-0270 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From aperez at lslsun.epfl.ch Tue Mar 17 08:15:59 1998 From: aperez at lslsun.epfl.ch (Andres Perez-Uribe) Date: Tue, 17 Mar 1998 14:15:59 +0100 Subject: Java Applet:Black Jack and Reinforcement Learning Message-ID: <350E778E.136C53F9@lslsun.epfl.ch> Dear Connectionist, This is to announce a Java applet that implements a simplified version of the game of Black Jack. One or two players can play against the dealer (i.e., the casino). Though one or both players can be set to be your computer. By default, the computer plays in a random manner. However, you may let it play against the dealer and learn to play Black Jack from experience. The learning algorithm it uses is called the SARSA algorithm, a reinforcement learning algorithm introduced by G.Rummery and M.Niranjan. URL : http://lslwww.epfl.ch/~aperez/BlackJack/ For further information on reinforcement learning and Black Jack playing, you may refer to the www page "Learning to Play Black Jack with Artificial Neural Networks" : URL : http://lslwww.epfl.ch/~aperez/rlbj.html also at the Logic Systems Laboratory, Swiss Federal Institute of Technology-Lausanne. Best regards, -- Andres PEREZ-URIBE Logic Systems Laboratory Computer Science Department Swiss Federal Institute of Technology-Lausanne 1015 Lausanne, Switzerland Email: aperez at lslsun.epfl.ch http://lslwww.epfl.ch/~aperez Tel: +41-21-693-2652 Fax: +41-21-693 3705 From trevor.clarkson at kcl.ac.uk Wed Mar 18 09:24:27 1998 From: trevor.clarkson at kcl.ac.uk (Trevor Clarkson) Date: Wed, 18 Mar 1998 14:24:27 +0000 Subject: 4-month RA post at King's College Message-ID: <1.5.4.16.19980318142427.297f250c@mail.kcl.ac.uk> RESEARCH ASSISTANT (NEURAL NETWORKS) from June - September 1998. A post is available for 4-months, starting in June 1998 for a Research Assistant at point 9 on the scale, at King's College London. The post is intended for a post-doctoral researcher. A feasability study will be carried out with a customer in Cambridge to develop a neural network system to provide accurate ink-drop placement in an ink-jet printer. Some travel to Cambridge will be required and this will be fully funded. Experience in neural networks, real-time systems or neural hardware will be required. The ability to work to a tight schedule is essential. Owing to the imminent start date, applicants should preferably be EU citizens. For further details please contact Professor Trevor Clarkson at King's College London (tgc at kcl.ac.uk). (Please pass on this message to potential candidates. Thank you.) From harnad at coglit.soton.ac.uk Wed Mar 18 11:09:25 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Wed, 18 Mar 1998 16:09:25 +0000 (GMT) Subject: Words in the Brain's Language: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article on: WORDS IN THE BRAIN'S LANGUAGE by Friedemann Pulvermueller This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences Department of Psychology University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. ___________________________________________________________________ WORDS IN THE BRAIN'S LANGUAGE Friedemann Pulvermueller Fachgruppe Psychologie Universitaet Konstanz 78434 Konstanz Germany pumue at uni-tuebingen.de KEYWORDS: associative learning, cell assembly, cognition, cortex, language, word category ABSTRACT: If the cortex is an associative memory, strongly connected cell assemblies will form when neurons in different cortical areas are frequently active at the same time. The cortical distributions of these assemblies must be a consequence of where in the cortex correlated neuronal activity occurred during learning. An assembly can be considered a functional unit exhibiting activity states such as full activation (ignition) after appropriate sensory stimulation (possibly related to perception) and continuous reverberation of excitation within the assembly (a putative memory process). This has implications for cortical topographies and activity dynamics of cell assemblies representing words. Cortical topographies of assemblies should be related to aspects of the meaning of the words they represent, and physiological signs of cell assembly ignition should be followed by possible indicators of reverberation. The following postulates are discussed in detail: (1) assemblies representing phonological word forms are strongly lateralized and distributed over perisylvian cortices; (2) assemblies representing highly abstract words, such as grammatical function words, are also strongly lateralized and restricted to these perisylvian regions; (3) assemblies representing concrete content words include additional neurons in both hemispheres; (4) assemblies representing words referring to visual stimuli include neurons in visual cortices; (5) assemblies representing words referring to actions include neurons in motor cortices. Two main sources of evidence are used for evaluating these proposals: (a) imaging studies aiming at localizing word processing in the brain, based on stimulus-triggered event-related potentials (ERP), positron emission tomography (PET) and functional magnetic resonance imaging (fMRI), and (b) studies of the temporal dynamics of fast activity changes in the brain, as revealed by high-frequency responses recorded in the electroencephalogram (EEG) and magnetoencephalogram (MEG). These data provide evidence for processing differences between words and matched meaningless pseudowords, and between word classes such as concrete content and abstract function words, and words evoking visual or motor associations. There is evidence for early word class-specific spreading of neuronal activity and for equally specific high-frequency responses occurring later. These results support a neurobiological model of language in the Hebbian tradition. Competing large-scale neuronal theories of language are discussed in the light of the summarized data. A final paragraph addresses neurobiological perspectives on the problem of serial order of words in syntactic strings. -------------------------------------------------------------- To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp or gopher from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.pulvermueller.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.pulvermueller ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.pulvermueller gopher://gopher.princeton.edu:70/11/.libraries/.pujournals To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.pulvermueller When you have the file(s) you want, type: quit From ssingh at soc.plym.ac.uk Thu Mar 19 14:13:18 1998 From: ssingh at soc.plym.ac.uk (Sameer Singh) Date: Thu, 19 Mar 1998 19:13:18 +0000 Subject: PhD studentship Message-ID: <199803191909.TAA20911@hebe.soc.plym.ac.uk> UNIVERSITY OF PLYMOUTH, UK SCHOOL OF COMPUTING PhD Studentship in Financial Forecasting using Neural Networks Applications are invited for a PhD studentship in the area of financial forecasting neural networks. The studentships only cover the UK/EEC fees, and maintenance. The applicant should have a good background in programming with C/C++ and ideally should have experience with neural networks. Postgraduate students with an MSc degree are particularly encouraged to apply. Applications can be made by e-mailing your CV to Dr. Sameer Singh at the address given below. For further information, Dr. Singh may be emailed at s1singh at plym.ac.uk. Deadline for applications: 1 April, 1998 ___________________________________________________ School of Computing University of Plymouth Kirkby Place Plymouth PL4 8AA UK Tel: +44-1752-232612 Fax: +44-1752-232540 e-mail: s1singh at plym.ac.uk/ ssingh at soc.plym.ac.uk web: http://www.soc.plym.ac.uk/soc/sameer __________________________________________________ From rich at cs.umass.edu Thu Mar 19 14:35:12 1998 From: rich at cs.umass.edu (Rich Sutton) Date: Thu, 19 Mar 1998 14:35:12 -0500 Subject: New Textbook on Reinforcement Learning Message-ID: Dear Colleagues, This note is to announce the availability of a new textbook on reinforcement learning by Andy Barto and me. As many of you know, we have been working on this book for over four years. A few weeks ago we received our authors' copies, and the book is now available by internet/mail order and in bookstores: Sutton, R.S., Barto, A.G. (1998) Reinforcement Learning: An Introduction. MIT Press, Cambridge, MA. The rest of this note says a little more about the book and points to further information. As its title indicates, the book is meant to be an introductory treatment of reinforcement learning, emphasizing foundations and ideas rather than the latest developments and mathematical proofs. We divide the ideas underlying the field into a half dozen primary dimensions, consider each in detail, and then combine them to form a much larger space of possible methods including all the most popular ones from Q-learning to value iteration and heuristic search. In this way we have tried to make the book interesting to both newcomers and experts alike. We have tried to make the work accessible to the broadest possible audiences in artificial intelligence, control engineering, operations research, psychology, and neuroscience. If you are a teacher, we urge you to consider creating or altering a course to use the book. We have found that the book works very well as the basis for an independent course on reinforcement learning at the graduate or advanced undergraduate level. The eleven chapters can be covered one per week. Exercises are provided in each chapter to help the students think on their own about the material. Answers to the exercises are available to instructors, for now from me, and probably later from MIT Press in an instructor's manual. Programming projects are also suggested throughout the book. Of course, the book can also be used to help teach reinforcement learning as it is most commonly done now, that is, as part of a broader course on machine learning, artificial intelligence, neural networks, or advanced control. I have taught all the material in the book in as little as four weeks, and of course subsets can be covered in less time. Finally, if you are interested in reviewing the book for a major journal or magazine, please contact our MIT Press publicist, Gita Manaktala (manak at mit.edu or 617-253-5643), directly. Further information about the book, including ordering information and detailed information about its contents, can be obtained from its home page at http://www.cs.umass.edu/~rich/book/the-book.html. Rich Sutton rich at cs.umass.edu From nic at idsia.ch Fri Mar 20 06:15:22 1998 From: nic at idsia.ch (Nici Schraudolph) Date: Fri, 20 Mar 1998 12:15:22 +0100 Subject: TR available: fast exponentiation Message-ID: <199803201115.MAA01103@idsia.ch> Dear colleagues, the following technical note (5 pages) is available by anonymous ftp; it may be of interest to those who write their own C/C++ neural network code. Best regards, -- Dr. Nicol N. Schraudolph Tel: +41-91-970-3877 IDSIA Fax: +41-91-911-9839 Corso Elvezia 36 CH-6900 Lugano http://www.idsia.ch/~nic/ Switzerland http://www.cnl.salk.edu/~schraudo/ Technical Report IDSIA-07-98: A Fast, Compact Approximation of the Exponential Function --------------------------------------------------------- Nicol N. Schraudolph Neural network simulations often spend a large proportion of their time computing exponential functions. Since the exponentiation routines of typical math libraries are rather slow, their replacement with a fast approximation can greatly reduce the overall computation time. This note describes how exponentiation can be approximated by manipulating the components of a standard (IEEE-754) floating-point representation. This models the exponential function as well as a lookup table with linear interpolation, but is significantly faster and more compact. ftp://ftp.idsia.ch/pub/nic/exp.ps.gz From Jakub.Zavrel at kub.nl Fri Mar 20 07:31:21 1998 From: Jakub.Zavrel at kub.nl (Jakub.Zavrel@kub.nl) Date: Fri, 20 Mar 1998 13:31:21 +0100 (MET) Subject: Software release: TiMBL 1.0 Message-ID: <199803201231.NAA10874@kubsuw.kub.nl> ---------------------------------------------------------------------- Software release: TiMBL 1.0 Tilburg Memory Based Learner ILK Research Group, http://ilk.kub.nl/ ---------------------------------------------------------------------- The ILK (Induction of Linguistic Knowledge) Research Group at Tilburg University, The Netherlands, announces the release of TiMBL, Tilburg Memory Based Learner (version 1.0). TiMBL is a machine learning program implementing a family of Memory-Based Learning techniques for discrete data. TiMBL stores a representation of the training set explicitly in memory (hence `Memory Based'), and classifies new cases by extrapolating from the most similar stored cases. TiMBL features the following (optional) metrics and speed-up optimalizations that enhance the underlying k-nearest neighbour classifier engine: - Information Gain weighting for dealing with features of differing importance (the IB1-IG learning algorithm). - Stanfill & Waltz's / Cost & Salzberg's (Modified) Value Difference metric for making graded guesses of the match between two different symbolic values. - Conversion of the flat instance memory into a decision tree, and inverted indexing of the instance memory, both yielding faster classification. - Further compression and pruning of the decision tree, guided by feature information gain differences, for an even larger speed-up (the IGTREE learning algorithm). TiMBL accepts commandline arguments by which these metrics and optimalizations can be selected and combined. TiMBL can read the C4.5 and WEKA's ARFF data file formats as well as column files and compact (fixed-width delimiter-less) data. -[download]----------------------------------------------------------- You are invited to download the TiMBL package for educational or non-commercial research purposes. When downloading the package you are asked to register, and express your agreement with the license terms. TiMBL is *not* shareware or public domain software. The TiMBL software package can be downloaded from http://ilk.kub.nl/software.html or by following the `Software' link under the ILK home page at http://ilk.kub.nl/ . The TiMBL package contains the following: - Source code (C++) with a Makefile. - A reference guide containing descriptions of the incorporated algorithms, detailed descriptions of the commandline options, and a brief hands-on tuturial. - Some example datasets. - The text of the licence agreement. - A postscript version of the paper that describes IGTREE. The package should be easy to install on most UNIX systems. -[background]--------------------------------------------------------- Memory-based learning (MBL) has proven to be quite successful in a large number of tasks in Natural Language Processing (NLP) -- MBL of NLP tasks (text-to-speech, part-of-speech tagging, chunking, light parsing) is the main theme of research of the ILK group. At one point it was decided to build a well-coded and generic tool that would combine the group's algorithms, favorite optimization tricks, and interface desiderata, the whole of which is now version 1.0 of TiMBL. We think TiMBL can make a useful tool for NLP research, and, for that matter, for any other domain with discrete classification tasks. For information on the ILK Research Group, visit our site at http://ilk.kub.nl/ On this site you can find links to (postscript versions of) publications relating to the algorithms incorporated in TiMBL and on their application to NLP tasks. The reference guide ("TiMBL: Tilburg Memory-Based Learner, version 1.0, Reference Guide.", Walter Daelemans, Jakub Zavrel, Ko van der Sloot, and Antal van den Bosch. ILK Technical Report 98-03) can be downloaded separately and directly from http://ilk.kub.nl/~ilk/papers/ilk9803.ps.gz For comments and bugreports relating to TiMBL, please send mail to Timbl at kub.nl ---------------------------------------------------------------------- From mkearns at research.att.com Sat Mar 21 14:36:14 1998 From: mkearns at research.att.com (Michael J. Kearns) Date: Sat, 21 Mar 1998 14:36:14 -0500 (EST) Subject: NIPS*98 Call for Papers Message-ID: <199803211936.OAA22406@radish.research.att.com> CALL FOR PAPERS -- NIPS*98 Neural Information Processing Systems -- Natural and Synthetic Monday November 30 - Saturday December 5, 1998 Denver, Colorado This is the twelfth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks and oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Nov. 30), and following it there will be two days of focused workshops on topical issues at a nearby ski area (Dec. 4-5). Major categories for paper submission, with example subcategories (by no means exhaustive), are as follows: Algorithms and Architectures: supervised and unsupervised learning algorithms, model selection algorithms, active learning algorithms, feedforward and recurrent network architectures, localized basis functions, mixture models, belief networks, graphical models, Gaussian processes, factor analysis, topographic maps, combinatorial optimization. Applications: handwriting recognition, DNA and protein sequence analysis, expert systems, fault diagnosis, medical diagnosis, analysis of medical images, data analysis, database mining, network traffic, music processing, time-series prediction, financial analysis. Artificial Intelligence: inductive reasoning, problem solving and planning, natural language, hybrid symbolic-subsymbolic systems. Cognitive Science: perception and psychophysics, neuropsychology, cognitive neuroscience, development, conditioning, human learning and memory, attention, language. Implementation: analog and digital VLSI, optical neurocomputing systems, novel neurodevices, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics. Reinforcement Learning and Control: exploration, planning, navigation, Q-learning, TD-learning, dynamic programming, robotic motor control, process control, Markov decision processes. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, auditory scene analysis, source separation, hidden Markov models, models of human speech perception. Theory: computational learning theory, statistical physics of learning, information theory, prediction and generalization, regularization, Boltzmann machines, Helmholtz machines, decision trees, support vector machines, online learning, dynamics of learning algorithms, approximation and estimation theory, learning of dynamical systems, model selection, complexity theory. Visual Processing: image processing, image coding, object recognition, visual psychophysics, stereopsis, motion detection and tracking. REVIEW CRITERIA: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting final camera-ready copy. PAPER FORMAT: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are strongly encouraged to use the NIPS LaTeX style files obtainable by anonymous FTP at the site given below. Papers must indicate (1) physical and e-mail addresses of all authors; (2) one of the nine major categories listed above, and a subcategory if desired; (3) if the work, or any substantial part thereof, has been submitted to or has appeared in other scientific conferences; (4) the authors' preference, if any, for oral or poster presentation (this preference will play no role in paper acceptance); and (5) author to whom correspondence should be addressed. SUBMISSION INSTRUCTIONS: Send eight copies of submitted papers to the address below; electronic or FAX submission is not acceptable. Include one additional copy of the abstract only, to be used for preparation of the abstracts booklet distributed at the meeting. SUBMISSIONS MUST BE RECEIVED BY MAY 22, 1998. From within the U.S., submissions will be accepted if mailed first class and postmarked by May 19, 1998. Mail submissions to: Sara A. Solla NIPS*98 Program Chair Department of Physiology Ward Building 5-003, MC211 Northwestern University Medical School 303 E. Chicago Avenue Chicago, IL 60611-3008, USA Mail general inquiries and requests for registration material to: NIPS Foundation Computational Neurobiology Laboratory Salk Institute for Biological Studies 10010 North Torrey Pines Road La Jolla, CA 92037 FAX: (619)587-0417 E-mail: nipsinfo at salk.edu Copies of the LaTeX style files for NIPS are available via anonymous ftp at ftp.cs.cmu.edu (128.2.206.173) in /afs/cs/Web/Groups/NIPS/formatting The style files and other conference information may also be retrieved via World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS NIPS*98 Organizing Committee: General Chair, Michael Kearns, AT&T Labs Research; Program Chair, Sara Solla, Northwestern University; Publications Chair, David Cohn, Harlequin; Tutorial Chair, Klaus Mueller, GMD First; Workshops Co-Chairs, Richard Zemel, University of Arizona, and Sue Becker, McMaster University; Publicity Chair, Jonathan Baxter, Australian National University; Treasurer, Bartlett Mel, University of Southern California; Web Master, Doug Baker, Carnegie Mellon University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*98 Program Committee: Andrew Barto, University of Massachusetts; Joachim Buhmann, University of Bonn; Yoav Freund, AT&T Labs Research; Lars Kai Hansen, Danish Technical University; Nathan Intrator, Brown University; Robert Jacobs, University of Rochester; Esther Levin, AT&T Labs Research; Alexandre Pouget, Georgetown University; David Saad, Aston University; Lawrence Saul, AT&T Labs Research; Sara Solla, Northwestern University (chair); Sebastian Thrun, Carnegie Mellon University; Yair Weiss, MIT. DEADLINE FOR RECEIPT OF SUBMISSIONS IS MAY 22, 1998 - please post - From mkearns at research.att.com Sat Mar 21 14:36:41 1998 From: mkearns at research.att.com (Michael J. Kearns) Date: Sat, 21 Mar 1998 14:36:41 -0500 (EST) Subject: NIPS*98 Call for Workshop Proposals Message-ID: <199803211936.OAA22409@radish.research.att.com> CALL FOR PROPOSALS NIPS*98 Post Conference Workshops December 4 and 5, 1998 Breckenridge, Colorado Following the regular program of the Neural Information Processing Systems 1998 conference, workshops on current topics in neural information processing will be held on December 4 and 5, 1998, in Breckenridge, Colorado. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Past topics have included: Active Learning, Architectural Issues, Attention, Audition, Bayesian Analysis, Bayesian Networks, Benchmarking, Brain Imaging, Computational Complexity, Computational Molecular Biology, Control, Genetic Algorithms, Graphical Models, Hippocampus and Memory, Hybrid HMM/ANN Systems, Implementations, Music, Neural Plasticity, Language Processing, Lexical Acquisition, Network Dynamics, On-Line Learning, Optimization, Recurrent Nets, Robot Learning, Rule Extraction, Self-Organization, Sensory Biophysics, Signal Processing, Support Vectors, Speech, Time Series, Topological Maps, and Vision Models and Applications. The goal of the workshops is to provide an informal forum for researchers to discuss important issues of current interest. There will be two workshop sessions a day, for a total of six hours, with free time in between for ongoing individual exchange or outdoor activities. Concrete open and/or controversial issues are encouraged and preferred as workshop topics. Representation of alternative viewpoints and panel-style discussions are particularly encouraged. Workshop organizers will have responsibilities including: 1) coordinating workshop participation and content, which involves arranging short informal presentations by experts working in an area, arranging for expert commentators to sit on a discussion panel and formulating a set of discussion topics, etc. 2) moderating or leading the discussion and reporting its high points, findings, and conclusions to the group during evening plenary sessions 3) writing a brief summary and/or coordinating submitted material for post-conference electronic dissemination. Submission Instructions ----------------------- Interested parties should submit via e-mail a short proposal for a workshop of interest by May 29, 1998. Proposals should include a title, a description of what the workshop is to address and accomplish, the proposed length of the workshop (one day or two days), the planned format (mini-conference, panel discussion, or group discussion, combinations of the above, etc), and the proposed number of speakers. Where possible, please also indicate potential invitees (particularly for panel discussions). Please note that this year we are looking for fewer "mini-conference" workshops and greater variety of workshop formats. The time allotted to workshops is six hours each day, in two sessions of three hours each. We strongly encourage that the organizers reserve a significant portion of time for open discussion. The proposal should motivate why the topic is of interest or controversial, why it should be discussed and who the targeted group of participants is. In addition, please send a brief resume of the prospective workshop chair, a list of publications, and evidence of scholarship in the field of interest. Submissions should include contact name, address, e-mail address, phone number and fax number if available. Proposals should be mailed electronically to zemel at u.arizona.edu. All proposals must be RECEIVED by May 29, 1998. If e-mail is unavailable, mail so as to arrive by the deadline to: NIPS*98 Workshops c/o Richard Zemel Department of Psychology University of Arizona Tucson, AZ 85721 Questions may be addressed to either of the Workshop Co-Chairs: Richard Zemel Sue Becker University of Arizona McMaster University zemel at u.arizona.edu becker at mcmaster.ca PROPOSALS MUST BE RECEIVED BY MAY 29, 1998 -Please Post- From aweigend at stern.nyu.edu Sat Mar 21 16:54:31 1998 From: aweigend at stern.nyu.edu (Andreas Weigend) Date: Sat, 21 Mar 1998 16:54:31 -0500 (EST) Subject: New Book: Decision Technologies for Financial Engineering Message-ID: <199803212154.QAA02093@sabai.stern.nyu.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 3010 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/d9e75983/attachment-0001.ksh From georgiou at wiley.csusb.edu Mon Mar 23 03:45:31 1998 From: georgiou at wiley.csusb.edu (georgiou@wiley.csusb.edu) Date: Mon, 23 Mar 1998 00:45:31 -0800 (PST) Subject: ICCIN'98: Call for Papers Message-ID: <199803230845.AAA26948@wiley.csusb.edu> Call for Papers 3nd International Conference on COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE http://www.csci.csusb.edu/iccin Sheraton Imperial Hotel & Convention Center Research Triangle Park, North Carolina October 23-28, 1998 Conference Co-chairs: Subhash C. Kak, Louisiana State University Jeffrey P. Sutton, Harvard University This conference is part of the Fourth Joint Conference Information Sciences. Plenary Speakers include the following: James Anderson Panos J. Antsaklis John Baillieul Walter Freeman David Fogel Stephen Grossberg Stuart Hameroff Yu Chi Ho Thomas S.Huang George J. Klir Teuvo Kohonen John Koza Richard G. Palmer Zdzislaw Pawlak Karl Pribram Azriel Rosenfeld Julius T. Tou I.Burhan Turksen Paul J. Werbos A.K.C.Wong Lotfi A. Zadeh Hans J.Zimmermann Areas for which papers are sought include: o Artificial Life o Artificially Intelligent NNs o Associative Memory o Cognitive Science o Computational Intelligence o Efficiency/Robustness Comparisons o Evaluationary Computation for Neural Networks o Feature Extraction & Pattern Recognition o Implementations (electronic, Optical, Biochips) o Intelligent Control o Learning and Memory o Neural Network Architectures o Neurocognition o Neurodynamics o Optimization o Parallel Computer Applications o Theory of Evolutionary Computation Summary (4 pages) Submission Deadline: June 1, 1998 Decision & Notification: August 1, 1998 For more information please see Conference Web site: http://www.csci.csusb.edu/iccin From rmeir at dumbo.technion.ac.il Mon Mar 23 04:57:23 1998 From: rmeir at dumbo.technion.ac.il (Ron Meir) Date: Mon, 23 Mar 1998 12:57:23 +0300 (IDT) Subject: Postdoc or Ph.D. position at the Technion, Israel Message-ID: Postdoctoral or PhD Position Aerospace and Electrical Engineering Technion - Israel Institute of Technology The Technion has an immediate opening for a two-year post-doc position, to pursue research on non-linear fault detection and isolation, with application to robust and affordable flight control systems for small commercial aircraft. The project is part of a large scale European effort and is funded by the Brite-Euram foundation. The successful candidate is expected to have some working knowledge of at least one of the following fields: flight control, neural networks, non-linear filtering, and non-linear system identification. Experience in applying non-linear approaches, such as neural networks, extended Kalman filtering etc. to real-world problems is a definite asset. Candidates are expected to have a Ph.D. in Aerospace or Electrical Engineering, Applied Mathematics, or Computer Science. Strong analytical skills and demonstrated ability to perform creative research, along with practical experience with Matlab, C, or C++ are essential. The position is also open to practicing engineers with similar backgrounds, who wish to pursue a two year research program on the above topics. For candidates holding the Masters degree, the program may lead to a topic for a PhD thesis in Aerospace or Electrical Engineering at the Technion. Salaries and social benefits are commensurate with those of senior research associates or senior engineers at the Technion, depending on the candidates' background and credentials. The project will be conducted under the joint supervision of: Dr. Moshe Idan Dr. Ron Meir Faculty of Aerospace Eng. Faculty of Electrical Engineering Technion - IIT Technion - IIT Haifa, 32000, Israel Haifa, 32000, Israel Tel. : ++972-4-8293489 Tel. : ++972-4-8294658 Fax. : ++972-4-8231848 Fax. : ++972-4-8323041 E-mail : aeridan at aeron.technion.ac.il E-mail : rmeir at dumbo.technion.ac.il From jfeldman at ICSI.Berkeley.EDU Tue Mar 24 12:08:34 1998 From: jfeldman at ICSI.Berkeley.EDU (Jerry Feldman) Date: Tue, 24 Mar 1998 09:08:34 -0800 Subject: Backprop w/o Math, Not. Message-ID: <3517E892.52741CC7@icsi.berkeley.edu> Last fall, I posted a request for ideas on how to teach back propagation to undergrads, some from linguistics, etc., who had little math. There were quite a few clever suggestions, but my conclusion was that it couldn't be done. There is no problem conveying the ideas of searching weight space, local minima, etc. But how could they understand the functional dependence w/o math? The students had already done some simple exercises with Tlearn and this seemed to help get them motivated. Following several suggestions and PDP v.3, I started with the integer based perceptron "delta rule", and left that on the board. The next step was to do the "delta rule" for linear nets with no hidden units. But, before doing that I "reminded" them about partial derivatives, using the volume of a cylinder, V = pi*r*r*h. There was an overhead with pictures of how the two partials affected V: dV/dh = pi*r*r was a thin dotted disk dV/dr = 2*pi*r*h was a thin dotted sleeve The only other math needed is the chain rule and it worked to motivate that directly from the error formula for the linear case. They saw that the error is expressed in terms of the output, but that one needs to know the effect of a weight change, etc. The fact that the result had the same form as the perceptron case was, of course, satisfying. They had already seen various activation functions and knew that the sigmoid had several advantages, but was obviously more complex. I derived the delta rule for a network with only one top node and using only one input pattern, this eliminates lots of subscripts. The derivation of the sigmoid derivative = f*(1-f) was given as a handout in gory detail, but I only went over it briefly. The idea was to get them all to believe that they could work it through and maybe some of them did. At that point, I just hand-waved about the delta for hidden layers being the appropriate function of the deltas to which it contributed and gave the final result. We then talked about search, local minima and the learning rate. Since they used momentum in Tlearn, there was another slide and story on that. My impression is that this approach works and that nothing simpler would suffice. There were only about thirty students and questions were allowed; it would certainly be harder with a large lecture. This was all done in one lecture because of the nature of the course. With more time, I would have followed some other suggestions and had them work through a tiny example by hand in class. For this course, we next went to a discussion of Regier's system, which uses some backprop extensions to push learning into the structured part of the net. I was able to describe Regier's techniques quite easily based on their being familiar with the derivation of backprop. I would still be interested in feedback on the overall course design: www.icsi.berkeley.edu/~mbrodsky/cogsci110/ -- Jerry Feldman From cns-cas at cns.bu.edu Tue Mar 24 11:33:53 1998 From: cns-cas at cns.bu.edu (Boston University - Cognitive and Neural Systems) Date: Tue, 24 Mar 1998 11:33:53 -0500 Subject: call for registration Message-ID: <199803241633.LAA28911@maverick.bu.edu> CALL FOR REGISTRATION and MEETING SCHEDULE OF EVENTS SECOND INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS May 27-30, 1998 Boston University 677 Beacon Street Boston, Massachusetts http://cns-web.bu.edu/cns-meeting/ Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from DARPA and ONR How Does the Brain Control Behavior? How Can Technology Emulate Biological Intelligence? The conference will include 125 invited lectures and contributed lectures and posters by experts from 25 countries on the biology and technology of how the brain and other intelligent systems adapt to a changing world. The conference is aimed at researchers and students of computational neuroscience, connectionist cognitive science, artificial neural networks, neuromorphic engineering, and artificial intelligence. A single oral or poster session enables all presented work to be highly visible. Costs are kept at a minimum without compromising the quality of meeting handouts and social events. Although Memorial Day falls on Saturday, May 30, it is observed on Monday, May 25, 1998. Over 200 people attended last year's meeting, so early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If paying by check, mail to the address on the form. If paying by credit card, mail as above, or fax to (617) 353-7755, or email to cindy at cns.bu.edu. Registration fees will be returned on request only until April 30, 1998. ************************* REGISTRATION FORM Second International Conference on Cognitive and Neural Systems Department of Cognitive and Neural Systems Boston University 677 Beacon Street Boston, Massachusetts 02215 Tutorials: May 27, 1998 Meeting: May 28-30, 1998 FAX: (617) 353-7755 (Please Type or Print) Mr/Ms/Dr/Prof: _____________________________________________________ Name: ______________________________________________________________ Affiliation: _______________________________________________________ Address: ___________________________________________________________ City, State, Postal Code: __________________________________________ Phone and Fax: _____________________________________________________ Email: _____________________________________________________________ The conference registration fee includes the meeting program, reception, two coffee breaks each day, and meeting proceedings. The tutorial registration fee includes tutorial notes and two coffee breaks. CHECK ONE: ( ) $70 Conference plus Tutorial (Regular) ( ) $45 Conference plus Tutorial (Student) ( ) $45 Conference Only (Regular) ( ) $30 Conference Only (Student) ( ) $25 Tutorial Only (Regular) ( ) $15 Tutorial Only (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Name as it appears on the card: _____________________________________ Type of card: _______________________________________________________ Account number: _____________________________________________________ Expiration date: ____________________________________________________ Signature: __________________________________________________________ ************************* MEETING SCHEDULE Wednesday, May 27, 1998 (Tutorials): 7:45am---8:30am MEETING REGISTRATION 8:30am--10:00am Larry Abbott: "Short-term synaptic plasticity: Mathematical description and computational function" 10:00am--10:30am COFFEE BREAK 10:30am--12:00pm George Cybenko: "Understanding Q-learning and other adaptive learning methods" 12:00pm---1:30pm LUNCH 1:30pm---3:00pm Ennio Mingolla: "Neural models of biological vision" 3:00pm---3:30pm COFFEE BREAK 3:30pm---5:00pm Alex Pentland: "Visual recognition of people and their behavior" Thursday, May 28, 1998 (Invited Talks, Contributed Talks, and Posters): 7:15am---8:00am MEETING REGISTRATION 7:55am---8:00am Stephen Grossberg: "Welcome and Introduction" 8:00am---8:45am Azriel Rosenfeld: "Understanding object motion" 8:45am---9:30am Takeo Kanade: "Computational sensors: Further progress" 9:30am--10:15am Tomaso Poggio: "Sparse representations for learning" 10:15am--10:45am COFFEE BREAK AND POSTER SESSION I 10:45am--11:30am Gail Carpenter: "Applications of ART neural networks" 11:30am--12:15pm Rodney Brooks: "Experiments in developmental models for a neurally controlled humanoid robot" 12:15pm---1:00pm Lee Feldkamp: "Recurrent networks: Promise and practice" 1:00pm---2:15pm LUNCH 2:15pm---3:15pm PLENARY TALK: Stephen Grossberg: "Adaptive resonance theory: From biology to technology" 3:15pm---3:30pm T.P. Caudell, P. Soliz, S.C. Nemeth, and G.P. Matthews: "Adaptive resonance theory: Diagnostic environment for clinical ophthalmology" 3:30pm---3:45pm Nabeel Murshed, Adnan Amin, and Samir Singh: "Recognition of handwritten Chinese characters with the fuzzy ARTMAP neural network" 3:45pm---4:00pm Yukinori Suzuki and Junji Maeda: "ECG wave form recognition with ART 2" 4:00pm---4:15pm Thomas E. Sandidge and Cihan H. Dagli: "Toward optimal fuzzy associative systems using interactive self-organizing maps and multi-layer feed forward principles" 4:15pm---4:30pm Alan Stocker: "Application of neural computational principles to compute smooth optical flow" 4:30pm---4:45pm Sorin Draghici and Valeriu Beiu: "On issues related to VLSI implementations of neural networks" 4:45pm---5:15pm COFFEE BREAK 4:45pm---7:45pm POSTER SESSION I (see below for details) Friday, May 29, 1998 (Invited and Contributed Talks): 7:30am---8:00am MEETING REGISTRATION 8:00am---8:45am J. Anthony Movshon: "Contrast gain control in the visual cortex" 8:45am---9:30am Hugh Wilson: "Global processes at intermediate levels of form vision" 9:30am--10:15am Mel Goodale: "Biological teleassistance: Perception and action in the human visual system" 10:15am--10:45am COFFEE BREAK 10:45am--11:30am Ken Stevens: "The categorical representation of speech and its traces in acoustics and articulation" 11:30am--12:15pm Carol Fowler: "Production-perception links in speech" 12:15pm---1:00pm Frank Guenther: "A theoretical framework for speech acquisition and production" 1:00pm---2:15pm LUNCH 2:15pm---2:30pm S. Oddo, J. Beck, and E. Mingolla: "Texture segregation in chromatic element- arrangement patterns" 2:30pm---2:45pm Joseph S. Lappin and Warren D. Craft: "The spatial structure of visual input explains perception of local surface shape" 2:45pm---3:00pm Glenn Becker and Peter Bock: "The ALISA shape module: Adaptive shape classification using a radial feature token" 3:00pm---3:15pm Sachin Ahuja and Bart Farell: "Stereo vision in a layered world" 3:15pm---3:30pm A.W. Przybyszewski, W. Foote, and D.A. Pollen: "Contrast gain of primate LGN neurons is controlled by feedback from V1" 3:30pm---3:45pm Bertram R. Payne and Stephen G. Lomber: "It doesn't add up: Non-linear interactions in the visual cerebral network" 3:45pm---4:00pm Larry Cauller: "NeuroInteractivism: Dynamical sensory/motor solutions to exploration inverse problems based upon the functional architecture of cerebral cortex" 4:00pm---4:30pm COFFEE BREAK 4:30pm---4:45pm Rashmi Sinha, William Heindel, and Leslie Welch: "Evidence for retinotopy in category learning" 4:45pm---5:00pm Michele Fabre-Thorpe, Ghislaine Richard, and Arnaud Delorme: "Color is not necessary for rapid categorization of natural images" 5:00pm---5:15pm Timothy C. Pearce, Todd A. Dickenson, David R. Walt, and John S. Kauer: "Exploiting statistics of signals obtained from large numbers of chemically sensitive polymer beads to implement hyperacuity in an artificial olfactory system" 5:15pm---5:30pm A. Wichert and G. Palm: "Hierarchical categorization" 5:30pm---5:45pm Nabil H. Farhat: "Bifurcation networks: An approach to cortical modeling and higher-level brain function" 5:45pm---6:00pm Robert Hecht-Nielsen: "A theory of the cerebral cortex" 6:00pm---8:00pm MEETING RECEPTION Saturday, May 30 (Invited Talks, Contributed Talks, and Posters): 7:30am---8:00am MEETING REGISTRATION 8:00am---8:45am Howard Eichenbaum: "The hippocampus and mechanisms of declarative memory" 8:45am---9:30am Earl Miller: "Neural mechanisms for working memory and cognition" 9:30am--10:15am Bruce McNaughton: "Neuronal population dynamics and the interpretation of dreams" 10:15am--10:45am COFFEE BREAK AND POSTER SESSION II 10:45am--11:30am Richard Thompson: "The cerebellar circuitry essential for classical conditioning of discrete behavioral responses" 11:30am--12:15pm Daniel Bullock: "Cortical control of arm movements" 12:15pm---1:00pm Andrew Barto: "Reinforcement learning applied to large-scale dynamic optimization" 1:00pm---2:15pm LUNCH 2:15pm---3:15pm PLENARY TALK: Ken Nakayama: "Psychological studies of visual attention" 3:15pm---3:30pm Emery N. Brown, Loren M. Frank, Dengda Tang, Michael C. Quirk, and Matthew A. Wilson: "A statistical model of spatial information encoding in the rat hippocampus" 3:30pm---3:45pm Michael Herrmann, Klaus Pawelzik, and Theo Geisel: "Self-localization of a robot by simultaneous self-organization of place and direction selectivity" 3:45pm---4:00pm Stefan Schaal and Dagmar Sternad: "Segmentation of endpoint trajectories does not imply segmented control" 4:00pm---4:15pm Stefan Schaal and Dagmar Sternad: "Nonlinear dynamics as a coherent framework for discrete and rhythmic movement primitives" 4:15pm---4:30pm Andrew L. Kun and W. Thomas Miller: "Unified walking control for a biped robot using neural networks" 4:30pm---4:45pm J.L. Buessler and J.P. Urban: "Global training of modular neural architectures in robotics" 4:45pm---5:15pm COFFEE BREAK 4:45pm---7:45pm POSTER SESSION II (see below for details) POSTER SESSION I: Thursday, May 28, 1998 All posters will be displayed for the full day. Cognition, Learning, Recognition (B): #1 A. Tijsseling, M. Casey, and S. Harnad: "Categories as attractors" #2 Vinoth Jagaroo: "Allocentric spatial processing and some of their cortical neural nodes: A neuropsychological investigation" #3 M.J. Denham and S.L. McCabe: "A dynamic learning rule for synaptic modification" #4 M.J. Denham and S.L. McCabe: "A computational model of predictive learning in hippocampal CA3 principal cells of the rat during spatial activity" #5 Nina Emilia Poriet Ramirez and Andreu Catala Mallofre: "Qualitative approximation of neural cognitive maps" #6 Gary C.-W. Shyi: "Computing representations for bound and unbound 3D object matching" #7 Ghislaine Richard, Michele Fabre-Thorpe, and Arnaud Delorme: "On the similarity between fast visual categorization of natural images in monkeys and humans" #8 Robert Proulx and Sylvain Chartier: "Reproduction of the list-length and the list-strength effect in unsupervised neural networks" #9 Jean-Claude Dreher and Emmanuel Guigon: "A model of dopamine modulation on sustained activities in prefrontal cortex" #10 Oury Monchi and John G. Taylor: "Neural modeling of the anatomical areas involved in working memory tasks" Cognition, Learning, Recognition (T): #11 Christophe Lecerf: "The double loop learning model" #12 V. Petridis and Ath. Kehagias: "A general convergence result for data allocation in online unsupervised learning methods" #13 C.S. Liu and C.H. Tseng: "Hierarchical decomposition training algorithm for multilayer Perceptron networks" #14 Antonio Ballarin and Simona Gervasi: "Political surveys and scenario simulations" #15 Regis Quelavoine and Pascal Nocera: "Transients classification, learning with expert interaction" #16 John R. Alexander Jr.: "How technology CAN emulate biological intelligence: Begin at the beginning - a speculation" #17 Maria P. Alvarez: "A supervised learning algorithm for feedforward networks with inhibitory lateral connections" #18 Mark A. Rubin, Michael A. Cohen, Joanne S. Luciano, and Jacqueline A. Samson: "Can we predict the outcome of treatment for depression?" #19 Irak Vicarte Mayer and Haruhisa Takahashi: "Object matching by principal component analysis" #20 Jun Saiki: "A neural network model for computation of object-based spatial relations" #21 Harald Ruda and Magnus Snorrason: "An algorithm for the construction of a hierarchical classifier using single trial learning and self-organizing neural networks" #22 Gail A. Carpenter and William W. Streilein: "Fuzzy ARTMAP neural networks for data fusion and sonar classification" #23 William W. Streilein and Paolo Gaudiano: "Autonomous robotics: Object identification and classification through sonar" #24 Nabeel Murshed, Ana Cristina de Carvalho, Regiane Aires, and Sergio Ossamu Ioshii: "Detection of carcinoma with the fuzzy ARTMAP NN" #25 Gail A. Carpenter, Sucharita Gopal, Scott Macomber, Siegfried Martens, and Curtis E. Woodcock: "Mapping vegetation ground cover with fuzzy ARTMAP" #26 Siegfried Martens and Paolo Gaudiano: "Mobile robot sensor fusion with an ARTMAP neural network" #27 Tayeb Nedjari and Younes Bennani: "Symbolic-connectionist interaction" #28 Eduardo da Fonesca Melo and Edson Costa de Barros Carvalho Filho: "An autonomous multi feature selective attention neural network model" #29 Wonil Kim, Kishan Mehrotra, and Chilukuri K. Mohan: "Learning collages: An adaptive multi-module approximation network" #30 Christine Lisetti: "Connectionist modeling of emotional arousal along the autonomic nervous system" Neural System Models (B): #31 Roger A. Drake: "Redundant behavioral measures of activation: Leftward visual inattention" #32 Michael Lamport Commons: "Can neural nets be conscious and have a sense of free will?" #33 Fabio R. Melfi and Andre C.P. Carvalho: "Human performance in maze navigation problems" #34 K. Gurney, A. Prescott, and P. Redgrave: "A model of intrinsic processing in the basal ganglia" Neural System Models (T): #35 (presentation withdrawn by the authors) #36 Robert Alan Brown: "Machine bonding" #37 Kit S. Choy and Peter D. Scott: "Reinforcement learning enhanced by learning from exemplary behaviors" #38 Andras Peter: "A neural network for self-adaptive classification" POSTER SESSION II: Saturday, May 30, 1998 All posters will be displayed for the full day. Vision (B): #1 Magnus Snorrason, Harald Ruda, and James Hoffman: "Visual search patterns in photo-realistic imagery" #2 Drazen Domijan: "New mechanism for luminance channel in the network model of brightness perception" #3 Colin W.G. Clifford and Michael R. Ibbotson: "Adaptive encoding of visual motion: Modelling the response properties of directional neurons in the accessory optic system" #4 J.R. Williamson and S. Grossberg: "How cortical development leads to perceptual grouping" #5 Stephen G. Lomber and Bertram R. Payne: "Behavioral dissociations in visual cortex: A multi-dimensional view of the cerebrum" Audition, Speech, Language (B + T): #6 Robert A. Baxter and Thomas F. Quatieri: "AM-FM estimation by shunting neural networks" #7 Lars Koop and Holger U. Prante: "Classification of artificial and natural sounds with stationary and temporal self-organized feature maps" #8 S.L. McCabe: "Synaptic depression and temporal order identification" #9 Fatima T. Husain and Frank H. Guenther: "Experimental tests of neural models of the perceptual magnet effect" #10 Jerome Braun and Haim Levkowitz: "Perceptually guided training in recurrent neural networks based automatic language identification" #11 Shinji Karasawa, Ken-ichi Suzuki, and Jun-ichii Oomori: "The artificial intelligence organized by decoders for language processing" Spatial Mapping and Navigation (B): #12 Herve Frezza-Buet and Frederic Alexandre: "A model of cortical activation for robot navigation" #13 William Gnadt and Stephen Grossberg: "A self-organizing neural network model of spatial navigation and planning" Control and Robotics (T): #14 Jesse Reichler and Clay Holroyd: "An architecture for autonomous control and planning in uncertain domains" #15 Fernando J. Corbacho: "Biologically inspired design principles for autonomous agents" #16 Erol Sahin, Paolo Gaudiano, and Robert Wagner: "A comparison of visual looming and sonar as mobile robot range sensors" #17 J.J. Collins and Malachy Eaton: "Situated pursuit and evasion using temporal difference learning" #18 Kent Thompson and Wayne Lu: "The big eyeballs project" #19 Stevo Bozinovski, Georgi Stojanov, and Liljana Bozinovska: "Learning sparse environments using emotionally reinforced neural network" #20 Catalin V. Buhusi and Nestor A. Schmajuk: "Stimulus selection mechanisms: Implications for artificial life systems" VLSI (T): #21 Richard Izak and Thomas Zahn: "Modeling auditory pathway: A neuromorphic VLSI system with integrate and fire neurons and on-chip learning synapses" #22 Todd Hinck and Allyn E. Hubbard: "Combining featural and boundary information to create a surface representation: A hardware paradigm" #23 Christian Karl and Allyn E. Hubbard: "Pipelined asynchronous communication between large arrays" #24 Catherine Collin and Susanne Still: "Towards a neuronally-controlled walking machine" #25 Radu Dogaru and Leon O. Chua: "Emergent computation in cellular neural networks with FitzHugh-Nagumo cells: A novel approach based on the local activity theory" Hybrid Systems and Industrial Applications (T): #26 Geoffrey N. Hone and Richard Scaife: "The SME Machine: A non-learning network implemented in a commercial spreadsheet delivers Subject Matter Expert judgments" #27 A. Bernatzki, W. Eppler, and H. Gemmeke: "Neural network debugger (NNDB) using local principal component analysis (LPCA) for high-dimensional input data" #28 Peter Sincak, Norbert Kopco, Rudolf Jaksa, and Marek Bundzel: "Computational intelligence tools for environmental applications" #29 Leo Chau-Kuang Liau and Robert W. McLaren: "The application of a neural net to parameter optimization of a continuous stirred tank reactor" #30 Brian M. O'Rourke: "Advanced time series modeling with neural networks" Neural System Models (B): #31 Clay Holroyd, Jesse Reichler, and Michael G.H. Coles: "Generation of error-related scalp potentials by a mesencephalic dopamine system for error processing" #32 Christian W. Eurich, Klaus Pawelzik, and John G. Milton: "Encoding temporal patterns by delay adaptation in networks of spiking neurons" #33 J.F. Gomez, F.J. Lopera, E. Marin, D. Pineda, and A. Rios: "A computer model using neuroimaging of a cyclic cortical wave for brain development, adult-brain-steady-state, and cerebral degeneration" Neural System Models (T): #34 M. Sreenivasa Rao and Arun K. Pujari: "A new neural network architecture with associative memory, pruning and order-sensitive learning" #35 L.M. Gelman and N.I. Bouraou: "Adaptive method for object recognition" #36 Alexandra I. Cristea and Toshio Okamoto: "Deduction of an L-based energy function for SE prediction" #37 Wei Cao and James Burghart: "Pattern-up-mapping method and fractional convergence" #38 Norbert Jankowski: "Controlling the structure of neural networks that grow and shrink" From salzamas at netlab.it Tue Mar 24 12:08:40 1998 From: salzamas at netlab.it (salzano) Date: Tue, 24 Mar 1998 18:08:40 +0100 Subject: New Journal Message-ID: <1.5.4.32.19980324170840.006e5390@mbox.netlab.it> Sorry for multiple posting. Hallo to Everybody, We are now publishing a new Journal, Economics & Complexity; Is anybody interested at topics as NN, Complex system, Chaos, Fuzzy Sets and Fuzzy Choice Theory applied to Economics, Public Finance, Financial ystem? We would like to increase the list of experts connected woth the Journal. Let me know. In the attachment You will find information about "Economics & Complexity" Massimo Salzano, University of Salerno - Italy ECONOMICS & COMPLEXITY an Interdisciplinary Journal on Public, Financial, Globalisation and Social Economics Federico Caff? Centre Publisher Roskilde University POBOX 260 DK-4000 Roskilde, DENMARK ISSN 1398-1706 VOL. 1 N. 1 WINTER 1997-98 CONTENTS The Control of Economics as a Complex System Massimo Salzano Towards a Model of Economic Growth Embodying an Evolutionistic Perspective Davide Infante Long-Term Memory Stability in the Italian Stock Market Marco Corazza Financial Time Series and Non Linear Models A.Amendola, M.S. Andreano, C. Vitale Announcements Scientific board: Prof. Bruno Amoroso - Economics of Globalization - University of Roskilde -DK Prof. Elio Canestrelli - Financial Mathematics - University of Venice - ca' Foscari - IT. Prof. Stefano Ecchia - Financial Markets - University of Naples - Federico II - IT Prof. Stuart Holland - International Political Economy - University of Roskilde - DK Prof. Roman Strongin - Math. Optimisation - Nizhni Novgorod - Lobacesky University - RF Prof. Salvatore Vinci - Economics -"Navale" University - Napoli - IT Prof. Cosimo Vitale - Statistics - University of Salerno - IT Managing editor: Prof. Massimo Salzano - Public Finance - University of Salerno - University of Calabria - IT Tel: (39) 089 962158 (39) 984 492443 fax: (39) 984 492421 The aim of this journal is to spread the use of a complex, interdisciplinary, methodological approach to the study of economics. Every issue will be devoted to a specific topic which will take into account the importance of having different perspectives on the subject. Many books and articles have been written on complexity in economics but, generally, they are oriented towards a more theoretical approach and it is very difficult, at the present, to speak about a complex approach to public policy. Perhaps, this is because the complex approach is at crossroads between economics, public finance, banking, financial sector, mathematics and statistics. The choice of the scientific board takes this reality into consideration. Only academics that are very well known in their relative fields, but are still open to new ideas, could be the garantors that this new approach will be applied to economics whilst maintaining a substantial respect of the disciplinary and interdisciplinary methodology. All contributions to the journal will be refereed using the usual approach of "invisible referee". For works presented at conferences organised by the journal, there will be a double screening. First acceptance for the conference will be dependent on a referee and then, after the conference, the results of the public discussion will be taken into account. This journal will be published two times a year, though some special numbers may appear. Each number will be published in Denmark both in paper and on the Internet. Generally we will produce fifty physical copies. With author's acceptance, provisional versions of articles that will be published on the Journal could be posted on Internet. The language of the journal is English. Instructions to Authors (1) Papers must be in English. (2) Papers for publication (2copies and electronic manuscript, i.e., on disk or by e at mail with the accompanying manuscrip) should be sent to: Professor M. Salzano, Managing Editor, Economics & Complexity, Department of Economics, University of Salerno Fisciano- ITALY: e at mail: salzamas at netlab.it Submission of a paper will be held to imply that it contains original unpublished work and is not being submitted for publication elsewhere. The Editor does not accept responsibility for damage or loss of papers submitted. Upon acceptance of an article author(s) will be asked to autorise the publication also on internet of his article. (3) The preferred format is Winword 8 for Win95. Do not submit your original paper only in paper. Do submit the accepted version of your paper as electronic manuscripts. Make absolutely sure that the file on the disk and the printout are identical. Label the disk with your name; also specify the software and hardware used as well as the title of the file to be processed. Please check that the file is correctly formatted. Do not allow your word processor to introduce word breaks and do not use a justified layout. Please adhere strictly the general instructions below on style, arrangement and, in particular, the reference style of the journal. (4) Manuscripts should be typewritten on one side of the paper only, double-spaced with wide margins. All pages should be numbered consecutively. Title and subtitles should be short (less then 1/2 line). (5) The first page of the manuscript should contain the following information: (i) the title; (ii) the name(s) and institutional affiliation(s) of the aurthor(s); (iii) an abstract of not more than 80 words. A footnote on the same sheet should give a) the name, address, and telephone and fax numbers of the corresponding author, as well as an e-mail address;b) acknowledgements and information on grants received. (6) The first page of the manuscript should also contain up to five key words - should be supplied. (7) Footnotes should be kept to a minimum and numbered consecutively throughout the text with superscript arabic numerals. (8) For the formulae, please would You use Winword basic MathType, and not an advanced one. (9) The References in the text should be as follow: "Allison (1915) demonstrate that ." The list of references should appear at the end of the main text. It should be single spaced and listed in alphabetical order by author's name. References should appear as follows: Author1, N., Autor2, N.B., and Autor3, L.J., (1938): The tax advantage. MacMillan, New York Author1, W., and Author2, E. (1998): The Elements of Economics, Journal of Economic and society The journal title should not be abbreviated. (10) The illustrations should be inserted in the text. All graphs and diagrams, numbered consecutively, should be referred to as figures. Illustrations can be printed in colour only if they are judged by the Editor to be essential to the presentation. Further information concerning colour illustrations and the costs to the author can be obtained from the publisher Any manuscripts which does not conform to be above instructions will not be considered for the publication. Usually, no page proofs will be sent to the corresponding author. Twenty offprints of each paper are supplied free of charge to the corresponding author; additional offprints are available at cost if they are ordered when revised final version of the work is posted. ISSN 1398-1706 From kaplane at rockvax.rockefeller.edu Wed Mar 25 08:13:18 1998 From: kaplane at rockvax.rockefeller.edu (Dr. Ehud Kaplan) Date: Wed, 25 Mar 1998 08:13:18 -0500 Subject: Post-Doc position in Computational Neuroscience Message-ID: <01BD57C5.DF0DA870.kaplane@rockvax.rockefeller.edu> Post-Doctoral position in Computational Neuroscience available-- We are looking for a post-doctoral fellow to work with us in computational neuroscience at the Mount Sinai School of Medicine in New York City. We are developing an approach to simulation of large-scale neuronal ensembles, and are looking for someone with expertise and interest in computer simulations, dynamical systems, and applied mathematics. Knowledge of neuroscience is an obvious advantage. Our group includes, among others: Bruce Knight, Larry Sirovich and Ehud Kaplan, and is involved in both theoretical (mathematical and computational simulations) and experimental (optical imaging and electrophysiology of the visual cortex) approaches. Physicists, mathematicians, neurobiologists and engineers work together. Please apply to: postdoc at camelot.mssm.edu Ehud Kaplan, Ph.D. Jules & Doris Stein Research-to-Prevent-Blindness Professor Depts. of Ophthalmology, Physiology & Biophysics Mount Sinai School of Medicine One Gustave Levy Place NY, NY, 10029 From dayan at flies.mit.edu Wed Mar 25 12:08:34 1998 From: dayan at flies.mit.edu (Peter Dayan) Date: Wed, 25 Mar 1998 12:08:34 -0500 (EST) Subject: postdoc job Message-ID: <199803251708.MAA19298@flies.mit.edu> A non-text attachment was scrubbed... Name: not available Type: text Size: 1627 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/b4b8dad3/attachment-0001.ksh From leila at ida.his.se Wed Mar 25 18:31:03 1998 From: leila at ida.his.se (Leila Khammari) Date: Thu, 26 Mar 1998 00:31:03 +0100 Subject: ICANN 98 - EXTENDED DEADLINE Message-ID: <351993B6.DDE4DF@ida.his.se> _________________________________________________________________ ICANN 98 - DEADLINE EXTENDED to Monday, March 30 Due to numerous requests for an extension of the ICANN 98 paper submission deadline, all papers received by Monday, March 30, 11.00 p.m. CET will be included in the review process. _________________________________________________________________ 8th INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS September 1-4, 1998, Skoevde, Sweden For details see: http://www.his.se/ida/icann98/ _________________________________________________________________ SUBMISSION Submissions are sought in all areas of artficial neural network research, in particular - Theory - Applications - Computational Neuroscience & Brain Theory - Connectionist Cognitive Science - Autonomous Robotics & Adaptive Behaviour - Hardware & Implementations Papers of maximum 6 pages can be submitted by March 30, 1998, * either ELECTRONICALLY via the web-service * or via EXPRESS/COURIER MAIL to the conference secretariat. Templates for LaTeX, Word and Framemaker are available for down- load. For details please see: http://www.his.se/ida/icann98/submission or contact the conference secretariat (see below). All papers accepted for oral or poster presentation will appear in the conference proceedings published by Springer-Verlag. _________________________________________________________________ IMPORTANT DATES March 30, 1998 - NEW DEADLINE, papers must be received May 6, 1998 - notification of acceptance May 28, 1998 - camera ready papers must be received September 1, 1998 - ICANN 98 tutorials September 2-4, 1998 - ICANN 98 takes place _________________________________________________________________ CONFERENCE SECRETARIAT ICANN 98 Hoegskolan Skoevde Hoegskolevaegen 1 S-541 28 Skoevde SWEDEN Email: icann98 at ida.his.se Telefax: +46 (0)500-46 47 25 WWW: http://www.his.se/ida/icann98/ _________________________________________________________________ From kchen at cis.ohio-state.edu Thu Mar 26 11:20:08 1998 From: kchen at cis.ohio-state.edu (Ke CHEN) Date: Thu, 26 Mar 1998 11:20:08 -0500 (EST) Subject: preprint. Message-ID: Dear Connectionists, The following preprint is available now on line: http://www.cis.ohio-state.edu/~kchen/jnc98.ps Best regards, -kc ---------------------------------------------------- Dr. Ke CHEN Department of Computer and Information Science The Ohio State University 583 Dreese Laboratories 2015 Neil Avenue Columbus, Ohio 43210-1277 U.S.A. Phone: 1-614-292-4890(O) (with an answering machine) Fax: 1-614-292-2911 E-Mail: kchen at cis.ohio-state.edu WWW: http://www.cis.ohio-state.edu/~kchen ------------------------------------------------------ ########################################################################## A Method of Combining Multiple Probabilistic Classifiers through Soft Competition on Different Feature Sets Ke Chen{1,2} and Huisheng Chi{1} {1} National Lab of Machine Perception and Center for Information Science Peking University, Beijing 100871, China {2} Dept of CIS and Center for Cognitive Science The Ohio State University, Columbus, OH 43210-1277, USA To appear in NEUROCOMPUTING - AN INTERNATIONAL JOURNAL, 1998. ABSTRACT A novel method is proposed for combining multiple probabilistic classifiers on different feature sets. In order to achieve the improved classification performance, a generalized finite mixture model is proposed as a linear combination scheme and implemented based on radial basis function networks. In the linear combination scheme, soft competition on different feature sets is adopted as an automatic feature rank mechanism so that different feature sets can be always simultaneously used in an optimal way to determine linear combination weights. For training the linear combination scheme, a learning algorithm is developed based on Expectation-Maximization (EM) algorithm. The proposed method has been applied to a typical real world problem, viz. speaker identification, in which different feature sets often need consideration simultaneously for robustness. Simulation results show that the proposed method yields good performance in speaker identification. Keywords: Combination of multiple classifiers, soft competition, different feature sets, Expectation-Maximization (EM) algorithm, speaker identification ########################################################################## From omori at cc.tuat.ac.jp Thu Mar 26 04:13:20 1998 From: omori at cc.tuat.ac.jp (=?ISO-2022-JP?B?GyRCQmc/OSEhTjQ7ShsoSg==?=) Date: Thu, 26 Mar 1998 18:13:20 +0900 Subject: ICONIP'98 : call for paper : EXTENDED DEADLINE to MAY 15 Message-ID: <01BD58E2.DBEF2CE0@BRAIN> - - Please accept our apologies if you receive multiple copies of this ??message. - - We would be most grateful if you would forward this message to ??potentially interested parties. +-----------------------------------------+ | SUBMISSION DEADLINE EXTENDED TO MAY 15 | +-----------------------------------------+ Call for Papers The Fifth International Conference on Neural Information Processing ICONIP'98 +----------------------------------------------------+ | http://jnns-www.okabe.rcast.u-tokyo.ac.jp/jnns/ICONIP98.html | +----------------------------------------------------+ Organized by Japanese Neural Network Society (JNNS) Sponsored by Asian Pacific Neural Network Assembly (APNNA) October 21-23,1998 Kitakyushu International Conference Center 3-9-30 Asano, Kokura-ku, Kitakyushu 802, Japan In 1998, the annual conference of the Asian Pacific Neural Network Assembly, ICONIP'98, will be held jointly with the ninth annual conference of Japanese Neural Network Society, from 21 to 23 October 1998 in Kitakyushu, Japan. The goal of ICONIP'98 is to provide a forum for researchers and engineers from academia and industries to meet and to exchange ideas on advanced techniques and recent developments in neural information processing. The conference further serves to stimulate local and regional interests in neural information processing and its potential applications to industries indigenous to this region. Topics of Interest Track 1: Neurobiological Basis of Brain Functions Track 2: Mathematical Theory of Brain Functions Track 3: Cognitive and Behavioral Aspects of Brain Functions Track 4: Tecnical Aspect of Neural Networks Track 5: Distributed Processing Systems Track 6: Applications of Neural Networks Track 7: Implementations of Neural Networks Topics cover (Key Words): Neuroscience, Neurobiology and Biophysics, Learning and Plasticity, Sensory and Motor Systems, Cognition and Perception Algorithms and Architectures, Learning and Generalization, Memory, Neurodynamics and Chaos, Probabilistic and Statistical Methods, Neural Coding Emotion, Consciousness and Attention, Visual and Auditory Computation, Speach and Languages, Neural Control and Robotics, Pattern Recognition and Signal Processing, Time Series Forecasting, Blind Separation, Knowledge Acquisition, Data Mining, Rule Extraction Emergent Computation, Distributed AI Systems, Agent-Based Systems, Soft Computing, Real World Systems, Neuro-Fuzzy Systems Neural Device and Hardware, Neuraland Brain Computers, Software Tools, System Integration Conference Committee Conference Chair: Kunihiko Fukushima, Osaka University Conference Vice-chair: Minoru Tsukada, Tamagawa University Organizing Chair: Shuji Yoshizawa, Tokyo University Program Chair: Shiro Usui, Toyohashi University of Technology International Advisory Committee (tentative) Chair: Shun-ichi Amari, Institute of Physical and Chemical Research Members: S. Bang (Korea), J. Bezdek (USA), J. Dayhoff (USA), R. Eckmiller (Germany), W. Freeman (USA), N. Kasabov (New Zealand), H. Mallot (Germany), G. Matsumoto (Japan), N. Sugie (Japan), R. Suzuki(Japan), K. Toyama (Japan), Y. Wu (China), Lei Xu (Hong Kong), J. Zurada (USA) Call for paper The Programme Committee is looking for original papers on the above mentioned topics. Authors should pay special attention to explanation of theoretical and technical choices involved, point out possible imitations and describe the current states of their work. All received papers will be reviewed by the Programme Committee. The authors will be informed about the decision of the review process by June 30, 1998. All accepted papers will be published. As the conference is a multi- disciplinary meeting the papers are required to be comprehensible to a wider rather than to a very specialized audience. Instruction to Authors Papers must be received by May 15, 1998. The papers must be submitted in a camera-ready format. Papers will be presented at the conference either in an oral or in a poster session. Please submit six copies of the paper written in English on A4-format white paper with equal sized left and right margins, and 18 mm from the top, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. Centred at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by an abstract and the text. In the covering letter the track and the topics(3-4 Keywords) of the paper according to the list above should be indicated. No changes will be possible after submission of your manuscript. Authors may also retrieve the ICONIP style "iconip98.tex","iconip98.sty" "epsbox.sty", and "sample.ps" files (they are compressed as "form.tar.gz") for the conference from the homepage. Language The use of English is required for papers and presentation. No simultaneous interpretation will be provided. Workshops No tutorial will be held before and after the conference. Two satellite workshops will be held. One is "Satellite workshop for young researcher on Information processing" will be held after the conference. The details will be obtained from http://jnns-www.okabe.rcast.u-tokyo.ac.jp/jnns/iconip98/iconip98_ws.html. Another workshop is the Riken-Tamagawa International Dynamic Brain Forum (DBF`98), Ocober 18 - 20, 1998. This will take place in Brain Science Research Center, Tamagawa University Research Institute. ?? It is organized by S.Amari, M.Tsukada, K.Aihara, and H,Dinse, and tentative list of invited speakers include following people. J.P.Segundo(Univ.of California,USA), W.Freeman(USA), Maass, Longtin, Gerstein(USA), S.Thorpe, P.Werbos USA), Ad Aertsen (Albert-Ludwigs-Univ. Germany), H.Dinse(Ruhr-University, Germany) G.Hauske(TU Munich ,Germany), W.Von Seelen(Ruhr-University,Germany), G.Sandner (Univ.Louis Pasteur France), C.E.Schreiner(Univ.of California, USA), N.M.Weinberger(Univ.of California,USA), P. Erdi(Academy of Sciences, Hungary) Important Dates for ICONIP'98 Papers Due: May 15, 1998 Notification of Paper Acceptance: June 30, 1998 Second Circular (with Registration Form): June 30, 1998 Registration of at least one author of a paper: July 31, 1998 Early Registration: July 31, 1998 Conference: October 21-23, 1998 Workshops: October 24-26, 1998 For further information, please contact: ICONIP'98 Secretariat Mr. Masahito Matue Japan Technical Information Service Sogo Kojimachi No.3 Bldg. 1-6 Kojimachi, Chiyoda-ku, Tokyo 102, Japan Tel:+81-3-3239-4565 Fax:+81-3-3239-4714 E-mail:jatisc at msm.com Could you suggest your friends and acquaintances who will be interested in ICONIP'98-Kitakyushu? Thank you. From elman at crl.ucsd.edu Wed Mar 25 17:22:22 1998 From: elman at crl.ucsd.edu (Jeff Elman) Date: Wed, 25 Mar 1998 14:22:22 -0800 (PST) Subject: Postdoc announcement: CRL/UC San Diego Message-ID: <199803252222.OAA27121@crl.ucsd.edu> CENTER FOR RESEARCH IN LANGUAGE UNIVERSITY OF CALIFORNIA, SAN DIEGO ANNOUNCEMENT OF POSTDOCTORAL FELLOWSHIPS Applications are invited for postdoctoral fellowships in Language, Communication and Brain at the Center for Research in Language at the University of California, San Diego. The fellowships are supported by the National Institutes of Health (NIDCD), and provide an annual stipend ranging from $20,292 to $26,900 depending upon years of postdoctoral experience. In addition, funding is provided for medical insurance and limited travel. The program provides interdisciplinary training in: (1) psycholinguistics, including language processing in adults and language development in children; (2) communication disorders, including childhood language disorders and adult aphasia; (3) electrophysiological studies of language, and (4) neural network models of language learning and processing. Candidates are expected to work in at least one of these four areas, and preference will be given to candidates with background and interests involving more than one area. Executive Committee Members: Elizabeth Bates, Depts. of Cognitive Science & Psychology, UCSD Jeffrey Elman, Dept. of Cognitive Science, UCSD Marta Kutas, Dept. of Cognitive Science, UCSD David Swinney, Dept. of Psychology, UCSD Beverly Wulfeck, Dept. of Communicative Disorders, San Diego State University Grant conditions require that candidates be citizens or permanent residents of the U.S. In addition, trainees will incur a payback obligation during their first year of postdoctoral NRSA support and are required to complete a Payback Agreement. Applications must be RECEIVED by May 1, 1998. Training may begin as early as July 1, 1998 and as late as May 30, 1999. This is a one year appointment. Questions regarding this program may be sent to Joanna Mancusi, jmancusi at ucsd.edu. Applicants should send a statement of interest, three letters of recommendation, a curriculum vitae and copies of relevant publications to: Postdoc Fellowship Committee Center for Research in Language 0526 University of California, San Diego 9500 Gilman Drive La Jolla, California 92093-0526 (619) 534-2536 Women and minority candidates are specifically invited to apply. From paolo at McCulloch.ING.UNIFI.IT Fri Mar 27 06:15:19 1998 From: paolo at McCulloch.ING.UNIFI.IT (Paolo Frasconi) Date: Fri, 27 Mar 1998 12:15:19 +0100 (MET) Subject: PhD Scholarship Message-ID: PhD Scholarship in Adaptive Processing of Data Structures Faculty of Informatics University of Wollongong Australia An Australian Research Council Large grant for 1998 -- 2000 was awarded to Professors Tsoi (University of Wollongong), Gori (University of Siena), and Sperduti (University of Pisa) to study adaptive processing of data structures, a new way of studying problems which can be represented as data structures. Many practical problems, e.g., image understanding, document understanding, modelling of access behaviour on the internet, are more suitable to be modelled by data structures due to their relative ease in handling problems with dynamic and variable structures. A PhD scholarship for three years tenable at the University of Wollongong in adaptive processing of data structures is available for suitably qualified candidate to take up immediately. The candidate must have a good first class undergraduate degree in computer science, computer engineering, mathematics or other related discipline with some familiarity with neural networks, data structures, automata theory. It is desirable for the candidate to have some postgraduate training in neural networks. Interested candidate should access our project web site: http://www.dsi.unifi.it/~paolo/datas for more information on the project, researchers involved in the project, and papers relevant to the project. Further information can be obtained by contacting Professor A. C. Tsoi, Dean, Faculty of Informatics, University of Wollongong, Email: act at wumpus.uow.edu.au; Phone: +(61) 2-42-21-38-43; Fax: +(61) 2-42-21-48-43. Application for the PhD scholarship, including a brief curriculum vitae, transcripts of academic results, recommendations from three lecturers who know about you, should be sent to Ms Cathy McIvor, Personnel Services, University of Wollongong, Northfields Avenue, Wollongong, NSW 2522, Australia by 24th April, 1998. Paolo Frasconi Universita' di Firenze Dipartimento di Sistemi tel: +39 (55) 479-6362 e Informatica fax: +39 (55) 479-6363 Via di Santa Marta 3 50139 Firenze (Italy) http://www.dsi.unifi.it/~paolo/ From niall at zeus.csis.ul.ie Sat Mar 28 08:37:00 1998 From: niall at zeus.csis.ul.ie (Niall Griffith) Date: Sat, 28 Mar 1998 13:37:00 GMT Subject: IEE Colloqiuim - Neural Nets and MultiMedia Message-ID: <9803281337.AA20770@zeus.csis.ul.ie> Please pass this on to anyone or any group you think may be interested. ============================================================== IEE Colloquium on "Neural Networks in Multimedia Interactive Systems" Thursday 22 October 1998, Savoy Place, London. Call for Papers --------------- The IEE are holding a colloquium at Savoy Place on the use of neural network models in multimedia systems. This is a developing field of importance to both Multimedia applications developers who want to develop more responsive and adaptive systems as well as to neural network researchers. The aim of the colloquium is to present a range of current neural network applications in the area of interactive multimedia. The aim is cover a range of topics including learning, intelligent agents within multimedia systems, data mining, image processing and intelligent application interfaces. Invited Speakers: ----------------- Bruce Blumberg, MIT Media Lab. Jim Austin, York. Russell Beale, Birmingham. Call For Papers --------------- Submissions are invited in any (but not exclusively) of the following areas: Adaptive and plastic behaviour in multi-media systems Concept and behaviour learning and acquisition Browsing mechanisms Preference and strategy identification and learning Data mining Image processing in multimedia systems Cross modal and media representations and processes Intelligent agents Interested parties are invited to submit a two page (maximum) abstract of their proposed talk to either Dr. Niall Griffith, Department of Computer Science and Information Science, University of Limerick, Limerick, Ireland. email: niall.griffith at ul.ie Telephone: +353 61 202785 Fax: +353 61 330876 or Professor Nigel M Allinson Dept. of Elec. Eng. & Electronics UMIST PO Box 88 Manchester, M60 1QD, UK Voice: (+44) (0) 161-200-4641 Fax: (+44) (0) 161-200-4781/4 Internet: allinson at umist.ac.uk Timetable: ---------- 29th April: Deadline for talk submissions 15th June: Authors notified. 24th November: Colloquium at IEE, Savoy Place, London ===================================================== From skremer at q.cis.uoguelph.ca Sun Mar 29 16:44:29 1998 From: skremer at q.cis.uoguelph.ca (Stefan C. Kremer) Date: Sun, 29 Mar 1998 16:44:29 -0500 (EST) Subject: M.Sc. program in connectionism (Guelph, Ont., CANADA) Message-ID: The University of Guelph (Ontario, Canada) is now accepting applications from students with an interest in artificial neural and connectionist networks for its M.Sc. program in Computing and Information Science. The department has one of the largest neural network research groups in Canada and offers graduate courses in Artificial Neural Networks, Genetic Algorithms, Autonomous Robotics, as well as traditional Artificial Intelligence and other topics in Computer Science. Joint research projects with industry and collaboration with other neural network groups in the area enhance the learning environment. To be considered for admission, applicants must have a minimum 73% (`B') average during the previous four semesters of university study (though actual cut-offs are usually much higher) and are expected to possess a four-year honours degree in computer science. Students who are externally funded are especially encouraged to apply, though local funding may also be available for outstanding applicants. Most available spaces will be filled in May for entry in September. To assist in identifying a suitable thesis advisor, applicants are requested to submit descriptions of their research interests. For more information please e-mail Prof. Stefan Kremer at the address listed below. -- Dr. Stefan C. Kremer, Assistant Prof., Dept. of Computing and Information Science University of Guelph, Guelph, Ontario N1G 2W1 WWW: http://hebb.cis.uoguelph.ca/~skremer Tel: (519)824-4120 Ext.8913 Fax: (519)837-0323 E-mail: skremer at snowhite.cis.uoguelph.ca From segevr at post.tau.ac.il Mon Mar 30 06:14:44 1998 From: segevr at post.tau.ac.il (Ronen Segev) Date: Mon, 30 Mar 1998 14:14:44 +0300 (IDT) Subject: Paper: Self-Wiring of Neural Networks. Message-ID: Dear Connectionist, The following paper has been published at Physics Letters A, Vol 237/4-5(1998), p. 307-313. Hard copies can be obtained by sending an email to: segevr at post.tau.ac.il Your comments are welcome! Ronen Segev, email: segevr at post.tau.ac.il, School of Physics & Astronomy, Tel Aviv university. ========================================================================== TITLE: Self-Wiring of Neural Networks AUTHORS: Ronen Segev and Eshel Ben-Jacob. Tel Aviv university, Tel Aviv, Israel. ABSTRACT: In order to form the intricate network of synaptic connections in the brain, the growth cones migrate through the embryonic environment to their targets using chemical communication. As a first step to study self-wiring, 2D model systems of neurons have been used. We present a simple model to reproduce the salient features of the 2D systems. The model incorporates random walkers representing the growth cones, which migrate in response to chemotaxis substances extracted by the soma and communicate with each other and with the soma by means of attractive chemotactic "feedback". From nic at idsia.ch Mon Mar 30 04:20:51 1998 From: nic at idsia.ch (Nici Schraudolph) Date: Mon, 30 Mar 1998 11:20:51 +0200 Subject: TR: online local step size adaptation Message-ID: <199803300920.LAA00873@idsia.ch> Dear colleagues, the following technical report (10 pages, 143kB gzipped postscript) is available by anonymous ftp from the address given below. Best regards, -- Dr. Nicol N. Schraudolph Tel: +41-91-970-3877 IDSIA Fax: +41-91-911-9839 Corso Elvezia 36 CH-6900 Lugano http://www.idsia.ch/~nic/ Switzerland http://www.cnl.salk.edu/~schraudo/ Technical Report IDSIA-09-98: Online Local Gain Adaptation for Multi-Layer Perceptrons -------------------------------------------------------- Nicol N. Schraudolph We introduce a new method for adapting the step size of each individual weight in a multi-layer perceptron trained by stochastic gradient descent. Our technique derives from the K1 algorithm for linear systems (Sutton, 1992), which in turn is based on a diagonalized Kalman Filter. We expand upon Sutton's work in two regards: K1 is a) extended to nonlinear systems, and b) made more efficient by linearizing an exponentiation operation. The resulting ELK1 (extended, linearized K1) algorithm is computationally little more expensive than alternative proposals (Zimmermann, 1994; Almeida et al., 1997, 1998), and does not require an arbitrary smoothing parameter. On a first benchmark problem ELK1 clearly outperforms these alternatives, as well as stochastic gradient descent with momentum, even when the number of floating-point operations required per weight update is taken into account. Unlike the method of Almeida et al., ELK1 does not require statistical independence between successive training patterns. ftp://ftp.idsia.ch/pub/nic/olga.ps.gz From bogus@does.not.exist.com Mon Mar 30 08:32:40 1998 From: bogus@does.not.exist.com () Date: Mon, 30 Mar 98 14:32:40 +0100 Subject: No subject Message-ID: PhD Studentship in Neural Networks Neuroendocrine Research Group, Department of Physiology, University of Edinburgh,UK MODELLING THE EFFECTS OF SYNAPTIC INPUT ON A BURST-GENERATING NEURONE This BBSRC special committee studentship will involve the construction of mathematical models of oxytocin neurones, assessing the properties of the model by computer simulation and analytical techniques where possible, and planning, in conjunction with electrophysiologists, the experimental testing of models, e.g. by devising critical experiments to discriminate between hypotheses. Random synaptic input is an important element of the environment of oxytocin neurones, which have important physiological functions related to their two modes of firing. Tonic activity is involved in regulation of osmotic pressure of the blood, whereas bursts of intense activity coincide with milk ejections in lactating females, and contractions during parturition. Mathematical models will be constructed at various levels of complexity, starting with a development of the simple leaky integrator model, via well known models of bursting neurones, to more biophysically based models. Our general philosophy is to use models which are as simple as possible while still explaining in broad terms the experimental findings. This facilitates the use of analytical techniques to underpin simulation studies of more realistic models. Models are only further elaborated when the simple models are clearly inadequate. The studentship will be supervised by Professor Gareth Leng in the Physiology Department, Edinburgh, and David Brown in the Biomathematics Laboratory, Babraham Institute, Cambridge and in collaboration with neuroendocrinologists and other mathematicians in Edinburgh. This is an exciting opportunity to be involved in mathematical work on a neuronal system for which there is much experimental data, which has properties which are interesting from a physiological and mathematical point of view, and which has been little modelled so far. Applicants should have, or be likely to get, a good degree in a quantitative subject (e.g. mathematics, physics), preferably with an interest in biology. Maintenance grant of about #6,800 per annum (for UK residents). The studentship will start in October 1998, and run for for 3 years. The project will involve some travel between Edinburgh and Cambridge, so as to facilitate regular contact with the mathematicians and biologists at the two sites. Further information from Gareth Leng (0131 650 2869, email gareth.leng at ed.ac.uk) or David Brown (01223 832312 ext 224, email david.brown at bbsrc.ac.uk). Applications in the form of a CV (detailing academic performance so far) and names and addresses of two referees to Professor Gareth Leng, Department of Physiology, Medical School, University of Edinburgh, Teviot Place, Edinburgh, or David Brown, Laboratory of Biomathematics, Babraham Institute, Cambridge CB2 4AT as soon as possible. From iiass at tin.it Tue Mar 24 04:20:50 1998 From: iiass at tin.it (IIASS) Date: Tue, 24 Mar 1998 10:20:50 +0100 Subject: call Message-ID: <35177AF2.751B@tin.it>