From sschaal at hip.atr.co.jp Tue Aug 1 13:31:27 1995 From: sschaal at hip.atr.co.jp (Stefan Schaal) Date: Tue, 1 Aug 95 13:31:27 JST Subject: Paper available on incremental local learning Message-ID: <9508010431.AA05186@hsun06> The following paper is available by anonymous FTP or WWW: FROM ISOLATION TO COOPERATIONION: AN ALTERNATIVE VIEW OF A SYSTEM OF EXPERTS Stefan Schaal and Christopher G. Atkeson submitted to NIPS'95 We introduce a constructive, incremental learning system for regression problems that models data by means of locally linear experts. In contrast to other approaches, the experts are trained independently and do not compete for data during learning. Only when a prediction for a query is required do the experts cooperate by blending their individual predictions. Each expert is trained by minimizing a penalized local cross validation error using second order methods. In this way, an expert is able to adjust the size and shape of the receptive field in which its predictions are valid, and also to adjust its bias on the importance of individual input dimensions. The size and shape adjustment corresponds to finding a local distance metric, while the bias adjustment accomplishes local dimensionality reduction. We derive asymptotic results for our method. In a variety of simulations we demonstrate the properties of the algorithm with respect to interference, learning speed, prediction accuracy, feature detection, and task oriented incremental learning. ------------------------- The paper is 8 pages long, requires 2.3 MB of memory (uncompressed), and is ftp-able as: ftp://ftp.cc.gatech.edu/people/sschaal/schaal-NIPS95.ps.gz or can be accessed through: http://www.cc.gatech.edu/fac/Stefan.Schaal/ http://www.hip.atr.co.jp/~sschaal/ Comments are most welcome. From malcolm%flash at newcastle.ac.uk Tue Aug 1 14:35:00 1995 From: malcolm%flash at newcastle.ac.uk (Malcolm Young) Date: Tue, 1 Aug 1995 12:35:00 -0600 Subject: Assistant Professorships in the U.K. Message-ID: <9508011235.ZM7279@flash.ncl.ac.uk> UNIVERSITY OF NEWCASTLE UPON TYNE DEPARTMENT OF PSYCHOLOGY Lecturers in Cognitive and Computational Neuroscience or Behavioural Ecology Applications from suitably qualified and experienced scientists are invited for two lectureships in this enthusiastic and expanding department. Candidates should have a good honours degree, a doctorate, knowledge of computing and multivariate data analysis, and a record of research activity in published papers and research support. Applicants who will contribute to three areas of research: (1) cognitive, sensory and systems neuroscience, (2) neural network computing, and (3) behavioural ecology are particularly encouraged. Excellent research facilities and support will be provided. Salary will be on the Lecturer Scale (to GBP26,430), at a position determined by qualifications and experience. The Department of Psychology at Newcastle is broadly-based and rapidly growing. All undergraduate courses offered by the Department are heavily over-subscribed and the Department's postgraduate research community is the fastest growing of any in the university. Staff have research programs in cognitive neuroscience, behavioural ecology, and applied, occupational, health, social and cognitive psychology. The Department is also the home of the Neural Systems Group, an interdisciplinary centre for research in sensory, systems, computational and cognitive neuroscience, and neural network computing. The members of the Neural Systems Group are drawn from the Departments of Psychology, Physiological Sciences and the School of Neurosciences. Further details of the department and the Neural Systems Group can be found on the WWW at http://york37.ncl.ac.uk/www/psychol.html No forms of application are to be issued. Applications should take the form of a covering letter and three copies of the applicant's full curriculum vitae, with details of present salary and the names and addresses of three referees. Informal enquiries may be made to Prof. Malcolm Young (0191-222-7525, mpy%crunch at ncl.ac.uk). Applications should be sent to: the Director of Personnel, University of Newcastle upon Tyne, NE1 7RU, UK. -- +------------------------------------------------------------------------+ + + + Professor Malcolm P. Young Tel: 0191 222 7525 + + Department of Psychology Fax: 0191 222 5622 + + Ridley Building + + University of Newcastle + + Newcastle upon Tyne + + NE1 7RU Email: malcolm at flash.ncl.ac.uk + + UK mpy at crunch.ncl.ac.uk + + nmpy at ncl.ac.uk + +------------------------------------------------------------------------+ From black at signal.dra.hmg.gb Tue Aug 1 12:03:09 1995 From: black at signal.dra.hmg.gb (John V. Black) Date: Tue, 01 Aug 95 17:03:09 +0100 Subject: Vacancies at the Defence Research Agency (Malvern, UK) Message-ID: VACANCIES IN PATTERN & INFORMATION PROCESSING AT THE DEFENCE RESEARCH AGENCY (DRA) IN MALVERN, WORCESTERSHIRE, ENGLAND Vacancies The Pattern & Information Processing (PIP) group within the Signal & Information Processing department has vacancies for research staff interested in Artificial Neural Networks or Statistical Pattern Recognition. Members of PIP invent, develop and provide advice on techniques which can be used to understand and interpret data/information in order to either aid decision-makers or automate decision-making. Techniques currently under investigation include novel self-organising networks; modifications to standard techniques to incorporate prior knowledge and known invariances, statistical pattern recognition techniques; multi-level information processing; fusion algorithms; tracking; classification; Bayesian inference; neural networks; decision making; and analogue information processing systems. Our aim is to research the underlying theoretical foundation of relevant techniques to create the knowledge and deep understanding required to provide advice for all DRA projects involving these techniques. Mathematical rigour with a methodical approach to experimental validation provides the necessary solid foundations for our work. The general emphasis is on techniques which are applicable to many domains but support is also provided to specific collaborative projects which both demonstrate the usefulness of our techniques and stimulate further developments. All members of staff are encouraged to develop to their full potential via continuing professional development based upon a wide range of training opportunities, including international conferences. Initially, a successful recruit may undertake work ranging from long term research to shorter term applications studies and consultancy, if appropriate. They will also be given every encouragement to initiate new research projects, develop/expand existing research themes and liaise with other projects. Starting salary will be based upon relevant experience in the range 10-26K Pounds Sterling (under review) or higher for exceptional candidates. Qualifications & Experience We have the flexibility to make appointments appropriate for applicants ranging from new graduates to experienced researchers. Successful candidates will probably have a good first degree in an appropriate scientific discipline. Proven research experience, typically a higher degree or equivalent, in one or more relevant areas such as information fusion, information processing, pattern recognition (including neural networks), decision support, statistical modelling or physical modelling, is preferable. It is also important that applicants be experienced in the software development of algorithms, models and experiments. Knowledge of defence applications projects would be an advantage for more senior appointments. Please note that successful applicants will need to be security cleared by the UK Ministry of Defence. DRA is an equal opportunities employer. Applications Application forms can be obtained from : CIS Recruitment Office G Building Defence Research Agency St Andrews Road Malvern Worcestershire WR14 3PS Telephone : (01684) 895642 Please mark any such applications for the attention of G D Whitaker, CIS5 department. Further information Further information about the DRA can be obtained on the world wide web (http://www.dra.hmg.gb/) although the PIP home page is still under construction and so not yet visible. Alternatively, e-mail gdw at signal.dra.hmg.gb with any specific questions or with CVs prior to submission of a formal application. From wimw at mbfys.kun.nl Wed Aug 2 22:13:33 1995 From: wimw at mbfys.kun.nl (Wim Wiegerinck) Date: Thu, 3 Aug 1995 04:13:33 +0200 Subject: SNN symposium Neural Networks and AI - Final Program Message-ID: <199508030213.EAA13487@septimius.mbfys.kun.nl> NEURAL NETWORKS AND ARTIFICIAL INTELLIGENCE Scientific track of the 3rd SNN Neural Network Symposium September 14-15, 1995 Nijmegen, the Netherlands http://www.mbfys.kun.nl/SNN/Symposium/ //////////////////////////////////////////////////////////////////////////// On september 14 and 15 SNN will organize its annual Symposium on Neural Networks in the University Auditorium and Conference Centre of the University of Nijmegen. The topic of the conference is "Neural Networks and Artificial Intelligence". The term "Artificial Intelligence" is often associated with "traditional AI" methodology. Here it used in its literal sense, indicating the problem to create intelligence by artificial means, regardless of the method that is being used. The aim of the conference is two-fold: to give an overview of new developments in neuro-biology and the cognitive sciences that may lead to novel computational paradigms and to give an overview of recent achievements for some of the major conceptual problems of artificial intelligence and data modelling. The conference consists of 4 one-half day single track sessions. Each session will consist of 2 invited contributions and 2 or 3 contributions selected from the submitted papers. In addition there will be poster sessions. There are about 50 selected papers and we expect approximately 250 attendants. The proceedings will be published by Springer-Verlag. //////////////////////////////////////////////////////////////////////////// PROGRAM //////////////////////////////////////////////////////////////////////////// Thursday, 14 September 8.50 - C.C.A.M. Gielen Opening remarks Neurobiology ------------ 9.00 - R. Eckhorn Segmentation coding in the visual system Neural signals that possibly support scene segmentation 9.30 - B. van Dijk Synchrony and plasticity in the visual cortex 10.00 - K. Kopecz Dynamic representations provide the gradual specification of movement parameters 10.20 - J. Krueger Recording from foveal striate cortex while the monkey is looking at the stimulus 10.40 - Coffee & posters 11.10 - L. van Hemmen What coherently spiking neurons are good for 11.40 - A. Herz Rapid local synchronization from spiking cells to synfire webs 12.10 - M.-O. Gewaltig Propagation of synfire activity in cortical networks: a statistical approach 12.30 - M. Arndt Propagation of synfire activity in cortical networks: a dynamical systems approach 12.50 - Lunch & posters Cognitive modeling and rule extraction -------------------------------------- 14.30 - F. Wiersma The BB neural network rule extraction method 14.50 - N. Bakker Implications of Hadley's definition of systematicity 15.10 - G.J. Dalenoort and P.H. de Vries The binding problem in distributed systems 15.30 - Tea & posters 16.10 - J.A. Tepper Integrating symbolic and subsymbolic architecture for parsing arithmetic expressions and natural language sentences 16.30 - L. Martignon Bayesian strategies for machine learning: rule extraction and concept detection 17.00 - Reception & posters 18.30 - Closing =========================================================================== Friday, 15 September Robotics and vision ------------------- 9.00 - F. Groen A visually guided robot and a neural network join to grasp slanted ogbjects 9.30 - S. Thrun The mobile robot Rhino 10.00 - R.P. Wurtz Background invariant face recognition 10.20 - Coffee & posters 11.00 - B. Krose Learning structure from Motion: how to represent two-valued functions 11.20 - D. Roobaert A natural object recognition system using Self-organizing Translation-Invariant Maps (STIMs) 11.40 - R. Geraets Affine scale-space for discrete point sets 12.00 - Posters 12.30 - Lunch Statistical pattern recognition ------------------------------- 13.30 - B. Ripley Statistical ideas for selecting network architectures 14.00 - D. MacKay Developments in probabilistic modelling with neural networks -- ensemble learning 14.30 - C. Campbell A constructive algorithm for building a feed-forward neural network 14.50 - S.H. Lokerse Density estimation using SOFM and adaptive kernels 15.10 - H. von Hasseln Maximum likelihood estimates for Markov networks using inhomogeneous Markov chains 15.30 - Tea & posters Applications of neural networks ------------------------------- 16.00 - E. Turkcan Optimal training for neural network applied to nuclear technology 16.20 - H.A.B. te Braake Nonlinear predictive control with neural models 16.40 - G. Schram System identification with orthogonal basis functions and neural networks 17.00 - Posters 17.30 - Closing /////////////////////////////////////////////////////////////////////////////// Throughout the whole conference about 30 posters will be presented. More information on the poster sessions can be found on WWW : http://www.mbfys.kun.nl/SNN/Symposium/posters.html /////////////////////////////////////////////////////////////////////////////// PROGRAM COMMITTEE Aertsen (Weizmann Institute, Israel), Amari (Tokyo University), Buhmann (University of Bonn), van Dijk (University of Amsterdam), Eckhorn (University of Marburg), Gielen (University of Nijmegen), van Hemmen (University of Munchen), Herz (University of Oxford), Heskes (University of Nijmegen), Kappen (University of Nijmegen), Krose (University of Amsterdam), Lopez (University of Madrid), Martinetz (Siemens Research, Munchen), MacKay (University of Cambridge), von Seelen (University of Bochum, Germany), Taylor (King's College, London) INDUSTRIAL TRACK There will be an industrial conference entitled NEURAL NETWORKS IN PRACTICE which runs concurrently with the scientific conference. A selection of the best working and 'money making' applications of neural networks in Europe will be presented. The industrial track is organized as part of the activities of the Esprit project SIENA which aims are to conduct a survey of successful and unsuccessful applications of neural networks in the European market. For additional information and a detailed program, please contact SNN at the address below. VENUE The conference will be held at the University Auditorium and Conference Centre of the University of Nijmegen. Instructions on how to get to the University Auditorium and Conference Centre will be sent to you with your conference registration. CONFERENCE REGISTRATION The registration fee for the industrial track is Hfl 450 for two days and includes coffee/tea and lunches. One day registration is Hfl 300. Industrial registration gives access to the industrial track presentations as well as the scientific oral and poster presentations, and includes a copy of the proceedings. The registration fee for the scientific track is Hfl 300 for the two days and includes coffee/tea and lunches. One day registration is Hfl 200. Scientific registration gives access to the scientific oral and poster presentations, and includes a copy of the proceedings. Full-time PhD or Master students may register at a reduced rate. The registration fee for students is Hfl 150 for the two days and includes coffee/tea and lunches. Student registration gives access to the scientific oral and poster presentations. Students must send a copy of their university registration card or a letter from their supervisor together with the registrati on form. Methods of payment are outlined in the enclosed registration form. To those who have completed the registration form with remittance of the appropriate fees, a receipt will be sent. This receipt should be presented at the registration desk at the conference. Payment must have been received by us before the conference. If not, you will have to pay in cash or with personal cheques at the conference. At the conference site, credit cards will be accepted (e.g. American Express, Master Card and VISA). CANCELLATION Notification of cancellation must be sent in writing to the Conference Organizing Bureau of the University of Nijmegen (see address below). Cancellations will not be refunded, but the proceedings will be mailed. ACCOMMODATIONS Hotel reservations will be made for you as indicated on the registration form. Payment can be made at arrival or departure of the hotel (depends on the hotel policy). All hotels are in central Nijmegen and within a ten minute bus ride from the University. The Foundation for Neural Networks and the Conference Organizing bureau cannot be held responsible for hotel reservations and related costs. LIABILITY SNN cannot be held liable for any personal accident or damage to the private property of participants during the conference. //////////////////////////////////////////////////////////////////////////////////// REGISTRATION FORM Send this registration form to: University of Nijmegen Conference organization bureau POBox 9111 6500 HN Nijmegen, The Netherlands tel: +31 80 615968 or 612184 fax: +31 80 567956 Name: .................................................Mr/Mrs Affiliation: ................................................ Address: .................................................... Zipcode/City: ............................................... Country: .................................................... Conference Registration () I will participate at the SNN symposium Amount () industrial registration 2 days Hfl 450,= () industrial registration 1 day: 14/15 september*) Hfl 300,= () academic registration 2 days Hfl 300,= () academic registration 1 day: 14/15 september*) Hfl 200,= () student registration Hfl 150,= *) please strike what is not applicable () Bank transfer has been made (free of bank charges) to SNN conferences, Credit Lyonnais Nederland NV Nijmegen, on bank account number 637984838, swift code CRLIJNL2RS. Amount: ..................................... () Charge my credit card for the amount of ....................... () VISA () Master Card () America Express Card no.: Expiration date: Signature: () Please make hotel reservations in my name: Date of arrival: Date of departure: Single/double room (strike what is not applicable) single double (prices per night) () Hotel Mercure 145.00 165.00 () Hotel Apollo 90.00 120.00 () Hotel Catharina 57.50 115.00 () Hotel Catharina 43.50 87.00 (without shower and toilet) ///////////////////////////////////////////////////////////////////////////// Enquiries should be addressed to: Prof.dr. C. Gielen, Dr. H. Kappen, Mrs. E. Burg Foundation for Neural Networks (SNN) University of Nijmegen PObox 9101 6500 HB Nijmegen, The Netherlands tel: +31 80 614245 fax: +31 80 541435 email: snn at mbfys.kun.nl More information on both the scientific and the industrial track can be found on WWW http://www.mbfys.kun.nl/SNN/Symposium/ Best regards, Wim Wiegerinck From wimw at mbfys.kun.nl Thu Aug 3 05:25:05 1995 From: wimw at mbfys.kun.nl (Wim Wiegerinck) Date: Thu, 3 Aug 1995 11:25:05 +0200 (MET DST) Subject: SNN symposium Neural Networks in Practice - Final Program Message-ID: <199508030925.LAA25549@anastasius.mbfys.kun.nl> NEURAL NETWORKS IN PRACTICE Applications of neural networks in business and industry Industrial track of the 3rd SNN Neural Network Symposium September 14-15, 1995 Nijmegen, the Netherlands http://www.mbfys.kun.nl/SNN/Symposium/ /////////////////////////////////////////////////////////////////////////// PROGRAM /////////////////////////////////////////////////////////////////////////// The ability of neural networks to learn solutions, rather than having to be programmed, makes them suitable for a wide range of applications in business and industry. A rapidly increasing number of organisations already benefit of neural computing. This symposium offers you the opportunity to see how your organisation too might benefit of the possibilities of neural networks. WHAT DOES THE SYMPOSIUM OFFER? On this two day symposium a selection of the best neural network applications in diverse sectors of industry and business will be presented and discussed. The presentations are of first-hand from people who themselves work with these applications. Moreover there will be stands which can provide you with more information about possible applications of neural networks in your organisation. It gives you the opportunity to make contacts to help you realising these applications. WHO SHOULD ATTEND? This is not a technical or scientific symposium. It is intended for managers from business and industries to know more about the relevance and applicability of neural networks in their organization. LANGUAGE and PROCEEDINGS The main language of this symposium will be Dutch. The presentations of the non-Dutch speakers will be in English. The proceedings (in English) will be published by Springer-Verlag. ORGANISATION: Foundation for Neural Networks(SNN) IN COLLABORATION WITH: InnovatieCentrum Midden- en Zuid-Gelderland (IC) Vereniging Artificiele Neurale Netwerken (VANN) This symposium is organised as part of SIENA (Esprit project EP-9811). ////////////////////////////////////////////////////////////////////////////// PROGRAM Registration from 8.00 Thursday 14 September 1995 NEURAL NETWORKS in the PRODUCTION SECTOR OPENING 8.50 - B. Kappen (SNN, NL) Welkom 9.00 - E. Au'ee (InnovatieCentrum, NL) Toepasbaarheid van neurale netwerken in het mkb CALCULATION 9.30 - H. Brockmeijer (Smit Transformatoren, NL) Calculatie voor klant-specifieke transformatoren 10.00 - coffee PROCESS CONTROL 11.00 - A. de Weyer (Akzo Nobel, NL) Modelleren van industriele processen mbv neurale netwerken en genetische algoritmen 11.30 - T. Martinetz (Siemens AG, D) Neural network control for steel rolling mills 12.00 - lunch 14.00 - J. MacIntyre (University of Sunderland / National Power, UK) Condition monitoring with National Power 14.30 - M. Collins (NCS, UK) Electronic nose for process control 15.00 - tea QUALITY CONTROL 16.00 - H.-J. Kolb (MEDAV GmbH, D) Acoustic quality control of roofing tiles 16.30 - A. Timmermans (ATO-DLO, NL) Automatisch sorteren van potplanten 17.00 - reception =============================================================================== Friday 15 September 1995 NEURAL NETWORKS in the SERVICES SECTOR OPENING 8.50 - B. Kappen (SNN, NL) Welkom MARKETING 9.00 - R. ter Heide (SMR, NL) Neurale netwerken in direct marketing 9.30 - R. J. Sch"uring (BrandmarC BV, NL) Modelleren van marktdynamiek in voedingsmiddelen en duurzame gebruiksartikelen 10.00 - coffee ADMINISTRATION 11.00 - A. Hogervorst (Document Access BV, NL) Handschriftherkenning mbv neurale netwerken FINANCE 11.30 - H. G. Zimmerman (Siemens AG, D) Neural networks - the future of forecasting in finance ? 12.00 - lunch SAFETY 14.00 - G. Hesketh (AEA Technology, UK) Countermatch: a neural network approach to automatic signature verification 14.30 - J. Kopecz, (Zentrum fur Neuroinformatik, D) Access control by recognition of human faces 15.00 - tea TRANSPORT 16.00 - H. W"ust (Rijkswaterstaat, NL) Stroomverwachtingen voor scheepvaartbegeleiding bij IJmuiden 17.00 - closing //////////////////////////////////////////////////////////////////////////////// INFORMATION STANDS ================= During the conference much time is reserved for visiting stands from e.g. technology suppliers. If your company would like to present itself with a stand during our symposium, please contact SNN at the address below. FOUNDATION FOR NEURAL NETWORKS (SNN) Geert Grooteplein Noord 21 6525 ZX Nijmegen The Netherlands tel: +31 80-614245 fax: +31 80-541435 email snn at mbfys.kun.nl /////////////////////////////////////////////////////////////////////////////// SCIENTIFIC TRACK There will be an scientific track entitled NEURAL NETWORKS AND ARTIFICIAL INTELLIGENCE which runs concurrently with the industrial track. New fundamental and applied developments in the area of neural networks will be presented. Registration for the industrial track gives access to the scientific track. For additional information and a detailed program, please contact SNN at the address below. VENUE The conference will be held at the University Auditorium and Conference Centre of the University of Nijmegen. Instructions on how to get to the University Auditorium and Conference Centre will be sent to you with your conference registration. CONFERENCE REGISTRATION The registration fee for the industrial track is Hfl 450 for two days and includes coffee/tea and lunches. One day registration is Hfl 300. Industrial registration gives access to the industrial track presentations as well as the scientific oral and poster presentations, and includes a copy of the proceedings. The registration fee for the scientific track is Hfl 300 for the two days and includes coffee/tea and lunches. One day registration is Hfl 200. Scientific registration gives access to the scientific oral and poster presentations, and includes a copy of the proceedings. Full-time PhD or Master students may register at a reduced rate. The registration fee for students is Hfl 150 for the two days and includes coffee/tea and lunches. Student registration gives access to the scientific oral and poster presentations. Students must send a copy of their university registration card or a letter from their supervisor together with the registration form. Methods of payment are outlined in the enclosed registration form. To those who have completed the registration form with remittance of the appropriate fees, a receipt will be sent. This receipt should be presented at the registration desk at the conference. Payment must have been received by us before the conference. If not, you will have to pay in cash or with personal cheques at the conference. At the conference site, credit cards will be accepted (e.g. American Express, Master Card and VISA). CANCELLATION Notification of cancellation must be sent in writing to the Conference Organizing Bureau of the University of Nijmegen (see address below). Cancellations will not be refunded, but the proceedings will be mailed. ACCOMMODATIONS Hotel reservations will be made for you as indicated on the registration form. Payment can be made at arrival or departure of the hotel (depends on the hotel policy). All hotels are in central Nijmegen and within a ten minute bus ride from the University. The Foundation for Neural Networks and the Conference Organizing bureau cannot be held responsible for hotel reservations and related costs. DISCLAIMER The organisation is not responsible for the contents of the presentations or the information supplied by the stands. The organisation cannot be held liable for any personal accident or damage to the private property of participants during the conference. In unforeseen circumstances, the organisation reserves the right to change the program. ///////////////////////////////////////////////////////////////////////////// REGISTRATION FORM Send this registration form to: University of Nijmegen Conference organization bureau POBox 9111 6500 HN Nijmegen, The Netherlands tel: +31 80 615968 or 612184 fax: +31 80 567956 Name: ...............................................................Mr/Mrs Affiliation: .............................................................. Address: .................................................................. Zipcode/City: ............................................................. Country: .................................................................. Conference Registration () I will participate at the SNN symposium Amount () industrial registration 2 days Hfl 450,= () industrial registration 1 day: 14/15 september*) Hfl 300,= () academic registration 2 days Hfl 300,= () academic registration 1 day: 14/15 september*) Hfl 200,= () student registration Hfl 150,= *) please strike what is not applicable () Bank transfer has been made (free of bank charges) to SNN conferences, Credit Lyonnais Nederland NV Nijmegen, on bank account number 637984838, swift code CRLIJNL2RS. Amount: ..................................... () Charge my credit card for the amount of ....................... () VISA () Master Card () America Express Card no.: Expiration date: Signature: () Please make hotel reservations in my name: Date of arrival: Date of departure: Single/double room (strike what is not applicable) single double (prices per night) () Hotel Mercure 145.00 165.00 () Hotel Apollo 90.00 120.00 () Hotel Catharina 57.50 115.00 () Hotel Catharina 43.50 87.00 (without shower and toilet) ///////////////////////////////////////////////////////////////////////////// For enquiries, please contact: Prof.dr. C. Gielen, Dr. H. Kappen, Mrs. E. Burg Foundation for Neural Networks (SNN) University of Nijmegen PObox 9101 6500 HB Nijmegen, The Netherlands tel: +31 80 614245 fax: +31 80 541435 email: snn at mbfys.kun.nl More information on both the industrial and the scientific track can be found on WWW http://www.mbfys.kun.nl/SNN/Symposium/ Best regards, Wim Wiegerinck From richardc at sedal.su.oz.au Fri Aug 4 02:31:24 1995 From: richardc at sedal.su.oz.au (Richard Coggins) Date: Fri, 4 Aug 1995 16:31:24 +1000 Subject: Position announcement at Sydney University Message-ID: <199508040631.QAA04568@sedal.sedal.su.OZ.AU> Systems Engineering and Design Automation Laboratory Sydney University Electrical Engineering RENEWABLE Research Engineer Time Series Prediction and Signal Compression Applications are invited for the position of Research Engineer with the Systems Engineering and Design Automation Laboratory (SEDAL) at the University of Sydney. SEDAL has a number of collaborative projects in the area of time series forecasting and signal compression spanning the applications of financial forecasting, electricity load forecasting and heart signal recoginition and compression. The applicant will be expected to contribute to these projects by developing novel forecasting and coding software models based on neural networks and other statistical techniques. The project collaborators include a major international bank, a major utility supplier and a world leader in implantable cardiac pacemakers. For appointment at HEO Level 6 applicant must have a Degree in Electrial Engineering or Mathematics as well as knowledge and experience of C programming (preferably in a Unix environment), time series forecasting and/or source coding. The successful candidate will be encourged to enrol in an area of one of these disciplines if applicable. For appointment at Level 7, applicants must in additon to the above, have extensive experience in the area of Neural Computing, Information Theory or Statistics or in the design of signal processing systems and their implementation in either software or hardware. Experience in development and implementation of data compression systems, and the ability to supervise final year students and junior research staff also is essential. Experience in integrated circuit design and/or biomedical signal processing and masters degree or higher in Neural Computing, Information Theory or Statistics are desirable. The position is available for an initial period of one and a half years and is renewable up to a further 3 years subject to progress and funding. Applicants must be able to take up the position within 2 months of receiving an offer. Salary: Higher Education Officer Level 6, $34,026 to $36,842 depending on experience. Higher Education Officer Level 7, $37,546 to $41,065 depending on experience and qualifications. Closing Date: 31st August, 1995. Reference Number: B29/10 For further information please contact: Dr. Marwan Jabri Tel: +61 2 351 2240 Fax: +61 2 660 1228 Email: marwan at sedal.su.oz.au From JCONNOR at lbs.lon.ac.uk Fri Aug 4 19:21:38 1995 From: JCONNOR at lbs.lon.ac.uk (Jerry Connor) Date: Fri, 4 Aug 1995 19:21:38 BST Subject: NNCM-95 PRELIMINARY PROGRAMME Message-ID: NNCM 95 - PRELIMINARY PROGRAMME October 11, 12, and 13 1995 WWW site http://www.lbs.lon.ac.uk/desci/nncm.htm This years' Neural Networks in the Capital Markets Conference will be held in two parts. The Tutorials day offers two tracks. The Finance track is a series of 2 two-hour sessions designed to give engineers, mathematicians, and neural network practitioners an overview of the financial markets, pricing models for derivative securities, and the dynamics of price behaviour in high frequency markets. The Statistics & Neural Networks track is a series of 2 two-hour sessions designed to give finance professionals an overview of nonlinear techniques and mathematics that can be applied to data analysis, predictive modelling and analysis of financial markets. It will be held at London Business School, Sussex Place, London NW1 4SA on Wednesday 11 October. The Main conference will offer eight plenary sessions with invited speakers and original research contributions on Derivative & Term structure models, Equity & Commodity models, Foreign Exchange, Corporate Distress & Risk Models, Macroeconomic & Retail Finance applications, and two sessions on Advances in Methodology. Overall, the Main conference includes over 50 oral and poster paper presentations. It will be held on Thursday and Friday October 12 - 13 at the Langham Hilton, 1 Portland Place, London W1N 4JA, which is a short walk from London Business School. TUTORIAL SESSIONS, Wednesday, October 11, 1995 08:30 - 10:30 Finance Tutorial I "Pricing Models for Derivative Securities" Prof. Stewart Hodges Warwick University, Warwick 11.00 - 13.00 Finance Tutorial II "Price Behavior and Models for High Frequency Data in Finance" Dr. Michel Dacorogna Olsen & Associates 14.00 - 16.00 Statistics & Neural Nets Tutorial I "Statistical Inference & Nonparametric Models: Linear Lessons about Nonlinear Prediction" Prof. Leo Breiman University of California, Berkeley 16.30 - 18.30 Statistics & Neural Nets Tutorial II "Neural Networks for Time Series and Trading" Prof. Andreas Weigend University of Colorado, Boulder MAIN CONFERENCE, THURSDAY, October 12, 1995 08.30- 10.30 Oral Presentations Derivative & Term Structure Models I Invited speaker: "Option Pricing and Artificial Neural Networks", Prof. Hal White, USCD 11.00 - 13.00 Oral Presentations Corporate Distress & Risk Models 13.00 - 15.00 Lunch & Poster Sessions 15.00 - 17.00 Oral Presentations Foreign Exchange Invited Speaker (to be confirmed) "High Frequency Data in Financial Models, Issues & Applications" Prof. Charles Goodhart, LSE 17.30 - 19.30 Oral Presentations Macroeconomics & Retail Finance MAIN CONFERENCE, FRIDAY, October 13, 1995 08.30 - 10.30 Oral Presentations Derivative & Term Structure Models II Invited speaker: "Validation of Volatility Models" Prof. Yaser Abu-Mostafa, Caltech 11.00 - 13.00 Oral Presentations Advances in Methodology I 13.00 -15.00 Lunch & Poster Sessions 15.00 - 17.00 Oral Presentations Equities & Commodities Invited speaker: "Towards Minimal Risk: Model Selection Strategies for Time Series Prediction and Trading Strategies" Prof. John Moody, Oregon Graduate Institute 17.30 - 19.30 Oral Presentations Advances in Methodology II ACCEPTED PAPERS DERIVATIVE AND TERM STRUCTURE MODELS Neural Networks for Contingent Claim Pricing via the Galerkin Method E. Barucci Futures Trading Using Artificial Neural Networks H. Pi, & T. Rognavldsson Modelling the Term Structure of Interbank Interest Rates B. Dasgupta, & D. Wood Modelling Non-Linear Cointegration in European Equity Index Futures A. N. Burgess Neural Network Pricing of All Ordinaries SPI Options on Futures P. Lajbcygier A Comparative Study of Forecasting Models for Foreign Exchange Rates Using ANN and Option Prices G. S. Maddala,M. Qi Neural Networks in Derivative Securities Pricing Forecasting in Brazillian Capital Markets L. A. R Gaspar,& G. Lacitermacher Statistical Yield Curve Arbitrage in Eurodollar Futures using Neural Networks A. N. Burgess EQUITIES AND COMMODITIES The Use of Neural Networks For Property Investment Forecasting G. Clarke The Predictability of Security Returns with Simple Technical Trading Rules R. Gencay On-Line Learning for Multi-Layered Neural Network: Application to S&P- 500 Prediction C. E. Pedreira Applying Neural Networks in Copper Trading: A Technical Analysis Simulation C. Naylor The Predictability of Stock Returns with Local Versus Global Nonparametric Estimators R. Gencay A 'World' Model of Integrated Financial Markets Using Artificial Neural Networks T. Poddig Stock Price Prediction Using an Integrated pattern Recognition Paradigm D. H Kil Applications of Artificial Neural Networks in Emerging Financial Markets C. Siriopoulos Stock Selection Using Recon G. H. John, P. Miller, & R. Kerber An Embedded Fuzzy Knowledge Base for Technical Analysis of Stocks K. P. Lam Stock Price Predictions by Recurrent Multilayer Neural Network Architectures D. F. Bassi Equity Forecasting: A Case Study in the KLSE Index J. Yao Use of Neural Networks and Expert/Knowledgebase Systems for Stock Market Analysis, Prediction and Trading A. Chartzaniotis Applying Neural Networks in Copper Trading: Weekly Transaction ANN Model with a Linear Filter C. Naylor Short Term Forecasts of Financial Time Series Using Artificial Neural Networks S. Avouyi-Dovi FOREIGN EXCHANGE An Artificial Neural Network Based Trade Forecasting System for Capital Markets B G Flower Applying Neural Networks to Currency Trading - A Case Study C. Lee Exchange Rate Forecasting Comparison: Neural Networks, Symbolic Machine Learning and Linear Models E. Steurer Identification of FX Arbitrage Opportunities with a Non-Linear Multivariate Kalman Filter P. J. Bolland & J. T. Connor Predicting Returns on Canadian Exchange Rates with Artificial Neural Networks and EGARCH-M Models A. Episcopos Genetic Programming of Fuzzy Logic Production Rules with Application to Financial Trading A. Edmonds Forecasting Foreign Exchange Rates: Bayesian Model Comparison and Non-Gaussian Distributions S. Butlin,& J. T.Connor Short-term FX Market Analysis and Prediction H. Beran CORPORATE DISTRESS AND RISK MODELS Neural Networks in Corporate Failure Prediction : The UK Experience Y. Alici Corporate Distress Diagnosis - An International Comparison M. Kerling Assessing Financial Distress With Probabilistic Neural Networks E.Tyree, & J.A.Long Connectivity & Financial Network Shutdown L. Eisenberg MACRO-ECONOMICS AND RETAIL FINANCE Minimising the Cost of Money in Branch Offices F. Avila Exploratory Data Analysis by the Self-Organizing Map: Structures of Welfare and Poverty in the World S. Kaski, T. Kohonen Commercial Mortgage Default: A Comparison of the Proportional Hazard Modl with Artificial Neural Networks A. Episcopos Empirical Regularities and The Forecasting of Industrial production C. Hafke Loan Risk Analysis Using Neural Networks A. N. Burgess ADVANCES IN METHODOLOGY Clearning A. S. Weigend, & H-G. Zimmermann Ordinal Models for Neural Networks M. Mathieson R/S Analysis and Hurst Exponents: A Critique J. Moody, & L. Wu Combination of Buffered Back-Propagation And RPCL-CLP By Mixture of Experts Model for Foreign Exchange Rate Forecasting L. Xu, Y.M. Cheung, & W. M Leung Prediction with Robustness Towards Outliers, Trends, and Level Shifts J. T. Connor , D. Martin, & A. Bruce Avoiding overfitting by locally matching the noise level of the data A. S. Weigend, M. Mangeas Trading Using Committees J. Moody, S. Rehfuss, L. Wu Reliable Neural Network Predictions in the Presence of Outliers and Non-Constant Variances D. Ormoneit An Analysis of Stops and Profit Objectives in Trading Systems A. Atiya The Meaning of the Mean C. Hafke An Interval Neural Network Architecture for Time Series Prediction M. Fialho, & C.E.Pedreira REGISTRATION To register, complete the registration form and mail to the sec- retariat. Please note that attendance is limited and will be allocated on a "first-come, first-served" basis. SECRETARIAT: For further information, please contact the NNCM-95 secretariat: Ms Busola Oguntula, London Business School Sussex Place, Regent's Park, London NW1 4SA, UK e-mail: boguntula at lbs.lon.ac.uk phone (+44) (0171) 262 50 50 fax (+44) (0171) 724 78 75 LOCATION: The main conference will be held at The Langham Hilton, which is situated near Regent's Park and is a short walk from Baker Street Underground Station. Further directions including a map will be sent to all registries. PROGRAMME COMMITEE Dr A. Refenes, London Business School (Chairman) Dr Y. Abu-Mostafa, Caltech Dr A. Atiya, Cairo University Dr N. Biggs, London School of Economics Dr D. Bunn, London Business School Dr M. Jabri, University of Sydney Dr B. LeBaron, University of Wisconsin Dr A. Lo, MIT Sloan School Dr J. Moody, Oregon Graduate Institute Dr C. Pedreira, Catholic University PUC-Rio Dr M. Steiner, Universitaet Munster Dr A. Timermann, University of California, San Diego Dr A. Weigend, University of Colorado Dr H. White, University of California, San Diego HOTEL ACCOMMODATION: Convenient hotels include: The Langham Hilton 1 Portland Place London W1N 4JA Tel: (+44) (0171) 636 10 00 Fax: (+44) (0171) 323 23 40 Sherlock Holmes Hotel 108 Baker Street, London NW1 1LB Tel: (+44) (0171) 486 61 61 Fax: (+44) (0171) 486 08 84 The White House Hotel Albany St., Regent's Park, London NW1 Tel: (+44) (0171) 387 12 00 Fax: (+44) (0171) 388 00 91 --------------------------REGISTRATION FORM -------------------------- -- NNCM-95 Registration Form Third International Conference on Neural Networks in the Capital Markets October 12-13 1995 Name:____________________________________________________ Affiliation:_____________________________________________ Mailing Address: ________________________________________ _________________________________________________________ Telephone:_______________________________________________ ****Please circle the applicable fees and write the total below**** Main Conference (October 12-13): (British Pounds) Registration fee 450 Discounted fee for academicians 250 (letter on university letterhead required) Discounted fee for full-time students 100 (letter from registrar or faculty advisor required) Tutorials (October 11): You must be registered for the main conference in order to register for the tutorials. (British Pounds) Morning Session Only 100 Afternoon Session Only 100 Both Sessions 150 Full-time students 50 (letter from registrar or faculty advisor required) TOTAL: _________ Payment may be made by: (please tick) ____ Check payable to London Business School ____ VISA ____Access ____American Express Card Number:___________________________________ From terry at salk.edu Fri Aug 4 19:11:14 1995 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 4 Aug 95 16:11:14 PDT Subject: Neural Net Job in San Diego Message-ID: <9508042311.AA23955@salk.edu> From: Janice Holstein Status: R Pam Surko at SAIC is looking for a full-time neural net programmer to start as soon as possible, preferably a bachelor or master's level person with a good practical knowledge of neural nets and a nose for data analysis, with their feet on the ground, who would be happy solving very applied problems. The candidate should be smart and creative, and able to communicate with customers on occasion. The group works in UNIX land, Sun and HP, and mostly do their own system administration, so the successful candidate would have to be a mid-level UNIX guru. Candidates can email Pam ascii or postscript resumes or snail mail or sneaker express paper to Dr. Pamela Surko Science Applications Int'l Corporation Pamela.T.Surko at cpmx.saic.com 10260 Campus Pt. Dr., M/S C2 619-546-6386 San Diego, CA 92121 619-546-6360 FAX USA Janice From piuri at elet.polimi.it Fri Aug 4 19:43:08 1995 From: piuri at elet.polimi.it (Vincenzo Piuri) Date: Sat, 5 Aug 1995 00:43:08 +0100 Subject: call for papers Message-ID: <9508042343.AA27079@ipmel2.elet.polimi.it> ================================================================ ISCAS'96 IEEE International Symposium on Circuits and Systems Atlanta, Georgia, USA - May 12-15, 1996 ================================================================ Call for Papers for the Special Session on "Neural Technologies for Prediction, Identification and Control" ================================================================ This special session will discuss several aspects concerning the use neural technologies for modelling complex non-linear dynamic systems. The theoretic foundation of these methodologies will be treated as the basis for system prediction, identification and control. Presentation of some real applications will allow to evaluate the effectiveness of this approach with respect to the traditional techniques. Papers are solicited on all aspects of the neural technologies concerning system identification, prediction, and control: theory, design methodologies, realizations, case studies, and applications are welcome. Perspective authors are invited to send an abstract (1-2 pages) to prof. Vincenzo Piuri (email and fax submissions are strongly encouraged) by August 31, 1995. It should contain the paper's title, the authors' names and affiliations, the contact person with complete address, phone, fax and email. Acceptance/rejection will be mailed by December 4, 1995. The final version of the paper will be due by January 15, 1996. Prof. Vincenzo Piuri Organizer of the Special Session on "Neural Technologies Prediction, Identification and Control" Department of Electronics and Information Politecnico di Milano, Italy phone +39-2-2399-3606 secretary +39-2-2399-3623 fax +39-2-2399-3411 email piuri at elet.polimi.it ================================================================ From hali at nada.kth.se Mon Aug 7 22:17:53 1995 From: hali at nada.kth.se (Hans Liljenstrm) Date: Tue, 8 Aug 1995 04:17:53 +0200 Subject: Workshop on Fluctuations in Biology Message-ID: <199508080217.EAA25981@sanscalc.nada.kth.se> International Workshop on THE ROLE AND CONTROL OF RANDOM EVENTS IN BIOLOGICAL SYSTEMS Sigtuna, Sweden 4-9 September 1995 At this interdisciplinary workshop we intend to discuss the relevance of chaos and noise in biological systems in general, but with a focus on neural systems. The approach will be theroretical as well as experimental, and the meeting is intended to attract participants from various fields, such as biology, physics, and computer science. Invited talks are given by L. Agnati, A. Babloyantz, A. Bulsara, R. Cotterill, W. Freeman, H. Haken, D. Hansel, K. Hepp, J. Hopfield, S. Kelso, C. Koch, W. Levy, L. Liebovitch, M. Mackey, F. Moss, S. Pogun, R. de Ruyter, and I. Tsuda. A few places are still available, and to get the desired balance between disciplines and between theory and experiments we welcome, in particular, further participants with experience from experimental (neuro)biology. For more information, see our WWW page, http://www.nada.kth.se/~hali/workshop.html, or contact Hans Liljenstrom Dept. of Numerical Analysis and Computing Science Royal Institute of Technology S-100 44 Stockholm, SWEDEN Email: hali at nada.kth.se Phone: +46-(0)8-790 6909 Fax: +46-(0)8-790 0930 From linster at katla.harvard.edu Tue Aug 8 19:25:53 1995 From: linster at katla.harvard.edu (Christiane Linster) Date: Tue, 8 Aug 1995 19:25:53 -0400 (EDT) Subject: No subject Message-ID: The followowing two papers are now available by ftp: TOWARDS A COGNITIVE UNDERSTANDING OF ODOR DISCRIMINATION: COMBINING EXPERIMENTAL AND THEORETICAL APPROACHES Claudine Masson (1) and Christiane Linster (2) (1) Laboratoire de Neurobiologie ComparEe des InvertEes INRA-CNRS (URA 1190) BP 23, 91440 Bures-sur-Yvette, France tel, fax: 69 07 20 59 e-mail: masson at jouy.inra.fr (2) Laboratoire d'Electronique, ESPCI, 10 rue Vauquelin, 75005 Paris tel: 40 79 44 61 e-mail: linster at neurons.fr Key words: feature extraction; modeling; odor discrimination; olfactory pro cessing. Abstract In response to changes in odorous environmental conditions, most species (ranging from lower invertebrates to mammals), demonstrate high adaptive behavioral performances. Complex natural chemical signals (i.e. odorous blends involved in food search), are particularly unstable and fluctuating, in quality, space and time. Nevertheless, adapted behavioral responses related to meaningful odor signals can be observed even in complex natural odorous environments, demonstrating that the underlying olfactory neural network is a very dynamic pattern recognition device. In the honeybee, a large amount of experimental data have been collected at different levels of observation within the olfactory system, from signal processing to behavior, including cellular and molecular properties. However, no set of data considered by itself can give insight into the mechanisms underlying odor discrimination and pattern recognition. Here, by concentrating on deciphering the neural mechanisms underlying encoding and decoding of the olfactory signal in the two first layers of the neural network, we illustrate how a theoretical approach helps us to integrate the different experimental data and to extract relevant parameters (features) which might be selected and used to store an odor representation in a behavioral context. To appear in Behavioral Processes, Special Edition on Cognition and Evolution A NEURAL MODEL OF OLFCATORY SENSORY MEMORY IN THE HONEYBEE ANTENNAL LOBE Christiane Linster Laboratoire d'Electronique, ESPCI 10, Rue Vauquelin, 75005 Paris linster at neurones.espci.fr Claudine Masson Neurobiologie ComparEe des Inverteb INRA, CNRS (URA 1190) 911140 Bures sur Yvette, France masson at jouy.inra.fr Abstract We present a neural model for olfactory sensory memory in the honey bee's antennal lobe. In order to investigate the neural mechanisms underlying odor discrimination and memorization, we exploit a variety of morphological, physiological and behavioral data. The model allows us to study the computational capacities of the known neural circuitry, and to interpret under a new light experimental data on the cellular as well as on the neuronal assembly level. We propose a scheme for memorization of the neural activity pattern after stimulus offset by changing the local balance between excitation and inhibition. This modulation is achieved by changing the intrinsic parameters of local inhibitory neurons or synapses. To appear in Neural Computation , Volume 8 (1) Both papers can be obtained as postscript files by ftp: ftp katla.harvard.edu login: ftp password: your email address cd linster get filename.ps *********************************************************** * Christiane Linster * * * * Dept. of Psychology 920 Tel: 1 617 495 3875 * * Harvard University Fax: 1 617 495 3728 * * Cambridge MA 02138 linster at katla.harvard.edu * *********************************************************** From ntl at bbcnc.org.uk Wed Aug 9 05:55:24 1995 From: ntl at bbcnc.org.uk (Neural Technologies Limited) Date: Wed, 09 Aug 1995 10:55:24 +0100 Subject: Urgent Vacancies Message-ID: <9508090955.aa26952@auntie.bbcnc.org.uk> Neural Technologies Limited is the leading UK company working in the application and exploitation of neural computing across a wide range of industrial and commercial environments. Our continued growth has lead to various new opening for new people to help in the development and deployment of working neural computing solutions. You will be working within a small and highly motivated team in our laboratories in Petersfield, Hampshire. Urgent vacancies exist for: Neural Scientists Neural Engineers Information Analysts Team Leaders Details on each below... Contact: (Fax or send CV) Simon Hancock Project Manager Neural Technologies Limited Bedford Road Petersfield Hampshire GU32 3QA (or call): Phone: 01730 260256 Fax: 01730 260466 E-mail: ntl at bbcnc.org.uk NEURAL SCIENTIST Required skills: ? well versed in neural network algorithm development and their practical application ? MUST be fluent in C or C++ within the PC environment ? good general presentation skills ? experience in the following would also be appreciated: ? knowledge of conventional statistics ? signal processing techniques [speech, vision. . .] ? application domains [data base analysis, machine health monitoring . . .] ? AI techniques and KBS ? system level integration [DSP, neural accelerators. . .] ? Visual C++ and MFC for Windows All candidates should be working be at a practical research level or have extensive industrial experience. A keen view to the commercial realities of working within a small, but fast growing, company is required. Neural Engineer Required skills: ? knowledge of practical application of neural networks and general data analysis ? MUST be fluent in C or C++ within the PC environment ? good general communication and presentation skills ? experience in the following would also be appreciated: ? knowledge of conventional statistics ? signal processing techniques [speech, vision. . .] ? application domains [data base analysis, machine health monitoring . . .] ? AI techniques and KBS ? system level integration [DSP, neural accelerators. . .] ? Visual C++ and MFC for Windows All candidates should have practical experience of using a variety of neural computing techniques applied to real problem domains. A keen view to the commercial realities of working within a small, but fast growing, company is required. Software Engineer Required skills: ? Mature development skills in the PC Windows environment Visual C++ with MFC for Windows ? Experience in working to rigorous quality procedures ISO9000 or equivalent ? Experience of the following areas would also be beneficial: ? Graphics Server ? Spread/VBX++ ? HDK (on-line help compiler) All candidates should have been working within the software industry for a period of at least 1-2 years and be able to demonstrate skills and successes in Windows software development. Senior Data Analyst / Team Leader Our continued growth has lead to the requirement of a Senior Data Analyst / Team Leader to help in both internal project development and customer bespoke solutions. Required skills: ? Statistical background: e.g. regression analysis neural methods useful ? Team leadership on large projects utilising up-to-date development methods Ideally with the experience of 2-3 engineers reporting directly to you working on a number of projects simultaneously ? Experience in working to rigorous quality procedures ISO9000 or equivalent ? Experience of the following areas would also be essential: ? experience in report writing ? presentation preparation ? excellent inter-personal skills ? Experience of the following areas would also be beneficial: ? training of personnel All candidates should have been working within the IT industry for a period of 3-5 years and be able to demonstrate skills and successes in both data analysis and leadership of project teams. -------------------------------------------- Dr. G.R. Bolt, Neural Scientist Neural Technologies Limited, Ideal House, Petersfield, Hampshire GU32 3QA U.K. Tel: (0) 1730 260256 Fax: (0) 1730 260466 From radford at cs.toronto.edu Thu Aug 10 13:04:42 1995 From: radford at cs.toronto.edu (Radford Neal) Date: Thu, 10 Aug 1995 13:04:42 -0400 Subject: Software for Bayesian learning available Message-ID: <95Aug10.130447edt.859@neuron.ai.toronto.edu> Announcing Software for BAYESIAN LEARNING FOR NEURAL NETWORKS Radford Neal, University of Toronto Software for Bayesian learning of models based on multilayer perceptron networks, using Markov chain Monte Carlo methods, is now available by ftp. This software implements the methods described in my Ph.D. thesis, "Bayesian Learning for Neural Networks". Use of the software is free for research and educational purposes. The software supports models for regression and classification problems based on networks with any number of hidden layers, using a wide variety of prior distributions for network parameters and hyperparameters. The advantages of Bayesian learning include the automatic determination of "regularization" parameters, without the need for a validation set, avoidance of overfitting when using large networks, and quantification of the uncertainty in predictions. The software implements the Automatic Relevance Determination (ARD) approach to handling inputs that may turn out to be irrelevant (developed with David MacKay). For problems and networks of moderate size (eg, 200 training cases, 10 inputs, 20 hidden units), full training (to the point where one can be reasonably sure that the correct Bayesian answer has been found) typically takes several hours to a day on our SGI machine. However, quite good results, competitive with other methods, are often obtained after training for under an hour. (Of course, your machine may not be as fast as ours!) The software is written in ANSI C, and has been tested on SGI and Sun machines. Full source code is included. Both the software and my thesis can be obtained by anonymous ftp, or via the World Wide Web, starting at my home page. It is essential for you to have read the thesis before trying to use the software. The URL of my home page is http://www.cs.toronto.edu/~radford. If for some reason this doesn't work, you can get to the same place using the URL ftp://ftp.cs.toronto.edu/pub/radford/www/homepage.html. From the home page, you will be able to get both the thesis and the software. To get the thesis and the software by anonymous ftp, use the host name ftp.cs.toronto.edu, or one of the addresses 128.100.3.6 or 128.100.1.105. After logging in as "anonymous", with your e-mail address as the password, change to directory pub/radford, make sure you are in "binary" mode, and get the files thesis.ps.Z and bnn.tar.Z. The file bnn.doc contains just the documentation for the software, but this is included in bnn.tar.Z, so you will need it only if you need to read how to unpack a tar archive, or don't want to transfer the whole thing. The files ending in .Z should be uncompressed with the "uncompress" command. The thesis may be printed on a Postscript printer, or viewed with ghostview. If you have any problems obtaining the thesis or the software, please contact me at one of the addresses below. --------------------------------------------------------------------------- Radford M. Neal radford at cs.toronto.edu Dept. of Statistics and Dept. of Computer Science radford at stat.toronto.edu University of Toronto http://www.cs.toronto.edu/~radford --------------------------------------------------------------------------- From eric at research.nj.nec.com Thu Aug 10 14:41:24 1995 From: eric at research.nj.nec.com (Eric B. Baum) Date: Thu, 10 Aug 1995 14:41:24 -0400 Subject: Job Announcement Message-ID: <199508101841.OAA10568@yin> I recently posted a job ad. Below is a revised version of this ad. Note especially the new entry entitled "Term" as well as some more details of experiments. ---------------------------------------------------------------------------- Research Programmer Wanted. Prerequisites: Experience in getting large programs to work. Some mathematical sophistication. E.g. at least equivalent of a good undergraduate degree in math, physics, theoretical computer science or related field. Salary: Depends on experience. Job: Implementing various novel algorithms. The previous holder of this position (Charles Garrett) implemented our new Bayesian approach to games, with striking success. We are now engaged in an effort to produce a world championship chess program based on these methods and several new ideas regarding learning. This is an experimental learning project as well as a game-playing project. The chess program is being written in Modula 3. Experience in Modula 3 is useful but not essential so long as you are willing to learn it. Other projects may include TD learning, GA's, DNA computing, etc. To access papers on our approach to games, and get some idea of general nature of other projects, (e.g. a paper on GA's Garrett worked on) see my home page http://www.neci.nj.nec.com:80/homepages/eric/eric.html A paper on a classifier-like learning system with Garrett will appear there RSN (but don't wait to apply). This system trains a complex `mind' formed of multiple agents interacting in an economic model and is currently being tested on Blocks World planning problems. This project will continue and running experiments on it will be part of the job. The successful applicant will (a)have experience getting *large* programs to *work*, (b) be able to understand the papers on my home page and convert them to computer experiments. These projects are at the leading edge of basic research in algorithms/cognition/learning, so expect the work to be both interesting and challenging. Term: The initial job offer will be for a short provisional period. If performance is satisfactory a longer term will be offered. There is a strong possibility of conversion to permanent status if performance is highly satisfactory. To apply please send cv, cover letter and list of references to: eric at research.nj.nec.com .ps or plain text please! NOTE- EMAIL ONLY. Hardcopy, e.g. US mail or Fedex etc, will not be opened. Equal Opportunity Employer M/F/D/V ------------------------------------- Eric Baum NEC Research Institute, 4 Independence Way, Princeton NJ 08540 PHONE:(609) 951-2712, FAX:(609) 951-2482, Inet:eric at research.nj.nec.com From biehl at Physik.Uni-Wuerzburg.DE Fri Aug 11 14:49:20 1995 From: biehl at Physik.Uni-Wuerzburg.DE (Michael Biehl) Date: Fri, 11 Aug 95 14:49:20 MESZ Subject: paper available: on-line backpropagation Message-ID: <199508111249.OAA08313@wptx08.physik.uni-wuerzburg.de> FTP-host: ftp.physik.uni-wuerzburg.de FTP-filename: /pub/preprint/WUE-ITP-95-018.ps.gz The following paper is now available via anonymous ftp: (See below for the retrieval procedure) ------------------------------------------------------------------ "On-line backpropagation in two-layered neural networks" Peter Riegler and Michael Biehl Ref. WUE-ITP-95-018 Abstract We present an exact analysis of learning a rule by on-line gradient descent in a two-layered neural network with adjustable hidden-to-output weigths (backpropagation of error). Results are compared with the training of networks having the same architecture but fixed weights in the second layer. --------------------------------------------------------------------- Retrieval procedure: unix> ftp ftp.physik.uni-wuerzburg.de Name: anonymous Password: {your e-mail address} ftp> cd pub/preprint ftp> get WUE-ITP-95-018.ps.gz (*) ftp> quit unix> gunzip WUE-ITP-95-018.ps.gz e.g. unix> lp WUE-ITP-95-018.ps [8 pages] (*) can be replaced by "get WUE-ITP-95-018.ps". The file will then be uncompressed before transmission (slow!). _____________________________________________________________________ -- Michael Biehl Institut fuer Theoretische Physik Julius-Maximilians-Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg email: biehl at physik.uni-wuerzburg.de Tel.: (+49) (0)931 888 5865 " " " 5131 Fax : (+49) (0)931 888 5141 From mao at almaden.ibm.com Thu Aug 10 23:54:04 1995 From: mao at almaden.ibm.com (Jianchang Mao 927-1932) Date: Thu, 10 Aug 95 19:54:04 -0800 Subject: call for papers, IEEE Journal on SELECTED AREAS IN COMMUNICATIONS Message-ID: <9508110254.AA24519@powerocr.almaden.ibm.com> ==================================================================== CALL FOR PAPERS IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS COMPUTATIONAL AND ARTIFICIAL INTELLIGENCE IN HIGH SPEED NETWORKS Recent research in high speed networks has resulted in key architectural trends which are likely to fundamentally influence all facets of the communications infrastructure. A major opportunity is now the integration of diverse services on these networks. Unlike traditional teletraffic, many of these current and emerging services have poorly understood traffic parameters and user behaviors. These networks must be self-managing and self-healing and be able to maintain their quality of service, deal with congestion and failures, and allow dynamic reconfiguration with minimal intervention. This has led many researchers to investigate algorithms that have adaptive and even learning behaviors. There is a consensus among many researchers that to manage these new workloads and their workloads a class of techniques exhibiting some form of computational intelligence will be needed. Despite some progress, key challenges remain. One problem is that these resource management algorithms need to be able to respond anticipatively or preventatively to problems, since the time-bandwidth product of these networks does not always allow for reactive behavior. A second problem is that in some cases, decisions must be made on a very rapid (sometimes submicrosecond) time scale in order to optimize switching behavior. Thirdly, while the network adapts, its traffic sources and sinks are also adapting intelligently to the network's behavior, making for added complexity. Computational intelligence encompasses the information processing paradigms of adaptive systems such as neural networks and fuzzy logic. Examples of Artificial intelligence include expert systems and search techniques. Computational intelligence paradigms have the ability to learn from experience and to predict future behaviors. In some cases these learning rules are explicit, but in other cases the learning algorithms are implicit in a more general structure such as a neural network. In particular, neural networks have been shown to have properties that can help in managing congestion in networks, dealing with changing workloads, etc. Analog circuits that implement neural networks have been shown to be capable of solving optimization problems in submicrosecond timescales, fast enough to make on-the-fly switch routing decisions. Original papers are solicited on the applications of any technique in computational or artificial intelligence to the following topics (but not limited to): Fast packet switching Fault tolerant and dynamic routing Multimedia source measurement and modeling Call admission control Traffic policing Congestion and flow control Estimation of quality of service parameters Wireless and mobile networks Disconnectible terminal management Authors wishing to submit papers, should send six copies to Prof. Ibrahim Habib at the address below. The following schedule shall be applied: Submission Deadline: January 15, 1996 Notification of acceptance: May 15, 1996 Final manuscript due: July 1, 1996 Publication: 1st AQuarter, 1997. GUEST EDITORS: Prof. Ibrahim Habib Department of electrical engineering City University of New York, City College 137 street at Convent Avenue New York, N.Y. 10031 email: ibhcc at cunyvm.cuny.edu Dr. Robert Morris Manager, Data Systems Technology IBM Almaden Research Center San Jose, CA 95120 email: rjtm at almaden.ibm.com Dr. Hiroshi Saito Distinguished Technical Member NTT Telecommunications Networks Laboratories 3-9-11, Midori-chi, Musashino-shi Tokyo 180, Japan email: saito at hashi.ntt.jp Prof. Bjorn Pehrson Royal Institute of Technology Electrum 204 S-164 Kista, Sweden email: bjorn at it.kth.se From v.dimitrov at uws.edu.au Tue Aug 15 05:49:17 1995 From: v.dimitrov at uws.edu.au (Vladimir Dimitrov) Date: Tue, 15 Aug 1995 19:49:17 +1000 Subject: FLAMOC'96 Message-ID: <199508150948.AA02462@hotel.uws.EDU.AU> International Discourse on FUZZY LOGIC AND THE MANAGEMENT OF COMPLEXITY (FLAMOC'96) Sydney, 15-18 January, 1996 SECOND ANNOUNCEMENT AND FINAL CALL FOR PAPERS FLAMOC'96 is an International Discourse targeted on the growing use of Fuzzy Logic when dealing with Complexity in various fields of applications (Industry, Business, Finance, Management, Ecology, Medicine, Social Science, etc.). FLAMOC'96 intends to contribute insight and foresight regarding creation of innovative and practically efficient ways of implementing Fuzzy Logic in problem situations impregnated with Uncertainty, Intricacy, and Hazard. In the age of an increasing Technological, Environmental and Social Complexity, FLAMOC'96 emphasises the synergy between Fuzzy Logic and Neural Networks, Genetic Algorithms, Non-linear Simulation Techniques, Fractal and Chaos Theory not only in fuzzy engineering practice but also in the search for better understanding the changes around and in us, learning how to handle paradoxes and risk, how to avoid conflicts and look for collaboration and consensus, how to improve our personal and organizational achievements by integrating many diverse and contradictory requirements into coherent, complimentary, and useful outputs. FLAMOC'96 provides a rich tutorial program and an excellent opportunity to meet and talk with world-known experts in Fuzzy Logic and Soft Computing. FLAMOC'96 is the only one of its kind discourse that intents to explore the diversity of practical and theoretical applications of Fuzzy Logic in managing real life Complexity, using Fuzzy Thinking as a bridge between science and the humanities. THE PROGRAM Keynote speaker: Prof. Lotfi Zadeh Invited speakers: Prof. George Klir Prof. Michio Sugeno Three basic streams build the conceptual framework of this discourse: - Soft Computing Technology (Fuzzy Logic, Neural Networks and Genetic Algorithms): with applications in Process Control, Intelligent Manufacturing Systems, Artificial Intelligence (Expert Systems, Decision Support Systems, Knowledge Based Leaning Systems, Robotics), Fuzzy Mathematics, Data Analysis, Linguistics, Biomedical Engineering, Medical Informatics. - Fuzzy Logic in Organising Systems: with applications in Social Science (Consensus Seeking, Conflict Analysis, Human Decision Making, Public Participation, Qualitative Reasoning, Education), Philosophy (PostAristotelean Logic, Postmodernism), Psychology, Economics (Stock Market and Financial Analysis). - Approximate Reasoning in Environmental Applications: Cleaner Production, Environmental Management, Mass Load Analysis, Risk Management, Sustainable Development Practice, Ecology, Ecocybernetics. Participants of FLAMOC'96 are free to suggest other topics that might be of interest to attendees. FLAMOC'96 will include open discussions on selected by participants issues of Soft Computing and Fuzzy Set Theory, System Science and Sustainable Development, Chaos and Complexity, as well as introductory and advanced tutorials hands-on demonstrations international exhibition of fuzzy products lectures visual summaries (poster sessions) multi-paper sessions. FLAMOC'96 provides a special day for business people and managers ("FLAMOC Business Day") with presentations delivered by leading experts and tutorials on how effectively to apply Fuzzy Logic in business, marketing, finance, and management. Participants are invited to submit proposals for presentation in a form that is most appropriate to the subject matter they would cover (e.g. paper, poster presentation, tutorial, lecture, computer demonstration, discussion, exhibit of a fuzzy product, performance on fuzzy music system, etc.) THE ATTENDEES FLAMOC'96 aimed at system and complexity scientists and researchers, control engineers, social scientists, managers in business, industry and government, academics, conflict resolution practitioners and facilitators, bio-medical engineers, environmental managers and environmentalists, fuzzy soft- and hardware specialists, information science professionals and students. INTERNATIONAL PROGRAM COMMITTEE L. Zadeh (USA) - Honorary Chairman V. Dimitrov (Univ. Western Sydney, Hawkesbury, Australia) - Coordinator J. Bezdek (Univ. West Florida, USA) H. Berenji (NASA, USA) Z. Bien (Advanced Inst. Science and Technology, Korea) C. Carlsson (Abo Academy, Finland) E. Cox (Metus Systems Group, USA) J. Dimitrov (Univ. Western Sydney, Nepean, Australia) D. Driankov (Univ. Linkoeping, Sweden) S. Dyer (OrgMetrics, USA) D. Filev (Ford, USA) T. Gedeon (Univ. NSW, Australia) H. Guesgen (Univ. Auckland, New Zealand) M. Gupta (Univ. Saskatchewan, Canada) J. Kacprzyk (Academy of Sciences, Poland) N. Kasabov (Univ. Otago, New Zealand) D. Keweley (Defence Science and Technology Organisation, Australia) P. Kloeden (Deakin Univ., Australia) G. Klir (State Univ. New York, USA) L. Koczy (Techn. Univ. Budapest, Hungary) R. Kowalczyk (CRA, Advanced Technical Development, Australia) V. Kreinovich (Univ. Texas, USA) K. Leung (The Chinese Univ. Hong Kong) D. Lakov (Academy of Sciences, Bulgaria) M. Mizumoto (Osaka EC Univ., Japan) S. Murugesan (Univ.Western Sydney, Macarthur, Australia) A. Patki (Department of Electronics, India) L. Reznik (Victoria Univ. Technology, Australia) A. Ramer (Univ. NSW, Australia) B. Rieger (Univ. Trier, Germany) A. Salski (Univ. Kiel, Germany) M. Smithson (James Cook Univ., Australia) M. Sugeno (Tokyo Inst. Technology, Japan) T. Terano (Tokyo Inst. Technology, Japan) T. Vladimirova (Univ. Surrey, UK) X. Yao (Defence Force Academy, Australia) J. Yen (Texas A&M Univ., USA) ORGANIZING/MANAGEMENT COMMITTEE J. Dimitrov (Univ. Western Sydney, Nepean) - Chair V. Dimitrov (Univ. Western Sydney, Hawkesbury) X. Hu (Univ. Sydney) G. Jaros (Univ. Sydney) K. Kopra (Univ. Western Sydney, Hawkesbury) J. Peperides (Omron Electronics) G. Sheather (Univ. Technology, Sydney) D. Tayler (Hawkesbury Technology) IMPORTANT DATES 15 September 1995 Deadline for Extended Abstract (1-2 pages) Submission. Address for submission: Dr Vladimir Dimitrov School of Social Ecology, UWS-Hawkesbury, Richmond 2753, Australia Fax: +61(45) 701901 Phone: +61(47) 701903 E-mail: v.dimitrov at uws.edu.au 15 October 1995 Preliminary Acceptance 30 November 1995 Deadline for Camera Ready Copy of Full Paper (5 pages) All accepted papers will be published in the FLAMOC'96 Proceedings: "Fuzzy Logic and the Management of Complexity". After the discourse, selected papers will be published in a separate volume. REGISTRATION FEE AUS$ 450 (before 15.11.1995) and AUS$500 (after 15.11.1995). Students' Fee: AU$100. TUTORIALS Tutorials include the following topics: (1) Introduction to Fuzzy Logic (FL) and its Applications. (2) Industrial Applications of FL. (3) Applications of FL in Management Practice. (4) Applications of FL in Busines and Financial Forecasting. (5) Clinical Applications of FL. (6) Application of FL in Environmental Management. (7) Advanced Design Methodology of Fuzzy Systems: Neuro-Fuzzy and Fuzzy-Genetic Systems. (8) Fuzzy Semantics (9) Approximate Reasoning (10) Research Topics in Soft Computing. FEE FOR TUTORIALS: AUS$100 for one selected tutorial topic. AUS$180 for two selected lectures. AUS$250 for three selected lectures. AUS$300 for four or more lectures. Students' fee: AUS$50 (full day tutorials) Potential lectures are invited to submit a one page proposal for tutorial that includes: the background of the lecturer (both in research and lecturing experience), abstract and contents of the proposed lecture (not limited by the above list) to: Dr Xiheng Hu DEE, University of Sydney NSW 2006, Australia Fax: +61(2) 351 3847 Phone: +61(2) 351 6475 E-mail: hxh at ee.su.oz.au not later than 15 September 1995. Lecturers are responsible for preparation and delivery of their lectures as well as preparation of quality handout (such as copies of lecture overheads). Lecturers have FREE REGISTRATION for FLAMOC'96. EXHIBITION For information about the exhibition and space reservation, contact: Mr Kalevi Kopra 6 Boree Rd, Forestville 2087 Australia Fax: +61(2) 975 1943 Phone: +61(2) 451 5728 Proposals for the exhibition to be sent not later than 15 September 1995. FEE FOR EXHIBITION SPACE (for companies): AU$2000 SOCIAL EVENTS You may want to participate in our golf tournament on 17 January 1996 (Fee: AU$50) with fuzzy logic experts from around the world or enjoy a Conference Dinner when cruising in the beautiful Sydney Harbour (Fee: AUS$ 100 ). Or postconference tour to the Great Barrier Reef, Blue Mountains, etc. For overall information about attending at (and participating in the program of) FLAMOC'96, contact: Mrs Judith Dimitrov POBox 91, Richmond 2753 Australia Fax: +61(47) 761616 Phone: +61(47) 761514 ACCOMMODATION AUS$ 105 per room (single or shared by two or three persons) in (1) The Golden Gate Hotel (Reservations: POBox K401, Haymarket, NSW 2000, Australia; Fax +61(2) 281 2213; Phone +61(2)281 6888). (2) Country Comfort Hotel, Sydney Central (Reservations: POBox K963, Haymarket; Fax +61(2) 281 3794; Phone +61(2) 212 2544). The participants must book their staying directly with one of the above hotel as soon as possible. Both hotels are close (at walking distance) to the Graduate School of Business, University of Technology, Sydney: 1-59, Quay St., Haymarket), where FLAMOC'96 will take place, and to all major Sydney tourist attractions. ____________________________________________________________________________ _______ REGISTRATION FORM FLAMOC'96 (to be sent to: FLAMOC'96, POBox 91, Richmond 2753, AUSTRALIA; Fax: +61(47)761 616) Name........................................................................ ......................................... Address..................................................................... ........................................ Company..................................................................... ...................................... Mailing Address..................................................................... .......................... Phone Fax E-mail [ ] I shall particpate in FLAMOC'96. Enclosed is my payment of the Registration Fee (AUS$450 before 15.11.1995; AUS$500 after 15.11.1995; AUS$100 for students). I shall participate in tutorials: [ ] one selected lecture only : (please, indicate the number of the tutorial from the above list of tutorials) [ ] two selected tutorials: (please, indicate which numbers of the tutorials from the above list of tutorials) [ ] three selected tutorials: (please, indicate the corresponding numbers) [ ] four or more tutorials. Enclosed is my payment for the tutorials (AUS$100 for one lecture; AUS$180 for two lectures; AUS$250 for three lectures; AUS$300 for 4 lectures or more; [ ] Students' fee: AUS$50 full day) [ ] I shall exhibit fuzzy products. Enclosed is my payment for the exhibition space: AUS$2000. I shall participate in: [ ]Golf tournament (AUS$50) [ ]Conference Dinner &Harbourcruise (AUS$100) Enclosed is my payment for the golf tournament or/and the Harbourcruise. I am interested in Post Conference Activities such as: [ ] Great Barrier Reef Tour [ ] Blue Mountains Excursion [ ] Others: ________________________________________________________________________________ ________________________________________________________________________________ PAYMENT 1. CREDIT CARD ________Visa ________Master Card __________American Express ________ Card No.......................................................................... ......................... Expiration Date........................................................................ ............... Name of the cardholder:................................................................. ....... Signature of the cardholder:................................................................. . Please, mail or fax the above form to: FLAMOC'96 POBox 91, Richmond 2753, AUSTRALIA Fax: +61(47) 761616 2. CHEQUE - made payable in Australian Dollars to: FLAMOC'96 The University of Sydney - sent to: POBox 91, Richmond 2753, AUSTRALIA ____________________________________________________________________________ ________ From rwp at eng.cam.ac.uk Tue Aug 15 10:38:19 1995 From: rwp at eng.cam.ac.uk (Richard Prager) Date: Tue, 15 Aug 1995 10:38:19 BST Subject: Research Position for 1 Year Cambridge UK Message-ID: <199508150938.6421@dsl.eng.cam.ac.uk> Euro-PUNCH Project Research Assistant Position for One Year Under the terms of a grant recently awarded to the Euro-PUNCH Project we expect to offer a one year's Research Assistant position in Cambridge to investigate the use of: Neural Networks in the Prediction of Risk in Pregnancy Euro-PUNCH is a collaborative Project funded by the Human Capital and Mobility Programme of the Commission of the European Communities. Thus, the post is available only to a citizen of a European Union Member State (but not British), who wishes to come to work in the United Kingdom. >From the obstetrical point of view, the principal focus of the Euro-PUNCH Project lies in the use of patient-specific measurements of an epidemiological nature (such as maternal age, past obstetrical history, etc.) as well as fetal heart rate recordings, in the forecasting of a number of specific Adverse Pregnancy Outcomes. >From the neural network point of view, the Project involves the design of pattern-processing and classification systems which can be trained to forecast problems in pregnancy. This will involve continuation of work on pattern-classification and regression analysis, using neural networks operating on a very large database of about 1.2 million pregnancies from various European countries. Challenging components of the project include dealing with missing and uncertain variables, sensitivity analysis, variable selection procedures and cluster analysis. Many leading European obstetrical centres are involved in the Euro-PUNCH project, and close collaboration with a number of these will be an essential component of the post offered. Candidates for this post are expected to have a good first degree and preferably a post-graduate degree in a relevant discipline. Come familiarity with medical statistics and neural networks is desirable but not essential. Salary (on the RA scale) will depend on age and experience, and is likely to be in the range of #14,317 to #15,986 per annum. Appointment would be subject to satisfactory health screening. Applications will close on 23rd August 1995. Interviews will be held in Cambridge and are likely to be on Wednesday 30th August 1995. Applications (naming two referees) should be submitted to: Dr Kevin J Dalton PhD FRCOG Division of Materno-Fetal Medicine, Dept. Obstetrics & Gynaecology University of Cambridge, Addenbrooke's Hospital Cambridge CB2 2QQ Tel: +44-1223-410250 Fax: +44-1223-336873 or 215327 e-mail: kjd5 at cus.cam.ac.uk Informal enquiries about the project should be directed to: (Obstetric side) Dr Kevin Dalton kjd5 at cus.cam.ac.uk (Engineering Side) Dr Niranjan niranjan at eng.cam.ac.uk (Engineering Side) Dr Richard Prager rwp at eng.cam.ac.uk From kevin at research.nj.nec.com Tue Aug 15 14:49:12 1995 From: kevin at research.nj.nec.com (Kevin Lang) Date: Tue, 15 Aug 95 14:49:12 EDT Subject: learning 5-bit parity: RMHC vs. GP vs. MLP Message-ID: <9508151849.AA20840@doghein> Comments on "A Response to ..." [Ko95] Kevin J. Lang NEC Research Institute kevin at research.nj.nec.com July 1995 These remarks constitute a brief technical reply to [Ko95], which was handed out at the ML-95 and ICGA-95 conferences. SCALING FROM 3-BIT TO 5-BIT PARITY: hill climbing still wins [Ko95, section 8] extends the experimental comparison in [La95] of the search efficiency of random mutation hill climbing and genetic search to the task of learning 5-bit parity. It is shown that any given run of RMHC is four times less likely than genetic search to find a circuit which computes 5-bit parity. However, when RMHC manages to find a solution, it does so about fifty times faster than genetic search. Hence, by using RMHC in an iterated manner (i.e. multiple independent runs), it is possible to generate solutions much more cheaply than with genetic search. For example, by restarting each run of RMHC after 75,000 candidates one could obtain a candidate to solution ratio of about 300,000. This compares well with the value of 2,913,583 candidates per solution reported for genetic search. The above argument could be formalized by calculating performance curves for the two algorithms, as discussed in [Ko92, chapter 8]. [Ko95] asserts that these curves would have permitted a meaningful comparison of the algorithms to be made, but does not provide them, citing the large computational expense that would supposedly have to be incurred. Actually, the relevant portion of the I(M,i,z) curve for RMHC could be estimated in one day by doing fifty runs out to 100,000 candidates. Since this is roughly the same amount of work as a single run of genetic search on this problem, estimating the performance curve for genetic search _would_ be a daunting task. DON'T USE RMHC: the power of multi-layer perceptrons I would like to emphasize that hill climbing algorithm described in [La95] was not intended to be either novel or good, and that I do NOT advocate its use. By deliberately using a bad version of an old algorithm, I sought to underline the negative character of my results. My positive advice is this: when learning functions, use multi-layer perceptrons. By adopting this representation for hypotheses one can exploit the powerful gradient-based search procedures that have been developed by the numerical analysis community. To illustrate the advantages of this approach, I invested 7 seconds of computer time in ten runs of conjugate gradient search for MLP weights to compute 5-bit parity. The resulting candidate to solution ratio was 393. This is roughly 750 times better than RMHC on boolean circuits, and 7400 times better than genetic search on boolean circuits. candidates found a examined solution? ---------- ---------- 31 yes 43 yes 62 yes 151 yes 196 no, stuck in local optimum 239 no, stuck in local optimum 271 no, stuck in local optimum 274 no, stuck in local optimum 308 no, stuck in local optimum 392 yes Unlike RMHC and genetic search, the conjugate gradient search procedure used here has no control parameters to tweak, and should yield good results in the hands of any user. Also, it detects when it is stuck in a local optimum, thus permitting an immediate restart to be made from random weights. This transforms the iterated methodology from an after-the-fact accounting system into a truly useful algorithm. Notes on the MLP network used in the above experiment: 5 input units 5 hidden units computing tanh 1 output unit computing tanh random initial weights drawn uniformly from the interval [-1,+1] true and false encoded by +1 and -1 REFERENCES [Ko95] John Koza, "A Response to the ML-95 Paper entitled "Hill Climbing Beats Genetic Search on a Boolean Circuit Synthesis Task of Koza's"", informally published and distributed document. [La95] Kevin Lang, "Hill Climbing Beats Genetic Search on a Boolean Circuit Synthesis Task of Koza's", The Twelfth International Conference on Machine Learning, pp. 340-343, 1995. Note: a copy of this paper can be obtained by anonymous ftp from ftp.nj.nec.com:/pub/kevin/lang-ml95.ps [Ko92] John Koza, "Genetic Programming", MIT Press, 1992, pp. 205-236. From walter.fetter at mandic.com.br Wed Aug 16 02:22:00 1995 From: walter.fetter at mandic.com.br (WALTER FETTER) Date: Wed, 16 Aug 95 03:22:00 -0300 Subject: subscription References: <8ACB049.012C0082FD.uuout@mandic.com.br> Message-ID: <8AF40CA.012C010C4D.uuout@mandic.com.br> A non-text attachment was scrubbed... Name: not available Type: text Size: 166 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/7d71163e/attachment.ksh From jbower at bbb.caltech.edu Tue Aug 15 21:44:07 1995 From: jbower at bbb.caltech.edu (Jim Bower) Date: Tue, 15 Aug 95 18:44:07 PDT Subject: GENESIS 2.0 Message-ID: <9508160144.AA06021@bbb.caltech.edu> ------------------------------------------------------------------------ This is to announce the release of GENESIS 2.0, a major revison of the GENESIS simulator. GENESIS is a general purpose simulation platform which was developed to support the simulation of neural systems ranging from complex models of single neurons to simulations of large networks made up of more abstract neuronal components. GENESIS has provided the basis for laboratory courses in neural simulation at both Caltech and the Marine Biological Laboratory in Woods Hole, MA, as well as several other institutions. Most current GENESIS applications involve realistic simulations of biological neural systems. Although the software can also model more abstract networks, other simulators are more suitable for backpropagation and similar connectionist modeling. The current version of GENESIS and its graphical front-end XODUS are written in C and run under UNIX on Sun (SunOS 4.x or Solaris 2.x), DECstation (Ultrix), Silicon Graphics (Irix 4.0.1 and up) or x86 PC (Linux or FreeBSD) machines with X-windows (versions X11R4, X11R5, and X11R6). Within the next two months, we expect to complete the port of GENESIS to DEC Alphas (OSF1 v2 and v3), IBM RS6000s (AIX) and HPs (HPUX). Other platforms may be capable of running GENESIS, but the software has not been tested by Caltech outside of these environments. The GENESIS ftp site also contains the first release of Parallel GENESIS, designed for networks of workstations (NOW), symmetric multiprocessors (SMP) and massively parallel processors (MPP). This release is known to run on SGI/Irix and has run on these workstations: Sun4/Solaris, Alpha/OSF1.3, DecStation/Ultrix4.3, Sun4/SunOS. It has run on these SMPs: SGI-Challenge/Irix 5.3, Sun4MP/Solaris. It will soon be ported to the Cray T3D/E MPP and later to Intel Paragon and IBM SP2 MPPs. GENESIS 2.0 also now runs on 486 and Pentium PC's under Linux and FreeBSD. In addition to these extensions, 2.0 includes a number of changes and improvements in the source code for portability, stablity, and consistency. Numerous changes in the GENESIS objects also add flexibility, especially when constructing network simulations. In addition, the XODUS graphical interface has been completely rewritten and is now independent of the Athena widget set. This allows greater interaction with simulations using the mouse (rescaling of graphs by click and drag, restructuring of simulation elements with drag and drop operations, etc.). A script converter is included which translates GENESIS 1 scripts to GENESIS 2. The GENESIS 2.0 release also includes updates of the simulation scripts for all tutorials and examples used in ``The Book of GENESIS'' (see below). This release includes a completely revised and expanded manual and on-line help. Acquiring GENESIS via free FTP distribution: We have made the current release of GENESIS (ver. 2.0, August 1995) available via FTP from genesis.bbb.caltech.edu (131.215.5.249). The distributed compressed tar file is about 3 MB in size. The current distribution includes full source code and documentation for both GENESIS and XODUS as well as fourteen tutorial simulations. Documentation for these tutorials is included along with online GENESIS help files and postscript files for generating the newly revised printed manual. To acquire the software use 'ftp' to connect to genesis.bbb.caltech.edu and login as the user "anonymous", giving your full email address as the password. You can then 'cd /pub/genesis' and download the software. Be sure to download the files LATEST.NEWS and README for information about new features of the current GENESIS version, the files on the system, and installation instructions. A detailed guide to the GENESIS neuroscience tutorials and to the construction of GENESIS simulations is given in: The Book of GENESIS: Exploring Realistic Neural Models with the GEneral NEural SImulation System, by James M. Bower and David Beeman, published by TELOS/Springer-Verlag -- ISBN 0-387-94019-7 For ordering information, contact info at telospub.com, or phone (in the US) 1-800-777-4643. BABEL - GENESIS users group Serious users of GENESIS are advised to join the users group, BABEL. Members of BABEL are entitled to access the BABEL directories and email newsgroup. These are used as a repository for the latest contributions by GENESIS users and developers. These include new simulations, libraries of cells and channels, additional simulator components, new documentation and tutorials, bug reports and fixes, and the posting of questions and hints for setting up GENESIS simulations. As the results of GENESIS research simulations are published, many of these simulations are being made available through BABEL. New developments are announced in a newsletter which is sent by email to all members. Members are able to access the BABEL directories and transfer files to and from their host machines using a passworded ftp account. Inquiries concerning GENESIS should be addressed to genesis at bbb.caltech.edu. Inquiries concerning BABEL memberships should be sent to babel at bbb.caltech.edu. Other information concerning GENESIS, including "snapshots" of GENESIS simulations and descriptions of research which has been conducted with GENESIS may be found on the GENESIS World Wide Web Server: http://www.bbb.caltech.edu/GENESIS From wgm at santafe.edu Wed Aug 16 16:03:37 1995 From: wgm at santafe.edu (Bill Macready) Date: Wed, 16 Aug 95 14:03:37 MDT Subject: No subject Message-ID: <9508162003.AA20822@sfi.santafe.edu> In his recent posting, Kevin Lang continues a contest with John Koza of the "my search algorithm beats your search algorithm according to the following performance measure for the following (contrived) fitness function" variety. Interested readers of connectionist should know that over the space of all fitness functions, no matter what the performance measure, any two search algorithms have exactly the same expected performance. This is discussed in the paper mentioned below. That paper goes on to discuss other more interesting issues, like head-to-head minimax distinctions between search algorithms, the information theoretic and geometric aspects of search, time-varying fitness functions, etc. Interested readers should also know that there was a (very) lengthy thread on this topic on the ga-list several months ago. In particular, some articles following up on the paper mentioned below are announced there. For example, an article by Radcliffe and Surry is announced there, as is an article by Macready and Wolpert on intrisically hard fitness functions (as opposed to functions that are hard with respect to some particular search algorithm). *** We are also currently involved in research with a student to exhaustively characterize the set of fitness functions for which searh algorithm A does much better than algorithm B and how that set differs for the set for which B outperforms A. In particular, we are doing this for the case where the algorithms are hill-climbers and GA's, as in Lang's debate with Koza. In addition, search is (from a formal perspective) almost identical to active learning, optimal experimental design, and (a less exact match) control theory. We are also currently investigating what those other fields have to offer for search. Anyone interested in receiving the results of that work as it gets written up can email us at wgm at santafe.edu or dhw at santafe.edu Bill Macready *** Here are the original paper announcements: ftp-file-name: nfl.ps No Free Lunch Theorems for Search D.H. Wolpert, W.G. Macready We show that all algorithms that search for an extremum of a cost function perform exactly the same, when averaged over all possible cost functions. In particular, if algorithm A outperforms algorithm B on some cost functions, then loosely speaking there must exist exactly as many other functions where B outperforms A. Starting from this we analyze a number of the other a priori characteristics of the search problem, like its geometry and its information-theoretic aspects. This analysis allows us to derive mathematical benchmarks for assessing a particular search algorithm's performance. We also investigate minimax aspects of the search problem, the validity of using characteristics of a partial search over a cost function to predict future behavior of the search algorithm on that cost function, and time-varying cost functions. We conclude with some discussion of the justifiability of biologically-inspired search methods. ------------------------------------------------------------- ftp-file-name: hard.ps What Makes an Optimization Problem Hard? W.G. Macready, D.H. Wolpert We address the question, ``Are some classes of combinatorial optimization problems intrinsically harder than others, without regard to the algorithm one uses, or can difficulty only be assessed relative to particular algorithms?'' We provide a measure of the hardness of a particular optimization problem for a particular optimization algorithm. We then present two algorithm-independent quantities that use this measure to provide answers to our question. In the first of these we average hardness over all possible algorithms for the optimization problem at hand. We show that according to this quantity, there is no distinction between optimization problems, and in this sense no problems are intrinsically harder than others. For the second quantity, rather than average over all algorithms we consider the level of hardness of a problem (or class of problems) for the algorithm that is optimal for that problem (or class of problems). Here there are classes of problems that are intrinsically harder than others. To obtain an electronic copy of these papers: ftp ftp.santafe.edu login: anonymous password: cd /pub/wgm get quit Then at your system: lpr -P If you have trouble getting any of these papers electronically, you can request a hard copy from publications (wp at santafe.edu), Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM, USA, 87501. From bap at scr.siemens.com Thu Aug 17 01:13:12 1995 From: bap at scr.siemens.com (Barak Pearlmutter) Date: Thu, 17 Aug 1995 01:13:12 -0400 Subject: Strawman: 2, GP: 0 In-Reply-To: <9508162003.AA20822@sfi.santafe.edu> (message from Bill Macready on Wed, 16 Aug 95 14:03:37 MDT) Message-ID: <199508170513.BAA21287@gull.scr.siemens.com> It might be true that in a universe where everything was equally likely, all search algorithms would be equally awful. But that does not appear to be the universe that we live it, so it is not unreasonable to ask whether some particular search algorithm performs well in practice. But this point is somewhat moot, because Kevin Lang's recent post was not claiming that "his" "new" algorithm was better than current algorithms. Lang published a well reasoned and careful scientific paper, which cast doubt on the practicality and importance of a particular algorithm (GP) by showing that on a very simple problem *taken from the GP book* it compares unfavorably to a silly strawman algorithm. This paper passed scientific muster, and was accepted into ML, a respected refereed conference. John Koza responded by distributing a lengthy unrefereed screed, the bulk of which consisted of vicious invective and distorted half-truths. A small part of Koza's monograph had some actual technical content: it gave some new numbers on a scaled-up version of the problem in question, and interpreted them as showing that GP scaled better than the algorithm Lang had compared it to. However, Koza's interpretation was wrong: looking carefully at the raw numbers shows that even in Koza's hands, GP is scaling horribly worse than the silly strawman algorithm Lang had compared it to. That is the point of Lang's recent post. From workshop at Physik.Uni-Wuerzburg.DE Thu Aug 17 15:55:33 1995 From: workshop at Physik.Uni-Wuerzburg.DE (workshop) Date: Thu, 17 Aug 95 15:55:33 MESZ Subject: workshop and autumn school in Wuerzburg/Germany Message-ID: <199508171355.PAA17261@wptx14.physik.uni-wuerzburg.de> Second Announcement and Call for Abstracts INTERDISCIPLINARY AUTUMN SCHOOL AND WORKSHOP ON NEURAL NETWORKS: APPLICATION, BIOLOGY, AND THEORY October 12-14 (school) and 16-18 (workshop), 1995 W"urzburg, Germany INVITED SPEAKERS INCLUDE: M. Abeles, Jerusalem A. Aertsen, Rehovot J.K. Anlauf, Siemens AG J.P. Aubin, Paris M. Biehl, W"urzburg C. v.d. Broeck, Diepenbeek M. Cottrell, Paris G. Deco, Siemens AG B. Fritzke, Bochum Th. Fritsch, W"urzburg J. G"oppert, T"ubingen L.K.Hansen, Lyngby M. Hemberger,Daimler Benz L.van Hemmen, M"unchen J.A. Hertz, Copenhagen J. Hopfield, Pasadena I. Kanter, Ramat-Gan P. Kraus, Bochum B. Lautrup, Copenhagen W. Maass, Graz Th. Martinetz, Siemens AG M. Opper, Santa Cruz H. Scheich, Magdeburg S. Seung, AT&T Bell-Lab. W. Singer, Frankfurt S.A. Solla, Copenhagen H. Sompolinsky, Jerusalem M. Stemmler, Pasadena F. Varela, Paris A. Weigend, Boulder AUTUMN SCHOOL, Oct. 12-14: Introductory lectures on theory and applications of neural nets for graduate students and interested postgraduates in biology, medicine, mathematics, physics, computer science, and other related disciplines. Topics include neuronal modelling, statistical physics, hardware and application of neural nets, e.g. in telecommunication and biological data analysis. WORKSHOP, Oct. 16-18: Biology, theory, and applications of neural networks with particular emphasis on the interdisciplinary aspects of the field. There will be only invited lectures with ample time for discussion. In addition, poster sessions will be scheduled. REGISTRATION: Recommended before AUGUST 31, per FAX or (E-)MAIL to Workshop on Neural Networks, Inst. f"ur Theor. Physik, Julius-Maxmimilians-Universit"at Am Hubland, D-97074 W"urzburg, Germany Fax: +49 931 888 5141 E-mail: workshop at physik.uni-wuerzburg.de The registration fee is DM 150,- for the Autumn school and DM 150,- for the Workshop, due upon arrival (cash only). Students pay DM 80,- for each event (student ID required). ABSTRACTS: Participants who wish to present a poster should submit title and abstract together with their registration, preferably by E-mail. Deadline for the registration of poster contributions is AUGUST 31. ACCOMMODATION: Please use the appended form to contact directly the W"urzburg Tourist Office for room reservations (Fax +49 931 37652). NOTE THAT THIS NUMBER WAS WRONG IN THE FIRST ANNOUNCEMENT!!!!!!!!! We strongly recommend to arrange accommodation as soon as possible as various other conferences are scheduled for the same period of time. FTP-SERVER: Updated information (program, abstracts etc.) is available via anonymous ftp from the site ftp.physik.uni-wuerzburg.de, directory /pub/workshop/. Retrieve file README for further instructions. ORGANIZING COMMITTEE: M. Biehl, Th. Fritsch, W. Kinzel, Univ. W"urzburg. SCIENTIFIC ADVISORY COUNCIL: D. Flockerzi, K.-D. Kniffki, W. Knobloch, M. Meesmann, T. Nowak, F. Schneider, P. Tran-Gia, Universit"at W"urzburg. SPONSORS: Peter Beate Heller-Stiftung im Stifterverband f. die Deutsche Wissenschaft, Research Center of Daimler Benz AG, Stiftung der St"adt. Sparkasse W"urzburg. --------------------------------cut here---------------------------------- Registration Form Please return to: Workshop on Neural Networks Institut f"ur Theoretische Physik Julius-Maximilians-Universit"at Am Hubland D-97074 W"urzburg, Germany Fax: +49 931 888 5141 E-mail : workshop at physik.uni-wuerzburg.de I will attend the Autumn School Oct. 12-14 [ ] * (Reg. fee DM 150,- [ ] / 80,- [ ] due upon arrival) * Workshop Oct. 16-18 [ ] * (Reg. fee DM 150,- [ ] / 80,- [ ] due upon arrival) * * Please mark, reduced fee applies only for participants with valid student-ID. I wish to present a poster [ ] (If yes, please send a title page with a 10-line abstract!) Name: Affiliation: Address: Phone: Fax: E-mail: (please provide full postal address in any case!) Signature: --------------------------cut here, print out and fill in ------------------ To the Congress and Tourismus Zentrale Am Congress Centrum D- 97079 Wuerzburg Fax: ( +49) 931 37652 PLEASE NOTE: In case of room reservations the Tourist Office only proceeds as an agent. Your request should arrive here early enough to allow us to accommodate you and send you a reply (about one week) For this reservation we will charge you DM 5,- which you will have to pay with your hotel bill. REF: Workshop on Neural Networks Institut fuer Theoretische Physik Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - I hereby reserve ....... single rooms ....... double rooms LOCATION of the accommodation [ ] inner city [ ] city [ ] suburbs PRICE (per night, including breakfast) single double room with bath/shower/WC [ ] from DM 90,- [ ] from DM 130,- room with bath/shower/WC [ ] from DM 130,- [ ] from DM 180,- room with bath/shower/WC [ ] from DM 180,- [ ] from DM 250,- DATE of arrival ............. for ........... night(s) TIME of arrival (approximately) .............. h by car/train SIGNATURE and date : SENDER: (print letters) Company/institute : Mrs/Mr : Street : Zip Code : City : Phone : Fax : From kruschke at croton.psych.indiana.edu Thu Aug 17 10:16:29 1995 From: kruschke at croton.psych.indiana.edu (John Kruschke) Date: Thu, 17 Aug 1995 09:16:29 -0500 (EST) Subject: report announcement: Five principles of category learning Message-ID: <9508171416.AA12437@croton.psych.indiana.edu> ========================================================== Kruschke, J. K. & Erickson, M. A. (to appear). Five principles for models of category learning. Invited chapter in: Z. Dienes (ed.), Connectionism and Human Learning. Oxford, England: Oxford University Press. ABSTRACT: The primary goal of this chapter is to report a connectionist model that integrates five principles of category learning previously implemented separately. Previous work by Kruschke (1995) modeled the generalization phase of the ``inverse base-rate effect'' (Medin and Edelson, 1988), but did not address performance in the learning phase. That work emphasized the principles of rapid attention shifts and consistent use of base rate knowledge. Subsequent work by Kruschke and Bradley (1995) addressed the learning phase of a simpler categorization task, and emphasized the principles of short-term memory and strategic guessing. The present chapter integrates principles from both previous reports, and applies the integrated model to both the learning and generalization phases of the inverse base-rate effect. PostScript for this chapter, and for the previous papers cited in the abstract, may be retrieved from the Research section of my Web page, which has address (URL) listed below. John K. Kruschke e-mail: kruschke at indiana.edu Dept. of Psychology office: (812) 855-3192 Indiana University lab: (812) 855-9613 Bloomington, IN 47405-1301 USA fax: (812) 855-4691 URL= http://silver.ucs.indiana.edu/~kruschke/home.html ========================================================== From bert at mbfys.kun.nl Thu Aug 17 11:05:13 1995 From: bert at mbfys.kun.nl (Bert Kappen) Date: Thu, 17 Aug 1995 17:05:13 +0200 Subject: Paper dynamic linking, paper active vision Message-ID: <199508171505.RAA14729@septimius.mbfys.kun.nl> The following two papers are available by anonymous FTP. DO NOT FORWARD TO OTHER GROUPS No hardcopies available Dynamic linking in Stochastic Networks Hilbert J. Kappen, Marcel J. Nijman To be presented at the International Conference on Brain Processes, Theories and Models to take place in Las Palmas de Gran Canaria, Spain, Nov 12-17, 1995. FTP-host: ftp.mbfys.kun.nl FTP-file: /snn/pub/reports/Kappen.Dyn.Link.ps.Z Abstract: It is well established that cortical neurons display synchronous firing for some stimuli and not for others. The resulting synchronous subpopulation of neurons is thought to form the basis of object perception. In this paper this 'binding' problem is formulated for Boltzmann Machines. Feed-forward connections implement feature detectors and lateral connections implement memory traces or cell assemblies. We show, that dynamic linking can be solved in the Ising model where sensory input provides local evidence. The lateral connections in the hidden layer provide global correlations between features that belong to the same stimulus and no correlations between features from different stimuli. Learning Active Vision Hilbert J. Kappen, Marcel J. Nijman, Tonnie van Moorsel To be presented at ICANN'95, October 1995, Paris FTP-host: ftp.mbfys.kun.nl FTP-file: snn/pub/reports/Kappen.Active.Vision.ps.Z Abstract: In this paper we introduce a new type of problem which we call active decision. It consists of finding the optimal subsequent action, based both on partial observation and on previously learned knowledge. We propose a method for solution, based on Boltzmann Machine learning of joint input-output probabilities and on an entropy minimization criterion. We show how the method provides a basic mechanism for active vision tasks such as saccadic eye movements. From byuhas at ucs.att.com Thu Aug 17 15:14:18 1995 From: byuhas at ucs.att.com (byuhas@ucs.att.com) Date: Thu, 17 Aug 95 14:14:18 EST Subject: No subject Message-ID: <9507178086.AA808665603@netlink.ucs.att.com> ********************************************************************** ************ PLEASE DO NOT REPOST ************** ********************************************************************** JOB OPPORTUNITY: NEURAL NETWORK FINANCIAL APPLICATIONS An opportunity exists for someone interested in applying neural networks to problems in the credit-card industry, including:risk modeling, marketing, fraud detection and related areas. This neural network effort will coincide with two other groups: one using more traditional statistical approaches and the other using AI and expert systems. Our goal is to combine and compare these methods to achieve the best possible results. Many of the problems we are interested in have two characteristics: 1) two-class discrimination is often the primary goal, with one class being significantly more prevalent than the other 2) huge amounts of data are available for modeling To be successful, candidate should have significant experience applying neural networks and be able to develop and implement novel algorithms, specialized error criteria and lever learning strategies to attack these problems. Qualifications of interest are: * a Masters degree, with a Ph.D. preferred * knowledge and experience with neural network modeling * 'C' programming in a UNIX environment * experience with SAS or S-Plus Compensation is competitive if not generous. ATT-UCS is located about 12 miles from the beach, just south of Jacksonville, FL. (The #1 medium-size city in Money's best places to live list out this week. ) If you have any questions or want to apply, you can contact me by: email: byuhas at ucs.att.com or phone: 904-954-7376 or fax: 904-954-8718. or Ben Yuhas ATT-UCS 8787 Baypine Road Jacksonville, FL 32256 From esann at dice.ucl.ac.be Thu Aug 17 14:57:26 1995 From: esann at dice.ucl.ac.be (esann@dice.ucl.ac.be) Date: Thu, 17 Aug 1995 20:57:26 +0200 Subject: Neural Processing Letters Vol.2 No.4 Message-ID: <199508171855.UAA05881@ns1.dice.ucl.ac.be> Neural Processing Letters: new issue Vol.2 No.4 ----------------------------------------------- You will find enclosed the table of contents of the July 1995 issue of "Neural Processing Letters" (Vol.2 No.4). The abstracts of these papers are contained on the below mentioned FTP and WWW servers. We also inform you that subscription to the journal is now possible by credit card. All necessary information is contained on the following servers: - FTP server: ftp.dice.ucl.ac.be directory: /pub/neural-nets/NPL - WWW server: http://www.dice.ucl.ac.be/neural-nets/NPL/NPL.html If you have no access to these servers, or for any other information (subscriptions, instructions for authors, free sample copies,...), please don't hesitate to contact directly the publisher: D facto publications 45 rue Masui B-1210 Brussels Belgium Phone: + 32 2 245 43 63 Fax: + 32 2 245 46 94 Neural Processing Letters, Vol.2, No.4, July 1995 _________________________________________________ - Evolving neural networks with iterative learning scheme for associative memory Shigetaka Fujita, Haruhiko Nishimura - Combining Singular-Spectrum Analysis and neural networks for time series forecasting F. Lisi, O. Nicolis, Marco Sandri - A simplified MMC model for the control of an arm with redundant degrees of freedom U. Steink?hler, W.-J. Beyn, H. Cruse - On the statistical physics of radial basis function networks Sean B. Holden, Mahesan Niranjan - Accelerated training algorithm for feedforward neural networks based on least squares method Y.F.Yam and Tommy W.S.Chow - On the search for new learning rules for ANNs Samy Bengio, Yoshua Bengio, Jocelyn Cloutier _____________________________ D facto publications - conference services 45 rue Masui 1210 Brussels Belgium tel: +32 2 245 43 63 fax: +32 2 245 46 94 _____________________________ From mkearns at research.att.com Thu Aug 17 16:13:00 1995 From: mkearns at research.att.com (Michael J. Kearns) Date: Thu, 17 Aug 95 16:13 EDT Subject: COLT '96 Call for Papers Message-ID: ______________________________________________________________________ PRELIMINARY CALL FOR PAPERS---COLT '96 Ninth Conference on Computational Learning Theory Desenzano del Garda, Italy June 28 -- July 1, 1996 ______________________________________________________________________ The Ninth Conference on Computational Learning Theory (COLT '96) will be held in the town of Desenzano del Garda, Italy, from Friday, June 28, through Monday, July 1, 1996. COLT '96 is sponsored by the Universita` degli Studi di Milano. We invite papers in all areas that relate directly to the analysis of learning algorithms and the theory of machine learning, including neural networks, statistics, statistical physics, Bayesian/MDL estimation, reinforcement learning, inductive inference, knowledge discovery in databases, robotics, and pattern recognition. We also encourage the submission of papers describing experimental results that are supported by theoretical analysis. ABSTRACT SUBMISSION. Authors should submit fifteen copies (preferably two-sided) of an extended abstract to: Michael Kearns --- COLT '96 AT&T Bell Laboratories, Room 2A-423 600 Mountain Avenue Murray Hill, New Jersey 07974-0636 Telephone(for overnight mail): (908) 582-4017 Abstracts must be RECEIVED by FRIDAY JANUARY 12, 1996. This deadline is firm. We also anticipate allowing electronic submissions. Details of the electronic submission procedure will be provided in an updated Call for Papers, and will be available from the web site http://www.cs.cmu.edu/~avrim/colt96.html which will also be used to provide other program-related information. Authors will be notified of acceptance or rejection on or before Friday, March 15, 1996. Final camera-ready papers will be due by Friday, April 5. Papers that have appeared in journals or other conferences, or that are being submitted to other conferences, are not appropriate for submission to COLT. An exception to this policy is that COLT and STOC have agreed that a paper can be submitted to both conferences, with the understanding that a paper will be automatically withdrawn from COLT if accepted to STOC. ABSTRACT FORMAT. The extended abstract should include a clear definition of the theoretical model used and a clear description of the results, as well as a discussion of their significance, including comparison to other work. Proofs or proof sketches should be included. If the abstract exceeds 10 pages, only the first 10 pages may be examined. A cover letter specifying the contact author and his or her email address should accompany the abstract. PROGRAM FORMAT. At the discretion of the program committee, the program may consist of both long and short talks, corresponding to longer and shorter papers in the proceedings. The short talks will also be coupled with a poster presentation. PROGRAM CHAIRS. Avrim Blum (Carnegie Mellon University) and Michael Kearns (AT&T Bell Laboratories). CONFERENCE AND LOCAL ARRANGEMENTS CHAIRS. Nicolo` Cesa-Bianchi (Universita` di Milano) and Giancarlo Mauri (Universita` di Milano). PROGRAM COMMITTEE. Martin Anthony (London School of Economics), Avrim Blum (Carnegie Mellon University), Bill Gasarch (University of Maryland), Lisa Hellerstein (Northwestern University), Robert Holte (University of Ottawa), Sanjay Jain (National University of Singapore), Michael Kearns (AT&T Bell Laboratories), Nick Littlestone (NEC Research Institute), Yishay Mansour (Tel Aviv University), Steve Omohundro (NEC Research Institute), Manfred Opper (University of Wuerzburg), Lenny Pitt (University of Illinois), Dana Ron (Massachusetts Institute of Technology), Rich Sutton (University of Massachusetts) COLT, ML, AND EUROCOLT. The Thirteenth International Conference on Machine Learning (ML '96) will be held right after COLT '96, on July 3--7 in Bari, Italy. In cooperation with COLT, the EuroCOLT conference will not be held in 1996. STUDENT TRAVEL. We anticipate some funds will be available to partially support travel by student authors. Details will be distributed as they become available. From lorincz at iserv.iki.kfki.hu Fri Aug 18 05:22:36 1995 From: lorincz at iserv.iki.kfki.hu (Andras Lorincz) Date: Fri, 18 Aug 1995 11:22:36 +0200 Subject: No subject Message-ID: <9508180922.AA17280@iserv.iki.kfki.hu> The following recent publications of the Adaptive Systems Lab of the Attila Jozsef University of Szeged and the Hungarian Academy of Sciences can be accessed on WWW via the URL http://iserv.iki.kfki.hu/asl-publs.html and http://iserv.iki.kfki.hu/qua-publs.html. Olah, M. and Lorincz, A. (1995) Analog VLSI Implementation of Grassfire Transformation for Generalized Skeleton Formation ICANN'95 Paris, accepted Abstract A novel analog VLSI circuit is proposed that implements the grassfire transformation for calculating the generalized skeleton of a planar shape. The fully parallel VLSI circuit can perform the calculation in real time. The algorithm, based on an activation spreading process on an artificial neural network, can be implemented by an extended nonlinear resistive network. Architecture and building blocks are outlined, the feasibility of this technique is investigated. Szepesvari, Cs. (1995) General Framework for Reinforcement Learning, ICANN'95 Paris, accepted Abstract We set up a general framework for the investigation of decision processes. The framework is based on the so called one step look-ahead (OLA) cost mapping. The cost of a policy is defined by the successive application OLA mapping. This way various decision criterions (e.g. the expected value criterion or the worst-case criterion) can be treated in a unified way. The main theorem of this article says that under minimal conditions optimal stationary policies are greedy w.r.t. the optimal cost function and vice versa. Based on this result we hope that previous results on reinforcement learning can be generalized to other decision criterions that fit the proposed framework. Marczell, Zs., Kalmar, Zs. and Lorincz, A. (1995) Generalized skeleton formation for texture segmentation Neural Network World, accepted Abstract An algorithm and an artificial neural architecture that approximates the algorithm are proposed for the formation of generalized skeleton transformations. The algorithm includes the original grassfire proposal of Blum and is extended with an integrative on-center off-surround detector system. It is shown that the algorithm can elicit textons by skeletonization. Slight modification of the architecture corresponds to the Laplace transformation followed by full wave rectification, another algorithm for texture discrimination proposed by Bergen and Adelson. Szepesvari, Cs. (1995) Perfect Dynamics for Neural Networks Talk presented at Mathematics of Neural Networks and Applications Lady Margaret Hall, Oxford, July, 1995 Abstract The vast majority of artificial neural network (ANN) algorithms may be viewed as massively parallel, non-linear numerical procedures that solve certain kind of fixed point equations in an iterative manner. We consider the use of such recurrent ANNs to solve optimization problems. The dynamics of such networks is often based on the minimization of an energy-like function. An important (and presumably hard) problem is to exclude the possibility of "spurious local minima". In this article we take another starting point and that is to consider perfect dynamics. We say that a recurrent ANN admits perfect dynamics if the dynamical system given by the update operator of the network has an attractor whose basin of attraction covers the set of all possible initial solution candidates. One may wonder whether neural networks that admit perfect dynamics can be interesting in applications. In this article we show that there exist a family of such networks (or dynamics). We introduce Generalised Dynamic Programming (GenDyP) for the government of the dynamics. Roughly speaking a GenDyP problem is a 3-tuple (X,A,Q), where X and A are arbitrary sets and Q maps functions of X into functions of X x A. GenDyP problems derive from sequential decision problems and dynamic programming (DP). Traditional DP procedures correspond to special selections of the mapping Q. We show that if Q is monotone and satisfies other reasonable (continuity and boundedness) conditions then the above iteration converges to a distinguished solution of the functional equation u(x)=inf_a (Q u)(x,a), that is the generalization of the well known Bellmann Optimality Equation. The proofs relies on the relation between the above dynamics and the optimal solutions of generalized multi-stage decision problems (we give the definitions in the next section). Kalmar, Zs., Szepesvari, Cs. and Lorincz, A. (1995) Generalized Dynamic Concept Model as a Route to Construct Adaptive Autonomous Agents Neural Network World 3:353--360 Abstract A model of adaptive autonomous agents, that (i) builds internal representation of events and event relations, (ii) utilizes activation spreading for building dynamic concepts and (iii) makes use of the winner-take-all paradigm to come to a decision is extended by introducing generalization into the model. The generalization reduces memory requirements and improves performance in unseen scenes as it is indicated by computer simulations J.Toth, G., Kovacs, Sz, and Lorincz, A. (1995) Genetic algorithm with alphabet optimization Biological Cybernetics 73:61-68 Abstract In recent years genetic algorithm (GA) was used successfully to solve many optimization problems. One of the most difficult questions of applying GA to a particular problem is that of coding. In this paper a scheme is derived to optimize one aspect of the coding in an automatic fashion. This is done by using a high cardinality alphabet and optimizing the meaning of the letters. The scheme is especially well suited in cases where a number of similar problems need to be solved. The use of the scheme is demonstrated on such a group of problems: on the simplified problem of navigating a `robot' in a `room'. It is shown that for the sample problem family the proposed algorithm is superior to the canonical GA. J.Toth, G. and Lorincz, A. (1995) Genetic algorithm with migration on topology conserving maps Neural Network World 2:171--181 Abstract Genetic algorithm (GA) is extended to solve a family of optimization problems in a self-organizing fashion. The continuous world of inputs is discretized in an optimal fashion with the help of a topology conserving neural network. The GA is generalized to organize individuals into subpopulations associated with the neurons. Interneuron topology connections are used to allow gene migration to neighboring sites. The method speeds up GA by allowing small subpopulations, but still saving diversity with the help of migration. Within a subpopulation the original GA was applied as the means of evolution. To illustrate the this modified GA the optimal control of a simulated robot-arm is treated: a falling ping-pong ball has to be caught by a bat without bouncing. It is demonstrated that the simultaneous optimization for an interval of height can be solved, and that migration can considerably reduce computation time. Other aspects of the algorithm are outlined. Amstrup, B., J.Toth, G., Szabo, G., Rabitz, H., and Lorincz, A. (1995) Genetic algorithm with migration on topology conserving maps for optimal control of quantum systems Journal of Physical Chemistry, 99, 5206-5213 Abstract The laboratory implementation of molecular optimal control has to overcome the problem caused by the changing environmental parameters, such as the temperature of the laser rod, the resonator parameters, the mechanical parameters of the laboratory equipment, and other dependent parameters such as time delay between pulses or the pulse amplitudes. In this paper a solution is proposed: instead of trying to set the parameter(s) with very high precision, their changes are monitored and the control is adjusted to the current values. The optimization in the laboratory can then be run at several values of the parameter(s) with an extended genetic algorithm (GA) wich is tailored to such parametric optimization. The extended GA does not presuppose but can take advantage and, in fact, explores whether the mapping from the parameter(s) to the optimal control field is continuous. Then the optimization for the different values of the parameter(s) is done cooperatively, which reduces the optimization time. A further advantage of the method is its full adaptiveness; i.e., in the best circumstances no information on the the system or laboratory equipment is required, and only the success of the control needs to be measured. The method is demonstrated on a model problem: a pump-and-dump type model experiment on CsI. The WWW pages also contain pointers to other older publications that are available on line. Papers are available also by anonymous ftp through iserv.iki.kfki.hu/pub/papers Best regards, Andras Lorincz Department of Adaptive Systems also Department of Photophysics Attila Jozsef University of Szeged Institute of Isotopes Dom ter 9 Hungarian Academy of Sciences Szeged Konkoly-Thege 29-33 Hungary, H-6720 Budapest, P.O.B. 77 Hungary, H-1525 email: lorincz at iserv.iki.kfki.hu From wgm at santafe.edu Fri Aug 18 13:42:01 1995 From: wgm at santafe.edu (wgm@santafe.edu) Date: Fri, 18 Aug 95 11:42:01 MDT Subject: No subject Message-ID: <9508181742.AA00176@yaqui> In response to the recent postings of Kevin Lang and Bill Macready and David Wolpert, Barak Pearlmutter writes: >>> But this point is somewhat moot, because Kevin Lang's recent post was not claiming that "his" "new" algorithm was better than current algorithms. >>> Nor did we ever assume he was making this claim. We simply wanted to point to our work that resulted from many of the same motivations and investigates such issues from the most general perspective. >>> Lang published a well reasoned and careful scientific paper, which cast doubt on the practicality and importance of a particular algorithm (GP) by showing that on a very simple problem *taken from the GP book* it compares unfavorably to a silly strawman algorithm. This paper passed scientific muster, and was accepted into ML, a respected refereed conference. >>> Again we want to stress we were not attacking Lang's work *at all*. We couldn't agree more that the GP community needs more comparisons of its results to other techniques. For those interested in such comparisons I would also point to the papers of Una-May O'Reilly. They can be found at http://www.santafe.edu/sfi/publications/94wplist.html *** As regards No Free Lunch issues in optimization and supervised learning, >>> It might be true that in a universe where everything was equally likely, all search algorithms would be equally awful. >>> It is NOT true that "in a universe where everything was equally likely, all search algorithms would be equally awful." For example, there are major head-to-head minimax distinctions between algorithms. (The true magnitude of those distinctions is coming to light in the work with our student we mentioned in our previous post.) This is a crucially important point. There is a huge amount of structure distinguishing algorithms even in "a universe where everything was equally likely". All that would be "equally awful" in such a universe is expected performance. And of course in supervised learning there are other major a priori distinctions between algorithms. E.g., for quadratic loss, if you give me a learning algorithm with *any* random component (backprop with random initial weights anyone?), I can always come up with an algorithm with assuredly (!) superior performance, *independent of the target*. (And therefore independent of the "universe that we live in".) Bill Macready and David Wolpert From bap at scr.siemens.com Fri Aug 18 14:45:14 1995 From: bap at scr.siemens.com (Barak Pearlmutter) Date: Fri, 18 Aug 1995 14:45:14 -0400 Subject: paper announcement Message-ID: <199508181845.OAA22582@gull.scr.siemens.com> The following paper, to appear in Neural Computation, is available via ftp to archive.cis.ohio-state.edu:/pub/neuroprose/pearlmutter.vcdifn.ps.Z. VC Dimension of an Integrate-and-Fire Neuron Model Anthony M. Zador Barak A. Pearlmutter Salk Institute Siemens Corporate Research 10010 N. Torrey Pines Rd. 755 College Road East La Jolla, CA 92037 Princeton, NJ 08540 zador at salk.edu bap at scr.siemens.com ABSTRACT We compute the VC dimension of a leaky integrate-and-fire neuron model. The VC dimension quantifies the ability of a function class to partition an input pattern space, and can be considered a measure of computational capacity. In this case, the function class is the class of integrate-and-fire models generated by varying the integration time constant and the threshold, the input space they partition is the space of continuous-time signals, and the binary partition is specified by whether or not the model reaches threshold at some specified time. We show that the VC dimension diverges only logarithmically with the input signal bandwidth. We also extend this approach to arbitrary passive dendritic trees. The main contributions of this work are (1) it offers a formal treatment of the computational capacity of a dynamical system; and (2) it provides a framework for analyzing the computational capabilities of the dynamical systems defined by networks of real neurons. ---------------------------------------------------------------- Thanks for Jordan Pollack for maintaining the neuroprose archive. From terry at salk.edu Sat Aug 19 15:04:51 1995 From: terry at salk.edu (Terry Sejnowski) Date: Sat, 19 Aug 95 12:04:51 PDT Subject: Neural Computation 7:5 Message-ID: <9508191904.AA10190@salk.edu> NEURAL COMPUTATION September 1995 Volume 7 Number 5 Review: Methods for Combining Experts' Probability Assessments Robert A. Jacobs Letters: The Helmholtz Machine Peter Dayan, Geoffrey E. Hinton, Radford M. Neal, and Richard S. Zemel Spontaneous Excitations in the Visual Cortex: Stripes, Spirals, Rings and Collective Bursts Corinna Fohlmeister, Wulfram Gerstner, Raphael Ritz, and J. Leo van Hemmen Time-Domain Solutions of Oja's Equations J. L. Wyatt, Jr. and I. M. Elfadel Learning the Initial State of a Second-Order Recurrent Neural Network during Regular-Language Inference Mikel L. Forcada and Rafael C. Carrasco An Algebraic Framework to Represent Finite State Machines in Single-Layer Recurrent Neural Networks R. Alquezar and A. Sanfeliu Local And Global Optimization Algorithms For Generalized Learning Automata V. V. Phansalkar and M. A. L. Thathachar From lbl at nagoya.bmc.riken.go.jp Sat Aug 19 23:32:26 1995 From: lbl at nagoya.bmc.riken.go.jp (Bao-Liang Lu) Date: Sun, 20 Aug 1995 12:32:26 +0900 Subject: Paper available:Inverse Kinematics via Network Inversions Message-ID: <9508200332.AA08838@xian> The following paper, to appear in Proc. of IEEE ICNN'95, Perth, Australia, is available via FTP. FTP-host:ftp.bmc.riken.go.jp FTP-file:pub/publish/Lu/lu-ieee-icnn95.ps.Z ========================================================================== TITLE: Regularization of Inverse Kinematics for Redundant Manipulators Using Neural Network Inversions AUTHORS: Bao-Liang Lu (1) Koji Ito (1,2) ORGANISATIONS: (1) The Institute of Physical and Chemical Research (2) Toyohashi University of Technology ABSTRACT: This paper presents a new approach to regularizing the inverse kinematics problem for redundant manipulators using neural network inversions. This approach is a four-phase procedure. In the first phase, the configuration space and associated workspace are partitioned into a set of regions. In the second phase, a set of modular neural networks is trained on associated training data sets sampled over these regions to learn the forward kinematic function. In the third phase, the multiple inverse kinematic solutions for a desired end-effector position are obtained by inverting the corresponding modular neural networks. In the fourth phase, an ``optimal" inverse kinematic solution is selected from the multiple solutions according to a given criterion. This approach has an important feature in comparison with existing methods, that is, both the inverse kinematic solutions located in the multiple solution branches and the ones that belong to the same solution branch can be found. As a result, better control of the manipulator using the optimum solution than that using an ordinary solution can be achieved. This approach is illustrated with a three-joint planar arm. (6 pages. No hard copies available.) Bao-Liang Lu --------------------------------------------- Bio-Mimetic Control Research Center, The Institute of Physical and Chemical Research (RIKEN) 3-8-31 Rokuban, Atsuta-ku, Nagoya 456, Japan Phone: +81-52-654-9137 Fax: +81-52-654-9138 Email: lbl at nagoya.bmc.riken.go.jp From M.Q.Brown at ecs.soton.ac.uk Mon Aug 21 12:05:09 1995 From: M.Q.Brown at ecs.soton.ac.uk (martin brown) Date: Mon, 21 Aug 1995 17:05:09 +0100 Subject: ISIS web entry Message-ID: <17225.199508211605@diana.ecs.soton.ac.uk> The Image, Speech and Intelligent Systems (ISIS) research group now has an entry at: http://www-isis.ecs.soton.ac.uk/ We've been doing research into many different aspects of neural and neurofuzzy theory for the past 6 years, especially applied to classification and identification and control theory. A lot of work has been done on neurofuzzy networks and the development of automatic construction algorithms. These pages include: 1) Publication details together with abstracts. 2) Project details. 3) Personnel details. 4) Neurofuzzy information service (conferences, software, other homepages etc.) Martin Brown ------------------------------------------------------ ISIS research group, Email: mqb at ecs.soton.ac.uk Room 3025, Zepler Building, Tel: +44 (0)1703 594984 Dept. of Electronics and Computer Science, Fax: +44 (0)1703 594498 University of Southampton, WWW: http://www-isis.ecs.soton.ac.uk/ Highfield, Southampton, SO17 1BJ, UK From charles at playfair.Stanford.EDU Tue Aug 22 13:47:05 1995 From: charles at playfair.Stanford.EDU (charles@playfair.Stanford.EDU) Date: Tue, 22 Aug 95 10:47:05 -0700 Subject: Paper available: Visualization of High-Dimensional Functions Message-ID: <199508221747.KAA06364@tukey.Stanford.EDU> The following doctoral dissertation (168 pages) is now available electronically: ftp://playfair.stanford.edu/pub/roosen/thesis.ps.Z A short (6 page) proceedings paper on the same material is also available: ftp://playfair.stanford.edu/pub/roosen/asa95.ps.Z Charles Roosen charles at playfair.stanford.edu Department of Statistics http://playfair.stanford.edu/~roosen/ Stanford University Title ----- Visualization and Exploration of High-Dimensional Functions Using the Functional ANOVA Decomposition Abstract -------- In recent years the statistical and engineering communities have developed many high-dimensional methods for regression (e.g.\ MARS, feedforward neural networks, projection pursuit). Users of these methods often wish to explore how particular predictors affect the response. One way to do so is by decomposing the model into low-order components through a functional ANOVA decomposition and then visualizing the components. Such a decomposition, with the corresponding variance decomposition, also provides information on the importance of each predictor to the model, the importance of interactions, and the degree to which the model may be represented by first and second-order components. This manuscript develops techniques for constructing and exploring such a decomposition. It begins by suggesting techniques for constructing the decomposition either numerically or analytically, proceeds to describe approaches to plotting and interpreting effects, and then develops methodology for rough inference and model selection. Extensions to the GLM framework are discussed briefly. From mw at isds.Duke.EDU Tue Aug 22 15:30:51 1995 From: mw at isds.Duke.EDU (Mike West) Date: Tue, 22 Aug 1995 15:30:51 -0400 Subject: No subject Message-ID: 1996 JOINT STATISTICAL MEETINGS Chicago, August 4-8, 1996 ********************************************** ASA Section on BAYESIAN STATISTICAL SCIENCES ********************************************** CALL FOR PROPOSALS: SPECIAL CONTRIBUTED PAPER SESSIONS This is an initial call for proposals for Special Contributed Paper Sessions for the Section on Bayesian Statistical Sciences program at the 1996 Joint Meetings. We already have several proposals, at least in embryo, and are looking for many more. Please consider putting together a session, and alert others who may be interested in organising a session. We have a few months before formal sessions, tentative titles, etc., are needed, but it is not too soon to begin to think about session themes and to talk to potential participants. Those of you who have already contacted me with ideas should now be going the next step to confirm your topics and speakers, and let me have the revised session proposal. At this stage, suggestions and ideas for sessions need not identify a full list of speakers and discussants, but you should provide a general idea of the topic and focus, and some names at least tentatively agreed. The ASA theme for the 1996 meetings is "Challenging the Frontiers of Knowledge Using Statistical Science" and is intended to highlight new statistical developments at the forefront of the discipline -- theory, methods, applications, and cross-disciplinary activities. Accordingly, ASA will be designating selected sessions as "Theme" sessions. Suggestions for SBSS sessions obviously in tune with this theme, involving topics of real novelty and importance, new directions of development in Bayesian statistics, and reflecting the current vibrancy of the discipline, are particularly encouraged. And remember that other ASA sections can co-sponsor sessions; SBSS will be vigorously seeking co-sponsorship in many cases. *SPECIAL REQUEST* The Chicago meetings will likely attract well over 4,000 participants. Many will not be seriously attracted by many of the Bayesian talks; not because of Bayesian emphases, but because of technical/mathematical level and orientation. For example, there will be many participants from various corners of industry/commerce/government that do not have PhDs. We would like to reach these kinds of people. One way of attempting this is to have one or more sessions more or less targeted at the practising "applied" statistician and at a lower technical level than typical. One possibility is a panel discussion, involving participants with strong industrial/consulting/non-academic backgrounds who will discuss issues and experiences of practical Bayesian statistics in their areas. A variant would have several speakers making "case study" presentations. Your inputs are solicited. FORMAT: A Special Contributed Paper Session typically involves a collections of either five short talks (20mins), or four talks plus a discussion, on a specific theme. One alternative is to have a panel discussion involving up to five panelists. All sessions need a chair. Please contact me with suggestions and ideas for themes and speakers. Mike West, 1996 SBSS Program Chair ISDS, Duke University, Durham, NC 27708-0251 mw at isds.duke.edu http://www.isds.duke.edu From hochreit at informatik.tu-muenchen.de Wed Aug 23 05:15:10 1995 From: hochreit at informatik.tu-muenchen.de (Josef Hochreiter) Date: Wed, 23 Aug 1995 11:15:10 +0200 Subject: TR announcement: Long Short Term Memory Message-ID: <95Aug23.111512+0200_met_dst.116236+322@papa.informatik.tu-muenchen.de> FTP-host: flop.informatik.tu-muenchen.de (131.159.8.35) FTP-filename: /pub/fki/fki-207-95.ps.gz or FTP-host: fava.idsia.ch (192.132.252.1) FTP-filename: /pub/juergen/fki-207-95.ps.gz or something like netscape http://www.idsia.ch/~juergen LONG SHORT TERM MEMORY Technical Report FKI-207-95 (8 pages, 50 K) Sepp Hochreiter Juergen Schmidhuber Fakultaet fuer Informatik IDSIA Technische Universitaet Muenchen Corso Elvezia 36 80290 Muenchen, Germany 6900 Lugano, Switzerland ``Recurrent backprop'' for learning to store information over extended time periods takes too long. The main reason is insufficient, decaying error back flow. We describe a novel, efficient ``Long Short Term Memory'' (LSTM) that overcomes this and related problems. Unlike previous approaches, LSTM can learn to bridge arbitrary time lags by enforcing constant error flow. Using gradient descent, LSTM explicitly learns when to store information and when to access it. In experimental comparisons with ``Real-Time Recurrent Learning'', ``Recurrent Cascade-Correlation'', ``Elman nets'', and ``Neural Sequence Chunking'', LSTM leads to many more successful runs, and learns much faster. Unlike its competitors, LSTM can solve tasks involving minimal time lags of more than 1000 time steps, even in noisy environments. If you don't have gzip/gunzip, we can mail you an uncompressed postscript version (as a last resort). Comments welcome. Sepp Hochreiter Juergen Schmidhuber From Voz at dice.ucl.ac.be Wed Aug 23 12:58:09 1995 From: Voz at dice.ucl.ac.be (Jean-Luc Voz) Date: Wed, 23 Aug 1995 18:58:09 +0200 Subject: ELENA Classification databases and technical reports available Message-ID: <199508231658.SAA11205@ns1.dice.ucl.ac.be> Dear colleagues, The partners of the Elena project are pleased to announce you the availability of several databases related to classification together with two technical reports. ELENA is an ESPRIT III Basic Research Action project (No. 6891) From S.Renals at dcs.shef.ac.uk Wed Aug 23 14:11:06 1995 From: S.Renals at dcs.shef.ac.uk (S.Renals@dcs.shef.ac.uk) Date: Wed, 23 Aug 1995 19:11:06 +0100 Subject: AbbotDemo: Continuous Speech Recognition Available by FTP Message-ID: <199508231811.TAA11146@elvis.dcs.shef.ac.uk> AbbotDemo is a near-real-time speaker-independent continuous speech recognition system for American and British accented English. The vocabulary size of this demonstration system is 5,000 words. The system uses a hybrid recurrent network/hidden Markov model acoustic model and a trigram language model. The recurrent network (which contains around 100,000 weights) was trained as a phone probability estimator using back-propagation through time. It is available by FTP from svr-ftp.eng.cam.ac.uk in directory /pub/comp.speech/binaries. The file AbbotDemo.README gives more information on this system and the remainder of the files provide the executables for various flavours of UNIX (Linux, SunOS4, HP-UX, IRIX). A 16 bit soundcard and a reasonable microphone are required. The Linux version is also available by FTP from sunsite.unc.edu (and mirrors) in directory /pub/Linux/apps/sound/speech. Sorry, but at this stage no sources and only limited documentation are provided. Although the task domain is focused on noise-free read speech from a north American business newspaper (e.g. the Wall Street Journal) we hope that this system provides a fair representation of the state of the art in large vocabulary speech recognition and that it will encourage the creation of novel applications. Tony Robinson (Cambridge University) Mike Hochberg (Cambridge University) Steve Renals (Sheffield University) and many many more. AbbotDemo URLs: ftp://svr-ftp.eng.cam.ac.uk/pub/comp.speech/binaries/ ftp://sunsite.unc.edu/pub/Linux/apps/sound/speech/ From wahba at stat.wisc.edu Wed Aug 23 15:19:00 1995 From: wahba at stat.wisc.edu (Grace Wahba) Date: Wed, 23 Aug 95 14:19:00 -0500 Subject: spline/ai/met paper ancts Message-ID: <9508231919.AA12253@hera.stat.wisc.edu> Announcing new/revised manuscripts available on the web .... http/www.stat.wisc.edu/~wahba click on `TRLIST' also available by ftp in files listed at the end of this msg Luo, Z. and Wahba, G. "Hybrid Adaptive Splines" TR 947, June 1995, submitted. --- Combines ideas from smoothing splines and MARS to obtain spatial adaptivity, numerical efficiency. Numerical comparisons to wavelet simulations in Donoho et al, JRSSB 57, 1995. Approach parallels Orr, Neural Computation 7,1995, which we just became aware of-similar results on a different class of examples --. Wahba, G., Wang, Y., Gu, C., Klein, R. and Klein, B. " Smoothing Spline ANOVA for Exponential Families, with Application to the Wisconsin Epidemiological Study of Diabetic Retinopathy." May 1995, to appear, Annals of Statistics. --Expanded and *slightly revised* version of TR 940, December 1994. This paper was the basis for the Neyman Lecture given at the Annual Meeting of the Institute of Mathematical Statistics at Chapel Hill 1994, delivered by the first author. Collects results from SS-ANOVA, algorithm for choosing multiple smoothing parameters for Bernoulli data, implementing confidence intervals. Applies results to the estimation of four- year risk of progression of diabetic retinopathy, using data from the Wisconsin Epidemiological Study of Diabetic Retinopathy. (data available in GRKPACK documentation above). ***A discussion of the backfitting algorithm and SS-ANOVA has been added***--. Xiang, D. and Wahba, G. " A Generalized Approximate Cross Validation for Smoothing Splines with Non-Gaussian Data." TR 930, September 1994, submitted. --Another look at choosing the regularization parameter for Bernoulli data. Parallels work of Moody, Liu in the NN literature. Wahba, G., Johnson, D. R., Gao, F. and Gong, J. " Adaptive tuning of numerical weather prediction models: Part I: randomized GCV and related methods in three and four dimensional data assimilation." TR 920, April 1994. *revised and shortened* version to appear, Monthly Weather Review --describes how to use the randomized trace method to implement GCV in very large problems in numerical weather prediction, approximate solution of pde's with discrete noisy observations. Also available by ftp in gzipped postscript in ftp.stat.wisc.edu/pub/wahba in the following files, respectively has.ps.gz exptl.ssanova.rev.ps.gz gacv.ps.gz tuning-nwp.rev.ps.gz From jstrout at UCSD.EDU Thu Aug 24 11:13:03 1995 From: jstrout at UCSD.EDU (Joseph Strout) Date: Thu, 24 Aug 1995 08:13:03 -0700 (PDT) Subject: Neuron Emulation List (ANNOUNCEMENT) Message-ID: ANNOUNCEMENT: Neuron Emulation List ----------------------------------- This message is to announce the opening of a new mailing list. Its purpose is the discussion of "neuron emulation" -- that is, the endeavor to recreate (in hardware or software) the functional properties of a particular neuron or small network of neurons. This is in contrast to the more common neural model, which recreates the general properties of an entire class of cells. Upon subscribing to the list, you will receive two copies of a document which is now out of date. The revised versions will probably not be posted until next week, due to personnel difficulties [read: postmaster on vacation]. However, the latest versions can be obtained from the Web site (below). Also at the web site is an archive of previous messages; the list has been open privately for a few weeks while we worked out bugs with the listserver. It seems to be running smoothly now, and as interest has been spreading, it seemed prudent to make the formal announcement. The NEL Web Site can be accessed through the following URL: http://sdcc3.ucsd.edu/~jstrout/nel/ Full instructions on using the list server, as well as etiquette guidelines and a preliminary FAQ, are available at that site. If you do not have access to the World Wide Web, contact me directly and I will forward you the relevant files. ,------------------------------------------------------------------. | Joseph J. Strout Department of Neuroscience, UCSD | | jstrout at ucsd.edu http://sdcc3.ucsd.edu/~jstrout/ | `------------------------------------------------------------------' From steven.young at psy.ox.ac.uk Fri Aug 25 09:01:24 1995 From: steven.young at psy.ox.ac.uk (Steven Young) Date: Fri, 25 Aug 1995 14:01:24 +0100 (BST) Subject: Oxford Summer School on Connectionist Modelling Message-ID: <199508251301.OAA00656@cogsci1.psych.ox.ac.uk> This is a last minute call for participation for the Oxford Summer School on Connectionist Modelling. (This is extremely short notice. Apologies) Full details are given below. There are only 3 places left for this year's Summer School. The Summer School is a 2-week residential summer school which provides an introduction to connectionist modelling. Please pass on this information to people you may know who would be interested. The inital call for participation follows, and further information is available via the World Wide Web at the following URL: http://cogsci1.psych.ox.ac.uk/summer-school/ (Please ignore the closing date in the initial call for participation. If you know of people who might wish to attend then they could contact Sue King via email or phone -- details below). Here is the initial call for participation: Oxford Summer School on Connectionist Modelling Department of Experimental Psychology, University of Oxford 10-22 September 1995 Applications are invited for participation in a 2-week residential Summer School on techniques in connectionist modelling of cognitive and biological phenomena. The course is aimed primarily at researchers who wish to exploit neural network models in their teaching and/or research. It will provide a general introduction to connectionist modelling through lectures and exercises on Power PCs. The instructors with primary responsibility for teaching the course are Kim Plunkett and Edmund Rolls. No prior knowledge of computational modelling will be required though simple word processing skills will be assumed. Participants will be encouraged to start work on their own modelling projects during the Summer School. The cost of participation in the Summer School is 500 to include accommodation (bed and breakfast at St. John's College) and registration. Participants will be expected to cover their own travel and meal costs. A small number of graduate student scholarships may be available. Applicants should indicate whether they wish to be considered for a graduate student scholarship but are advised to seek their own funding as well, since in previous years the number of graduate student applications has far exceeded the number of scholarships available. If you are interested in participating in the Summer School, please contact: Mrs Sue King Department of Experimental Psychology South Parks Road University of Oxford Oxford OX1 3UD Tel: +44 (1865) 271 353 Email: sking at psy.ox.ac.uk Please send a brief description of your background with an explanation of why you would like to attend the Summer School (one page maximum) no later than 31st January 1995. Regards, Steven Young -- Facility for Computational Modelling in Cognitive Science McDonnell-Pew Centre for Cognitive Neuroscience, Oxford Department of Experimental Psychology, University of Oxford From stokely at atax.eng.uab.edu Fri Aug 25 16:02:51 1995 From: stokely at atax.eng.uab.edu (Ernest Stokely) Date: Fri, 25 Aug 1995 15:02:51 -0500 Subject: Faculty Position in Biomed. Engr. Message-ID: Tenure Track Faculty Position -------------------------- The Department of Biomedical Engineering at the University of Alabama at Birmingham has an opening for a tenure-track faculty member. The opening is being filled as part of a Whitaker Foundation Special Opportunities Award for a training and research program in functional and structural imaging of the brain. Candidates are particularly sought in neurosystems modelling, biological neural networks,computational neurobiology, or other multidisciplinary areas that combine neurobiology and imaging. More senior candidates and candidates with established research programs are of particular interest, in order that the UAB program can be quickly established. The person selected for this position will be expected to form active research collaborations with other units in the medical center part of the UAB campus. Candidates should have a Ph.D. degree in an appropriate field, and must be a U.S. citizen or have permanent residency in the U.S. The search will be continued until the position is filled. UAB is an autonomous campus within the University of Alabama system. UAB faculty currently are involved in over $180 million of externally funded grants and contracts. A 4.1 Tesla clinical NMR facility for cardiovascular and brain research, several other small-bore MR systems, a Philips Gyroscan system, and a team of research scientists and engineers working in various aspects of MR imaging and spectroscopy are housed only 200 meters from the School of Engineering. In addition, the brain imaging project will involve the opportunity for collaborations with members of the Neurobiology Research Center, a new Cognitive Science Graduate Program, as well as faculty members from the Departments of Neurology, Psychiatry, and Radiology. To apply send a letter of application, a current curriculum vitae, and three letters of reference to Dr. Ernest Stokely, Department of Biomedical Engineering, BEC 256, University of Alabama at Birmingham, Birmingham, Alabama 35294-4461. The University of Alabama at Birmingham is an equal opportunity, affirmative action employer, and encourages applications from women and minorities. Ernest Stokely Chair, Department of Biomedical Engineering BEC 256 University of Alabama at Birmingham Birmingham, Alabama 35294-4461 Internet: stokely at atax.eng.uab.edu FAX: (205) 975-4919 Phone: (205) 934-8420 From baluja at GS93.SP.CS.CMU.EDU Tue Aug 29 00:08:39 1995 From: baluja at GS93.SP.CS.CMU.EDU (Shumeet Baluja) Date: Tue, 29 Aug 95 00:08:39 EDT Subject: Face Detection Test Set Available Message-ID: In response to our recently announced paper on face detection, we have received several requests for our test images. In order to facilitate accurate comparisons, we have made our images available on-line. *** We recommend that you use these images for testing purposes only. Training systems based on these images decreases the value of this database as an independent test set. *** Data Set Specifics: 42 Compuserve GIF format gray-scale images. 169 Labeled Face Locations. Our work concentrated on finding frontal views of faces in scenes, so the list of faces includes mainly those faces looking towards the camera; extreme side views are ignored. These faces vary in size from ~20 to several hundred pixels. Image lighting/noise/graininess/contrast also varies; images were taken from CCD cameras, scanned photographs, scanned magazine & newspaper pictures, and handrawn images. To get the images: http://www.ius.cs.cmu.edu/IUS/dylan_usr0/har/faces/test/index.html A demo of our system, and copies of our paper, can be found in: http://www.cs.cmu.edu/~baluja http://www.cs.cmu.edu/~har If you use this databse, please let us know. Send any questions/comments to: shumeet baluja (baluja at cs.cmu.edu) henry rowley (har at cs.cmu.edu) From dhw at santafe.edu Wed Aug 30 20:46:30 1995 From: dhw at santafe.edu (David Wolpert) Date: Wed, 30 Aug 95 18:46:30 MDT Subject: Paper announcement Message-ID: <9508310046.AA17730@sfi.santafe.edu> *** PAPER ANNOUNCEMENT *** ON BIAS PLUS VARIANCE by D. Wolpert Abstract: This paper presents a Bayesian additive "correction" to the familiar quadratic loss bias-plus-variance formula. It then discusses some other loss-function-specific aspects of supervised learning. It ends by presenting a version of the bias-plus-variance formula appropriate for log loss, and then the Bayesian additive correction to that formula. Both the quadratic loss and log loss correction terms are a covariance, between the learning algorithm and the posterior distribution over targets. Accordingly, in the context in which those terms apply, there is not a "bias-variance trade-off", or a "bias-variance dilemma", as one often hears. Rather there is a bias-variance-covariance trade-off. This tech report is retrievable by anonymous ftp to ftp.santafe.edu. Go to pub/dhw_ftp, and retrieve either bias.plus.ps.Z or bias.plus.Z.encoded for compressed or uuencoded compressed postscript, respectively. Comments welcomed. From sschaal at hip.atr.co.jp Tue Aug 1 13:31:27 1995 From: sschaal at hip.atr.co.jp (Stefan Schaal) Date: Tue, 1 Aug 95 13:31:27 JST Subject: Paper available on incremental local learning Message-ID: <9508010431.AA05186@hsun06> The following paper is available by anonymous FTP or WWW: FROM ISOLATION TO COOPERATIONION: AN ALTERNATIVE VIEW OF A SYSTEM OF EXPERTS Stefan Schaal and Christopher G. Atkeson submitted to NIPS'95 We introduce a constructive, incremental learning system for regression problems that models data by means of locally linear experts. In contrast to other approaches, the experts are trained independently and do not compete for data during learning. Only when a prediction for a query is required do the experts cooperate by blending their individual predictions. Each expert is trained by minimizing a penalized local cross validation error using second order methods. In this way, an expert is able to adjust the size and shape of the receptive field in which its predictions are valid, and also to adjust its bias on the importance of individual input dimensions. The size and shape adjustment corresponds to finding a local distance metric, while the bias adjustment accomplishes local dimensionality reduction. We derive asymptotic results for our method. In a variety of simulations we demonstrate the properties of the algorithm with respect to interference, learning speed, prediction accuracy, feature detection, and task oriented incremental learning. ------------------------- The paper is 8 pages long, requires 2.3 MB of memory (uncompressed), and is ftp-able as: ftp://ftp.cc.gatech.edu/people/sschaal/schaal-NIPS95.ps.gz or can be accessed through: http://www.cc.gatech.edu/fac/Stefan.Schaal/ http://www.hip.atr.co.jp/~sschaal/ Comments are most welcome. From malcolm%flash at newcastle.ac.uk Tue Aug 1 14:35:00 1995 From: malcolm%flash at newcastle.ac.uk (Malcolm Young) Date: Tue, 1 Aug 1995 12:35:00 -0600 Subject: Assistant Professorships in the U.K. Message-ID: <9508011235.ZM7279@flash.ncl.ac.uk> UNIVERSITY OF NEWCASTLE UPON TYNE DEPARTMENT OF PSYCHOLOGY Lecturers in Cognitive and Computational Neuroscience or Behavioural Ecology Applications from suitably qualified and experienced scientists are invited for two lectureships in this enthusiastic and expanding department. Candidates should have a good honours degree, a doctorate, knowledge of computing and multivariate data analysis, and a record of research activity in published papers and research support. Applicants who will contribute to three areas of research: (1) cognitive, sensory and systems neuroscience, (2) neural network computing, and (3) behavioural ecology are particularly encouraged. Excellent research facilities and support will be provided. Salary will be on the Lecturer Scale (to GBP26,430), at a position determined by qualifications and experience. The Department of Psychology at Newcastle is broadly-based and rapidly growing. All undergraduate courses offered by the Department are heavily over-subscribed and the Department's postgraduate research community is the fastest growing of any in the university. Staff have research programs in cognitive neuroscience, behavioural ecology, and applied, occupational, health, social and cognitive psychology. The Department is also the home of the Neural Systems Group, an interdisciplinary centre for research in sensory, systems, computational and cognitive neuroscience, and neural network computing. The members of the Neural Systems Group are drawn from the Departments of Psychology, Physiological Sciences and the School of Neurosciences. Further details of the department and the Neural Systems Group can be found on the WWW at http://york37.ncl.ac.uk/www/psychol.html No forms of application are to be issued. Applications should take the form of a covering letter and three copies of the applicant's full curriculum vitae, with details of present salary and the names and addresses of three referees. Informal enquiries may be made to Prof. Malcolm Young (0191-222-7525, mpy%crunch at ncl.ac.uk). Applications should be sent to: the Director of Personnel, University of Newcastle upon Tyne, NE1 7RU, UK. -- +------------------------------------------------------------------------+ + + + Professor Malcolm P. Young Tel: 0191 222 7525 + + Department of Psychology Fax: 0191 222 5622 + + Ridley Building + + University of Newcastle + + Newcastle upon Tyne + + NE1 7RU Email: malcolm at flash.ncl.ac.uk + + UK mpy at crunch.ncl.ac.uk + + nmpy at ncl.ac.uk + +------------------------------------------------------------------------+ From black at signal.dra.hmg.gb Tue Aug 1 12:03:09 1995 From: black at signal.dra.hmg.gb (John V. Black) Date: Tue, 01 Aug 95 17:03:09 +0100 Subject: Vacancies at the Defence Research Agency (Malvern, UK) Message-ID: VACANCIES IN PATTERN & INFORMATION PROCESSING AT THE DEFENCE RESEARCH AGENCY (DRA) IN MALVERN, WORCESTERSHIRE, ENGLAND Vacancies The Pattern & Information Processing (PIP) group within the Signal & Information Processing department has vacancies for research staff interested in Artificial Neural Networks or Statistical Pattern Recognition. Members of PIP invent, develop and provide advice on techniques which can be used to understand and interpret data/information in order to either aid decision-makers or automate decision-making. Techniques currently under investigation include novel self-organising networks; modifications to standard techniques to incorporate prior knowledge and known invariances, statistical pattern recognition techniques; multi-level information processing; fusion algorithms; tracking; classification; Bayesian inference; neural networks; decision making; and analogue information processing systems. Our aim is to research the underlying theoretical foundation of relevant techniques to create the knowledge and deep understanding required to provide advice for all DRA projects involving these techniques. Mathematical rigour with a methodical approach to experimental validation provides the necessary solid foundations for our work. The general emphasis is on techniques which are applicable to many domains but support is also provided to specific collaborative projects which both demonstrate the usefulness of our techniques and stimulate further developments. All members of staff are encouraged to develop to their full potential via continuing professional development based upon a wide range of training opportunities, including international conferences. Initially, a successful recruit may undertake work ranging from long term research to shorter term applications studies and consultancy, if appropriate. They will also be given every encouragement to initiate new research projects, develop/expand existing research themes and liaise with other projects. Starting salary will be based upon relevant experience in the range 10-26K Pounds Sterling (under review) or higher for exceptional candidates. Qualifications & Experience We have the flexibility to make appointments appropriate for applicants ranging from new graduates to experienced researchers. Successful candidates will probably have a good first degree in an appropriate scientific discipline. Proven research experience, typically a higher degree or equivalent, in one or more relevant areas such as information fusion, information processing, pattern recognition (including neural networks), decision support, statistical modelling or physical modelling, is preferable. It is also important that applicants be experienced in the software development of algorithms, models and experiments. Knowledge of defence applications projects would be an advantage for more senior appointments. Please note that successful applicants will need to be security cleared by the UK Ministry of Defence. DRA is an equal opportunities employer. Applications Application forms can be obtained from : CIS Recruitment Office G Building Defence Research Agency St Andrews Road Malvern Worcestershire WR14 3PS Telephone : (01684) 895642 Please mark any such applications for the attention of G D Whitaker, CIS5 department. Further information Further information about the DRA can be obtained on the world wide web (http://www.dra.hmg.gb/) although the PIP home page is still under construction and so not yet visible. Alternatively, e-mail gdw at signal.dra.hmg.gb with any specific questions or with CVs prior to submission of a formal application. From wimw at mbfys.kun.nl Wed Aug 2 22:13:33 1995 From: wimw at mbfys.kun.nl (Wim Wiegerinck) Date: Thu, 3 Aug 1995 04:13:33 +0200 Subject: SNN symposium Neural Networks and AI - Final Program Message-ID: <199508030213.EAA13487@septimius.mbfys.kun.nl> NEURAL NETWORKS AND ARTIFICIAL INTELLIGENCE Scientific track of the 3rd SNN Neural Network Symposium September 14-15, 1995 Nijmegen, the Netherlands http://www.mbfys.kun.nl/SNN/Symposium/ //////////////////////////////////////////////////////////////////////////// On september 14 and 15 SNN will organize its annual Symposium on Neural Networks in the University Auditorium and Conference Centre of the University of Nijmegen. The topic of the conference is "Neural Networks and Artificial Intelligence". The term "Artificial Intelligence" is often associated with "traditional AI" methodology. Here it used in its literal sense, indicating the problem to create intelligence by artificial means, regardless of the method that is being used. The aim of the conference is two-fold: to give an overview of new developments in neuro-biology and the cognitive sciences that may lead to novel computational paradigms and to give an overview of recent achievements for some of the major conceptual problems of artificial intelligence and data modelling. The conference consists of 4 one-half day single track sessions. Each session will consist of 2 invited contributions and 2 or 3 contributions selected from the submitted papers. In addition there will be poster sessions. There are about 50 selected papers and we expect approximately 250 attendants. The proceedings will be published by Springer-Verlag. //////////////////////////////////////////////////////////////////////////// PROGRAM //////////////////////////////////////////////////////////////////////////// Thursday, 14 September 8.50 - C.C.A.M. Gielen Opening remarks Neurobiology ------------ 9.00 - R. Eckhorn Segmentation coding in the visual system Neural signals that possibly support scene segmentation 9.30 - B. van Dijk Synchrony and plasticity in the visual cortex 10.00 - K. Kopecz Dynamic representations provide the gradual specification of movement parameters 10.20 - J. Krueger Recording from foveal striate cortex while the monkey is looking at the stimulus 10.40 - Coffee & posters 11.10 - L. van Hemmen What coherently spiking neurons are good for 11.40 - A. Herz Rapid local synchronization from spiking cells to synfire webs 12.10 - M.-O. Gewaltig Propagation of synfire activity in cortical networks: a statistical approach 12.30 - M. Arndt Propagation of synfire activity in cortical networks: a dynamical systems approach 12.50 - Lunch & posters Cognitive modeling and rule extraction -------------------------------------- 14.30 - F. Wiersma The BB neural network rule extraction method 14.50 - N. Bakker Implications of Hadley's definition of systematicity 15.10 - G.J. Dalenoort and P.H. de Vries The binding problem in distributed systems 15.30 - Tea & posters 16.10 - J.A. Tepper Integrating symbolic and subsymbolic architecture for parsing arithmetic expressions and natural language sentences 16.30 - L. Martignon Bayesian strategies for machine learning: rule extraction and concept detection 17.00 - Reception & posters 18.30 - Closing =========================================================================== Friday, 15 September Robotics and vision ------------------- 9.00 - F. Groen A visually guided robot and a neural network join to grasp slanted ogbjects 9.30 - S. Thrun The mobile robot Rhino 10.00 - R.P. Wurtz Background invariant face recognition 10.20 - Coffee & posters 11.00 - B. Krose Learning structure from Motion: how to represent two-valued functions 11.20 - D. Roobaert A natural object recognition system using Self-organizing Translation-Invariant Maps (STIMs) 11.40 - R. Geraets Affine scale-space for discrete point sets 12.00 - Posters 12.30 - Lunch Statistical pattern recognition ------------------------------- 13.30 - B. Ripley Statistical ideas for selecting network architectures 14.00 - D. MacKay Developments in probabilistic modelling with neural networks -- ensemble learning 14.30 - C. Campbell A constructive algorithm for building a feed-forward neural network 14.50 - S.H. Lokerse Density estimation using SOFM and adaptive kernels 15.10 - H. von Hasseln Maximum likelihood estimates for Markov networks using inhomogeneous Markov chains 15.30 - Tea & posters Applications of neural networks ------------------------------- 16.00 - E. Turkcan Optimal training for neural network applied to nuclear technology 16.20 - H.A.B. te Braake Nonlinear predictive control with neural models 16.40 - G. Schram System identification with orthogonal basis functions and neural networks 17.00 - Posters 17.30 - Closing /////////////////////////////////////////////////////////////////////////////// Throughout the whole conference about 30 posters will be presented. More information on the poster sessions can be found on WWW : http://www.mbfys.kun.nl/SNN/Symposium/posters.html /////////////////////////////////////////////////////////////////////////////// PROGRAM COMMITTEE Aertsen (Weizmann Institute, Israel), Amari (Tokyo University), Buhmann (University of Bonn), van Dijk (University of Amsterdam), Eckhorn (University of Marburg), Gielen (University of Nijmegen), van Hemmen (University of Munchen), Herz (University of Oxford), Heskes (University of Nijmegen), Kappen (University of Nijmegen), Krose (University of Amsterdam), Lopez (University of Madrid), Martinetz (Siemens Research, Munchen), MacKay (University of Cambridge), von Seelen (University of Bochum, Germany), Taylor (King's College, London) INDUSTRIAL TRACK There will be an industrial conference entitled NEURAL NETWORKS IN PRACTICE which runs concurrently with the scientific conference. A selection of the best working and 'money making' applications of neural networks in Europe will be presented. The industrial track is organized as part of the activities of the Esprit project SIENA which aims are to conduct a survey of successful and unsuccessful applications of neural networks in the European market. For additional information and a detailed program, please contact SNN at the address below. VENUE The conference will be held at the University Auditorium and Conference Centre of the University of Nijmegen. Instructions on how to get to the University Auditorium and Conference Centre will be sent to you with your conference registration. CONFERENCE REGISTRATION The registration fee for the industrial track is Hfl 450 for two days and includes coffee/tea and lunches. One day registration is Hfl 300. Industrial registration gives access to the industrial track presentations as well as the scientific oral and poster presentations, and includes a copy of the proceedings. The registration fee for the scientific track is Hfl 300 for the two days and includes coffee/tea and lunches. One day registration is Hfl 200. Scientific registration gives access to the scientific oral and poster presentations, and includes a copy of the proceedings. Full-time PhD or Master students may register at a reduced rate. The registration fee for students is Hfl 150 for the two days and includes coffee/tea and lunches. Student registration gives access to the scientific oral and poster presentations. Students must send a copy of their university registration card or a letter from their supervisor together with the registrati on form. Methods of payment are outlined in the enclosed registration form. To those who have completed the registration form with remittance of the appropriate fees, a receipt will be sent. This receipt should be presented at the registration desk at the conference. Payment must have been received by us before the conference. If not, you will have to pay in cash or with personal cheques at the conference. At the conference site, credit cards will be accepted (e.g. American Express, Master Card and VISA). CANCELLATION Notification of cancellation must be sent in writing to the Conference Organizing Bureau of the University of Nijmegen (see address below). Cancellations will not be refunded, but the proceedings will be mailed. ACCOMMODATIONS Hotel reservations will be made for you as indicated on the registration form. Payment can be made at arrival or departure of the hotel (depends on the hotel policy). All hotels are in central Nijmegen and within a ten minute bus ride from the University. The Foundation for Neural Networks and the Conference Organizing bureau cannot be held responsible for hotel reservations and related costs. LIABILITY SNN cannot be held liable for any personal accident or damage to the private property of participants during the conference. //////////////////////////////////////////////////////////////////////////////////// REGISTRATION FORM Send this registration form to: University of Nijmegen Conference organization bureau POBox 9111 6500 HN Nijmegen, The Netherlands tel: +31 80 615968 or 612184 fax: +31 80 567956 Name: .................................................Mr/Mrs Affiliation: ................................................ Address: .................................................... Zipcode/City: ............................................... Country: .................................................... Conference Registration () I will participate at the SNN symposium Amount () industrial registration 2 days Hfl 450,= () industrial registration 1 day: 14/15 september*) Hfl 300,= () academic registration 2 days Hfl 300,= () academic registration 1 day: 14/15 september*) Hfl 200,= () student registration Hfl 150,= *) please strike what is not applicable () Bank transfer has been made (free of bank charges) to SNN conferences, Credit Lyonnais Nederland NV Nijmegen, on bank account number 637984838, swift code CRLIJNL2RS. Amount: ..................................... () Charge my credit card for the amount of ....................... () VISA () Master Card () America Express Card no.: Expiration date: Signature: () Please make hotel reservations in my name: Date of arrival: Date of departure: Single/double room (strike what is not applicable) single double (prices per night) () Hotel Mercure 145.00 165.00 () Hotel Apollo 90.00 120.00 () Hotel Catharina 57.50 115.00 () Hotel Catharina 43.50 87.00 (without shower and toilet) ///////////////////////////////////////////////////////////////////////////// Enquiries should be addressed to: Prof.dr. C. Gielen, Dr. H. Kappen, Mrs. E. Burg Foundation for Neural Networks (SNN) University of Nijmegen PObox 9101 6500 HB Nijmegen, The Netherlands tel: +31 80 614245 fax: +31 80 541435 email: snn at mbfys.kun.nl More information on both the scientific and the industrial track can be found on WWW http://www.mbfys.kun.nl/SNN/Symposium/ Best regards, Wim Wiegerinck From wimw at mbfys.kun.nl Thu Aug 3 05:25:05 1995 From: wimw at mbfys.kun.nl (Wim Wiegerinck) Date: Thu, 3 Aug 1995 11:25:05 +0200 (MET DST) Subject: SNN symposium Neural Networks in Practice - Final Program Message-ID: <199508030925.LAA25549@anastasius.mbfys.kun.nl> NEURAL NETWORKS IN PRACTICE Applications of neural networks in business and industry Industrial track of the 3rd SNN Neural Network Symposium September 14-15, 1995 Nijmegen, the Netherlands http://www.mbfys.kun.nl/SNN/Symposium/ /////////////////////////////////////////////////////////////////////////// PROGRAM /////////////////////////////////////////////////////////////////////////// The ability of neural networks to learn solutions, rather than having to be programmed, makes them suitable for a wide range of applications in business and industry. A rapidly increasing number of organisations already benefit of neural computing. This symposium offers you the opportunity to see how your organisation too might benefit of the possibilities of neural networks. WHAT DOES THE SYMPOSIUM OFFER? On this two day symposium a selection of the best neural network applications in diverse sectors of industry and business will be presented and discussed. The presentations are of first-hand from people who themselves work with these applications. Moreover there will be stands which can provide you with more information about possible applications of neural networks in your organisation. It gives you the opportunity to make contacts to help you realising these applications. WHO SHOULD ATTEND? This is not a technical or scientific symposium. It is intended for managers from business and industries to know more about the relevance and applicability of neural networks in their organization. LANGUAGE and PROCEEDINGS The main language of this symposium will be Dutch. The presentations of the non-Dutch speakers will be in English. The proceedings (in English) will be published by Springer-Verlag. ORGANISATION: Foundation for Neural Networks(SNN) IN COLLABORATION WITH: InnovatieCentrum Midden- en Zuid-Gelderland (IC) Vereniging Artificiele Neurale Netwerken (VANN) This symposium is organised as part of SIENA (Esprit project EP-9811). ////////////////////////////////////////////////////////////////////////////// PROGRAM Registration from 8.00 Thursday 14 September 1995 NEURAL NETWORKS in the PRODUCTION SECTOR OPENING 8.50 - B. Kappen (SNN, NL) Welkom 9.00 - E. Au'ee (InnovatieCentrum, NL) Toepasbaarheid van neurale netwerken in het mkb CALCULATION 9.30 - H. Brockmeijer (Smit Transformatoren, NL) Calculatie voor klant-specifieke transformatoren 10.00 - coffee PROCESS CONTROL 11.00 - A. de Weyer (Akzo Nobel, NL) Modelleren van industriele processen mbv neurale netwerken en genetische algoritmen 11.30 - T. Martinetz (Siemens AG, D) Neural network control for steel rolling mills 12.00 - lunch 14.00 - J. MacIntyre (University of Sunderland / National Power, UK) Condition monitoring with National Power 14.30 - M. Collins (NCS, UK) Electronic nose for process control 15.00 - tea QUALITY CONTROL 16.00 - H.-J. Kolb (MEDAV GmbH, D) Acoustic quality control of roofing tiles 16.30 - A. Timmermans (ATO-DLO, NL) Automatisch sorteren van potplanten 17.00 - reception =============================================================================== Friday 15 September 1995 NEURAL NETWORKS in the SERVICES SECTOR OPENING 8.50 - B. Kappen (SNN, NL) Welkom MARKETING 9.00 - R. ter Heide (SMR, NL) Neurale netwerken in direct marketing 9.30 - R. J. Sch"uring (BrandmarC BV, NL) Modelleren van marktdynamiek in voedingsmiddelen en duurzame gebruiksartikelen 10.00 - coffee ADMINISTRATION 11.00 - A. Hogervorst (Document Access BV, NL) Handschriftherkenning mbv neurale netwerken FINANCE 11.30 - H. G. Zimmerman (Siemens AG, D) Neural networks - the future of forecasting in finance ? 12.00 - lunch SAFETY 14.00 - G. Hesketh (AEA Technology, UK) Countermatch: a neural network approach to automatic signature verification 14.30 - J. Kopecz, (Zentrum fur Neuroinformatik, D) Access control by recognition of human faces 15.00 - tea TRANSPORT 16.00 - H. W"ust (Rijkswaterstaat, NL) Stroomverwachtingen voor scheepvaartbegeleiding bij IJmuiden 17.00 - closing //////////////////////////////////////////////////////////////////////////////// INFORMATION STANDS ================= During the conference much time is reserved for visiting stands from e.g. technology suppliers. If your company would like to present itself with a stand during our symposium, please contact SNN at the address below. FOUNDATION FOR NEURAL NETWORKS (SNN) Geert Grooteplein Noord 21 6525 ZX Nijmegen The Netherlands tel: +31 80-614245 fax: +31 80-541435 email snn at mbfys.kun.nl /////////////////////////////////////////////////////////////////////////////// SCIENTIFIC TRACK There will be an scientific track entitled NEURAL NETWORKS AND ARTIFICIAL INTELLIGENCE which runs concurrently with the industrial track. New fundamental and applied developments in the area of neural networks will be presented. Registration for the industrial track gives access to the scientific track. For additional information and a detailed program, please contact SNN at the address below. VENUE The conference will be held at the University Auditorium and Conference Centre of the University of Nijmegen. Instructions on how to get to the University Auditorium and Conference Centre will be sent to you with your conference registration. CONFERENCE REGISTRATION The registration fee for the industrial track is Hfl 450 for two days and includes coffee/tea and lunches. One day registration is Hfl 300. Industrial registration gives access to the industrial track presentations as well as the scientific oral and poster presentations, and includes a copy of the proceedings. The registration fee for the scientific track is Hfl 300 for the two days and includes coffee/tea and lunches. One day registration is Hfl 200. Scientific registration gives access to the scientific oral and poster presentations, and includes a copy of the proceedings. Full-time PhD or Master students may register at a reduced rate. The registration fee for students is Hfl 150 for the two days and includes coffee/tea and lunches. Student registration gives access to the scientific oral and poster presentations. Students must send a copy of their university registration card or a letter from their supervisor together with the registration form. Methods of payment are outlined in the enclosed registration form. To those who have completed the registration form with remittance of the appropriate fees, a receipt will be sent. This receipt should be presented at the registration desk at the conference. Payment must have been received by us before the conference. If not, you will have to pay in cash or with personal cheques at the conference. At the conference site, credit cards will be accepted (e.g. American Express, Master Card and VISA). CANCELLATION Notification of cancellation must be sent in writing to the Conference Organizing Bureau of the University of Nijmegen (see address below). Cancellations will not be refunded, but the proceedings will be mailed. ACCOMMODATIONS Hotel reservations will be made for you as indicated on the registration form. Payment can be made at arrival or departure of the hotel (depends on the hotel policy). All hotels are in central Nijmegen and within a ten minute bus ride from the University. The Foundation for Neural Networks and the Conference Organizing bureau cannot be held responsible for hotel reservations and related costs. DISCLAIMER The organisation is not responsible for the contents of the presentations or the information supplied by the stands. The organisation cannot be held liable for any personal accident or damage to the private property of participants during the conference. In unforeseen circumstances, the organisation reserves the right to change the program. ///////////////////////////////////////////////////////////////////////////// REGISTRATION FORM Send this registration form to: University of Nijmegen Conference organization bureau POBox 9111 6500 HN Nijmegen, The Netherlands tel: +31 80 615968 or 612184 fax: +31 80 567956 Name: ...............................................................Mr/Mrs Affiliation: .............................................................. Address: .................................................................. Zipcode/City: ............................................................. Country: .................................................................. Conference Registration () I will participate at the SNN symposium Amount () industrial registration 2 days Hfl 450,= () industrial registration 1 day: 14/15 september*) Hfl 300,= () academic registration 2 days Hfl 300,= () academic registration 1 day: 14/15 september*) Hfl 200,= () student registration Hfl 150,= *) please strike what is not applicable () Bank transfer has been made (free of bank charges) to SNN conferences, Credit Lyonnais Nederland NV Nijmegen, on bank account number 637984838, swift code CRLIJNL2RS. Amount: ..................................... () Charge my credit card for the amount of ....................... () VISA () Master Card () America Express Card no.: Expiration date: Signature: () Please make hotel reservations in my name: Date of arrival: Date of departure: Single/double room (strike what is not applicable) single double (prices per night) () Hotel Mercure 145.00 165.00 () Hotel Apollo 90.00 120.00 () Hotel Catharina 57.50 115.00 () Hotel Catharina 43.50 87.00 (without shower and toilet) ///////////////////////////////////////////////////////////////////////////// For enquiries, please contact: Prof.dr. C. Gielen, Dr. H. Kappen, Mrs. E. Burg Foundation for Neural Networks (SNN) University of Nijmegen PObox 9101 6500 HB Nijmegen, The Netherlands tel: +31 80 614245 fax: +31 80 541435 email: snn at mbfys.kun.nl More information on both the industrial and the scientific track can be found on WWW http://www.mbfys.kun.nl/SNN/Symposium/ Best regards, Wim Wiegerinck From richardc at sedal.su.oz.au Fri Aug 4 02:31:24 1995 From: richardc at sedal.su.oz.au (Richard Coggins) Date: Fri, 4 Aug 1995 16:31:24 +1000 Subject: Position announcement at Sydney University Message-ID: <199508040631.QAA04568@sedal.sedal.su.OZ.AU> Systems Engineering and Design Automation Laboratory Sydney University Electrical Engineering RENEWABLE Research Engineer Time Series Prediction and Signal Compression Applications are invited for the position of Research Engineer with the Systems Engineering and Design Automation Laboratory (SEDAL) at the University of Sydney. SEDAL has a number of collaborative projects in the area of time series forecasting and signal compression spanning the applications of financial forecasting, electricity load forecasting and heart signal recoginition and compression. The applicant will be expected to contribute to these projects by developing novel forecasting and coding software models based on neural networks and other statistical techniques. The project collaborators include a major international bank, a major utility supplier and a world leader in implantable cardiac pacemakers. For appointment at HEO Level 6 applicant must have a Degree in Electrial Engineering or Mathematics as well as knowledge and experience of C programming (preferably in a Unix environment), time series forecasting and/or source coding. The successful candidate will be encourged to enrol in an area of one of these disciplines if applicable. For appointment at Level 7, applicants must in additon to the above, have extensive experience in the area of Neural Computing, Information Theory or Statistics or in the design of signal processing systems and their implementation in either software or hardware. Experience in development and implementation of data compression systems, and the ability to supervise final year students and junior research staff also is essential. Experience in integrated circuit design and/or biomedical signal processing and masters degree or higher in Neural Computing, Information Theory or Statistics are desirable. The position is available for an initial period of one and a half years and is renewable up to a further 3 years subject to progress and funding. Applicants must be able to take up the position within 2 months of receiving an offer. Salary: Higher Education Officer Level 6, $34,026 to $36,842 depending on experience. Higher Education Officer Level 7, $37,546 to $41,065 depending on experience and qualifications. Closing Date: 31st August, 1995. Reference Number: B29/10 For further information please contact: Dr. Marwan Jabri Tel: +61 2 351 2240 Fax: +61 2 660 1228 Email: marwan at sedal.su.oz.au From JCONNOR at lbs.lon.ac.uk Fri Aug 4 19:21:38 1995 From: JCONNOR at lbs.lon.ac.uk (Jerry Connor) Date: Fri, 4 Aug 1995 19:21:38 BST Subject: NNCM-95 PRELIMINARY PROGRAMME Message-ID: NNCM 95 - PRELIMINARY PROGRAMME October 11, 12, and 13 1995 WWW site http://www.lbs.lon.ac.uk/desci/nncm.htm This years' Neural Networks in the Capital Markets Conference will be held in two parts. The Tutorials day offers two tracks. The Finance track is a series of 2 two-hour sessions designed to give engineers, mathematicians, and neural network practitioners an overview of the financial markets, pricing models for derivative securities, and the dynamics of price behaviour in high frequency markets. The Statistics & Neural Networks track is a series of 2 two-hour sessions designed to give finance professionals an overview of nonlinear techniques and mathematics that can be applied to data analysis, predictive modelling and analysis of financial markets. It will be held at London Business School, Sussex Place, London NW1 4SA on Wednesday 11 October. The Main conference will offer eight plenary sessions with invited speakers and original research contributions on Derivative & Term structure models, Equity & Commodity models, Foreign Exchange, Corporate Distress & Risk Models, Macroeconomic & Retail Finance applications, and two sessions on Advances in Methodology. Overall, the Main conference includes over 50 oral and poster paper presentations. It will be held on Thursday and Friday October 12 - 13 at the Langham Hilton, 1 Portland Place, London W1N 4JA, which is a short walk from London Business School. TUTORIAL SESSIONS, Wednesday, October 11, 1995 08:30 - 10:30 Finance Tutorial I "Pricing Models for Derivative Securities" Prof. Stewart Hodges Warwick University, Warwick 11.00 - 13.00 Finance Tutorial II "Price Behavior and Models for High Frequency Data in Finance" Dr. Michel Dacorogna Olsen & Associates 14.00 - 16.00 Statistics & Neural Nets Tutorial I "Statistical Inference & Nonparametric Models: Linear Lessons about Nonlinear Prediction" Prof. Leo Breiman University of California, Berkeley 16.30 - 18.30 Statistics & Neural Nets Tutorial II "Neural Networks for Time Series and Trading" Prof. Andreas Weigend University of Colorado, Boulder MAIN CONFERENCE, THURSDAY, October 12, 1995 08.30- 10.30 Oral Presentations Derivative & Term Structure Models I Invited speaker: "Option Pricing and Artificial Neural Networks", Prof. Hal White, USCD 11.00 - 13.00 Oral Presentations Corporate Distress & Risk Models 13.00 - 15.00 Lunch & Poster Sessions 15.00 - 17.00 Oral Presentations Foreign Exchange Invited Speaker (to be confirmed) "High Frequency Data in Financial Models, Issues & Applications" Prof. Charles Goodhart, LSE 17.30 - 19.30 Oral Presentations Macroeconomics & Retail Finance MAIN CONFERENCE, FRIDAY, October 13, 1995 08.30 - 10.30 Oral Presentations Derivative & Term Structure Models II Invited speaker: "Validation of Volatility Models" Prof. Yaser Abu-Mostafa, Caltech 11.00 - 13.00 Oral Presentations Advances in Methodology I 13.00 -15.00 Lunch & Poster Sessions 15.00 - 17.00 Oral Presentations Equities & Commodities Invited speaker: "Towards Minimal Risk: Model Selection Strategies for Time Series Prediction and Trading Strategies" Prof. John Moody, Oregon Graduate Institute 17.30 - 19.30 Oral Presentations Advances in Methodology II ACCEPTED PAPERS DERIVATIVE AND TERM STRUCTURE MODELS Neural Networks for Contingent Claim Pricing via the Galerkin Method E. Barucci Futures Trading Using Artificial Neural Networks H. Pi, & T. Rognavldsson Modelling the Term Structure of Interbank Interest Rates B. Dasgupta, & D. Wood Modelling Non-Linear Cointegration in European Equity Index Futures A. N. Burgess Neural Network Pricing of All Ordinaries SPI Options on Futures P. Lajbcygier A Comparative Study of Forecasting Models for Foreign Exchange Rates Using ANN and Option Prices G. S. Maddala,M. Qi Neural Networks in Derivative Securities Pricing Forecasting in Brazillian Capital Markets L. A. R Gaspar,& G. Lacitermacher Statistical Yield Curve Arbitrage in Eurodollar Futures using Neural Networks A. N. Burgess EQUITIES AND COMMODITIES The Use of Neural Networks For Property Investment Forecasting G. Clarke The Predictability of Security Returns with Simple Technical Trading Rules R. Gencay On-Line Learning for Multi-Layered Neural Network: Application to S&P- 500 Prediction C. E. Pedreira Applying Neural Networks in Copper Trading: A Technical Analysis Simulation C. Naylor The Predictability of Stock Returns with Local Versus Global Nonparametric Estimators R. Gencay A 'World' Model of Integrated Financial Markets Using Artificial Neural Networks T. Poddig Stock Price Prediction Using an Integrated pattern Recognition Paradigm D. H Kil Applications of Artificial Neural Networks in Emerging Financial Markets C. Siriopoulos Stock Selection Using Recon G. H. John, P. Miller, & R. Kerber An Embedded Fuzzy Knowledge Base for Technical Analysis of Stocks K. P. Lam Stock Price Predictions by Recurrent Multilayer Neural Network Architectures D. F. Bassi Equity Forecasting: A Case Study in the KLSE Index J. Yao Use of Neural Networks and Expert/Knowledgebase Systems for Stock Market Analysis, Prediction and Trading A. Chartzaniotis Applying Neural Networks in Copper Trading: Weekly Transaction ANN Model with a Linear Filter C. Naylor Short Term Forecasts of Financial Time Series Using Artificial Neural Networks S. Avouyi-Dovi FOREIGN EXCHANGE An Artificial Neural Network Based Trade Forecasting System for Capital Markets B G Flower Applying Neural Networks to Currency Trading - A Case Study C. Lee Exchange Rate Forecasting Comparison: Neural Networks, Symbolic Machine Learning and Linear Models E. Steurer Identification of FX Arbitrage Opportunities with a Non-Linear Multivariate Kalman Filter P. J. Bolland & J. T. Connor Predicting Returns on Canadian Exchange Rates with Artificial Neural Networks and EGARCH-M Models A. Episcopos Genetic Programming of Fuzzy Logic Production Rules with Application to Financial Trading A. Edmonds Forecasting Foreign Exchange Rates: Bayesian Model Comparison and Non-Gaussian Distributions S. Butlin,& J. T.Connor Short-term FX Market Analysis and Prediction H. Beran CORPORATE DISTRESS AND RISK MODELS Neural Networks in Corporate Failure Prediction : The UK Experience Y. Alici Corporate Distress Diagnosis - An International Comparison M. Kerling Assessing Financial Distress With Probabilistic Neural Networks E.Tyree, & J.A.Long Connectivity & Financial Network Shutdown L. Eisenberg MACRO-ECONOMICS AND RETAIL FINANCE Minimising the Cost of Money in Branch Offices F. Avila Exploratory Data Analysis by the Self-Organizing Map: Structures of Welfare and Poverty in the World S. Kaski, T. Kohonen Commercial Mortgage Default: A Comparison of the Proportional Hazard Modl with Artificial Neural Networks A. Episcopos Empirical Regularities and The Forecasting of Industrial production C. Hafke Loan Risk Analysis Using Neural Networks A. N. Burgess ADVANCES IN METHODOLOGY Clearning A. S. Weigend, & H-G. Zimmermann Ordinal Models for Neural Networks M. Mathieson R/S Analysis and Hurst Exponents: A Critique J. Moody, & L. Wu Combination of Buffered Back-Propagation And RPCL-CLP By Mixture of Experts Model for Foreign Exchange Rate Forecasting L. Xu, Y.M. Cheung, & W. M Leung Prediction with Robustness Towards Outliers, Trends, and Level Shifts J. T. Connor , D. Martin, & A. Bruce Avoiding overfitting by locally matching the noise level of the data A. S. Weigend, M. Mangeas Trading Using Committees J. Moody, S. Rehfuss, L. Wu Reliable Neural Network Predictions in the Presence of Outliers and Non-Constant Variances D. Ormoneit An Analysis of Stops and Profit Objectives in Trading Systems A. Atiya The Meaning of the Mean C. Hafke An Interval Neural Network Architecture for Time Series Prediction M. Fialho, & C.E.Pedreira REGISTRATION To register, complete the registration form and mail to the sec- retariat. Please note that attendance is limited and will be allocated on a "first-come, first-served" basis. SECRETARIAT: For further information, please contact the NNCM-95 secretariat: Ms Busola Oguntula, London Business School Sussex Place, Regent's Park, London NW1 4SA, UK e-mail: boguntula at lbs.lon.ac.uk phone (+44) (0171) 262 50 50 fax (+44) (0171) 724 78 75 LOCATION: The main conference will be held at The Langham Hilton, which is situated near Regent's Park and is a short walk from Baker Street Underground Station. Further directions including a map will be sent to all registries. PROGRAMME COMMITEE Dr A. Refenes, London Business School (Chairman) Dr Y. Abu-Mostafa, Caltech Dr A. Atiya, Cairo University Dr N. Biggs, London School of Economics Dr D. Bunn, London Business School Dr M. Jabri, University of Sydney Dr B. LeBaron, University of Wisconsin Dr A. Lo, MIT Sloan School Dr J. Moody, Oregon Graduate Institute Dr C. Pedreira, Catholic University PUC-Rio Dr M. Steiner, Universitaet Munster Dr A. Timermann, University of California, San Diego Dr A. Weigend, University of Colorado Dr H. White, University of California, San Diego HOTEL ACCOMMODATION: Convenient hotels include: The Langham Hilton 1 Portland Place London W1N 4JA Tel: (+44) (0171) 636 10 00 Fax: (+44) (0171) 323 23 40 Sherlock Holmes Hotel 108 Baker Street, London NW1 1LB Tel: (+44) (0171) 486 61 61 Fax: (+44) (0171) 486 08 84 The White House Hotel Albany St., Regent's Park, London NW1 Tel: (+44) (0171) 387 12 00 Fax: (+44) (0171) 388 00 91 --------------------------REGISTRATION FORM -------------------------- -- NNCM-95 Registration Form Third International Conference on Neural Networks in the Capital Markets October 12-13 1995 Name:____________________________________________________ Affiliation:_____________________________________________ Mailing Address: ________________________________________ _________________________________________________________ Telephone:_______________________________________________ ****Please circle the applicable fees and write the total below**** Main Conference (October 12-13): (British Pounds) Registration fee 450 Discounted fee for academicians 250 (letter on university letterhead required) Discounted fee for full-time students 100 (letter from registrar or faculty advisor required) Tutorials (October 11): You must be registered for the main conference in order to register for the tutorials. (British Pounds) Morning Session Only 100 Afternoon Session Only 100 Both Sessions 150 Full-time students 50 (letter from registrar or faculty advisor required) TOTAL: _________ Payment may be made by: (please tick) ____ Check payable to London Business School ____ VISA ____Access ____American Express Card Number:___________________________________ From terry at salk.edu Fri Aug 4 19:11:14 1995 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 4 Aug 95 16:11:14 PDT Subject: Neural Net Job in San Diego Message-ID: <9508042311.AA23955@salk.edu> From: Janice Holstein Status: R Pam Surko at SAIC is looking for a full-time neural net programmer to start as soon as possible, preferably a bachelor or master's level person with a good practical knowledge of neural nets and a nose for data analysis, with their feet on the ground, who would be happy solving very applied problems. The candidate should be smart and creative, and able to communicate with customers on occasion. The group works in UNIX land, Sun and HP, and mostly do their own system administration, so the successful candidate would have to be a mid-level UNIX guru. Candidates can email Pam ascii or postscript resumes or snail mail or sneaker express paper to Dr. Pamela Surko Science Applications Int'l Corporation Pamela.T.Surko at cpmx.saic.com 10260 Campus Pt. Dr., M/S C2 619-546-6386 San Diego, CA 92121 619-546-6360 FAX USA Janice From piuri at elet.polimi.it Fri Aug 4 19:43:08 1995 From: piuri at elet.polimi.it (Vincenzo Piuri) Date: Sat, 5 Aug 1995 00:43:08 +0100 Subject: call for papers Message-ID: <9508042343.AA27079@ipmel2.elet.polimi.it> ================================================================ ISCAS'96 IEEE International Symposium on Circuits and Systems Atlanta, Georgia, USA - May 12-15, 1996 ================================================================ Call for Papers for the Special Session on "Neural Technologies for Prediction, Identification and Control" ================================================================ This special session will discuss several aspects concerning the use neural technologies for modelling complex non-linear dynamic systems. The theoretic foundation of these methodologies will be treated as the basis for system prediction, identification and control. Presentation of some real applications will allow to evaluate the effectiveness of this approach with respect to the traditional techniques. Papers are solicited on all aspects of the neural technologies concerning system identification, prediction, and control: theory, design methodologies, realizations, case studies, and applications are welcome. Perspective authors are invited to send an abstract (1-2 pages) to prof. Vincenzo Piuri (email and fax submissions are strongly encouraged) by August 31, 1995. It should contain the paper's title, the authors' names and affiliations, the contact person with complete address, phone, fax and email. Acceptance/rejection will be mailed by December 4, 1995. The final version of the paper will be due by January 15, 1996. Prof. Vincenzo Piuri Organizer of the Special Session on "Neural Technologies Prediction, Identification and Control" Department of Electronics and Information Politecnico di Milano, Italy phone +39-2-2399-3606 secretary +39-2-2399-3623 fax +39-2-2399-3411 email piuri at elet.polimi.it ================================================================ From hali at nada.kth.se Mon Aug 7 22:17:53 1995 From: hali at nada.kth.se (Hans Liljenstrm) Date: Tue, 8 Aug 1995 04:17:53 +0200 Subject: Workshop on Fluctuations in Biology Message-ID: <199508080217.EAA25981@sanscalc.nada.kth.se> International Workshop on THE ROLE AND CONTROL OF RANDOM EVENTS IN BIOLOGICAL SYSTEMS Sigtuna, Sweden 4-9 September 1995 At this interdisciplinary workshop we intend to discuss the relevance of chaos and noise in biological systems in general, but with a focus on neural systems. The approach will be theroretical as well as experimental, and the meeting is intended to attract participants from various fields, such as biology, physics, and computer science. Invited talks are given by L. Agnati, A. Babloyantz, A. Bulsara, R. Cotterill, W. Freeman, H. Haken, D. Hansel, K. Hepp, J. Hopfield, S. Kelso, C. Koch, W. Levy, L. Liebovitch, M. Mackey, F. Moss, S. Pogun, R. de Ruyter, and I. Tsuda. A few places are still available, and to get the desired balance between disciplines and between theory and experiments we welcome, in particular, further participants with experience from experimental (neuro)biology. For more information, see our WWW page, http://www.nada.kth.se/~hali/workshop.html, or contact Hans Liljenstrom Dept. of Numerical Analysis and Computing Science Royal Institute of Technology S-100 44 Stockholm, SWEDEN Email: hali at nada.kth.se Phone: +46-(0)8-790 6909 Fax: +46-(0)8-790 0930 From linster at katla.harvard.edu Tue Aug 8 19:25:53 1995 From: linster at katla.harvard.edu (Christiane Linster) Date: Tue, 8 Aug 1995 19:25:53 -0400 (EDT) Subject: No subject Message-ID: The followowing two papers are now available by ftp: TOWARDS A COGNITIVE UNDERSTANDING OF ODOR DISCRIMINATION: COMBINING EXPERIMENTAL AND THEORETICAL APPROACHES Claudine Masson (1) and Christiane Linster (2) (1) Laboratoire de Neurobiologie ComparEe des InvertEes INRA-CNRS (URA 1190) BP 23, 91440 Bures-sur-Yvette, France tel, fax: 69 07 20 59 e-mail: masson at jouy.inra.fr (2) Laboratoire d'Electronique, ESPCI, 10 rue Vauquelin, 75005 Paris tel: 40 79 44 61 e-mail: linster at neurons.fr Key words: feature extraction; modeling; odor discrimination; olfactory pro cessing. Abstract In response to changes in odorous environmental conditions, most species (ranging from lower invertebrates to mammals), demonstrate high adaptive behavioral performances. Complex natural chemical signals (i.e. odorous blends involved in food search), are particularly unstable and fluctuating, in quality, space and time. Nevertheless, adapted behavioral responses related to meaningful odor signals can be observed even in complex natural odorous environments, demonstrating that the underlying olfactory neural network is a very dynamic pattern recognition device. In the honeybee, a large amount of experimental data have been collected at different levels of observation within the olfactory system, from signal processing to behavior, including cellular and molecular properties. However, no set of data considered by itself can give insight into the mechanisms underlying odor discrimination and pattern recognition. Here, by concentrating on deciphering the neural mechanisms underlying encoding and decoding of the olfactory signal in the two first layers of the neural network, we illustrate how a theoretical approach helps us to integrate the different experimental data and to extract relevant parameters (features) which might be selected and used to store an odor representation in a behavioral context. To appear in Behavioral Processes, Special Edition on Cognition and Evolution A NEURAL MODEL OF OLFCATORY SENSORY MEMORY IN THE HONEYBEE ANTENNAL LOBE Christiane Linster Laboratoire d'Electronique, ESPCI 10, Rue Vauquelin, 75005 Paris linster at neurones.espci.fr Claudine Masson Neurobiologie ComparEe des Inverteb INRA, CNRS (URA 1190) 911140 Bures sur Yvette, France masson at jouy.inra.fr Abstract We present a neural model for olfactory sensory memory in the honey bee's antennal lobe. In order to investigate the neural mechanisms underlying odor discrimination and memorization, we exploit a variety of morphological, physiological and behavioral data. The model allows us to study the computational capacities of the known neural circuitry, and to interpret under a new light experimental data on the cellular as well as on the neuronal assembly level. We propose a scheme for memorization of the neural activity pattern after stimulus offset by changing the local balance between excitation and inhibition. This modulation is achieved by changing the intrinsic parameters of local inhibitory neurons or synapses. To appear in Neural Computation , Volume 8 (1) Both papers can be obtained as postscript files by ftp: ftp katla.harvard.edu login: ftp password: your email address cd linster get filename.ps *********************************************************** * Christiane Linster * * * * Dept. of Psychology 920 Tel: 1 617 495 3875 * * Harvard University Fax: 1 617 495 3728 * * Cambridge MA 02138 linster at katla.harvard.edu * *********************************************************** From ntl at bbcnc.org.uk Wed Aug 9 05:55:24 1995 From: ntl at bbcnc.org.uk (Neural Technologies Limited) Date: Wed, 09 Aug 1995 10:55:24 +0100 Subject: Urgent Vacancies Message-ID: <9508090955.aa26952@auntie.bbcnc.org.uk> Neural Technologies Limited is the leading UK company working in the application and exploitation of neural computing across a wide range of industrial and commercial environments. Our continued growth has lead to various new opening for new people to help in the development and deployment of working neural computing solutions. You will be working within a small and highly motivated team in our laboratories in Petersfield, Hampshire. Urgent vacancies exist for: Neural Scientists Neural Engineers Information Analysts Team Leaders Details on each below... Contact: (Fax or send CV) Simon Hancock Project Manager Neural Technologies Limited Bedford Road Petersfield Hampshire GU32 3QA (or call): Phone: 01730 260256 Fax: 01730 260466 E-mail: ntl at bbcnc.org.uk NEURAL SCIENTIST Required skills: ? well versed in neural network algorithm development and their practical application ? MUST be fluent in C or C++ within the PC environment ? good general presentation skills ? experience in the following would also be appreciated: ? knowledge of conventional statistics ? signal processing techniques [speech, vision. . .] ? application domains [data base analysis, machine health monitoring . . .] ? AI techniques and KBS ? system level integration [DSP, neural accelerators. . .] ? Visual C++ and MFC for Windows All candidates should be working be at a practical research level or have extensive industrial experience. A keen view to the commercial realities of working within a small, but fast growing, company is required. Neural Engineer Required skills: ? knowledge of practical application of neural networks and general data analysis ? MUST be fluent in C or C++ within the PC environment ? good general communication and presentation skills ? experience in the following would also be appreciated: ? knowledge of conventional statistics ? signal processing techniques [speech, vision. . .] ? application domains [data base analysis, machine health monitoring . . .] ? AI techniques and KBS ? system level integration [DSP, neural accelerators. . .] ? Visual C++ and MFC for Windows All candidates should have practical experience of using a variety of neural computing techniques applied to real problem domains. A keen view to the commercial realities of working within a small, but fast growing, company is required. Software Engineer Required skills: ? Mature development skills in the PC Windows environment Visual C++ with MFC for Windows ? Experience in working to rigorous quality procedures ISO9000 or equivalent ? Experience of the following areas would also be beneficial: ? Graphics Server ? Spread/VBX++ ? HDK (on-line help compiler) All candidates should have been working within the software industry for a period of at least 1-2 years and be able to demonstrate skills and successes in Windows software development. Senior Data Analyst / Team Leader Our continued growth has lead to the requirement of a Senior Data Analyst / Team Leader to help in both internal project development and customer bespoke solutions. Required skills: ? Statistical background: e.g. regression analysis neural methods useful ? Team leadership on large projects utilising up-to-date development methods Ideally with the experience of 2-3 engineers reporting directly to you working on a number of projects simultaneously ? Experience in working to rigorous quality procedures ISO9000 or equivalent ? Experience of the following areas would also be essential: ? experience in report writing ? presentation preparation ? excellent inter-personal skills ? Experience of the following areas would also be beneficial: ? training of personnel All candidates should have been working within the IT industry for a period of 3-5 years and be able to demonstrate skills and successes in both data analysis and leadership of project teams. -------------------------------------------- Dr. G.R. Bolt, Neural Scientist Neural Technologies Limited, Ideal House, Petersfield, Hampshire GU32 3QA U.K. Tel: (0) 1730 260256 Fax: (0) 1730 260466 From radford at cs.toronto.edu Thu Aug 10 13:04:42 1995 From: radford at cs.toronto.edu (Radford Neal) Date: Thu, 10 Aug 1995 13:04:42 -0400 Subject: Software for Bayesian learning available Message-ID: <95Aug10.130447edt.859@neuron.ai.toronto.edu> Announcing Software for BAYESIAN LEARNING FOR NEURAL NETWORKS Radford Neal, University of Toronto Software for Bayesian learning of models based on multilayer perceptron networks, using Markov chain Monte Carlo methods, is now available by ftp. This software implements the methods described in my Ph.D. thesis, "Bayesian Learning for Neural Networks". Use of the software is free for research and educational purposes. The software supports models for regression and classification problems based on networks with any number of hidden layers, using a wide variety of prior distributions for network parameters and hyperparameters. The advantages of Bayesian learning include the automatic determination of "regularization" parameters, without the need for a validation set, avoidance of overfitting when using large networks, and quantification of the uncertainty in predictions. The software implements the Automatic Relevance Determination (ARD) approach to handling inputs that may turn out to be irrelevant (developed with David MacKay). For problems and networks of moderate size (eg, 200 training cases, 10 inputs, 20 hidden units), full training (to the point where one can be reasonably sure that the correct Bayesian answer has been found) typically takes several hours to a day on our SGI machine. However, quite good results, competitive with other methods, are often obtained after training for under an hour. (Of course, your machine may not be as fast as ours!) The software is written in ANSI C, and has been tested on SGI and Sun machines. Full source code is included. Both the software and my thesis can be obtained by anonymous ftp, or via the World Wide Web, starting at my home page. It is essential for you to have read the thesis before trying to use the software. The URL of my home page is http://www.cs.toronto.edu/~radford. If for some reason this doesn't work, you can get to the same place using the URL ftp://ftp.cs.toronto.edu/pub/radford/www/homepage.html. From the home page, you will be able to get both the thesis and the software. To get the thesis and the software by anonymous ftp, use the host name ftp.cs.toronto.edu, or one of the addresses 128.100.3.6 or 128.100.1.105. After logging in as "anonymous", with your e-mail address as the password, change to directory pub/radford, make sure you are in "binary" mode, and get the files thesis.ps.Z and bnn.tar.Z. The file bnn.doc contains just the documentation for the software, but this is included in bnn.tar.Z, so you will need it only if you need to read how to unpack a tar archive, or don't want to transfer the whole thing. The files ending in .Z should be uncompressed with the "uncompress" command. The thesis may be printed on a Postscript printer, or viewed with ghostview. If you have any problems obtaining the thesis or the software, please contact me at one of the addresses below. --------------------------------------------------------------------------- Radford M. Neal radford at cs.toronto.edu Dept. of Statistics and Dept. of Computer Science radford at stat.toronto.edu University of Toronto http://www.cs.toronto.edu/~radford --------------------------------------------------------------------------- From eric at research.nj.nec.com Thu Aug 10 14:41:24 1995 From: eric at research.nj.nec.com (Eric B. Baum) Date: Thu, 10 Aug 1995 14:41:24 -0400 Subject: Job Announcement Message-ID: <199508101841.OAA10568@yin> I recently posted a job ad. Below is a revised version of this ad. Note especially the new entry entitled "Term" as well as some more details of experiments. ---------------------------------------------------------------------------- Research Programmer Wanted. Prerequisites: Experience in getting large programs to work. Some mathematical sophistication. E.g. at least equivalent of a good undergraduate degree in math, physics, theoretical computer science or related field. Salary: Depends on experience. Job: Implementing various novel algorithms. The previous holder of this position (Charles Garrett) implemented our new Bayesian approach to games, with striking success. We are now engaged in an effort to produce a world championship chess program based on these methods and several new ideas regarding learning. This is an experimental learning project as well as a game-playing project. The chess program is being written in Modula 3. Experience in Modula 3 is useful but not essential so long as you are willing to learn it. Other projects may include TD learning, GA's, DNA computing, etc. To access papers on our approach to games, and get some idea of general nature of other projects, (e.g. a paper on GA's Garrett worked on) see my home page http://www.neci.nj.nec.com:80/homepages/eric/eric.html A paper on a classifier-like learning system with Garrett will appear there RSN (but don't wait to apply). This system trains a complex `mind' formed of multiple agents interacting in an economic model and is currently being tested on Blocks World planning problems. This project will continue and running experiments on it will be part of the job. The successful applicant will (a)have experience getting *large* programs to *work*, (b) be able to understand the papers on my home page and convert them to computer experiments. These projects are at the leading edge of basic research in algorithms/cognition/learning, so expect the work to be both interesting and challenging. Term: The initial job offer will be for a short provisional period. If performance is satisfactory a longer term will be offered. There is a strong possibility of conversion to permanent status if performance is highly satisfactory. To apply please send cv, cover letter and list of references to: eric at research.nj.nec.com .ps or plain text please! NOTE- EMAIL ONLY. Hardcopy, e.g. US mail or Fedex etc, will not be opened. Equal Opportunity Employer M/F/D/V ------------------------------------- Eric Baum NEC Research Institute, 4 Independence Way, Princeton NJ 08540 PHONE:(609) 951-2712, FAX:(609) 951-2482, Inet:eric at research.nj.nec.com From biehl at Physik.Uni-Wuerzburg.DE Fri Aug 11 14:49:20 1995 From: biehl at Physik.Uni-Wuerzburg.DE (Michael Biehl) Date: Fri, 11 Aug 95 14:49:20 MESZ Subject: paper available: on-line backpropagation Message-ID: <199508111249.OAA08313@wptx08.physik.uni-wuerzburg.de> FTP-host: ftp.physik.uni-wuerzburg.de FTP-filename: /pub/preprint/WUE-ITP-95-018.ps.gz The following paper is now available via anonymous ftp: (See below for the retrieval procedure) ------------------------------------------------------------------ "On-line backpropagation in two-layered neural networks" Peter Riegler and Michael Biehl Ref. WUE-ITP-95-018 Abstract We present an exact analysis of learning a rule by on-line gradient descent in a two-layered neural network with adjustable hidden-to-output weigths (backpropagation of error). Results are compared with the training of networks having the same architecture but fixed weights in the second layer. --------------------------------------------------------------------- Retrieval procedure: unix> ftp ftp.physik.uni-wuerzburg.de Name: anonymous Password: {your e-mail address} ftp> cd pub/preprint ftp> get WUE-ITP-95-018.ps.gz (*) ftp> quit unix> gunzip WUE-ITP-95-018.ps.gz e.g. unix> lp WUE-ITP-95-018.ps [8 pages] (*) can be replaced by "get WUE-ITP-95-018.ps". The file will then be uncompressed before transmission (slow!). _____________________________________________________________________ -- Michael Biehl Institut fuer Theoretische Physik Julius-Maximilians-Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg email: biehl at physik.uni-wuerzburg.de Tel.: (+49) (0)931 888 5865 " " " 5131 Fax : (+49) (0)931 888 5141 From mao at almaden.ibm.com Thu Aug 10 23:54:04 1995 From: mao at almaden.ibm.com (Jianchang Mao 927-1932) Date: Thu, 10 Aug 95 19:54:04 -0800 Subject: call for papers, IEEE Journal on SELECTED AREAS IN COMMUNICATIONS Message-ID: <9508110254.AA24519@powerocr.almaden.ibm.com> ==================================================================== CALL FOR PAPERS IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS COMPUTATIONAL AND ARTIFICIAL INTELLIGENCE IN HIGH SPEED NETWORKS Recent research in high speed networks has resulted in key architectural trends which are likely to fundamentally influence all facets of the communications infrastructure. A major opportunity is now the integration of diverse services on these networks. Unlike traditional teletraffic, many of these current and emerging services have poorly understood traffic parameters and user behaviors. These networks must be self-managing and self-healing and be able to maintain their quality of service, deal with congestion and failures, and allow dynamic reconfiguration with minimal intervention. This has led many researchers to investigate algorithms that have adaptive and even learning behaviors. There is a consensus among many researchers that to manage these new workloads and their workloads a class of techniques exhibiting some form of computational intelligence will be needed. Despite some progress, key challenges remain. One problem is that these resource management algorithms need to be able to respond anticipatively or preventatively to problems, since the time-bandwidth product of these networks does not always allow for reactive behavior. A second problem is that in some cases, decisions must be made on a very rapid (sometimes submicrosecond) time scale in order to optimize switching behavior. Thirdly, while the network adapts, its traffic sources and sinks are also adapting intelligently to the network's behavior, making for added complexity. Computational intelligence encompasses the information processing paradigms of adaptive systems such as neural networks and fuzzy logic. Examples of Artificial intelligence include expert systems and search techniques. Computational intelligence paradigms have the ability to learn from experience and to predict future behaviors. In some cases these learning rules are explicit, but in other cases the learning algorithms are implicit in a more general structure such as a neural network. In particular, neural networks have been shown to have properties that can help in managing congestion in networks, dealing with changing workloads, etc. Analog circuits that implement neural networks have been shown to be capable of solving optimization problems in submicrosecond timescales, fast enough to make on-the-fly switch routing decisions. Original papers are solicited on the applications of any technique in computational or artificial intelligence to the following topics (but not limited to): Fast packet switching Fault tolerant and dynamic routing Multimedia source measurement and modeling Call admission control Traffic policing Congestion and flow control Estimation of quality of service parameters Wireless and mobile networks Disconnectible terminal management Authors wishing to submit papers, should send six copies to Prof. Ibrahim Habib at the address below. The following schedule shall be applied: Submission Deadline: January 15, 1996 Notification of acceptance: May 15, 1996 Final manuscript due: July 1, 1996 Publication: 1st AQuarter, 1997. GUEST EDITORS: Prof. Ibrahim Habib Department of electrical engineering City University of New York, City College 137 street at Convent Avenue New York, N.Y. 10031 email: ibhcc at cunyvm.cuny.edu Dr. Robert Morris Manager, Data Systems Technology IBM Almaden Research Center San Jose, CA 95120 email: rjtm at almaden.ibm.com Dr. Hiroshi Saito Distinguished Technical Member NTT Telecommunications Networks Laboratories 3-9-11, Midori-chi, Musashino-shi Tokyo 180, Japan email: saito at hashi.ntt.jp Prof. Bjorn Pehrson Royal Institute of Technology Electrum 204 S-164 Kista, Sweden email: bjorn at it.kth.se From v.dimitrov at uws.edu.au Tue Aug 15 05:49:17 1995 From: v.dimitrov at uws.edu.au (Vladimir Dimitrov) Date: Tue, 15 Aug 1995 19:49:17 +1000 Subject: FLAMOC'96 Message-ID: <199508150948.AA02462@hotel.uws.EDU.AU> International Discourse on FUZZY LOGIC AND THE MANAGEMENT OF COMPLEXITY (FLAMOC'96) Sydney, 15-18 January, 1996 SECOND ANNOUNCEMENT AND FINAL CALL FOR PAPERS FLAMOC'96 is an International Discourse targeted on the growing use of Fuzzy Logic when dealing with Complexity in various fields of applications (Industry, Business, Finance, Management, Ecology, Medicine, Social Science, etc.). FLAMOC'96 intends to contribute insight and foresight regarding creation of innovative and practically efficient ways of implementing Fuzzy Logic in problem situations impregnated with Uncertainty, Intricacy, and Hazard. In the age of an increasing Technological, Environmental and Social Complexity, FLAMOC'96 emphasises the synergy between Fuzzy Logic and Neural Networks, Genetic Algorithms, Non-linear Simulation Techniques, Fractal and Chaos Theory not only in fuzzy engineering practice but also in the search for better understanding the changes around and in us, learning how to handle paradoxes and risk, how to avoid conflicts and look for collaboration and consensus, how to improve our personal and organizational achievements by integrating many diverse and contradictory requirements into coherent, complimentary, and useful outputs. FLAMOC'96 provides a rich tutorial program and an excellent opportunity to meet and talk with world-known experts in Fuzzy Logic and Soft Computing. FLAMOC'96 is the only one of its kind discourse that intents to explore the diversity of practical and theoretical applications of Fuzzy Logic in managing real life Complexity, using Fuzzy Thinking as a bridge between science and the humanities. THE PROGRAM Keynote speaker: Prof. Lotfi Zadeh Invited speakers: Prof. George Klir Prof. Michio Sugeno Three basic streams build the conceptual framework of this discourse: - Soft Computing Technology (Fuzzy Logic, Neural Networks and Genetic Algorithms): with applications in Process Control, Intelligent Manufacturing Systems, Artificial Intelligence (Expert Systems, Decision Support Systems, Knowledge Based Leaning Systems, Robotics), Fuzzy Mathematics, Data Analysis, Linguistics, Biomedical Engineering, Medical Informatics. - Fuzzy Logic in Organising Systems: with applications in Social Science (Consensus Seeking, Conflict Analysis, Human Decision Making, Public Participation, Qualitative Reasoning, Education), Philosophy (PostAristotelean Logic, Postmodernism), Psychology, Economics (Stock Market and Financial Analysis). - Approximate Reasoning in Environmental Applications: Cleaner Production, Environmental Management, Mass Load Analysis, Risk Management, Sustainable Development Practice, Ecology, Ecocybernetics. Participants of FLAMOC'96 are free to suggest other topics that might be of interest to attendees. FLAMOC'96 will include open discussions on selected by participants issues of Soft Computing and Fuzzy Set Theory, System Science and Sustainable Development, Chaos and Complexity, as well as introductory and advanced tutorials hands-on demonstrations international exhibition of fuzzy products lectures visual summaries (poster sessions) multi-paper sessions. FLAMOC'96 provides a special day for business people and managers ("FLAMOC Business Day") with presentations delivered by leading experts and tutorials on how effectively to apply Fuzzy Logic in business, marketing, finance, and management. Participants are invited to submit proposals for presentation in a form that is most appropriate to the subject matter they would cover (e.g. paper, poster presentation, tutorial, lecture, computer demonstration, discussion, exhibit of a fuzzy product, performance on fuzzy music system, etc.) THE ATTENDEES FLAMOC'96 aimed at system and complexity scientists and researchers, control engineers, social scientists, managers in business, industry and government, academics, conflict resolution practitioners and facilitators, bio-medical engineers, environmental managers and environmentalists, fuzzy soft- and hardware specialists, information science professionals and students. INTERNATIONAL PROGRAM COMMITTEE L. Zadeh (USA) - Honorary Chairman V. Dimitrov (Univ. Western Sydney, Hawkesbury, Australia) - Coordinator J. Bezdek (Univ. West Florida, USA) H. Berenji (NASA, USA) Z. Bien (Advanced Inst. Science and Technology, Korea) C. Carlsson (Abo Academy, Finland) E. Cox (Metus Systems Group, USA) J. Dimitrov (Univ. Western Sydney, Nepean, Australia) D. Driankov (Univ. Linkoeping, Sweden) S. Dyer (OrgMetrics, USA) D. Filev (Ford, USA) T. Gedeon (Univ. NSW, Australia) H. Guesgen (Univ. Auckland, New Zealand) M. Gupta (Univ. Saskatchewan, Canada) J. Kacprzyk (Academy of Sciences, Poland) N. Kasabov (Univ. Otago, New Zealand) D. Keweley (Defence Science and Technology Organisation, Australia) P. Kloeden (Deakin Univ., Australia) G. Klir (State Univ. New York, USA) L. Koczy (Techn. Univ. Budapest, Hungary) R. Kowalczyk (CRA, Advanced Technical Development, Australia) V. Kreinovich (Univ. Texas, USA) K. Leung (The Chinese Univ. Hong Kong) D. Lakov (Academy of Sciences, Bulgaria) M. Mizumoto (Osaka EC Univ., Japan) S. Murugesan (Univ.Western Sydney, Macarthur, Australia) A. Patki (Department of Electronics, India) L. Reznik (Victoria Univ. Technology, Australia) A. Ramer (Univ. NSW, Australia) B. Rieger (Univ. Trier, Germany) A. Salski (Univ. Kiel, Germany) M. Smithson (James Cook Univ., Australia) M. Sugeno (Tokyo Inst. Technology, Japan) T. Terano (Tokyo Inst. Technology, Japan) T. Vladimirova (Univ. Surrey, UK) X. Yao (Defence Force Academy, Australia) J. Yen (Texas A&M Univ., USA) ORGANIZING/MANAGEMENT COMMITTEE J. Dimitrov (Univ. Western Sydney, Nepean) - Chair V. Dimitrov (Univ. Western Sydney, Hawkesbury) X. Hu (Univ. Sydney) G. Jaros (Univ. Sydney) K. Kopra (Univ. Western Sydney, Hawkesbury) J. Peperides (Omron Electronics) G. Sheather (Univ. Technology, Sydney) D. Tayler (Hawkesbury Technology) IMPORTANT DATES 15 September 1995 Deadline for Extended Abstract (1-2 pages) Submission. Address for submission: Dr Vladimir Dimitrov School of Social Ecology, UWS-Hawkesbury, Richmond 2753, Australia Fax: +61(45) 701901 Phone: +61(47) 701903 E-mail: v.dimitrov at uws.edu.au 15 October 1995 Preliminary Acceptance 30 November 1995 Deadline for Camera Ready Copy of Full Paper (5 pages) All accepted papers will be published in the FLAMOC'96 Proceedings: "Fuzzy Logic and the Management of Complexity". After the discourse, selected papers will be published in a separate volume. REGISTRATION FEE AUS$ 450 (before 15.11.1995) and AUS$500 (after 15.11.1995). Students' Fee: AU$100. TUTORIALS Tutorials include the following topics: (1) Introduction to Fuzzy Logic (FL) and its Applications. (2) Industrial Applications of FL. (3) Applications of FL in Management Practice. (4) Applications of FL in Busines and Financial Forecasting. (5) Clinical Applications of FL. (6) Application of FL in Environmental Management. (7) Advanced Design Methodology of Fuzzy Systems: Neuro-Fuzzy and Fuzzy-Genetic Systems. (8) Fuzzy Semantics (9) Approximate Reasoning (10) Research Topics in Soft Computing. FEE FOR TUTORIALS: AUS$100 for one selected tutorial topic. AUS$180 for two selected lectures. AUS$250 for three selected lectures. AUS$300 for four or more lectures. Students' fee: AUS$50 (full day tutorials) Potential lectures are invited to submit a one page proposal for tutorial that includes: the background of the lecturer (both in research and lecturing experience), abstract and contents of the proposed lecture (not limited by the above list) to: Dr Xiheng Hu DEE, University of Sydney NSW 2006, Australia Fax: +61(2) 351 3847 Phone: +61(2) 351 6475 E-mail: hxh at ee.su.oz.au not later than 15 September 1995. Lecturers are responsible for preparation and delivery of their lectures as well as preparation of quality handout (such as copies of lecture overheads). Lecturers have FREE REGISTRATION for FLAMOC'96. EXHIBITION For information about the exhibition and space reservation, contact: Mr Kalevi Kopra 6 Boree Rd, Forestville 2087 Australia Fax: +61(2) 975 1943 Phone: +61(2) 451 5728 Proposals for the exhibition to be sent not later than 15 September 1995. FEE FOR EXHIBITION SPACE (for companies): AU$2000 SOCIAL EVENTS You may want to participate in our golf tournament on 17 January 1996 (Fee: AU$50) with fuzzy logic experts from around the world or enjoy a Conference Dinner when cruising in the beautiful Sydney Harbour (Fee: AUS$ 100 ). Or postconference tour to the Great Barrier Reef, Blue Mountains, etc. For overall information about attending at (and participating in the program of) FLAMOC'96, contact: Mrs Judith Dimitrov POBox 91, Richmond 2753 Australia Fax: +61(47) 761616 Phone: +61(47) 761514 ACCOMMODATION AUS$ 105 per room (single or shared by two or three persons) in (1) The Golden Gate Hotel (Reservations: POBox K401, Haymarket, NSW 2000, Australia; Fax +61(2) 281 2213; Phone +61(2)281 6888). (2) Country Comfort Hotel, Sydney Central (Reservations: POBox K963, Haymarket; Fax +61(2) 281 3794; Phone +61(2) 212 2544). The participants must book their staying directly with one of the above hotel as soon as possible. Both hotels are close (at walking distance) to the Graduate School of Business, University of Technology, Sydney: 1-59, Quay St., Haymarket), where FLAMOC'96 will take place, and to all major Sydney tourist attractions. ____________________________________________________________________________ _______ REGISTRATION FORM FLAMOC'96 (to be sent to: FLAMOC'96, POBox 91, Richmond 2753, AUSTRALIA; Fax: +61(47)761 616) Name........................................................................ ......................................... Address..................................................................... ........................................ Company..................................................................... ...................................... Mailing Address..................................................................... .......................... Phone Fax E-mail [ ] I shall particpate in FLAMOC'96. Enclosed is my payment of the Registration Fee (AUS$450 before 15.11.1995; AUS$500 after 15.11.1995; AUS$100 for students). I shall participate in tutorials: [ ] one selected lecture only : (please, indicate the number of the tutorial from the above list of tutorials) [ ] two selected tutorials: (please, indicate which numbers of the tutorials from the above list of tutorials) [ ] three selected tutorials: (please, indicate the corresponding numbers) [ ] four or more tutorials. Enclosed is my payment for the tutorials (AUS$100 for one lecture; AUS$180 for two lectures; AUS$250 for three lectures; AUS$300 for 4 lectures or more; [ ] Students' fee: AUS$50 full day) [ ] I shall exhibit fuzzy products. Enclosed is my payment for the exhibition space: AUS$2000. I shall participate in: [ ]Golf tournament (AUS$50) [ ]Conference Dinner &Harbourcruise (AUS$100) Enclosed is my payment for the golf tournament or/and the Harbourcruise. I am interested in Post Conference Activities such as: [ ] Great Barrier Reef Tour [ ] Blue Mountains Excursion [ ] Others: ________________________________________________________________________________ ________________________________________________________________________________ PAYMENT 1. CREDIT CARD ________Visa ________Master Card __________American Express ________ Card No.......................................................................... ......................... Expiration Date........................................................................ ............... Name of the cardholder:................................................................. ....... Signature of the cardholder:................................................................. . Please, mail or fax the above form to: FLAMOC'96 POBox 91, Richmond 2753, AUSTRALIA Fax: +61(47) 761616 2. CHEQUE - made payable in Australian Dollars to: FLAMOC'96 The University of Sydney - sent to: POBox 91, Richmond 2753, AUSTRALIA ____________________________________________________________________________ ________ From rwp at eng.cam.ac.uk Tue Aug 15 10:38:19 1995 From: rwp at eng.cam.ac.uk (Richard Prager) Date: Tue, 15 Aug 1995 10:38:19 BST Subject: Research Position for 1 Year Cambridge UK Message-ID: <199508150938.6421@dsl.eng.cam.ac.uk> Euro-PUNCH Project Research Assistant Position for One Year Under the terms of a grant recently awarded to the Euro-PUNCH Project we expect to offer a one year's Research Assistant position in Cambridge to investigate the use of: Neural Networks in the Prediction of Risk in Pregnancy Euro-PUNCH is a collaborative Project funded by the Human Capital and Mobility Programme of the Commission of the European Communities. Thus, the post is available only to a citizen of a European Union Member State (but not British), who wishes to come to work in the United Kingdom. >From the obstetrical point of view, the principal focus of the Euro-PUNCH Project lies in the use of patient-specific measurements of an epidemiological nature (such as maternal age, past obstetrical history, etc.) as well as fetal heart rate recordings, in the forecasting of a number of specific Adverse Pregnancy Outcomes. >From the neural network point of view, the Project involves the design of pattern-processing and classification systems which can be trained to forecast problems in pregnancy. This will involve continuation of work on pattern-classification and regression analysis, using neural networks operating on a very large database of about 1.2 million pregnancies from various European countries. Challenging components of the project include dealing with missing and uncertain variables, sensitivity analysis, variable selection procedures and cluster analysis. Many leading European obstetrical centres are involved in the Euro-PUNCH project, and close collaboration with a number of these will be an essential component of the post offered. Candidates for this post are expected to have a good first degree and preferably a post-graduate degree in a relevant discipline. Come familiarity with medical statistics and neural networks is desirable but not essential. Salary (on the RA scale) will depend on age and experience, and is likely to be in the range of #14,317 to #15,986 per annum. Appointment would be subject to satisfactory health screening. Applications will close on 23rd August 1995. Interviews will be held in Cambridge and are likely to be on Wednesday 30th August 1995. Applications (naming two referees) should be submitted to: Dr Kevin J Dalton PhD FRCOG Division of Materno-Fetal Medicine, Dept. Obstetrics & Gynaecology University of Cambridge, Addenbrooke's Hospital Cambridge CB2 2QQ Tel: +44-1223-410250 Fax: +44-1223-336873 or 215327 e-mail: kjd5 at cus.cam.ac.uk Informal enquiries about the project should be directed to: (Obstetric side) Dr Kevin Dalton kjd5 at cus.cam.ac.uk (Engineering Side) Dr Niranjan niranjan at eng.cam.ac.uk (Engineering Side) Dr Richard Prager rwp at eng.cam.ac.uk From kevin at research.nj.nec.com Tue Aug 15 14:49:12 1995 From: kevin at research.nj.nec.com (Kevin Lang) Date: Tue, 15 Aug 95 14:49:12 EDT Subject: learning 5-bit parity: RMHC vs. GP vs. MLP Message-ID: <9508151849.AA20840@doghein> Comments on "A Response to ..." [Ko95] Kevin J. Lang NEC Research Institute kevin at research.nj.nec.com July 1995 These remarks constitute a brief technical reply to [Ko95], which was handed out at the ML-95 and ICGA-95 conferences. SCALING FROM 3-BIT TO 5-BIT PARITY: hill climbing still wins [Ko95, section 8] extends the experimental comparison in [La95] of the search efficiency of random mutation hill climbing and genetic search to the task of learning 5-bit parity. It is shown that any given run of RMHC is four times less likely than genetic search to find a circuit which computes 5-bit parity. However, when RMHC manages to find a solution, it does so about fifty times faster than genetic search. Hence, by using RMHC in an iterated manner (i.e. multiple independent runs), it is possible to generate solutions much more cheaply than with genetic search. For example, by restarting each run of RMHC after 75,000 candidates one could obtain a candidate to solution ratio of about 300,000. This compares well with the value of 2,913,583 candidates per solution reported for genetic search. The above argument could be formalized by calculating performance curves for the two algorithms, as discussed in [Ko92, chapter 8]. [Ko95] asserts that these curves would have permitted a meaningful comparison of the algorithms to be made, but does not provide them, citing the large computational expense that would supposedly have to be incurred. Actually, the relevant portion of the I(M,i,z) curve for RMHC could be estimated in one day by doing fifty runs out to 100,000 candidates. Since this is roughly the same amount of work as a single run of genetic search on this problem, estimating the performance curve for genetic search _would_ be a daunting task. DON'T USE RMHC: the power of multi-layer perceptrons I would like to emphasize that hill climbing algorithm described in [La95] was not intended to be either novel or good, and that I do NOT advocate its use. By deliberately using a bad version of an old algorithm, I sought to underline the negative character of my results. My positive advice is this: when learning functions, use multi-layer perceptrons. By adopting this representation for hypotheses one can exploit the powerful gradient-based search procedures that have been developed by the numerical analysis community. To illustrate the advantages of this approach, I invested 7 seconds of computer time in ten runs of conjugate gradient search for MLP weights to compute 5-bit parity. The resulting candidate to solution ratio was 393. This is roughly 750 times better than RMHC on boolean circuits, and 7400 times better than genetic search on boolean circuits. candidates found a examined solution? ---------- ---------- 31 yes 43 yes 62 yes 151 yes 196 no, stuck in local optimum 239 no, stuck in local optimum 271 no, stuck in local optimum 274 no, stuck in local optimum 308 no, stuck in local optimum 392 yes Unlike RMHC and genetic search, the conjugate gradient search procedure used here has no control parameters to tweak, and should yield good results in the hands of any user. Also, it detects when it is stuck in a local optimum, thus permitting an immediate restart to be made from random weights. This transforms the iterated methodology from an after-the-fact accounting system into a truly useful algorithm. Notes on the MLP network used in the above experiment: 5 input units 5 hidden units computing tanh 1 output unit computing tanh random initial weights drawn uniformly from the interval [-1,+1] true and false encoded by +1 and -1 REFERENCES [Ko95] John Koza, "A Response to the ML-95 Paper entitled "Hill Climbing Beats Genetic Search on a Boolean Circuit Synthesis Task of Koza's"", informally published and distributed document. [La95] Kevin Lang, "Hill Climbing Beats Genetic Search on a Boolean Circuit Synthesis Task of Koza's", The Twelfth International Conference on Machine Learning, pp. 340-343, 1995. Note: a copy of this paper can be obtained by anonymous ftp from ftp.nj.nec.com:/pub/kevin/lang-ml95.ps [Ko92] John Koza, "Genetic Programming", MIT Press, 1992, pp. 205-236. From walter.fetter at mandic.com.br Wed Aug 16 02:22:00 1995 From: walter.fetter at mandic.com.br (WALTER FETTER) Date: Wed, 16 Aug 95 03:22:00 -0300 Subject: subscription References: <8ACB049.012C0082FD.uuout@mandic.com.br> Message-ID: <8AF40CA.012C010C4D.uuout@mandic.com.br> A non-text attachment was scrubbed... Name: not available Type: text Size: 166 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/7d71163e/attachment-0001.ksh From jbower at bbb.caltech.edu Tue Aug 15 21:44:07 1995 From: jbower at bbb.caltech.edu (Jim Bower) Date: Tue, 15 Aug 95 18:44:07 PDT Subject: GENESIS 2.0 Message-ID: <9508160144.AA06021@bbb.caltech.edu> ------------------------------------------------------------------------ This is to announce the release of GENESIS 2.0, a major revison of the GENESIS simulator. GENESIS is a general purpose simulation platform which was developed to support the simulation of neural systems ranging from complex models of single neurons to simulations of large networks made up of more abstract neuronal components. GENESIS has provided the basis for laboratory courses in neural simulation at both Caltech and the Marine Biological Laboratory in Woods Hole, MA, as well as several other institutions. Most current GENESIS applications involve realistic simulations of biological neural systems. Although the software can also model more abstract networks, other simulators are more suitable for backpropagation and similar connectionist modeling. The current version of GENESIS and its graphical front-end XODUS are written in C and run under UNIX on Sun (SunOS 4.x or Solaris 2.x), DECstation (Ultrix), Silicon Graphics (Irix 4.0.1 and up) or x86 PC (Linux or FreeBSD) machines with X-windows (versions X11R4, X11R5, and X11R6). Within the next two months, we expect to complete the port of GENESIS to DEC Alphas (OSF1 v2 and v3), IBM RS6000s (AIX) and HPs (HPUX). Other platforms may be capable of running GENESIS, but the software has not been tested by Caltech outside of these environments. The GENESIS ftp site also contains the first release of Parallel GENESIS, designed for networks of workstations (NOW), symmetric multiprocessors (SMP) and massively parallel processors (MPP). This release is known to run on SGI/Irix and has run on these workstations: Sun4/Solaris, Alpha/OSF1.3, DecStation/Ultrix4.3, Sun4/SunOS. It has run on these SMPs: SGI-Challenge/Irix 5.3, Sun4MP/Solaris. It will soon be ported to the Cray T3D/E MPP and later to Intel Paragon and IBM SP2 MPPs. GENESIS 2.0 also now runs on 486 and Pentium PC's under Linux and FreeBSD. In addition to these extensions, 2.0 includes a number of changes and improvements in the source code for portability, stablity, and consistency. Numerous changes in the GENESIS objects also add flexibility, especially when constructing network simulations. In addition, the XODUS graphical interface has been completely rewritten and is now independent of the Athena widget set. This allows greater interaction with simulations using the mouse (rescaling of graphs by click and drag, restructuring of simulation elements with drag and drop operations, etc.). A script converter is included which translates GENESIS 1 scripts to GENESIS 2. The GENESIS 2.0 release also includes updates of the simulation scripts for all tutorials and examples used in ``The Book of GENESIS'' (see below). This release includes a completely revised and expanded manual and on-line help. Acquiring GENESIS via free FTP distribution: We have made the current release of GENESIS (ver. 2.0, August 1995) available via FTP from genesis.bbb.caltech.edu (131.215.5.249). The distributed compressed tar file is about 3 MB in size. The current distribution includes full source code and documentation for both GENESIS and XODUS as well as fourteen tutorial simulations. Documentation for these tutorials is included along with online GENESIS help files and postscript files for generating the newly revised printed manual. To acquire the software use 'ftp' to connect to genesis.bbb.caltech.edu and login as the user "anonymous", giving your full email address as the password. You can then 'cd /pub/genesis' and download the software. Be sure to download the files LATEST.NEWS and README for information about new features of the current GENESIS version, the files on the system, and installation instructions. A detailed guide to the GENESIS neuroscience tutorials and to the construction of GENESIS simulations is given in: The Book of GENESIS: Exploring Realistic Neural Models with the GEneral NEural SImulation System, by James M. Bower and David Beeman, published by TELOS/Springer-Verlag -- ISBN 0-387-94019-7 For ordering information, contact info at telospub.com, or phone (in the US) 1-800-777-4643. BABEL - GENESIS users group Serious users of GENESIS are advised to join the users group, BABEL. Members of BABEL are entitled to access the BABEL directories and email newsgroup. These are used as a repository for the latest contributions by GENESIS users and developers. These include new simulations, libraries of cells and channels, additional simulator components, new documentation and tutorials, bug reports and fixes, and the posting of questions and hints for setting up GENESIS simulations. As the results of GENESIS research simulations are published, many of these simulations are being made available through BABEL. New developments are announced in a newsletter which is sent by email to all members. Members are able to access the BABEL directories and transfer files to and from their host machines using a passworded ftp account. Inquiries concerning GENESIS should be addressed to genesis at bbb.caltech.edu. Inquiries concerning BABEL memberships should be sent to babel at bbb.caltech.edu. Other information concerning GENESIS, including "snapshots" of GENESIS simulations and descriptions of research which has been conducted with GENESIS may be found on the GENESIS World Wide Web Server: http://www.bbb.caltech.edu/GENESIS From wgm at santafe.edu Wed Aug 16 16:03:37 1995 From: wgm at santafe.edu (Bill Macready) Date: Wed, 16 Aug 95 14:03:37 MDT Subject: No subject Message-ID: <9508162003.AA20822@sfi.santafe.edu> In his recent posting, Kevin Lang continues a contest with John Koza of the "my search algorithm beats your search algorithm according to the following performance measure for the following (contrived) fitness function" variety. Interested readers of connectionist should know that over the space of all fitness functions, no matter what the performance measure, any two search algorithms have exactly the same expected performance. This is discussed in the paper mentioned below. That paper goes on to discuss other more interesting issues, like head-to-head minimax distinctions between search algorithms, the information theoretic and geometric aspects of search, time-varying fitness functions, etc. Interested readers should also know that there was a (very) lengthy thread on this topic on the ga-list several months ago. In particular, some articles following up on the paper mentioned below are announced there. For example, an article by Radcliffe and Surry is announced there, as is an article by Macready and Wolpert on intrisically hard fitness functions (as opposed to functions that are hard with respect to some particular search algorithm). *** We are also currently involved in research with a student to exhaustively characterize the set of fitness functions for which searh algorithm A does much better than algorithm B and how that set differs for the set for which B outperforms A. In particular, we are doing this for the case where the algorithms are hill-climbers and GA's, as in Lang's debate with Koza. In addition, search is (from a formal perspective) almost identical to active learning, optimal experimental design, and (a less exact match) control theory. We are also currently investigating what those other fields have to offer for search. Anyone interested in receiving the results of that work as it gets written up can email us at wgm at santafe.edu or dhw at santafe.edu Bill Macready *** Here are the original paper announcements: ftp-file-name: nfl.ps No Free Lunch Theorems for Search D.H. Wolpert, W.G. Macready We show that all algorithms that search for an extremum of a cost function perform exactly the same, when averaged over all possible cost functions. In particular, if algorithm A outperforms algorithm B on some cost functions, then loosely speaking there must exist exactly as many other functions where B outperforms A. Starting from this we analyze a number of the other a priori characteristics of the search problem, like its geometry and its information-theoretic aspects. This analysis allows us to derive mathematical benchmarks for assessing a particular search algorithm's performance. We also investigate minimax aspects of the search problem, the validity of using characteristics of a partial search over a cost function to predict future behavior of the search algorithm on that cost function, and time-varying cost functions. We conclude with some discussion of the justifiability of biologically-inspired search methods. ------------------------------------------------------------- ftp-file-name: hard.ps What Makes an Optimization Problem Hard? W.G. Macready, D.H. Wolpert We address the question, ``Are some classes of combinatorial optimization problems intrinsically harder than others, without regard to the algorithm one uses, or can difficulty only be assessed relative to particular algorithms?'' We provide a measure of the hardness of a particular optimization problem for a particular optimization algorithm. We then present two algorithm-independent quantities that use this measure to provide answers to our question. In the first of these we average hardness over all possible algorithms for the optimization problem at hand. We show that according to this quantity, there is no distinction between optimization problems, and in this sense no problems are intrinsically harder than others. For the second quantity, rather than average over all algorithms we consider the level of hardness of a problem (or class of problems) for the algorithm that is optimal for that problem (or class of problems). Here there are classes of problems that are intrinsically harder than others. To obtain an electronic copy of these papers: ftp ftp.santafe.edu login: anonymous password: cd /pub/wgm get quit Then at your system: lpr -P If you have trouble getting any of these papers electronically, you can request a hard copy from publications (wp at santafe.edu), Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM, USA, 87501. From bap at scr.siemens.com Thu Aug 17 01:13:12 1995 From: bap at scr.siemens.com (Barak Pearlmutter) Date: Thu, 17 Aug 1995 01:13:12 -0400 Subject: Strawman: 2, GP: 0 In-Reply-To: <9508162003.AA20822@sfi.santafe.edu> (message from Bill Macready on Wed, 16 Aug 95 14:03:37 MDT) Message-ID: <199508170513.BAA21287@gull.scr.siemens.com> It might be true that in a universe where everything was equally likely, all search algorithms would be equally awful. But that does not appear to be the universe that we live it, so it is not unreasonable to ask whether some particular search algorithm performs well in practice. But this point is somewhat moot, because Kevin Lang's recent post was not claiming that "his" "new" algorithm was better than current algorithms. Lang published a well reasoned and careful scientific paper, which cast doubt on the practicality and importance of a particular algorithm (GP) by showing that on a very simple problem *taken from the GP book* it compares unfavorably to a silly strawman algorithm. This paper passed scientific muster, and was accepted into ML, a respected refereed conference. John Koza responded by distributing a lengthy unrefereed screed, the bulk of which consisted of vicious invective and distorted half-truths. A small part of Koza's monograph had some actual technical content: it gave some new numbers on a scaled-up version of the problem in question, and interpreted them as showing that GP scaled better than the algorithm Lang had compared it to. However, Koza's interpretation was wrong: looking carefully at the raw numbers shows that even in Koza's hands, GP is scaling horribly worse than the silly strawman algorithm Lang had compared it to. That is the point of Lang's recent post. From workshop at Physik.Uni-Wuerzburg.DE Thu Aug 17 15:55:33 1995 From: workshop at Physik.Uni-Wuerzburg.DE (workshop) Date: Thu, 17 Aug 95 15:55:33 MESZ Subject: workshop and autumn school in Wuerzburg/Germany Message-ID: <199508171355.PAA17261@wptx14.physik.uni-wuerzburg.de> Second Announcement and Call for Abstracts INTERDISCIPLINARY AUTUMN SCHOOL AND WORKSHOP ON NEURAL NETWORKS: APPLICATION, BIOLOGY, AND THEORY October 12-14 (school) and 16-18 (workshop), 1995 W"urzburg, Germany INVITED SPEAKERS INCLUDE: M. Abeles, Jerusalem A. Aertsen, Rehovot J.K. Anlauf, Siemens AG J.P. Aubin, Paris M. Biehl, W"urzburg C. v.d. Broeck, Diepenbeek M. Cottrell, Paris G. Deco, Siemens AG B. Fritzke, Bochum Th. Fritsch, W"urzburg J. G"oppert, T"ubingen L.K.Hansen, Lyngby M. Hemberger,Daimler Benz L.van Hemmen, M"unchen J.A. Hertz, Copenhagen J. Hopfield, Pasadena I. Kanter, Ramat-Gan P. Kraus, Bochum B. Lautrup, Copenhagen W. Maass, Graz Th. Martinetz, Siemens AG M. Opper, Santa Cruz H. Scheich, Magdeburg S. Seung, AT&T Bell-Lab. W. Singer, Frankfurt S.A. Solla, Copenhagen H. Sompolinsky, Jerusalem M. Stemmler, Pasadena F. Varela, Paris A. Weigend, Boulder AUTUMN SCHOOL, Oct. 12-14: Introductory lectures on theory and applications of neural nets for graduate students and interested postgraduates in biology, medicine, mathematics, physics, computer science, and other related disciplines. Topics include neuronal modelling, statistical physics, hardware and application of neural nets, e.g. in telecommunication and biological data analysis. WORKSHOP, Oct. 16-18: Biology, theory, and applications of neural networks with particular emphasis on the interdisciplinary aspects of the field. There will be only invited lectures with ample time for discussion. In addition, poster sessions will be scheduled. REGISTRATION: Recommended before AUGUST 31, per FAX or (E-)MAIL to Workshop on Neural Networks, Inst. f"ur Theor. Physik, Julius-Maxmimilians-Universit"at Am Hubland, D-97074 W"urzburg, Germany Fax: +49 931 888 5141 E-mail: workshop at physik.uni-wuerzburg.de The registration fee is DM 150,- for the Autumn school and DM 150,- for the Workshop, due upon arrival (cash only). Students pay DM 80,- for each event (student ID required). ABSTRACTS: Participants who wish to present a poster should submit title and abstract together with their registration, preferably by E-mail. Deadline for the registration of poster contributions is AUGUST 31. ACCOMMODATION: Please use the appended form to contact directly the W"urzburg Tourist Office for room reservations (Fax +49 931 37652). NOTE THAT THIS NUMBER WAS WRONG IN THE FIRST ANNOUNCEMENT!!!!!!!!! We strongly recommend to arrange accommodation as soon as possible as various other conferences are scheduled for the same period of time. FTP-SERVER: Updated information (program, abstracts etc.) is available via anonymous ftp from the site ftp.physik.uni-wuerzburg.de, directory /pub/workshop/. Retrieve file README for further instructions. ORGANIZING COMMITTEE: M. Biehl, Th. Fritsch, W. Kinzel, Univ. W"urzburg. SCIENTIFIC ADVISORY COUNCIL: D. Flockerzi, K.-D. Kniffki, W. Knobloch, M. Meesmann, T. Nowak, F. Schneider, P. Tran-Gia, Universit"at W"urzburg. SPONSORS: Peter Beate Heller-Stiftung im Stifterverband f. die Deutsche Wissenschaft, Research Center of Daimler Benz AG, Stiftung der St"adt. Sparkasse W"urzburg. --------------------------------cut here---------------------------------- Registration Form Please return to: Workshop on Neural Networks Institut f"ur Theoretische Physik Julius-Maximilians-Universit"at Am Hubland D-97074 W"urzburg, Germany Fax: +49 931 888 5141 E-mail : workshop at physik.uni-wuerzburg.de I will attend the Autumn School Oct. 12-14 [ ] * (Reg. fee DM 150,- [ ] / 80,- [ ] due upon arrival) * Workshop Oct. 16-18 [ ] * (Reg. fee DM 150,- [ ] / 80,- [ ] due upon arrival) * * Please mark, reduced fee applies only for participants with valid student-ID. I wish to present a poster [ ] (If yes, please send a title page with a 10-line abstract!) Name: Affiliation: Address: Phone: Fax: E-mail: (please provide full postal address in any case!) Signature: --------------------------cut here, print out and fill in ------------------ To the Congress and Tourismus Zentrale Am Congress Centrum D- 97079 Wuerzburg Fax: ( +49) 931 37652 PLEASE NOTE: In case of room reservations the Tourist Office only proceeds as an agent. Your request should arrive here early enough to allow us to accommodate you and send you a reply (about one week) For this reservation we will charge you DM 5,- which you will have to pay with your hotel bill. REF: Workshop on Neural Networks Institut fuer Theoretische Physik Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - I hereby reserve ....... single rooms ....... double rooms LOCATION of the accommodation [ ] inner city [ ] city [ ] suburbs PRICE (per night, including breakfast) single double room with bath/shower/WC [ ] from DM 90,- [ ] from DM 130,- room with bath/shower/WC [ ] from DM 130,- [ ] from DM 180,- room with bath/shower/WC [ ] from DM 180,- [ ] from DM 250,- DATE of arrival ............. for ........... night(s) TIME of arrival (approximately) .............. h by car/train SIGNATURE and date : SENDER: (print letters) Company/institute : Mrs/Mr : Street : Zip Code : City : Phone : Fax : From kruschke at croton.psych.indiana.edu Thu Aug 17 10:16:29 1995 From: kruschke at croton.psych.indiana.edu (John Kruschke) Date: Thu, 17 Aug 1995 09:16:29 -0500 (EST) Subject: report announcement: Five principles of category learning Message-ID: <9508171416.AA12437@croton.psych.indiana.edu> ========================================================== Kruschke, J. K. & Erickson, M. A. (to appear). Five principles for models of category learning. Invited chapter in: Z. Dienes (ed.), Connectionism and Human Learning. Oxford, England: Oxford University Press. ABSTRACT: The primary goal of this chapter is to report a connectionist model that integrates five principles of category learning previously implemented separately. Previous work by Kruschke (1995) modeled the generalization phase of the ``inverse base-rate effect'' (Medin and Edelson, 1988), but did not address performance in the learning phase. That work emphasized the principles of rapid attention shifts and consistent use of base rate knowledge. Subsequent work by Kruschke and Bradley (1995) addressed the learning phase of a simpler categorization task, and emphasized the principles of short-term memory and strategic guessing. The present chapter integrates principles from both previous reports, and applies the integrated model to both the learning and generalization phases of the inverse base-rate effect. PostScript for this chapter, and for the previous papers cited in the abstract, may be retrieved from the Research section of my Web page, which has address (URL) listed below. John K. Kruschke e-mail: kruschke at indiana.edu Dept. of Psychology office: (812) 855-3192 Indiana University lab: (812) 855-9613 Bloomington, IN 47405-1301 USA fax: (812) 855-4691 URL= http://silver.ucs.indiana.edu/~kruschke/home.html ========================================================== From bert at mbfys.kun.nl Thu Aug 17 11:05:13 1995 From: bert at mbfys.kun.nl (Bert Kappen) Date: Thu, 17 Aug 1995 17:05:13 +0200 Subject: Paper dynamic linking, paper active vision Message-ID: <199508171505.RAA14729@septimius.mbfys.kun.nl> The following two papers are available by anonymous FTP. DO NOT FORWARD TO OTHER GROUPS No hardcopies available Dynamic linking in Stochastic Networks Hilbert J. Kappen, Marcel J. Nijman To be presented at the International Conference on Brain Processes, Theories and Models to take place in Las Palmas de Gran Canaria, Spain, Nov 12-17, 1995. FTP-host: ftp.mbfys.kun.nl FTP-file: /snn/pub/reports/Kappen.Dyn.Link.ps.Z Abstract: It is well established that cortical neurons display synchronous firing for some stimuli and not for others. The resulting synchronous subpopulation of neurons is thought to form the basis of object perception. In this paper this 'binding' problem is formulated for Boltzmann Machines. Feed-forward connections implement feature detectors and lateral connections implement memory traces or cell assemblies. We show, that dynamic linking can be solved in the Ising model where sensory input provides local evidence. The lateral connections in the hidden layer provide global correlations between features that belong to the same stimulus and no correlations between features from different stimuli. Learning Active Vision Hilbert J. Kappen, Marcel J. Nijman, Tonnie van Moorsel To be presented at ICANN'95, October 1995, Paris FTP-host: ftp.mbfys.kun.nl FTP-file: snn/pub/reports/Kappen.Active.Vision.ps.Z Abstract: In this paper we introduce a new type of problem which we call active decision. It consists of finding the optimal subsequent action, based both on partial observation and on previously learned knowledge. We propose a method for solution, based on Boltzmann Machine learning of joint input-output probabilities and on an entropy minimization criterion. We show how the method provides a basic mechanism for active vision tasks such as saccadic eye movements. From byuhas at ucs.att.com Thu Aug 17 15:14:18 1995 From: byuhas at ucs.att.com (byuhas@ucs.att.com) Date: Thu, 17 Aug 95 14:14:18 EST Subject: No subject Message-ID: <9507178086.AA808665603@netlink.ucs.att.com> ********************************************************************** ************ PLEASE DO NOT REPOST ************** ********************************************************************** JOB OPPORTUNITY: NEURAL NETWORK FINANCIAL APPLICATIONS An opportunity exists for someone interested in applying neural networks to problems in the credit-card industry, including:risk modeling, marketing, fraud detection and related areas. This neural network effort will coincide with two other groups: one using more traditional statistical approaches and the other using AI and expert systems. Our goal is to combine and compare these methods to achieve the best possible results. Many of the problems we are interested in have two characteristics: 1) two-class discrimination is often the primary goal, with one class being significantly more prevalent than the other 2) huge amounts of data are available for modeling To be successful, candidate should have significant experience applying neural networks and be able to develop and implement novel algorithms, specialized error criteria and lever learning strategies to attack these problems. Qualifications of interest are: * a Masters degree, with a Ph.D. preferred * knowledge and experience with neural network modeling * 'C' programming in a UNIX environment * experience with SAS or S-Plus Compensation is competitive if not generous. ATT-UCS is located about 12 miles from the beach, just south of Jacksonville, FL. (The #1 medium-size city in Money's best places to live list out this week. ) If you have any questions or want to apply, you can contact me by: email: byuhas at ucs.att.com or phone: 904-954-7376 or fax: 904-954-8718. or Ben Yuhas ATT-UCS 8787 Baypine Road Jacksonville, FL 32256 From esann at dice.ucl.ac.be Thu Aug 17 14:57:26 1995 From: esann at dice.ucl.ac.be (esann@dice.ucl.ac.be) Date: Thu, 17 Aug 1995 20:57:26 +0200 Subject: Neural Processing Letters Vol.2 No.4 Message-ID: <199508171855.UAA05881@ns1.dice.ucl.ac.be> Neural Processing Letters: new issue Vol.2 No.4 ----------------------------------------------- You will find enclosed the table of contents of the July 1995 issue of "Neural Processing Letters" (Vol.2 No.4). The abstracts of these papers are contained on the below mentioned FTP and WWW servers. We also inform you that subscription to the journal is now possible by credit card. All necessary information is contained on the following servers: - FTP server: ftp.dice.ucl.ac.be directory: /pub/neural-nets/NPL - WWW server: http://www.dice.ucl.ac.be/neural-nets/NPL/NPL.html If you have no access to these servers, or for any other information (subscriptions, instructions for authors, free sample copies,...), please don't hesitate to contact directly the publisher: D facto publications 45 rue Masui B-1210 Brussels Belgium Phone: + 32 2 245 43 63 Fax: + 32 2 245 46 94 Neural Processing Letters, Vol.2, No.4, July 1995 _________________________________________________ - Evolving neural networks with iterative learning scheme for associative memory Shigetaka Fujita, Haruhiko Nishimura - Combining Singular-Spectrum Analysis and neural networks for time series forecasting F. Lisi, O. Nicolis, Marco Sandri - A simplified MMC model for the control of an arm with redundant degrees of freedom U. Steink?hler, W.-J. Beyn, H. Cruse - On the statistical physics of radial basis function networks Sean B. Holden, Mahesan Niranjan - Accelerated training algorithm for feedforward neural networks based on least squares method Y.F.Yam and Tommy W.S.Chow - On the search for new learning rules for ANNs Samy Bengio, Yoshua Bengio, Jocelyn Cloutier _____________________________ D facto publications - conference services 45 rue Masui 1210 Brussels Belgium tel: +32 2 245 43 63 fax: +32 2 245 46 94 _____________________________ From mkearns at research.att.com Thu Aug 17 16:13:00 1995 From: mkearns at research.att.com (Michael J. Kearns) Date: Thu, 17 Aug 95 16:13 EDT Subject: COLT '96 Call for Papers Message-ID: ______________________________________________________________________ PRELIMINARY CALL FOR PAPERS---COLT '96 Ninth Conference on Computational Learning Theory Desenzano del Garda, Italy June 28 -- July 1, 1996 ______________________________________________________________________ The Ninth Conference on Computational Learning Theory (COLT '96) will be held in the town of Desenzano del Garda, Italy, from Friday, June 28, through Monday, July 1, 1996. COLT '96 is sponsored by the Universita` degli Studi di Milano. We invite papers in all areas that relate directly to the analysis of learning algorithms and the theory of machine learning, including neural networks, statistics, statistical physics, Bayesian/MDL estimation, reinforcement learning, inductive inference, knowledge discovery in databases, robotics, and pattern recognition. We also encourage the submission of papers describing experimental results that are supported by theoretical analysis. ABSTRACT SUBMISSION. Authors should submit fifteen copies (preferably two-sided) of an extended abstract to: Michael Kearns --- COLT '96 AT&T Bell Laboratories, Room 2A-423 600 Mountain Avenue Murray Hill, New Jersey 07974-0636 Telephone(for overnight mail): (908) 582-4017 Abstracts must be RECEIVED by FRIDAY JANUARY 12, 1996. This deadline is firm. We also anticipate allowing electronic submissions. Details of the electronic submission procedure will be provided in an updated Call for Papers, and will be available from the web site http://www.cs.cmu.edu/~avrim/colt96.html which will also be used to provide other program-related information. Authors will be notified of acceptance or rejection on or before Friday, March 15, 1996. Final camera-ready papers will be due by Friday, April 5. Papers that have appeared in journals or other conferences, or that are being submitted to other conferences, are not appropriate for submission to COLT. An exception to this policy is that COLT and STOC have agreed that a paper can be submitted to both conferences, with the understanding that a paper will be automatically withdrawn from COLT if accepted to STOC. ABSTRACT FORMAT. The extended abstract should include a clear definition of the theoretical model used and a clear description of the results, as well as a discussion of their significance, including comparison to other work. Proofs or proof sketches should be included. If the abstract exceeds 10 pages, only the first 10 pages may be examined. A cover letter specifying the contact author and his or her email address should accompany the abstract. PROGRAM FORMAT. At the discretion of the program committee, the program may consist of both long and short talks, corresponding to longer and shorter papers in the proceedings. The short talks will also be coupled with a poster presentation. PROGRAM CHAIRS. Avrim Blum (Carnegie Mellon University) and Michael Kearns (AT&T Bell Laboratories). CONFERENCE AND LOCAL ARRANGEMENTS CHAIRS. Nicolo` Cesa-Bianchi (Universita` di Milano) and Giancarlo Mauri (Universita` di Milano). PROGRAM COMMITTEE. Martin Anthony (London School of Economics), Avrim Blum (Carnegie Mellon University), Bill Gasarch (University of Maryland), Lisa Hellerstein (Northwestern University), Robert Holte (University of Ottawa), Sanjay Jain (National University of Singapore), Michael Kearns (AT&T Bell Laboratories), Nick Littlestone (NEC Research Institute), Yishay Mansour (Tel Aviv University), Steve Omohundro (NEC Research Institute), Manfred Opper (University of Wuerzburg), Lenny Pitt (University of Illinois), Dana Ron (Massachusetts Institute of Technology), Rich Sutton (University of Massachusetts) COLT, ML, AND EUROCOLT. The Thirteenth International Conference on Machine Learning (ML '96) will be held right after COLT '96, on July 3--7 in Bari, Italy. In cooperation with COLT, the EuroCOLT conference will not be held in 1996. STUDENT TRAVEL. We anticipate some funds will be available to partially support travel by student authors. Details will be distributed as they become available. From lorincz at iserv.iki.kfki.hu Fri Aug 18 05:22:36 1995 From: lorincz at iserv.iki.kfki.hu (Andras Lorincz) Date: Fri, 18 Aug 1995 11:22:36 +0200 Subject: No subject Message-ID: <9508180922.AA17280@iserv.iki.kfki.hu> The following recent publications of the Adaptive Systems Lab of the Attila Jozsef University of Szeged and the Hungarian Academy of Sciences can be accessed on WWW via the URL http://iserv.iki.kfki.hu/asl-publs.html and http://iserv.iki.kfki.hu/qua-publs.html. Olah, M. and Lorincz, A. (1995) Analog VLSI Implementation of Grassfire Transformation for Generalized Skeleton Formation ICANN'95 Paris, accepted Abstract A novel analog VLSI circuit is proposed that implements the grassfire transformation for calculating the generalized skeleton of a planar shape. The fully parallel VLSI circuit can perform the calculation in real time. The algorithm, based on an activation spreading process on an artificial neural network, can be implemented by an extended nonlinear resistive network. Architecture and building blocks are outlined, the feasibility of this technique is investigated. Szepesvari, Cs. (1995) General Framework for Reinforcement Learning, ICANN'95 Paris, accepted Abstract We set up a general framework for the investigation of decision processes. The framework is based on the so called one step look-ahead (OLA) cost mapping. The cost of a policy is defined by the successive application OLA mapping. This way various decision criterions (e.g. the expected value criterion or the worst-case criterion) can be treated in a unified way. The main theorem of this article says that under minimal conditions optimal stationary policies are greedy w.r.t. the optimal cost function and vice versa. Based on this result we hope that previous results on reinforcement learning can be generalized to other decision criterions that fit the proposed framework. Marczell, Zs., Kalmar, Zs. and Lorincz, A. (1995) Generalized skeleton formation for texture segmentation Neural Network World, accepted Abstract An algorithm and an artificial neural architecture that approximates the algorithm are proposed for the formation of generalized skeleton transformations. The algorithm includes the original grassfire proposal of Blum and is extended with an integrative on-center off-surround detector system. It is shown that the algorithm can elicit textons by skeletonization. Slight modification of the architecture corresponds to the Laplace transformation followed by full wave rectification, another algorithm for texture discrimination proposed by Bergen and Adelson. Szepesvari, Cs. (1995) Perfect Dynamics for Neural Networks Talk presented at Mathematics of Neural Networks and Applications Lady Margaret Hall, Oxford, July, 1995 Abstract The vast majority of artificial neural network (ANN) algorithms may be viewed as massively parallel, non-linear numerical procedures that solve certain kind of fixed point equations in an iterative manner. We consider the use of such recurrent ANNs to solve optimization problems. The dynamics of such networks is often based on the minimization of an energy-like function. An important (and presumably hard) problem is to exclude the possibility of "spurious local minima". In this article we take another starting point and that is to consider perfect dynamics. We say that a recurrent ANN admits perfect dynamics if the dynamical system given by the update operator of the network has an attractor whose basin of attraction covers the set of all possible initial solution candidates. One may wonder whether neural networks that admit perfect dynamics can be interesting in applications. In this article we show that there exist a family of such networks (or dynamics). We introduce Generalised Dynamic Programming (GenDyP) for the government of the dynamics. Roughly speaking a GenDyP problem is a 3-tuple (X,A,Q), where X and A are arbitrary sets and Q maps functions of X into functions of X x A. GenDyP problems derive from sequential decision problems and dynamic programming (DP). Traditional DP procedures correspond to special selections of the mapping Q. We show that if Q is monotone and satisfies other reasonable (continuity and boundedness) conditions then the above iteration converges to a distinguished solution of the functional equation u(x)=inf_a (Q u)(x,a), that is the generalization of the well known Bellmann Optimality Equation. The proofs relies on the relation between the above dynamics and the optimal solutions of generalized multi-stage decision problems (we give the definitions in the next section). Kalmar, Zs., Szepesvari, Cs. and Lorincz, A. (1995) Generalized Dynamic Concept Model as a Route to Construct Adaptive Autonomous Agents Neural Network World 3:353--360 Abstract A model of adaptive autonomous agents, that (i) builds internal representation of events and event relations, (ii) utilizes activation spreading for building dynamic concepts and (iii) makes use of the winner-take-all paradigm to come to a decision is extended by introducing generalization into the model. The generalization reduces memory requirements and improves performance in unseen scenes as it is indicated by computer simulations J.Toth, G., Kovacs, Sz, and Lorincz, A. (1995) Genetic algorithm with alphabet optimization Biological Cybernetics 73:61-68 Abstract In recent years genetic algorithm (GA) was used successfully to solve many optimization problems. One of the most difficult questions of applying GA to a particular problem is that of coding. In this paper a scheme is derived to optimize one aspect of the coding in an automatic fashion. This is done by using a high cardinality alphabet and optimizing the meaning of the letters. The scheme is especially well suited in cases where a number of similar problems need to be solved. The use of the scheme is demonstrated on such a group of problems: on the simplified problem of navigating a `robot' in a `room'. It is shown that for the sample problem family the proposed algorithm is superior to the canonical GA. J.Toth, G. and Lorincz, A. (1995) Genetic algorithm with migration on topology conserving maps Neural Network World 2:171--181 Abstract Genetic algorithm (GA) is extended to solve a family of optimization problems in a self-organizing fashion. The continuous world of inputs is discretized in an optimal fashion with the help of a topology conserving neural network. The GA is generalized to organize individuals into subpopulations associated with the neurons. Interneuron topology connections are used to allow gene migration to neighboring sites. The method speeds up GA by allowing small subpopulations, but still saving diversity with the help of migration. Within a subpopulation the original GA was applied as the means of evolution. To illustrate the this modified GA the optimal control of a simulated robot-arm is treated: a falling ping-pong ball has to be caught by a bat without bouncing. It is demonstrated that the simultaneous optimization for an interval of height can be solved, and that migration can considerably reduce computation time. Other aspects of the algorithm are outlined. Amstrup, B., J.Toth, G., Szabo, G., Rabitz, H., and Lorincz, A. (1995) Genetic algorithm with migration on topology conserving maps for optimal control of quantum systems Journal of Physical Chemistry, 99, 5206-5213 Abstract The laboratory implementation of molecular optimal control has to overcome the problem caused by the changing environmental parameters, such as the temperature of the laser rod, the resonator parameters, the mechanical parameters of the laboratory equipment, and other dependent parameters such as time delay between pulses or the pulse amplitudes. In this paper a solution is proposed: instead of trying to set the parameter(s) with very high precision, their changes are monitored and the control is adjusted to the current values. The optimization in the laboratory can then be run at several values of the parameter(s) with an extended genetic algorithm (GA) wich is tailored to such parametric optimization. The extended GA does not presuppose but can take advantage and, in fact, explores whether the mapping from the parameter(s) to the optimal control field is continuous. Then the optimization for the different values of the parameter(s) is done cooperatively, which reduces the optimization time. A further advantage of the method is its full adaptiveness; i.e., in the best circumstances no information on the the system or laboratory equipment is required, and only the success of the control needs to be measured. The method is demonstrated on a model problem: a pump-and-dump type model experiment on CsI. The WWW pages also contain pointers to other older publications that are available on line. Papers are available also by anonymous ftp through iserv.iki.kfki.hu/pub/papers Best regards, Andras Lorincz Department of Adaptive Systems also Department of Photophysics Attila Jozsef University of Szeged Institute of Isotopes Dom ter 9 Hungarian Academy of Sciences Szeged Konkoly-Thege 29-33 Hungary, H-6720 Budapest, P.O.B. 77 Hungary, H-1525 email: lorincz at iserv.iki.kfki.hu From wgm at santafe.edu Fri Aug 18 13:42:01 1995 From: wgm at santafe.edu (wgm@santafe.edu) Date: Fri, 18 Aug 95 11:42:01 MDT Subject: No subject Message-ID: <9508181742.AA00176@yaqui> In response to the recent postings of Kevin Lang and Bill Macready and David Wolpert, Barak Pearlmutter writes: >>> But this point is somewhat moot, because Kevin Lang's recent post was not claiming that "his" "new" algorithm was better than current algorithms. >>> Nor did we ever assume he was making this claim. We simply wanted to point to our work that resulted from many of the same motivations and investigates such issues from the most general perspective. >>> Lang published a well reasoned and careful scientific paper, which cast doubt on the practicality and importance of a particular algorithm (GP) by showing that on a very simple problem *taken from the GP book* it compares unfavorably to a silly strawman algorithm. This paper passed scientific muster, and was accepted into ML, a respected refereed conference. >>> Again we want to stress we were not attacking Lang's work *at all*. We couldn't agree more that the GP community needs more comparisons of its results to other techniques. For those interested in such comparisons I would also point to the papers of Una-May O'Reilly. They can be found at http://www.santafe.edu/sfi/publications/94wplist.html *** As regards No Free Lunch issues in optimization and supervised learning, >>> It might be true that in a universe where everything was equally likely, all search algorithms would be equally awful. >>> It is NOT true that "in a universe where everything was equally likely, all search algorithms would be equally awful." For example, there are major head-to-head minimax distinctions between algorithms. (The true magnitude of those distinctions is coming to light in the work with our student we mentioned in our previous post.) This is a crucially important point. There is a huge amount of structure distinguishing algorithms even in "a universe where everything was equally likely". All that would be "equally awful" in such a universe is expected performance. And of course in supervised learning there are other major a priori distinctions between algorithms. E.g., for quadratic loss, if you give me a learning algorithm with *any* random component (backprop with random initial weights anyone?), I can always come up with an algorithm with assuredly (!) superior performance, *independent of the target*. (And therefore independent of the "universe that we live in".) Bill Macready and David Wolpert From bap at scr.siemens.com Fri Aug 18 14:45:14 1995 From: bap at scr.siemens.com (Barak Pearlmutter) Date: Fri, 18 Aug 1995 14:45:14 -0400 Subject: paper announcement Message-ID: <199508181845.OAA22582@gull.scr.siemens.com> The following paper, to appear in Neural Computation, is available via ftp to archive.cis.ohio-state.edu:/pub/neuroprose/pearlmutter.vcdifn.ps.Z. VC Dimension of an Integrate-and-Fire Neuron Model Anthony M. Zador Barak A. Pearlmutter Salk Institute Siemens Corporate Research 10010 N. Torrey Pines Rd. 755 College Road East La Jolla, CA 92037 Princeton, NJ 08540 zador at salk.edu bap at scr.siemens.com ABSTRACT We compute the VC dimension of a leaky integrate-and-fire neuron model. The VC dimension quantifies the ability of a function class to partition an input pattern space, and can be considered a measure of computational capacity. In this case, the function class is the class of integrate-and-fire models generated by varying the integration time constant and the threshold, the input space they partition is the space of continuous-time signals, and the binary partition is specified by whether or not the model reaches threshold at some specified time. We show that the VC dimension diverges only logarithmically with the input signal bandwidth. We also extend this approach to arbitrary passive dendritic trees. The main contributions of this work are (1) it offers a formal treatment of the computational capacity of a dynamical system; and (2) it provides a framework for analyzing the computational capabilities of the dynamical systems defined by networks of real neurons. ---------------------------------------------------------------- Thanks for Jordan Pollack for maintaining the neuroprose archive. From terry at salk.edu Sat Aug 19 15:04:51 1995 From: terry at salk.edu (Terry Sejnowski) Date: Sat, 19 Aug 95 12:04:51 PDT Subject: Neural Computation 7:5 Message-ID: <9508191904.AA10190@salk.edu> NEURAL COMPUTATION September 1995 Volume 7 Number 5 Review: Methods for Combining Experts' Probability Assessments Robert A. Jacobs Letters: The Helmholtz Machine Peter Dayan, Geoffrey E. Hinton, Radford M. Neal, and Richard S. Zemel Spontaneous Excitations in the Visual Cortex: Stripes, Spirals, Rings and Collective Bursts Corinna Fohlmeister, Wulfram Gerstner, Raphael Ritz, and J. Leo van Hemmen Time-Domain Solutions of Oja's Equations J. L. Wyatt, Jr. and I. M. Elfadel Learning the Initial State of a Second-Order Recurrent Neural Network during Regular-Language Inference Mikel L. Forcada and Rafael C. Carrasco An Algebraic Framework to Represent Finite State Machines in Single-Layer Recurrent Neural Networks R. Alquezar and A. Sanfeliu Local And Global Optimization Algorithms For Generalized Learning Automata V. V. Phansalkar and M. A. L. Thathachar From lbl at nagoya.bmc.riken.go.jp Sat Aug 19 23:32:26 1995 From: lbl at nagoya.bmc.riken.go.jp (Bao-Liang Lu) Date: Sun, 20 Aug 1995 12:32:26 +0900 Subject: Paper available:Inverse Kinematics via Network Inversions Message-ID: <9508200332.AA08838@xian> The following paper, to appear in Proc. of IEEE ICNN'95, Perth, Australia, is available via FTP. FTP-host:ftp.bmc.riken.go.jp FTP-file:pub/publish/Lu/lu-ieee-icnn95.ps.Z ========================================================================== TITLE: Regularization of Inverse Kinematics for Redundant Manipulators Using Neural Network Inversions AUTHORS: Bao-Liang Lu (1) Koji Ito (1,2) ORGANISATIONS: (1) The Institute of Physical and Chemical Research (2) Toyohashi University of Technology ABSTRACT: This paper presents a new approach to regularizing the inverse kinematics problem for redundant manipulators using neural network inversions. This approach is a four-phase procedure. In the first phase, the configuration space and associated workspace are partitioned into a set of regions. In the second phase, a set of modular neural networks is trained on associated training data sets sampled over these regions to learn the forward kinematic function. In the third phase, the multiple inverse kinematic solutions for a desired end-effector position are obtained by inverting the corresponding modular neural networks. In the fourth phase, an ``optimal" inverse kinematic solution is selected from the multiple solutions according to a given criterion. This approach has an important feature in comparison with existing methods, that is, both the inverse kinematic solutions located in the multiple solution branches and the ones that belong to the same solution branch can be found. As a result, better control of the manipulator using the optimum solution than that using an ordinary solution can be achieved. This approach is illustrated with a three-joint planar arm. (6 pages. No hard copies available.) Bao-Liang Lu --------------------------------------------- Bio-Mimetic Control Research Center, The Institute of Physical and Chemical Research (RIKEN) 3-8-31 Rokuban, Atsuta-ku, Nagoya 456, Japan Phone: +81-52-654-9137 Fax: +81-52-654-9138 Email: lbl at nagoya.bmc.riken.go.jp From M.Q.Brown at ecs.soton.ac.uk Mon Aug 21 12:05:09 1995 From: M.Q.Brown at ecs.soton.ac.uk (martin brown) Date: Mon, 21 Aug 1995 17:05:09 +0100 Subject: ISIS web entry Message-ID: <17225.199508211605@diana.ecs.soton.ac.uk> The Image, Speech and Intelligent Systems (ISIS) research group now has an entry at: http://www-isis.ecs.soton.ac.uk/ We've been doing research into many different aspects of neural and neurofuzzy theory for the past 6 years, especially applied to classification and identification and control theory. A lot of work has been done on neurofuzzy networks and the development of automatic construction algorithms. These pages include: 1) Publication details together with abstracts. 2) Project details. 3) Personnel details. 4) Neurofuzzy information service (conferences, software, other homepages etc.) Martin Brown ------------------------------------------------------ ISIS research group, Email: mqb at ecs.soton.ac.uk Room 3025, Zepler Building, Tel: +44 (0)1703 594984 Dept. of Electronics and Computer Science, Fax: +44 (0)1703 594498 University of Southampton, WWW: http://www-isis.ecs.soton.ac.uk/ Highfield, Southampton, SO17 1BJ, UK From charles at playfair.Stanford.EDU Tue Aug 22 13:47:05 1995 From: charles at playfair.Stanford.EDU (charles@playfair.Stanford.EDU) Date: Tue, 22 Aug 95 10:47:05 -0700 Subject: Paper available: Visualization of High-Dimensional Functions Message-ID: <199508221747.KAA06364@tukey.Stanford.EDU> The following doctoral dissertation (168 pages) is now available electronically: ftp://playfair.stanford.edu/pub/roosen/thesis.ps.Z A short (6 page) proceedings paper on the same material is also available: ftp://playfair.stanford.edu/pub/roosen/asa95.ps.Z Charles Roosen charles at playfair.stanford.edu Department of Statistics http://playfair.stanford.edu/~roosen/ Stanford University Title ----- Visualization and Exploration of High-Dimensional Functions Using the Functional ANOVA Decomposition Abstract -------- In recent years the statistical and engineering communities have developed many high-dimensional methods for regression (e.g.\ MARS, feedforward neural networks, projection pursuit). Users of these methods often wish to explore how particular predictors affect the response. One way to do so is by decomposing the model into low-order components through a functional ANOVA decomposition and then visualizing the components. Such a decomposition, with the corresponding variance decomposition, also provides information on the importance of each predictor to the model, the importance of interactions, and the degree to which the model may be represented by first and second-order components. This manuscript develops techniques for constructing and exploring such a decomposition. It begins by suggesting techniques for constructing the decomposition either numerically or analytically, proceeds to describe approaches to plotting and interpreting effects, and then develops methodology for rough inference and model selection. Extensions to the GLM framework are discussed briefly. From mw at isds.Duke.EDU Tue Aug 22 15:30:51 1995 From: mw at isds.Duke.EDU (Mike West) Date: Tue, 22 Aug 1995 15:30:51 -0400 Subject: No subject Message-ID: 1996 JOINT STATISTICAL MEETINGS Chicago, August 4-8, 1996 ********************************************** ASA Section on BAYESIAN STATISTICAL SCIENCES ********************************************** CALL FOR PROPOSALS: SPECIAL CONTRIBUTED PAPER SESSIONS This is an initial call for proposals for Special Contributed Paper Sessions for the Section on Bayesian Statistical Sciences program at the 1996 Joint Meetings. We already have several proposals, at least in embryo, and are looking for many more. Please consider putting together a session, and alert others who may be interested in organising a session. We have a few months before formal sessions, tentative titles, etc., are needed, but it is not too soon to begin to think about session themes and to talk to potential participants. Those of you who have already contacted me with ideas should now be going the next step to confirm your topics and speakers, and let me have the revised session proposal. At this stage, suggestions and ideas for sessions need not identify a full list of speakers and discussants, but you should provide a general idea of the topic and focus, and some names at least tentatively agreed. The ASA theme for the 1996 meetings is "Challenging the Frontiers of Knowledge Using Statistical Science" and is intended to highlight new statistical developments at the forefront of the discipline -- theory, methods, applications, and cross-disciplinary activities. Accordingly, ASA will be designating selected sessions as "Theme" sessions. Suggestions for SBSS sessions obviously in tune with this theme, involving topics of real novelty and importance, new directions of development in Bayesian statistics, and reflecting the current vibrancy of the discipline, are particularly encouraged. And remember that other ASA sections can co-sponsor sessions; SBSS will be vigorously seeking co-sponsorship in many cases. *SPECIAL REQUEST* The Chicago meetings will likely attract well over 4,000 participants. Many will not be seriously attracted by many of the Bayesian talks; not because of Bayesian emphases, but because of technical/mathematical level and orientation. For example, there will be many participants from various corners of industry/commerce/government that do not have PhDs. We would like to reach these kinds of people. One way of attempting this is to have one or more sessions more or less targeted at the practising "applied" statistician and at a lower technical level than typical. One possibility is a panel discussion, involving participants with strong industrial/consulting/non-academic backgrounds who will discuss issues and experiences of practical Bayesian statistics in their areas. A variant would have several speakers making "case study" presentations. Your inputs are solicited. FORMAT: A Special Contributed Paper Session typically involves a collections of either five short talks (20mins), or four talks plus a discussion, on a specific theme. One alternative is to have a panel discussion involving up to five panelists. All sessions need a chair. Please contact me with suggestions and ideas for themes and speakers. Mike West, 1996 SBSS Program Chair ISDS, Duke University, Durham, NC 27708-0251 mw at isds.duke.edu http://www.isds.duke.edu From hochreit at informatik.tu-muenchen.de Wed Aug 23 05:15:10 1995 From: hochreit at informatik.tu-muenchen.de (Josef Hochreiter) Date: Wed, 23 Aug 1995 11:15:10 +0200 Subject: TR announcement: Long Short Term Memory Message-ID: <95Aug23.111512+0200_met_dst.116236+322@papa.informatik.tu-muenchen.de> FTP-host: flop.informatik.tu-muenchen.de (131.159.8.35) FTP-filename: /pub/fki/fki-207-95.ps.gz or FTP-host: fava.idsia.ch (192.132.252.1) FTP-filename: /pub/juergen/fki-207-95.ps.gz or something like netscape http://www.idsia.ch/~juergen LONG SHORT TERM MEMORY Technical Report FKI-207-95 (8 pages, 50 K) Sepp Hochreiter Juergen Schmidhuber Fakultaet fuer Informatik IDSIA Technische Universitaet Muenchen Corso Elvezia 36 80290 Muenchen, Germany 6900 Lugano, Switzerland ``Recurrent backprop'' for learning to store information over extended time periods takes too long. The main reason is insufficient, decaying error back flow. We describe a novel, efficient ``Long Short Term Memory'' (LSTM) that overcomes this and related problems. Unlike previous approaches, LSTM can learn to bridge arbitrary time lags by enforcing constant error flow. Using gradient descent, LSTM explicitly learns when to store information and when to access it. In experimental comparisons with ``Real-Time Recurrent Learning'', ``Recurrent Cascade-Correlation'', ``Elman nets'', and ``Neural Sequence Chunking'', LSTM leads to many more successful runs, and learns much faster. Unlike its competitors, LSTM can solve tasks involving minimal time lags of more than 1000 time steps, even in noisy environments. If you don't have gzip/gunzip, we can mail you an uncompressed postscript version (as a last resort). Comments welcome. Sepp Hochreiter Juergen Schmidhuber From Voz at dice.ucl.ac.be Wed Aug 23 12:58:09 1995 From: Voz at dice.ucl.ac.be (Jean-Luc Voz) Date: Wed, 23 Aug 1995 18:58:09 +0200 Subject: ELENA Classification databases and technical reports available Message-ID: <199508231658.SAA11205@ns1.dice.ucl.ac.be> Dear colleagues, The partners of the Elena project are pleased to announce you the availability of several databases related to classification together with two technical reports. ELENA is an ESPRIT III Basic Research Action project (No. 6891) From S.Renals at dcs.shef.ac.uk Wed Aug 23 14:11:06 1995 From: S.Renals at dcs.shef.ac.uk (S.Renals@dcs.shef.ac.uk) Date: Wed, 23 Aug 1995 19:11:06 +0100 Subject: AbbotDemo: Continuous Speech Recognition Available by FTP Message-ID: <199508231811.TAA11146@elvis.dcs.shef.ac.uk> AbbotDemo is a near-real-time speaker-independent continuous speech recognition system for American and British accented English. The vocabulary size of this demonstration system is 5,000 words. The system uses a hybrid recurrent network/hidden Markov model acoustic model and a trigram language model. The recurrent network (which contains around 100,000 weights) was trained as a phone probability estimator using back-propagation through time. It is available by FTP from svr-ftp.eng.cam.ac.uk in directory /pub/comp.speech/binaries. The file AbbotDemo.README gives more information on this system and the remainder of the files provide the executables for various flavours of UNIX (Linux, SunOS4, HP-UX, IRIX). A 16 bit soundcard and a reasonable microphone are required. The Linux version is also available by FTP from sunsite.unc.edu (and mirrors) in directory /pub/Linux/apps/sound/speech. Sorry, but at this stage no sources and only limited documentation are provided. Although the task domain is focused on noise-free read speech from a north American business newspaper (e.g. the Wall Street Journal) we hope that this system provides a fair representation of the state of the art in large vocabulary speech recognition and that it will encourage the creation of novel applications. Tony Robinson (Cambridge University) Mike Hochberg (Cambridge University) Steve Renals (Sheffield University) and many many more. AbbotDemo URLs: ftp://svr-ftp.eng.cam.ac.uk/pub/comp.speech/binaries/ ftp://sunsite.unc.edu/pub/Linux/apps/sound/speech/ From wahba at stat.wisc.edu Wed Aug 23 15:19:00 1995 From: wahba at stat.wisc.edu (Grace Wahba) Date: Wed, 23 Aug 95 14:19:00 -0500 Subject: spline/ai/met paper ancts Message-ID: <9508231919.AA12253@hera.stat.wisc.edu> Announcing new/revised manuscripts available on the web .... http/www.stat.wisc.edu/~wahba click on `TRLIST' also available by ftp in files listed at the end of this msg Luo, Z. and Wahba, G. "Hybrid Adaptive Splines" TR 947, June 1995, submitted. --- Combines ideas from smoothing splines and MARS to obtain spatial adaptivity, numerical efficiency. Numerical comparisons to wavelet simulations in Donoho et al, JRSSB 57, 1995. Approach parallels Orr, Neural Computation 7,1995, which we just became aware of-similar results on a different class of examples --. Wahba, G., Wang, Y., Gu, C., Klein, R. and Klein, B. " Smoothing Spline ANOVA for Exponential Families, with Application to the Wisconsin Epidemiological Study of Diabetic Retinopathy." May 1995, to appear, Annals of Statistics. --Expanded and *slightly revised* version of TR 940, December 1994. This paper was the basis for the Neyman Lecture given at the Annual Meeting of the Institute of Mathematical Statistics at Chapel Hill 1994, delivered by the first author. Collects results from SS-ANOVA, algorithm for choosing multiple smoothing parameters for Bernoulli data, implementing confidence intervals. Applies results to the estimation of four- year risk of progression of diabetic retinopathy, using data from the Wisconsin Epidemiological Study of Diabetic Retinopathy. (data available in GRKPACK documentation above). ***A discussion of the backfitting algorithm and SS-ANOVA has been added***--. Xiang, D. and Wahba, G. " A Generalized Approximate Cross Validation for Smoothing Splines with Non-Gaussian Data." TR 930, September 1994, submitted. --Another look at choosing the regularization parameter for Bernoulli data. Parallels work of Moody, Liu in the NN literature. Wahba, G., Johnson, D. R., Gao, F. and Gong, J. " Adaptive tuning of numerical weather prediction models: Part I: randomized GCV and related methods in three and four dimensional data assimilation." TR 920, April 1994. *revised and shortened* version to appear, Monthly Weather Review --describes how to use the randomized trace method to implement GCV in very large problems in numerical weather prediction, approximate solution of pde's with discrete noisy observations. Also available by ftp in gzipped postscript in ftp.stat.wisc.edu/pub/wahba in the following files, respectively has.ps.gz exptl.ssanova.rev.ps.gz gacv.ps.gz tuning-nwp.rev.ps.gz From jstrout at UCSD.EDU Thu Aug 24 11:13:03 1995 From: jstrout at UCSD.EDU (Joseph Strout) Date: Thu, 24 Aug 1995 08:13:03 -0700 (PDT) Subject: Neuron Emulation List (ANNOUNCEMENT) Message-ID: ANNOUNCEMENT: Neuron Emulation List ----------------------------------- This message is to announce the opening of a new mailing list. Its purpose is the discussion of "neuron emulation" -- that is, the endeavor to recreate (in hardware or software) the functional properties of a particular neuron or small network of neurons. This is in contrast to the more common neural model, which recreates the general properties of an entire class of cells. Upon subscribing to the list, you will receive two copies of a document which is now out of date. The revised versions will probably not be posted until next week, due to personnel difficulties [read: postmaster on vacation]. However, the latest versions can be obtained from the Web site (below). Also at the web site is an archive of previous messages; the list has been open privately for a few weeks while we worked out bugs with the listserver. It seems to be running smoothly now, and as interest has been spreading, it seemed prudent to make the formal announcement. The NEL Web Site can be accessed through the following URL: http://sdcc3.ucsd.edu/~jstrout/nel/ Full instructions on using the list server, as well as etiquette guidelines and a preliminary FAQ, are available at that site. If you do not have access to the World Wide Web, contact me directly and I will forward you the relevant files. ,------------------------------------------------------------------. | Joseph J. Strout Department of Neuroscience, UCSD | | jstrout at ucsd.edu http://sdcc3.ucsd.edu/~jstrout/ | `------------------------------------------------------------------' From steven.young at psy.ox.ac.uk Fri Aug 25 09:01:24 1995 From: steven.young at psy.ox.ac.uk (Steven Young) Date: Fri, 25 Aug 1995 14:01:24 +0100 (BST) Subject: Oxford Summer School on Connectionist Modelling Message-ID: <199508251301.OAA00656@cogsci1.psych.ox.ac.uk> This is a last minute call for participation for the Oxford Summer School on Connectionist Modelling. (This is extremely short notice. Apologies) Full details are given below. There are only 3 places left for this year's Summer School. The Summer School is a 2-week residential summer school which provides an introduction to connectionist modelling. Please pass on this information to people you may know who would be interested. The inital call for participation follows, and further information is available via the World Wide Web at the following URL: http://cogsci1.psych.ox.ac.uk/summer-school/ (Please ignore the closing date in the initial call for participation. If you know of people who might wish to attend then they could contact Sue King via email or phone -- details below). Here is the initial call for participation: Oxford Summer School on Connectionist Modelling Department of Experimental Psychology, University of Oxford 10-22 September 1995 Applications are invited for participation in a 2-week residential Summer School on techniques in connectionist modelling of cognitive and biological phenomena. The course is aimed primarily at researchers who wish to exploit neural network models in their teaching and/or research. It will provide a general introduction to connectionist modelling through lectures and exercises on Power PCs. The instructors with primary responsibility for teaching the course are Kim Plunkett and Edmund Rolls. No prior knowledge of computational modelling will be required though simple word processing skills will be assumed. Participants will be encouraged to start work on their own modelling projects during the Summer School. The cost of participation in the Summer School is 500 to include accommodation (bed and breakfast at St. John's College) and registration. Participants will be expected to cover their own travel and meal costs. A small number of graduate student scholarships may be available. Applicants should indicate whether they wish to be considered for a graduate student scholarship but are advised to seek their own funding as well, since in previous years the number of graduate student applications has far exceeded the number of scholarships available. If you are interested in participating in the Summer School, please contact: Mrs Sue King Department of Experimental Psychology South Parks Road University of Oxford Oxford OX1 3UD Tel: +44 (1865) 271 353 Email: sking at psy.ox.ac.uk Please send a brief description of your background with an explanation of why you would like to attend the Summer School (one page maximum) no later than 31st January 1995. Regards, Steven Young -- Facility for Computational Modelling in Cognitive Science McDonnell-Pew Centre for Cognitive Neuroscience, Oxford Department of Experimental Psychology, University of Oxford From stokely at atax.eng.uab.edu Fri Aug 25 16:02:51 1995 From: stokely at atax.eng.uab.edu (Ernest Stokely) Date: Fri, 25 Aug 1995 15:02:51 -0500 Subject: Faculty Position in Biomed. Engr. Message-ID: Tenure Track Faculty Position -------------------------- The Department of Biomedical Engineering at the University of Alabama at Birmingham has an opening for a tenure-track faculty member. The opening is being filled as part of a Whitaker Foundation Special Opportunities Award for a training and research program in functional and structural imaging of the brain. Candidates are particularly sought in neurosystems modelling, biological neural networks,computational neurobiology, or other multidisciplinary areas that combine neurobiology and imaging. More senior candidates and candidates with established research programs are of particular interest, in order that the UAB program can be quickly established. The person selected for this position will be expected to form active research collaborations with other units in the medical center part of the UAB campus. Candidates should have a Ph.D. degree in an appropriate field, and must be a U.S. citizen or have permanent residency in the U.S. The search will be continued until the position is filled. UAB is an autonomous campus within the University of Alabama system. UAB faculty currently are involved in over $180 million of externally funded grants and contracts. A 4.1 Tesla clinical NMR facility for cardiovascular and brain research, several other small-bore MR systems, a Philips Gyroscan system, and a team of research scientists and engineers working in various aspects of MR imaging and spectroscopy are housed only 200 meters from the School of Engineering. In addition, the brain imaging project will involve the opportunity for collaborations with members of the Neurobiology Research Center, a new Cognitive Science Graduate Program, as well as faculty members from the Departments of Neurology, Psychiatry, and Radiology. To apply send a letter of application, a current curriculum vitae, and three letters of reference to Dr. Ernest Stokely, Department of Biomedical Engineering, BEC 256, University of Alabama at Birmingham, Birmingham, Alabama 35294-4461. The University of Alabama at Birmingham is an equal opportunity, affirmative action employer, and encourages applications from women and minorities. Ernest Stokely Chair, Department of Biomedical Engineering BEC 256 University of Alabama at Birmingham Birmingham, Alabama 35294-4461 Internet: stokely at atax.eng.uab.edu FAX: (205) 975-4919 Phone: (205) 934-8420 From baluja at GS93.SP.CS.CMU.EDU Tue Aug 29 00:08:39 1995 From: baluja at GS93.SP.CS.CMU.EDU (Shumeet Baluja) Date: Tue, 29 Aug 95 00:08:39 EDT Subject: Face Detection Test Set Available Message-ID: In response to our recently announced paper on face detection, we have received several requests for our test images. In order to facilitate accurate comparisons, we have made our images available on-line. *** We recommend that you use these images for testing purposes only. Training systems based on these images decreases the value of this database as an independent test set. *** Data Set Specifics: 42 Compuserve GIF format gray-scale images. 169 Labeled Face Locations. Our work concentrated on finding frontal views of faces in scenes, so the list of faces includes mainly those faces looking towards the camera; extreme side views are ignored. These faces vary in size from ~20 to several hundred pixels. Image lighting/noise/graininess/contrast also varies; images were taken from CCD cameras, scanned photographs, scanned magazine & newspaper pictures, and handrawn images. To get the images: http://www.ius.cs.cmu.edu/IUS/dylan_usr0/har/faces/test/index.html A demo of our system, and copies of our paper, can be found in: http://www.cs.cmu.edu/~baluja http://www.cs.cmu.edu/~har If you use this databse, please let us know. Send any questions/comments to: shumeet baluja (baluja at cs.cmu.edu) henry rowley (har at cs.cmu.edu) From dhw at santafe.edu Wed Aug 30 20:46:30 1995 From: dhw at santafe.edu (David Wolpert) Date: Wed, 30 Aug 95 18:46:30 MDT Subject: Paper announcement Message-ID: <9508310046.AA17730@sfi.santafe.edu> *** PAPER ANNOUNCEMENT *** ON BIAS PLUS VARIANCE by D. Wolpert Abstract: This paper presents a Bayesian additive "correction" to the familiar quadratic loss bias-plus-variance formula. It then discusses some other loss-function-specific aspects of supervised learning. It ends by presenting a version of the bias-plus-variance formula appropriate for log loss, and then the Bayesian additive correction to that formula. Both the quadratic loss and log loss correction terms are a covariance, between the learning algorithm and the posterior distribution over targets. Accordingly, in the context in which those terms apply, there is not a "bias-variance trade-off", or a "bias-variance dilemma", as one often hears. Rather there is a bias-variance-covariance trade-off. This tech report is retrievable by anonymous ftp to ftp.santafe.edu. Go to pub/dhw_ftp, and retrieve either bias.plus.ps.Z or bias.plus.Z.encoded for compressed or uuencoded compressed postscript, respectively. Comments welcomed.