From ZECCHINA at to.infn.it Mon Jul 1 06:10:42 1996 From: ZECCHINA at to.infn.it (Riccardo Zecchina - tel.11-5647358, fax. 11-5647399) Date: Mon, 1 Jul 1996 12:10:42 +0200 (MET-DST) Subject: Paper available on the Random K-Satisfiability Problem Message-ID: <960701121042.28201d6e@to.infn.it> The following paper on algorithmic complexity is available by FTP. STATISTICAL MECHANICS OF THE RANDOM K-SATISFIABILITY PROBLEM by Remi Monasson and Riccardo Zecchina. ABSTRACT: The Random K-Satisfiability Problem, consisting in verifying the existence of an assignment of $N$ Boolean variables that satisfy a set of $M=\alpha N$ random logical clauses containing $K$ variables each, is studied using the replica symmetric framework of disordered systems. The detailed structure of the analytical solution is discussed for the different cases of interest $K=2$, $K\ge 3$ and $K\gg 1$. We present an iterative scheme allowing to obtain exact and systematically improved solutions for the replica symmetric functional order parameter. The caculation of the number of solutions, which allowed us [Phys. Rev. Lett. 76, 3881 (1996)] to predict a first order jump at the threshold where the Boolean expressions become unsatisfiable with probability one, is thoroughly displayed. In the case $K=2$, the (rigourously known) critical value ($=1$) of the number of clauses per Boolean variable is recovered while for $K\ge 3$ we show that the system exhibits a replica symmetry breaking transition. The annealed approximation is proven to be exact for large $K$. (30 pages + 8 figures) Retrieval information: FTP-host: ftp.polito.it FTP-pathname: /pub/people/zecchina/tarksat.gz URL: ftp://ftp.polito.it/pub/people/zecchina From matteo at nwu.edu Mon Jul 1 16:56:26 1996 From: matteo at nwu.edu (Matteo Carandini) Date: Mon, 1 Jul 1996 15:56:26 -0500 Subject: Symposium on Orientation Selectivity Message-ID: A Satellite Symposium to the 1996 Computation and Neural Systems CNS*96 Conference **** ORIENTATION SELECTIVITY IN V1. IS AN AGREEMENT POSSIBLE? ***** Wednesday, July 17, 7-10 pm Bldg. E 25, Room 401 Department of Brain and Cognitive Sciences, MIT Organized by Matteo Carandini (Northwestern) and David Somers (MIT) There is currently no agreement over whether the orientation selectivity of cells in the primary visual cortex results from the feed-forward arrangement of subcortical inputs or from intracortical feedback. This debate has gone on for about 30 years, and has recently been heated by the modeling work of Somers et al (J Neurosci 95), Douglas et al (Science 95), Suarez et al (J Neurosci 95) and Ben-Yishai et al (PNAS 95) and somewhat cooled by the experimental results of Ferster et al (Nature 96) and of Reid and Alonso (Nature 95). We encourage those with active research interests in this topic to attend, but also welcome those with more casual interest. We aim to stimulate discussion on the following issues: - The available evidence. The two sides in the debate sometimes cite the same references for opposite reasons. Let's discuss the evidence and decide what models are consistent with it. - The level of modeling. The existing models range from the very detailed (e.g Somers, Suarez), to the very simplified (e.g. Douglas, Ben-Yishai). What level of complexity should be achieved by a satisfactory model of orientation selectivity? - The ideal evidence. What would constitute unequivocal evidence against one of the two views? Is this evidence available or should new experiments be designed? *** Maximal participation by the audience will be encouraged. *** We have invited speakers with widely different opinions: - David Ferster (Northwestern) - Gary Holt (Caltech) - Xing Pei (Missouri) - Clay Reid (Harvard) - Dario Ringach (NYU) - Haim Sompolinsky (Hebrew U) A sizeable portion of the time will be spent in a free discussion. People with strong interest in the topic, like Bob Shapley (NYU), Mriganka Sur (MIT), Sacha Nelson (Brandeis) are expected to participate. --------------------------------------------------------------------------- Info on CNS*96 at http://www.bbb.caltech.edu/cns96/cns96.html From mccallum at cs.rochester.edu Mon Jul 1 19:29:55 1996 From: mccallum at cs.rochester.edu (Andrew McCallum) Date: Mon, 01 Jul 1996 19:29:55 -0400 Subject: Paper on RL, exploration, hidden state Message-ID: <199607012329.TAA06593@slate.cs.rochester.edu> The following paper on reinforcement learning, hidden state and exploration is available by FTP. Comments and suggestions are welcome. "Efficient Exploration in Reinforcement Learning with Hidden State" Andrew Kachites McCallum (submitted to NIPS) Abstract Undoubtedly, efficient exploration is crucial for the success of a learning agent. Previous approaches to directed exploration in reinforcement learning exclusively address exploration in Markovian domains, i.e. domains in which the state of the environment is fully observable. If the environment is only partially observable, they cease to work because exploration statistics are confounded between aliased world states. This paper presents Fringe Exploration, a technique for efficient exploration in partially observable domains. The key idea, (applicable to many exploration techniques), is to keep statistics in the space of possible short-term memories, instead of in the agent's current state space. Experimental results in a partially observable maze and in a difficult driving task with visual routines show dramatic performance improvements. Retrieval information: FTP-host: ftp.cs.rochester.edu FTP-pathname: /pub/papers/robotics/96.mccallum-nips.ps.gz URL: ftp://ftp.cs.rochester.edu/pub/papers/robotics/96.mccallum-nips.ps.gz From ndxdpran at rrzn-user.uni-hannover.de Tue Jul 2 05:36:47 1996 From: ndxdpran at rrzn-user.uni-hannover.de (ndxdpran@rrzn-user.uni-hannover.de) Date: Tue, 2 Jul 1996 11:36:47 +0200 (MET DST) Subject: ICOBIP'97 announcement Message-ID: <199607020936.LAA06163@sun1.rrzn-user.uni-hannover.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 1157 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/02420ef8/attachment.ksh From pierre at mbfys.kun.nl Tue Jul 2 11:04:50 1996 From: pierre at mbfys.kun.nl (Pi\"erre van de Laar) Date: Tue, 02 Jul 1996 17:04:50 +0200 Subject: sensitivity analysis and relevance Message-ID: <31D93A92.15FB7483@mbfys.kun.nl> Dear Connectionists, On my request for references to methods which perform sensitivity analysis and/or relevance determination of input fields, and especially methods which use neural networks, I received a large number of reactions with even a larger number of references. Due to the large size of the resulting list of references, I will not post it. People interested in this list of references can download it in bibtex, refer, or html format from ftp.mbfys.kun.nl in the directory snn/pub/pierre as file connectionists.bib , connectionists.refer ,or connectionists.html respectively. The URL for the HTML format is thus ftp://ftp.mbfys.kun.nl/snn/pub/pierre/connectionists.html Once again, I would like to thank all people who sent their references about these topics to me. Greetings, -- Pi\"erre van de Laar Department of Medical Physics and Biophysics, University of Nijmegen, The Netherlands http://www.mbfys.kun.nl/~pierre/ mailto:pierre at mbfys.kun.nl P.S. New references are, of course, still welcome. From psarroa at westminster.ac.uk Tue Jul 2 04:41:30 1996 From: psarroa at westminster.ac.uk (Alexandra Psarrou) Date: Tue, 2 Jul 1996 09:41:30 +0100 (BST) Subject: Research post in Face Recognition Message-ID: <199607020841.JAA27566@jaguar.wmin.ac.uk> Research Post in Face Recognition CENTRE FOR ARTIFICIAL INTELLIGENCE RESEARCH Sir George Cayley Research Institute University of Westminster Applications are invited for the position of a Research Assistant in the Centre for AI Research of the University of Westminster to work in a one year research project in Machine Vision and Neural Networks. The successful candidate will undertake research in the area of Dynamic Face Recognition.The aim of this project is to exploit existing machine vision and neural network techniques for developing a framework for dynamic face recognition based on photometric representations. Applicants for this post should be educated to degree level within a relevant discipline (preferably computer science), and possess a working knowledge of C/C++/X-Windows on Unix platforms. Knowledge of image processing and neural network techniques will be an advantage. The post is available from July 1996 and the person appointed will be expected to start as soon as possible. Salary scales: Research A: 11,388 - 15,026 pounds sterling p.a., (including London allowance) - for more information about the project phone/email or send your CV to: Dr. Alexandra Psarrou School of Computer Science and Information Systems Engineering University of Westminster 115 New Cavendish Str London W1M 8JS Tel: (+44) - 171-911-5000 ext 3599 Fax: (+44) - 171-911-5089 Email: psarroa at westminster.ac.uk - for more information about the AI Research centre check our web page: http://www.scsise.wmin.ac.uk/AI/AI_Division.html From ndxdpran at rrzn-user.uni-hannover.de Wed Jul 3 05:49:54 1996 From: ndxdpran at rrzn-user.uni-hannover.de (ndxdpran@rrzn-user.uni-hannover.de) Date: Wed, 3 Jul 1996 11:49:54 +0200 (MET DST) Subject: ICOBIP'97 - correction of WWW home page Message-ID: <199607030949.LAA18829@sun1.rrzn-user.uni-hannover.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 1160 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/3d993bec/attachment.ksh From c.k.i.williams at aston.ac.uk Wed Jul 3 09:53:50 1996 From: c.k.i.williams at aston.ac.uk (Chris Williams) Date: Wed, 03 Jul 1996 15:53:50 +0200 Subject: Post-doc research position at Aston University, England Message-ID: <10110.199607031353@sun.aston.ac.uk> Postdoctoral Research Fellowship at Aston University, England Combining Spatially-Distributed Predictions from Neural Networks The Neural Computing Research Group at Aston is looking for a highly motivated individual for a 2 year postdoctoral research position in the area of "Combining Spatially-Distributed Predictions from Neural Networks", working with Dr. Chris Williams and Dr. Ian Nabney. The aim of this project is to develop methods for the fusion of spatially-distributed predictions from neural networks with prior knowledge about possible spatial patterns. This post is funded by a grant from the Engineering and Physical Sciences Research Council (UK), in collaboration with British Aerospace and the Meteorological Office. Potential candidates should have strong mathematical and computational skills, with a background one or more of neural networks, Bayesian belief networks and statistical Markov chain Monte Carlo computation. Neural networks have been used very successfully in a wide variety of domains for performing classification or regression tasks. A characteristic of most currently successful applications is that the input patterns are either independent (as in static pattern classification) or related over time, rather than being spatially distributed. To extend the use of neural networks to spatially distributed tasks, such as the prediction of a wind vector-field from remote-sensing data, typically it is necessary to combine local bottom-up predictions (wind vector predictions on a pixel-by-pixel basis) with global prior knowledge (typical wind-field configurations, including weather fronts). This combination can be achieved by using Bayes' theorem to obtain the posterior distribution for the features of interest (the wind-field). The project will apply this framework in the areas of remote sensing, the segmentation of images, and object recognition. Closing date: 29 July, 1996. Informal enquiries can be made by email to Chris Williams (C.K.I.Williams at aston.ac.uk) or to Ian Nabney (I.T.Nabney at aston.ac.uk). The target start date is October 1996, although this may be somewhat flexible, More information on the Neural Computing Research Group and the postdoc position is available from http://www.ncrg.aston.ac.uk/ Salaries will be up to point 6 on the RA 1A scale, currently 15,986 UK pounds. These salary scales are subject to annual increments. If you wish to be considered for this position, please send a full CV and publications list, together with the names of 3 referees, to: Dr. Chris Williams Neural Computing Research Group Department of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: +44 121 333 4631 Fax: +44 121 333 4586 e-mail: C.K.I.Williams at aston.ac.uk (email submission of postscript files is welcome) From mrj at dcs.ed.ac.uk Wed Jul 3 10:43:08 1996 From: mrj at dcs.ed.ac.uk (Mark Jerrum) Date: Wed, 3 Jul 1996 15:43:08 +0100 Subject: ICMS Workshop on the Vapnik-Chervonenkis Dimension, Edinburgh Message-ID: <17667.9607031443@ox.dcs.ed.ac.uk> [For distribution on the connectionist mailing list: thanks!] ************************************************************** *** *** *** ICMS WORKSHOP on the VAPNIK-CHERVONENKIS DIMENSION *** *** Edinburgh, 9th--13th September 1996 *** *** *** *** An interdisciplinary meeting of interest to *** *** probabilists, statisticians, theoretical computer *** *** scientists, and the machine learning community *** *** *** ************************************************************** The International Centre for Mathematical Sciences (ICMS) at Edinburgh will hold a Workshop on the Vapnik-Chervonenkis Dimension(*) in the week 9th--13th September 1996. The workshop will take place at the ICMS's headquarters at 14 India Street, Edinburgh, the birthplace of James Clerk Maxwell, which has recently been adapted to support meetings with about 50 participants. We (the organisers or the workshop) envisage a multidisciplinary meeting covering the topic in all its aspects: probability and statistics, computational learning theory, geometry, and applications in computer science. The following invited speakers have agreed to participate: Shai Ben-David, Technion, Haifa, Israel; David Haussler, University of California at Santa Cruz, USA; Jiri Matousek, Charles University, Prag, Czech Republic; V. N. Vapnik, AT&T Bell Laboratories, Holmdel, NJ, USA. A registration form is available from the workshop's WWW page at http://www.dcs.ed.ac.uk/~mrj/VCWorkshop/ (also accessible from the ICMS home page). Alternatively, intending participants may contact the ICMS by post or e-mail: Margaret Cook ICMS 14 India Street Edinburgh EH3 6EZ Scotland Phone: +44 (0)131-220-1777 Fax: +44 (0)131-220-1053 E-mail: icms at maths.ed.ac.uk Those interested in participating should return the registration form as soon as possible, as the total number of places is limited by the size of the venue. There will be ample scope for contributed talks. Mark Jerrum, Angus MacIntyre, and John Shawe-Taylor (Workshop organisers) (*) The Vapnik-Chervonenkis (VC) dimension is a combinatorial parameter of a set system (equivalently, of a class of predicates) which, informally, can be said to characterise the expressibility of the class. This parameter is of great significance in a wide range of applications: in statistics, theoretical computer science, and machine learning, for example. In statistics, one may identify ``set'' with ``event,'' in which case finite VC dimension entails a _uniform_ analogue of the strong law of large numbers for the class of events in question. (This is the situation described by the phrase ``uniform convergence of empirical measure.'') In learning theory (the mathematical theory of inductive inference), one may identify ``set'' with ``concept,'' in which case the VC dimension of the concept class gives quite tight bounds on the sample size that is necessary and sufficient for a learner to form an accurate hypothesis from classified examples. From chandler at kryton.ntu.ac.uk Thu Jul 4 06:03:27 1996 From: chandler at kryton.ntu.ac.uk (chandler) Date: Thu, 4 Jul 1996 10:03:27 +0000 Subject: PhD Research Positions available, Nottingham England Message-ID: <9607040903.AA01426@kryton.ntu.ac.uk> VACANCIES Research (2 Bursary Students) Object Recognition for Assembly Human Centred Assembly Introduction The Manufacturing Automation Research Group (MARG) has been working on the development of Artificial Intelligence techniques for the recognition of 3-D objects. The aim of this work is to develop a system for the recognition of solid objects independent of their position and orientation within the work domain. This is to aid in the manipulation of objects within a robotic cell, particularly for the processes of assembly and other manipulative tasks . In the formation of this work, a novel method using ANN with parallels to the processing within the primate visual system, has been used. The Programme Object Recognition for Assembly The aim of this project is to improve the fundamental understanding of object recognition for use in assembly process. The major area of the research will be the implementation of novel techniques of object recognition, and provide position and rotation parameters to enable assembly tasks to be executed. The ANN techniques already developed in-house will be extended and integrated with the robot, providing invariant object recognition capability to the system. Additionally the geometric descriptors will be assessed for their validity/accuracy. Task level robotic operations can then use these descriptors as a base for further actions. Human Centred Assembly Whilst the sections of the research programme described above will provide both novel and effective robotic assembly, this section of the work seeks to draw the maximum knowledge from existing manual methods. It therefore provides an effective link, drawing knowledge from the manual operation and contributing to the learning of a manipulative skill by a machine. The work will analyse human centred assembly strategies and contrast them with automation techniques. Methods which are applicable to sensory challenged human assembly will be interpreted and applied to the sensor equipped robot. Applications invited from Graduates (Engineering, Science) with good classifications Bursary 6,000 UKP / annum for 3 years Applicants will be expected to register for a PhD programme. References 1) Keat J, Balendran V, Sivayoganathan K. 1995. Invariant Object Recognition with a Neurobiological Slant, Proceeding of the Fourth IEE International Conference on Artificial Neural Networks, Cambridge. 2) Keat J, Balendran V, Sivayoganathan K, Sackfield A. "IvOR: A Neurobiologically slanted approach to PSRI Object recognition", 5th Irish Neural Network Conference - INNC95, September 11-13, 1995, pp. 30-37, Maynooth, Ireland. 3) Howarth M, Sivayoganathan K, Thomas P, Gentle C.R., "Robotic task level programming using neural networks", 4th Int. Conf.on Artificial Neural Networks, 26-28 June 1995. pp 262-267. Churchill College, Cambridge. 4) Balendran V, Sivayoganathan K, Al-Dabass D. 1989. Detection of flaws on slowly varying surfaces, Proceedings of the Fifth National Conference on Production Research, London, Kogan Press, pp82-85. 5) Keat J, Balendran V, Sivayoganathan K, Sackfield A. 1994. 3-D data collection for object recognition, Advances in Manufacturing Technology VIII, Proceedings of the Tenth National Conference on Manufacturing Research, Loughborough, pp648-652. Contact: Dr. K. Sivayoganathan man3sivayk at ntu.ac.uk tel: +44(0)115 941 8418 ex 4112. Dr. S. Kennedy man3kennesj at ntu.ac.uk tel: +44(0)115 941 8418 ex 4106. Mr. M. Howarth m.howarth at marg.ntu.ac.uk tel: +44(0)115 941 8418 ex 4110. Manufacturing Automation Research Group Department of Manufacturing Engineering, Burton Street, Nottingham, NG1 4BU. fax: +44(0)115 941 4024 From kevin.swingler at psych.stir.ac.uk Thu Jul 4 12:42:54 1996 From: kevin.swingler at psych.stir.ac.uk (Kevin Swingler) Date: Thu, 4 Jul 96 12:42:54 BST Subject: New Book Applying Neural Networks Message-ID: <9607041142.AA21529@nevis.stir.ac.uk> *********************************************************************** NEW BOOK ANNOUNCEMENT Applying Neural Networks A Practical Guide Kevin Swingler Academic Press. ISBN: 0126791708 *********************************************************************** Description This book takes the most common neural network architecture--the multi-layer perceptron--and leads the reader through every step the development of a trained network. Chapters cover data collection, quantity, quality, validation, preparation and encoding; network arcitecture, size and training; error analysis ; network validation, confidence limits, sensitivity measures and rule derivation. The book also covers novelty detection and time series analysis. The book presents a set of procedures designed to ensure succsessful network development and is concluded with a set of demonstration chapters on the application of neural networks to signal processing, financial analysis and process control. Each chapter is divided into three sections: A general discussion without equations, a how-to-do-it section where equations and algorithms are layed out, and a set of worked examples. The book also comes with a disk of C and C++ programs which implement the techniques discussed. Ordering Applying Neural Networks may be ordered directly from Academic Press or from your usual retail outlets. From j.b.rogers at ic.ac.uk Fri Jul 5 05:43:07 1996 From: j.b.rogers at ic.ac.uk (j.b.rogers@ic.ac.uk) Date: Fri, 5 Jul 1996 10:43:07 +0100 (BST) Subject: NN course announcement Message-ID: <27816.9607050943@london.ee.ic.ac.uk.ee.ic.ac.uk> A non-text attachment was scrubbed... Name: not available Type: text Size: 2452 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/92a2d868/attachment.ksh From srx014 at coventry.ac.uk Mon Jul 8 07:57:04 1996 From: srx014 at coventry.ac.uk (Colin Reeves) Date: Mon, 8 Jul 1996 12:57:04 +0100 (BST) Subject: Research studentship at Coventry University In-Reply-To: <344.836193914@B.GP.CS.CMU.EDU> Message-ID: The following PhD studentship is available. Please send CVs either electronically or by snail-mail (address at foot of this message). ------------------------------------------------------------------------- Project ------- The application of artificial intelligence to the control of large, complex gas transmission systems. Control Theory and Applications Centre, and British Gas TransCo, Coventry University System Control, Hinckley Background ---------- British Gas TransCo is responsible for the transportation and storage of gas produced by offshore fields. Gas is delivered at 6 coastal terminals where it enters the TransCo pipeline system. System Control manages and controls the flow of gas from these terminals to 18 million consumers. Following a recent re -organisation, there are 4 Area Control Centres (ACCs), at each of which 6 teams of 4 engineers are responsible for the operation of their Local Delivery Zones (LDZs). In order to provide a secure and economic gas transportation service, the ACC engineers need to develop best practices and apply these consistently to the operation of the LDZs. This project will investigate the possible application of Artificial Intelligence techniques to support the operation of these large, complex gas transmission systems. Methodology ----------- The Control Theory and Applications Centre at Coventry University has successfully applied AI methods, including neural nets, fuzzy systems, genetic algorithms etc, to complex control problems. It is intended in this project to extend these approaches to the gas pipeline systems of TransCo. This will involve firstly the selection (in close collaboration with TransCo System Control) of a suitable subsystem for a feasibility study. Historical records of this subsystem will be used to identify important variables, and then to extract knowledge in the form of rules. Knowledge elicitation from the experts (the operations engineers ) will also be carried out. Probable techniques include the use of neuro-fuzzy methods and the application of genetic algorithms or other heuristics in mining the data archives. At appropriate times it is required that a competent report will be presented to TransCo. In addition there will be opportunities to present the work at conferences or to publish through technical journals. The successful candidate will register for a PhD, working under the supervision of Colin Reeves at Coventry University. Candidate profile ----------------- The student should have a first degree (at least a 2i) or MSc in an appropriate technological or engineering subject. Good general mathematical and computing skills are more important than the specific subject of the degree. It would be an advantage to have previous knowledge of AI methods such as those mentioned above. Knowledge of control systems and databases would also be useful, and the candidate needs to possess the appropriate inter-personal skills to work in an operational engineering environment. Timescale --------- The project will commence in September 1996 and will last for 3 years. ------------------------------------------------------------------------- ___________________________________________ | Colin Reeves | | School of Mathematical and Information | | Sciences | | Coventry University | | Priory St | | Coventry CV1 5FB | | tel :+44 (0)1203 838979 | | fax :+44 (0)1203 838585 | | email: CRReeves at coventry.ac.uk | |___________________________________________| From moeller at informatik.uni-bonn.de Tue Jul 9 04:06:27 1996 From: moeller at informatik.uni-bonn.de (Knut Moeller) Date: Tue, 9 Jul 1996 10:06:27 +0200 (MET DST) Subject: HeKoNN96-CfP Message-ID: <199607090806.KAA09957@macke.informatik.uni-bonn.de> This announcement was sent to various lists. Sorry if you recieved multiple copies. ----------------------------------------------------------------- WWW: http://set.gmd.de/AS/fg1.1.2/hekonn ----------------------------------------------------------------- CALL FOR PARTICIPATION ================================================================= = = = H e K o N N 9 6 = = = Autumn School in C o n n e c t i o n i s m and N e u r a l N e t w o r k s October 2-6, 1996 Muenster, Germany Conference Language: German ---------------------------------------------------------------- A comprehensive description of the Autumn School together with abstracts of the courses can be found at the following address: WWW: http://set.gmd.de/AS/fg1.1.2/hekonn = = = O V E R V I E W = = = Artificial neural networks (ANN's) have been discussed in many diverse areas, ranging from models of cortical learning to the control of industrial processes. The goal of the Autumn School in Connectionionism and Neural Networks is to give a comprehensive introduction to connectionism and artificial neural networks (ANN's) and to provide an overview of the current state of the art. Courses will be offered in five thematic tracks. (The conference language is German.) The FOUNDATION track will introduce basic concepts (A. Zell, Univ. Stuttgart) and theoretical issues. Hardwareaspects (U. Rueckert, Univ. Paderborn), Lifelong Learning (G. Paass, GMD St.Augustin), algorithmic complexity of learning procedures (M. Schmitt, TU Graz) and convergence properties of ANN's (K. Hornik, TU Vienna) are presented in further lectures. This year, a special track was devoted to BRAIN RESEARCH. Courses are offered about the simulation of biological neurons (R. Rojas, Univ. Halle), theoretical neurobiology (H. Gluender, LMU Munich), learning and memory (A. Bibbig, Univ. Ulm) and dynamical aspects of cortical information processing (H. Dinse, Univ. Bochum). In the track on SYMBOLIC CONNECTIONISM and COGNITIVE MODELLING, consists of courses on: procedures for extracting rules from ANN's (J. Diederich, QUT Brisbane). representation and cognitive models (G. Peschl, Univ. Vienna), autonomous agents and ANN's (R. Pfeiffer, ETH Zuerich) and hybrid systems (A. Ultsch, Univ. Marburg). APPLICATIONS of ANN's are covered by courses on image processing (H.Bischof, TU Vienna), evolution strategies and ANN's (J. Born, FU Berlin), ANN's and fuzzy logic (R. Kruse, Univ. Braunschweig), and on medical applications (T. Waschulzik, Univ. Bremen). In addition, there will be courses on PROGRAMMING and SIMULATORS. Participants will have the opportunity to work with the SNNS simulator (G. Mamier, A. Zell, Univ. Stuttgart) and the VieNet2/ECANSE simulation tool (G. Linhart, Univ. Vienna). DEADLINE for applications is August 1,1996. For application or enquiries please contact: knepper at informatik.uni-bonn.de From terry at salk.edu Tue Jul 9 12:59:52 1996 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 9 Jul 1996 09:59:52 -0700 (PDT) Subject: NEURAL COMPUTATION 8:5 Message-ID: <199607091659.JAA23546@helmholtz.salk.edu> Neural Computation - Contents Volume 8, Number 5 - July 1, 1996 Article Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm Randall C. O'Reilly Letters Effects Of Nonlinear Synapses On The Performance Of Multilayer Neural Networks G. Dundar, F-C. Hsu and K. Rose Modeling Slowly Bursting Neuron Via Calcium Store And Voltage-Independent Calcium Current Teresa Ree Chay Type 1 Membranes, Phase Resetting Curves, And Synchrony Bard Ermentrout On Neurodynamics With Limiter Function And Linsker's Developmental Model Jianfeng Feng, Hong Pan and Vwani P. Roychowdhury Effect Of Binocular Cortical Misalignment On Ocular Dominance And Orientation Selectivity Harel Shouval, Nathan Intrator, C. Charles Law, and Leon N Cooper A Novel Optimizing Network Architecture With Applications Anand Rangarajan, Steven Gold and Eric Mjolsness Gradient Projection Network: Analog Solver For Linearly Constrained Nonlinear Programming Kiichi Urahama Online Steepest Descent Yields Weights With Non-Normal Limiting Distribution Sayandev Mukherjee and Terrence L. Fine A Numerical Study On Learning Curves In Stochastic Multilayer Feedforward Networks K.-R. Muller, M. Finke, N. Murata, K. Schulten and S. Amari Rate Of Convergence In Density Estimation Using Neural Networks Dharmendra S. Modha and Elias Masry Modeling Conditional Probability Distributions For Periodic Variables Christopher M. Bishop and Ian T. Nabney ----- ABSTRACTS - http://www-mitpress.mit.edu/jrnls-catalog/neural.html SUBSCRIPTIONS - 1996 - VOLUME 8 - 8 ISSUES ______ $50 Student and Retired ______ $78 Individual ______ $220 Institution Add $28 for postage and handling outside USA (+7% GST for Canada). (Back issues from Volumes 1-7 are regularly available for $28 each to institutions and $14 each for individuals Add $5 for postage per issue outside USA (+7% GST for Canada) mitpress-orders at mit.edu MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142. Tel: (617) 253-2889 FAX: (617) 258-6779 ----- From torkkk at base.sps.mot.com Tue Jul 9 13:12:27 1996 From: torkkk at base.sps.mot.com (Kari Torkkola) Date: Tue, 09 Jul 1996 10:12:27 -0700 Subject: papers on blind source separation available in the neuroprose archive Message-ID: <31E292FB.5347@base.sps.mot.com> The following two related papers on blind source separation are available on-line in the neuroprose archive. Papers extend the information maximization approach of Bell and Sejnowski towards the separation of more realistic mixtures of signals. FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/torkkola.icassp96.ps.Z FTP-file: pub/neuroprose/torkkola.nnsp96.ps.Z ------------------------------------------------------------------ BLIND SEPARATION OF DELAYED SOURCES BASED ON INFORMATION MAXIMIZATION 4 pages, 341K compressed, 920K uncompressed @InProceedings{Torkkola96icassp, author = "Kari Torkkola", title = "Blind separation of delayed sources based on information maximization", booktitle = "Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing", address = "Atlanta, GA", month = "May 7-10", year = "1996", pages = "3510-3513", url = "ftp://archive.cis.ohio-state.edu/pub/neuroprose/torkkola.icassp96.ps.Z", } Abstract Recently, Bell and Sejnowski have presented an approach to blind source separation based on the information maximization principle. We extend this approach into more general cases where the sources may have been delayed with respect to each other. We present a network architecture capable of coping with such sources, and we derive the adaptation equations for the delays and the weights in the network by maximizing the information transferred through the network. ------------------------------------------------------------------ BLIND SEPARATION OF CONVOLVED SOURCES BASED ON INFORMATION MAXIMIZATION 10 pages, 147K compressed, 340K uncompressed @InProceedings{ Torkkola96nnsp, author = "Kari Torkkola", title = "Blind separation of convolved sources based on information maximization", booktitle = "Neural Networks for Signal Processing VI (Proceedings of the 1996 IEEE Workshop)", address = "Kyoto, Japan", month = "September 4-6", year = "1996", note = "(in press)", url = "ftp://archive.cis.ohio-state.edu/pub/neuroprose/torkkola.nnsp96.ps.Z", } Abstract Blind separation of independent sources from their convolutive mixtures is a problem in many real world multi-sensor applications. In this paper we present a solution to this problem based on the information maximization principle, which was recently proposed by Bell and Sejnowski for the case of blind separation of instantaneous mixtures. We present a feedback network architecture capable of coping with convolutive mixtures, and we derive the adaptation equations for the adaptive filters in the network by maximizing the information transferred through the network. Examples using speech signals are presented to illustrate the algorithm. ------------------------------------------------------------------ Kari Torkkola Motorola Phoenix Corporate Research phone: +1-602-4134129 Mail Drop EL 508 fax: +1-602-4137281 2100 E. Elliot Rd email: torkkk at base.sps.mot.com Tempe, AZ 85284 A540AA at email.mot.com From mackay at mrao.cam.ac.uk Fri Jul 12 15:44:00 1996 From: mackay at mrao.cam.ac.uk (David J.C. MacKay) Date: Fri, 12 Jul 96 15:44 BST Subject: postdoctoral vacancy in Cambridge Message-ID: An 18 month postdoc in neural networks is available in the Department of Physics, Cambridge, Great Britain. Start date --- 1 Nov 96. Title: Neural Network Process Modelling for the Microstructural management of forged components Please see this URL for further information: http://wol.ra.phy.cam.ac.uk/is/sitsvac.html Please feel free to pass on this announcement to interested colleagues. Many thanks, David MacKay =============================== new phone number at work: 339852 ========== David J.C. MacKay email: mackay at mrao.cam.ac.uk www: http://wol.ra.phy.cam.ac.uk/mackay/ Cavendish Laboratory, tel: (01223) 339852 fax: 354599 home: 276411 Madingley Road, international code: +44 1223 Cambridge CB3 0HE. U.K. home: 19 Thornton Road, Girton, Cambridge CB3 0NP From murdock at ukraine.corp.mot.com Fri Jul 12 18:04:01 1996 From: murdock at ukraine.corp.mot.com (Mike Murdock) Date: Fri, 12 Jul 1996 17:04:01 -0500 Subject: internship in NN speech recognition at Motorola Message-ID: <199607122204.RAA06125@palau.mot.com> Motorola's Chicago Corporate Research Laboratories is currently seeking a motivated individual to fill an internship position in the Speech Recognition Group in Schaumburg, Illinois. The internship will last at least three months. The Speech Recognition Group has developed innovative neural network and signal processing technology for continuous, small vocabulary robust recognition. The successful candidate will work on a component of an HMM/neural network hybrid speech recognizer. The duties of the position include applied research, software development, and conducting experiments with speech data sets. A high level of motivation is the standard for all members of the team. The individual should possess a BS or MS degree in EE, CS or a related discipline. Strong programming skills in C are required. Knowledge of neural networks, decision trees, and statistical techniques is highly desirable. Please send resume and cover letter to be considered for this position to Motorola Inc., Corporate Staffing Department, Attn: Hybrid-5375, 1303 E. Algonquin Rd., Schaumburg, IL 60196. Fax: 847-576-4959. Motorola is an equal opportunity/affirmative action employer. We welcome and encourage diversity in our workforce. From haussler at cse.ucsc.edu Mon Jul 15 22:19:38 1996 From: haussler at cse.ucsc.edu (David Haussler) Date: Mon, 15 Jul 1996 19:19:38 -0700 Subject: Postdoctoral Positions in Biosequence Analysis at UCSC Message-ID: <199607160219.TAA28950@arapaho.cse.ucsc.edu> Postdoctoral and Graduate Researcher Positions in Biosequence Analysis at UCSC We are looking for people with backgrounds in statistics and computer science (e.g. machine learning, hidden Markov models, neural networks, speech recognition) to design and implement algorithms to find new genes in human DNA sequences, and to predict the structure and function of newly discovered protein sequences. Working knowledge of molecular biology is desired but not required; we can teach you what you need to know to get started and then you can learn the rest while you are here. An introductory overview of our research and selection of recent papers can be found at http://www.cse.ucsc.edu/research/compbio. The URL http://science-mag.aaas.org/science/scripts/display/short/272/5269/1730.html contains a recent article in Science about how hot this field is. Our biosequence analysis group currently consists of 3 faculty: D. Haussler (Computer Science), R. Hughey and K. Karplus (Computer Engineering), as well as 8 graduate students, 6 undergraduate students, and a postgraduate researcher. We have active collaborations with biologists at UCSC, as well as with groups at Lawrence Berkeley Labs Genome Center, the Sanger Centre, and the European Molecular Biology Laboratory. We have one postdoc position to start Jan. 1, 1997, and have the opportunity during this month (before Aug. 1, 1996) to apply for another postdoc position that will also start Jan. 1. In addition, we are trying to build our graduate program in this area, so we encourage post-baccalaureate students to apply to begin graduate study at UCSC in this area in Fall 1997. (In exceptional cases we will consider graduate applications for Jan. 1 or March 31, 1997 as well.) We support qualified graduate students with research assistantships. If you are interested in a postdoc position, send your vita and letters of reference asap to Lisa Pascal (lisa at cse.ucsc.edu, hardcopy: Lisa Pascal, Computer Science, University of California, Santa Cruz CA 95064). Applications and other information about our graduate program can be obtained on the web by starting at http://www.cse.ucsc.edu or by calling the graduate division at 408 459-2301 (email: gradadm at cats.ucsc.edu). For addition information about graduate study in computer science and computer engineering contact Carol Mullane (mullane at cse.ucsc.edu; 408 459-2576). From ling at cs.hku.hk Thu Jul 18 02:07:45 1996 From: ling at cs.hku.hk (Charles X. Ling) Date: Thu, 18 Jul 96 02:07:45 HKT Subject: Computational Cognitive Modeling: Source of the Power Message-ID: <9607171807.AA09004@stamina.cs.hku.hk> AAAI-96 Workshop Computational Cognitive Modeling Source of the Power One-day Workshop. August 5, 1996 (During UAI, KDD, AAAI, and IAAI. Portland, Oregon) Visit http://www.cs.hku.hk/~ling for updated info Program Committee: Charles Ling (co-chair), University of Hong Kong, ling at cs.hku.hk Ron Sun (co-chair), University of Alabama, rsun at cs.ua.edu Pat Langley, Stanford University Mike Pazzani, UC Irvine Tom Shultz, McGill University Paul Thagard, Univ. of Waterloo Kurt VanLehn, Univ. of Pittsburgh Invited speakers: Gary Cottrell, Jeff Elman, Denis Mareschal, Tom Shultz, Aaron Sloman, and Paul Thagard. Note: To attend the Workshop, you MUST register. To register, send e-mails to Charles Ling or Ron Sun. If you register the AAAI main conference, registration fee is free (but you still need to register); otherwise, there is a fee of $150 per Workshop. The Workshop Program and Schedule 9:00 am: Welcome and Introduction. Charles Ling and Ron Sun (Co-Chairs) 9:10 am: Aaron Sloman, The University of Birmingham, UK What sort of architecture is required for a human-like agent? (invited talk) 9:40 am: Susan L. Epstein and Jack Gelfand, City University of New York, USA The creation of new problem solving agents from experience with visual features 10:00 am: Denis Mareschal, Exeter University, UK Models of Object Permanence: How and Why they Work (invited talk) 10:30 am: coffee break 11:00 am: Pat Langley, Stanford University, USA An abstract computational model of learning selective sensing skills 11:20 am: Craig S. Miller, Dickinson College, USA The source of graded performance in a symbolic rule-based model 11:40 am: Christian D. Schunn and Lynne M. Reder, Carnegie Mellon University Modeling changes in strategy selections over time 12:00 pm: lunch break 1:30 pm: poster session 2:30 pm: Tom Shultz, McGill University, Montreal, Canada Generative Connectionist Models of Cognitive Development: Why They Work (invited talk) 3:00 pm: Garrison W. Cottrell, University of California, San Diego, USA Selective attention in the acquisition of the past tense (invited talk) 3:30 pm coffee break 4:00 pm: Jeff Elman, University of California, San Diego, USA States and stacks: Doing computation with a recurrent neural network (invited talk) 4:30 pm: Paul Thagard, Univ. of Waterloo, Canada Evaluating Computational Models of Cognition: Notes from the Analogy Wars (invited talk) 5:00 pm: Tony Veale, Barry Smyth, Diarmuid O'Donoghue, Mark Keane Representational myopia in cognitive mapping 5:20 pm: Panel and discussions Panelists: Charles Ling, Ron Sun, Pat Langley, Mike Pazzani. 6:30 pm: end From liaw at bmsr14.usc.edu Thu Jul 18 17:08:14 1996 From: liaw at bmsr14.usc.edu (Jim-Shih Liaw) Date: Thu, 18 Jul 1996 14:08:14 -0700 (PDT) Subject: Workshop on SENSORIMOTOR COORDINATION Message-ID: <199607182108.OAA07319@bmsr14.usc.edu> REGISTRATION INFORMATION Workshop on SENSORIMOTOR COORDINATION: AMPHIBIANS, MODELS, AND COMPARATIVE STUDIES Poco Diablo Resort, Sedona, Arizona, November 22-24, 1996 Co-Directors: Kiisa Nishikawa (Northern Arizona University, Flagstaff) and Michael Arbib (University of Southern California, Los Angeles). Local Arrangements Chair: Kiisa Nishikawa. E-mail enquiries may be addressed to Kiisa.Nishikawa at nau.edu or arbib at pollux.usc.edu. Further information may be found on our home page at http://www.nau.edu:80/~biology/vismot.html. Program Committee: Kiisa Nishikawa (Chair), Michael Arbib, Emilio Bizzi, Chris Comer, Peter Ewert, Simon Giszter, Mel Goodale, Ananda Weerasuriya, Walt Wilczynski, and Phil Zeigler. SCIENTIFIC PROGRAM The aim of this workshop is to study the neural mechanisms of sensorimotor coordination in amphibians and other model systems for their intrinsic interest, as a target for developments in computational neuroscience, and also as a basis for comparative and evolutionary studies. The list of subsidiary themes given below is meant to be representative of this comparative dimension, but is not intended to be exhaustive. The emphasis (but not the exclusive emphasis) will be on papers that encourage the dialog between modeling and experimentation. A decision as to whether or not to publish a proceedings is still pending. Central Theme: Sensorimotor Coordination in Amphibians and Other Model Systems Subsidiary Themes: Visuomotor Coordination: Comparative and Evolutionary Perspectives Reaching and Grasping in Frog, Pigeon, and Primate Cognitive Maps Motor Pattern Generators This workshop is the sequel to four earlier workshops on the general theme of "Visuomotor Coordination in Frog and Toad: Models and Experiments". The first two were organized by Rolando Lara and Michael Arbib at the University of Massachusetts, Amherst (1981) and Mexico City (1982). The next two were organized by Peter Ewert and Arbib in Kassel and Los Angeles, respectively, with the Proceedings published as follows: Ewert, J.-P. and M. A. Arbib (Eds.) 1989. Visuomotor Coordination: Amphibians, Comparisons, Models and Robots. New York: Plenum Press. Arbib, M.A. and J.-P. Ewert (Eds.) 1991. Visual Structures and Integrated Functions, Research Notes in Neural Computing 3. Heidelberg, New York: Springer Verlag. REGISTRATION INFORMATION Meeting Location and General Information The Workshop will be held at the Poco Diablo Resort in Sedona, Arizona (a beautiful small town set in dramatic red hills) immediately following the Society for Neuroscience meeting in 1996. The 1996 Neuroscience meeting ends on Thursday, November 21, so workshop participants can fly from Washington, DC to Phoenix, AZ that evening, meet Friday, Saturday, and Sunday, with a Workshop Banquet on Sunday evening, and fly home on Monday, November 25th. Paper sessions will be held all day on Friday, on Saturday afternoon, and all day on Sunday. Poster sessions will be held on Saturday afternoon and evening. A group field trip is planned for Saturday morning. Graduate Student and Postdoctoral Participation In order to encourage the participation of graduate students and postdoctorals, we have arranged for affordable housing, and in addition we are able to offer a reduced registration fee (see below) thanks to the generous contribution of the Office of the Associate Provost for Research and Graduate Studies at Northern Arizona University. Travel from Phoenix to Sedona Sedona, AZ is located approximately 100 miles north of Phoenix, where the nearest major airport (Sky Harbor) is located. Workshop attendees may wish to arrange their own transportation (e.g., car rental from Phoenix airport) from Phoenix to Sedona, or they may use the Workshop Shuttle (estimated round trip cost $20 US) to Sedona on 21 November, with a return to Phoenix on 25 November. If you plan to use the Workshop Shuttle, we will need to know your expected arrival time in Phoenix by 1 October 1996, to ensure that space is available for you at a convenient time. Lodging The following costs are for each night. Since many participants may want to extend their stay to further enjoy Arizona's scenic beauty, we have negotiated special rates for additional nights after the end of the workshop on November 24th. Attendees should make their own booking with the Poco Diablo Resort, by phone (800) 352-5710 or FAX (520) 282-9712. Thurs.-Fri. (and additional week nights before the workshop) per night: students $85 US + tax, faculty $105 + tax Sat.-Sun. (and additional week nights after the workshop) per night: students $69 + tax, faculty $89 + tax. The student room rates are for double occupancy. Thus, students willing to share a room may stay for half the stated rate. When you make your room reservations with the Poco Diablo Resort, please be sure to indicate the number of guests in your party. Graduate students and postdocs should be sure to indicate whether they want single or double occupancy. Registration Fees Students and postdoctorals $100; faculty, guests and others $200. The registration fee includes lunch Fri. - Sun., wine and cheese reception during the Saturday evening poster session, and a Farewell Dinner on Sunday evening. Registration fees should be paid by check in US funds, made payable to "Sensorimotor Coordination Workshop", and should be sent to Kiisa Nishikawa at the address listed below, together with the completed registration form that follows at the end of this announcement. Completed registration forms and fees must be received by 1 August, 1996. Late registration fees will be $150 for students and postdoctorals and $250 for faculty. REGISTRATION FORM NAME: ADDRESS: PHONE: FAX: EMAIL: STATUS: [ ] Faculty ($200); [ ] Postdoctoral ($100); [ ] Student ($100); [ ] Other ($200). (Postdocs and students: Please attach certification of your status signed by your supervisor.) TYPE OF PRESENTATION (paper vs. poster) ABSTRACT SUBMITTED (yes/no) AREAS OF INTEREST RELEVANT TO WORKSHOP: WILL YOU REQUIRE ANY SPECIAL AUDIOVISUAL EQUIPMENT FOR YOUR PRESENTATION? HAVE YOU MADE A RESERVATION WITH THE HOTEL? EXPECTED TIME OF ARRIVAL IN PHOENIX (ON NOVEMBER 21): EXPECTED TIME OF DEPARTURE FROM PHOENIX (ON NOVEMBER 25): DO YOU WISH TO USE THE WORKSHOP SHUTTLE TO TRAVEL FROM PHOENIX TO SEDONA? (If so, please be sure that we know your expected arrival time by 1 October!) DO YOU WISH TO PARTICIPATE IN A GROUP HIKE IN THE SEDONA AREA ON SATURDAY MORNING? Please make sure that your check (in US funds and payable to the Sensorimotor Coordination Workshop) is included with this form. If you plan to bring a guest with you to the Workshop, please add their name(s) to this form and enclose their registration fee along with your own. Mail to: Kiisa Nishikawa, Department of Biological Sciences, Northern Arizona University, Flagstaff, AZ 86011-5640. E-mail: Kiisa.Nishikawa at nau.edu. FAX: (520)523-7500. Phone: (520)523-9497. From philh at cogs.susx.ac.uk Fri Jul 19 10:43:43 1996 From: philh at cogs.susx.ac.uk (Phil Husbands) Date: Fri, 19 Jul 1996 15:43:43 +0100 (BST) Subject: 2 postdocs, comp. neuroscience and robotics Message-ID: Centre for Computational Neuroscience and Robotics University of Sussex TWO RESEARCH FELLOWSHIPS Neuroscience and AI are areas in which the University of Sussex is exceptionally strong. A new interdisciplinary Centre has recently been established to promote synergy between studies of artificial and biological nervous systems. Applications are invited for two post-doctoral fellowships to work in an exciting, interactive and unusual environment in this interdisciplinary Centre. POST 1 Computational Modelling of Biological Sensory Processing The aim is to understand the neural mechanisms underlying motion detection. The project will involve tight coupling of computer modelling and electrophysiological data. It will feed directly into our work on robot nervous systems. POST 2 Artificial Evolution of Nervous Systems for Robots The aim is to expand the evolutionary robotics field developed at Sussex by evolving robots with more complex behavioural capabilities, in particular to evolve minimally cognitive robots. Salary will be on the Research and Analogous Faculty Grade 1A scale 14,317 to 21,519 per annum. The posts are available immediately and will be for two years in the first instance with the possibility of renewal. Informal enquiries are encouraged. Enquiries about Post 1 should be made to Professor Michael O'Shea (Tel +44 (0)1273 678055; Fax +44 (0)1273 678535; email M.O-Shea at sussex.ac.uk) and for Post 2 to Dr Phil Husbands (Tel +44 (0)1273 678556; Fax +44 (0)1273 671320; email philh at cogs.susx.ac.uk). http://www.biols.sussex.ac.uk/Biols/IRC/ccnr.html Applications, (curriculum vitae, one or two sample publications, and the names and addresses of at least two referees) should be sent to the Administrator, Centre for Computational Neuroscience and Robotics, School of Biological Sciences, University of Sussex, Falmer, Brighton, BN1 9QG, UK. Closing date: 1st October 1996. From jagota at ICSI.Berkeley.EDU Fri Jul 19 22:53:13 1996 From: jagota at ICSI.Berkeley.EDU (Arun Jagota) Date: Fri, 19 Jul 1996 19:53:13 -0700 Subject: Electronic Journal Announcement Message-ID: <199607200253.TAA08445@flapjack.ICSI.Berkeley.EDU> Dear Connectionists: This is a first announcement and CALL FOR SUBMISSIONS. Manuscripts are invited, beginning now. Arun Jagota ----------- ________________________ NEURAL COMPUTING SURVEYS ------------------------ A refereed electronic journal published on the World Wide Web http://www.icsi.berkeley.edu/~jagota/NCS MISSION One way to cope with the exponential increase in the number of articles published in recent years is to ignore most of them. A second, perhaps more satisfying, approach is to provide a forum that encourages the regular production -- and perusal -- of high-quality survey articles. This is especially useful in an inter-disciplinary, evolving field such as Neural Computing. This journal aims to bring the second approach to bear. It is intended to + encourage researchers to write good survey papers. + provides an efficiently searchable repository -- perhaps the first that comes to mind -- for researchers to look at in times of need. TOPICS COVERED All aspects of Neural Computing. See the web site for details. BOARD OF ADVISORS (to expand) Yaser Abu-Mostafa Caltech Michael Arbib USC Eric Baum NEC Research Institute Jeff Elman UCSD Scott Fahlman CMU Lee C. Giles NEC Research Institute Michael Jordan MIT Wolfgang Maass TU-Graz, Austria Eric Mjolsness UCSD Michael Mozer Colorado U. Eduardo Sontag Rutgers U. Lei Xu CUHK MANAGING EDITOR Arun Jagota UCSC & ICSI-Berkeley (vis) jagota at icsi.berkeley.edu EDITORIAL BOARD (to expand somewhat) Suzanna Becker McMaster U. Yoshua Bengio bengioy at iro.umontreal.ca Wray Buntine wray at buntine.brainstorm.net Joydeep Ghosh ghosh at ece.utexas.edu Zoubin Ghahramani zoubin at cs.toronto.edu Arun Jagota jagota at icsi.berkeley.edu Pascal Koiran koiran at lip.ens-lyon.fr Barak Pearlmutter barak.pearlmutter at alumni.cs.cmu.edu Michael Perrone mpp at watson.ibm.com Anand Rangarajan rangarajan-anand at cs.yale.edu Hava Siegelmann iehava at ie.technion.ac.il Yoram Singer singer at research.att.com Ron Sun rsun at cs.ua.edu John-Shawe Taylor john at dcs.rhbnc.ac.uk Sebastian Thrun thrun+ at cs.cmu.edu Xin Wang xwang at cs.ucla.edu Lei Xu Chinese U. of Hong Kong FEATURES AT A GLANCE Survey-only Postscript & Hypertext Attachments allowed Peer-reviewed No length restrictions No delays upon acceptance Free FORMS & ATTACHMENTS Survey papers may take the form of an ESSAY or a COMPENDIUM. ATTACHMENTS to an accepted paper -- optional to the authors -- allow use of the capabilities of the web to connect to several kinds of material closely related to the original paper. A clear separation is made between attachments (which are unrefereed and changable) and the main paper (which is refereed and not changable). See the web site for more details. HOW TO SUBMIT Electronic submissions (in postscript format) are strongly encouraged, to any one of the editors based on matching area (see web site for listing of areas of editors). A Latex style file will be available for the NCS format. See the web site for details. PRINT VERSION A print version, published by some well-known publisher as a "backup" to the electronic version, is being strongly considered. FTP ACCESS Those without convenient access to the World Wide Web might use anonymous ftp as follows (see the README file there): ftp://ftp.icsi.berkeley.edu/pub/ai/jagota ______________________________________________________________________________ From hu at engr.wisc.edu Sun Jul 21 06:47:34 1996 From: hu at engr.wisc.edu (Hu, Yu Hen) Date: Sun, 21 Jul 96 10:47:34 0000 Subject: CFP: ISMIP'96 (New Submission Deadline) Message-ID: <199607211547.AA24785@eceserv0.ece.wisc.edu> ------------------------------------------------ CALL FOR PAPERS (Extension of Submission Deadline) ------------------------------------------------ 1996 International Symposium on Multi-Technology Information Processing A Joint Symposium of Artificial Neural Networks, Circuits and Systems, and Signal Processing December 16-18, 1996 Hsin-Chu, Taiwan, Republic of China ************************************************* NEW PAPER SUBMISSION DEADLINE: AUGUEST 5, 1996 ************************************************* Call For Papers --------------- The International Symposium on Multi-Technology Information Processing (ISMIP'96), a joint symposium of artificial neural networks, circuits and systems, and signal processing, will be held in National Tsing Hua University, Hsin Chu, Taiwan, Republic of China. This conference is an expansion of previous series of International Symposium of Artificial Neural Networks (ISANN). The main purpose of this conference is to offer a forum showcasing the latest advancement of modern information processing technologies. It will include recent innovative research results of theories, algorithms, architectures, systems, hardware implementations that lead to intelligent information processing. The technical program will feature opening keynote addresses, invited plenary talks, technical presentations of refereed papers. The official language is English. Papers are solicited for, but not limited to, the following topics: 1. Associative Memory 2. Digital and Analog Neurocomputers 3. Fuzzy Neural Systems 4. Supervised/Un-supervised Learning 5. Robotics 6. Sensory/Motor Control 7. Image Processing 8. Pattern Recognition 9. Language/ Speech Processing 10. Digital Signal Processing 11. VLSI Architectures 12. Non-linear Circuits 13. Multi-media information processing 14. Optimization 15. Mathematical Methods 16. Visual signal processing 17. Content based signal processing 18. Applications Prospective authors are invited to submit 4 copies of extended summaries of no more than 4 pages. All the manuscripts must be written in English in single-spaced, single column, on 8.5" by 11" white papers. The top of the first page of the paper should include a title, authors' names, affiliations, address, telephone/fax numbers, and email address if applicable. The indicated corresponding author will receive an acknowledgement of his/her submission. Camera-ready full papers of accepted manuscripts will be published in a hard-bound proceedings and distributed in the symposium. For more information, please consult at the URL site http://pierce.ee.washington.edu/~nnsp/ismip96.html Authors are invited to send submissions to one of the program co-chairs: ------------------- For submissions from USA and Europe Dr. C.-H. Lee Multimedia Communications Research Lab Bell Laboratories, Lucent Technologies 600 Mountain Ave. 2D-425 Murray Hill, NJ 07974-0636 USA Phone: 908-582-5226 fax: 908-582-7308 chl at research.bell-labs.com --------------------- For submissions from Asia and the rest of the world Prof. V. W. Soo Dept. of Computer Science National Tsing Hua University Hsin Chu, Taiwan 30043, ROC Phone: 886-35-731068 FAX: 886-35-723694 soo at cs.nthu.edu.tw Schedule -------- Submission of full paper: August 5, 1996. Notification of acceptance: September 30, 1996. Submission of camera-ready paper: October 31, 1996. Advanced registration, before: November 15, 1996. Sponsored by National Tsing Hua University (NTHU), Ministry of Education, Taiwan R.O.C. National Science Council, Taiwan R.O.C. in Cooperation with IEEE Signal Processing Society, IEEE Circuits and Systems Society IEEE Neural Networks Council, IEEE Taiwan Section Taiwanese Association for Artificial Intelligence ORGANIZATION ------------ General Co-chairs: H. C. Wang, NTHU Y. H. Hu, U. of Wisconsin Advisory board Co-chairs: W. T. Chen, NTHU S. Y. Kung, Princeton U. Vice Co-chairs: H. C. Hu, NCTU J.N. Hwang, U. of Washington Program Co-chairs: V. W. Soo, NTHU Chin-Hui Lee AT&T From robert at physik.uni-wuerzburg.de Wed Jul 24 07:54:58 1996 From: robert at physik.uni-wuerzburg.de (Robert Urbanczik) Date: Wed, 24 Jul 1996 13:54:58 +0200 (MESZ) Subject: paper available: learning in a committee machine Message-ID: FTP-host: ftp.physik.uni-wuerzburg.de FTP-filename: /pub/preprint/WUE-ITP-96-013.ps.gz **DO NOT FORWARD TO OTHER GROUPS** The following paper (9 pages, to appear in Europhys.Letts.) is now available via anonymous ftp: (See below for the retrieval procedure) --------------------------------------------------------------------- Learning in a large committee machine: Worst case and average case by R. Urbanczik Abstract: Learning of realizable rules is studied for tree committee machines with continuous weights. No nontrivial upper bound exists for the generalization error of consistent students as the number of hidden units $K$ increases. However, numerical considerations show that consistent students with a value of the generalization error significantly higher than predicted by the average case analysis are extremely hard to find. An on-line learning algorithm is presented, for which the generalization error scales with the training set size as in the average case theory in the limit of large $K$. --------------------------------------------------------------------- Retrieval procedure: unix> ftp ftp.physik.uni-wuerzburg.de Name: anonymous Password: {your e-mail address} ftp> cd pub/preprint/1996 ftp> get WUE-ITP-96-013.ps.gz ftp> quit unix> gunzip WUE-ITP-96-013.ps.gz e.g. unix> lp WUE-ITP-96-013.ps _____________________________________________________________________ From gluck at pavlov.rutgers.edu Wed Jul 24 08:49:22 1996 From: gluck at pavlov.rutgers.edu (Mark Gluck) Date: Wed, 24 Jul 1996 08:49:22 -0400 Subject: Programmer/R.A. Position at Rutgers Univ, NJ, in Computational Neuro. Message-ID: <199607241249.IAA01635@james.rutgers.edu> SEEKING A PROGRAMMER/RESEARCH ASSISTANT TO WORK ON NEURAL-NETWORK BRAIN MODELS AT RUTGERS-NEWARK NEUROSCIENCE CENTER (GLUCK LAB). We are looking for a programmer/research assistant to work with us on testing computational models of cortico-hippocampal function in animal and human learning. The applicant must be able to work independently -- given a set of specifications, he/she should be able to optimize program performance to generate results, and also analyze system behavior. The ideal applicant would be someone recently out of college, who would like some research experience prior to future graduate work in psychology, neuroscience, cognitive science, or computer science. Required Skills: Strong C (or C++) programming Knowledge of Unix Commitment to at least 15 hours/week, for at least one year. Could also be a full time position. Preferrred But Not Required Skills: Knowledge of Sun workstations Background in neural networks Background in premed, biology, or psychology. Salary: Commensurate with skill level and experience. For more information on our research, see our lab WWW page noted below. Contact Mark Gluck below with a cover letter and resume (preferably sent by email) to apply. _______________________________________________________________________________ Dr. Mark A. Gluck Center for Molecular & Behavioral Neuroscience Rutgers University 197 University Ave. Newark, New Jersey 07102 Phone: (201) 648-1080 (Ext. 3221) Fax: (201) 648-1272 Email: gluck at pavlov.rutgers.edu WWW Homepage: http://www.cmbn.rutgers.edu/cmbn/faculty/gluck.html _______________________________________________________________________________ From listerrj at helios.aston.ac.uk Wed Jul 24 09:23:07 1996 From: listerrj at helios.aston.ac.uk (Richard Lister) Date: Wed, 24 Jul 1996 14:23:07 +0100 Subject: Two Postdoctoral Research Fellowships Message-ID: <13421.199607241323@sun.aston.ac.uk> ---------------------------------------------------------------------- Neural Computing Research Group ------------------------------- Dept of Computer Science and Applied Mathematics Aston University, Birmingham, UK TWO POSTDOCTORAL RESEARCH FELLOWSHIPS ------------------------------------- *** Full details at http://www.ncrg.aston.ac.uk/ *** ---------------------------------------------------------------------- Analysis of On-Line Learning in Neural Networks ----------------------------------------------- The Neural Computing Research Group at Aston is looking for a highly motivated individual for a 2 year postdoctoral research position in the area of `Analysis of On-Line Learning in Neural Networks'. The emphasis of the research will be on applying a theoretically well- founded approach based on methods adopted from statistical mechanics to analyse learning in multilayer perceptrons in various learning scenarios. Potential candidates should have strong mathematical and computational skills, with a background in statistical mechanics and neural networks. Conditions of Service --------------------- Salaries will be up to point 6 on the RA 1A scale, currently 15,986 UK pounds. The salary scale is subject to annual increments. How to Apply ------------ If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, to: Dr. David Saad Neural Computing Research Group Dept. of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: 0121 333 4631 Fax: 0121 333 6215 e-mail: D.Saad at aston.ac.uk e-mail submission of postscript files is welcome. Candidates that applied for the position `On-line Learning in Radial Basis Function Networks' will be automatically considered for this position as well. Closing date: 12 August, 1996. ---------------------------------------------------------------------- NEUROSAT: Processing of environment observing satellite data ------------------------------------------------------------ with Neural Networks -------------------- The Neural Computing Research Group at Aston is looking for a highly motivated individual for a 3 year postdoctoral research position in the area of processing environmental data from satellites with neural networks, working with Dr. Ian Nabney. The post is funded by a grant from the European Commission Directorate General XII in Environment and Climate. Candidates should have strong mathematical and computational skills, with a background in one or more of neural networks, Bayesian inference, or satellite data analysis. NEUROSAT is a three collaborative project whose objective is to contribute to an enhanced analysis of the real Earth climate driving forces. The consortium is led by Michel Crepon at the Institut Pierre-Simon-Laplace (IPSL) in Paris, and includes partners from the UK, France, Germany and Italy. Aston will be involved in two work packages: Assessment of Generic Techniques and Inferring Sea Surface Wind from Scatterometric Measurements. In the first of these, we are responsible for contributions to survey papers, for technology transfer to other partners, and for some feasibility studies in the use of neural networks in climatological problems. The second work package, which will be the main activity, involves developing some existing research on scatterometric data analysis to a state where it can be compared with the current operational system (AEOLUS) used at the Meteorological Office. The data that will be used comes from the ERS1 satellite. A two stage approach will be applied. The first stage is to improve the local modelling (i.e. the wind vector in a single cell), and the second stage is to improve the global modelling (i.e. the overall wind field). To improve the local modelling, the influence of sea state on the scatterometer signals will be studied. This will be done by using the ERS1 signal collocated with wind vector and sea state obtained from analysed fields of meteorological models and fused with in situ buoy observations. The purpose of this work is to understand the factors involved in the GMF so as to improve the inverse function modelling. Earlier work at Aston has used mixture density networks to model the conditional density of the inverse function (since this is typically multi-valued for wind direction), and this will make it easier to incorporate probabilistic information into the global model. Such information includes priors on model parameters, priors on data coming from weather stations and climatological information (e.g. long term weather trends). This will also allow the wind-field to be `seeded' with known values at specific locations. Techniques from optical flow (for example, div-curl splines) and Bayesian models will be investigated for their application to modelling the global wind field. This approach should lead to a self consistent, accurate and fast neural network procedure to retrieve the entire wind field. This information will be useful to meteorological centres to be assimilated into their prediction models. Dr. David Offiler (of the UK Meteorological Office) will act as a consultant to the project, assisting in the development of prior models and the assessment of the prediction methods. If we can improve on existing techniques, then there is every prospect of replacing them in operational use. Informal enquiries can be made to Ian Nabney (I.T.Nabney at aston.ac.uk). The target start date is September 1996, although this may be somewhat flexible. Conditions of Service --------------------- Salaries will be up to point 6 on the RA 1A scale, currently 15,986 UK pounds. These salary scales are subject to annual increments. How to Apply ------------ If you wish to be considered for this position, please send a full CV and publications list, together with the names of 3 referees, to: Dr. Ian Nabney Neural Computing Research Group Department of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: +44 121 333 4631 Fax: +44 121 333 4586 e-mail: I.T.Nabney at aston.ac.uk (email submission of postscript files is welcome) Closing date: 12 August, 1996. ---------------------------------------------------------------------- From mel at quake.usc.edu Wed Jul 24 03:15:23 1996 From: mel at quake.usc.edu (Bartlett Mel) Date: Wed, 24 Jul 1996 15:15:23 +0800 Subject: Workshop Announcement Message-ID: <9607242215.AA21433@quake.usc.edu> ************* CALL FOR WORKSHOP PARTICIPATION ************* Advanced Workshop on BIOLOGICAL AND ARTIFICIAL NEURAL NETWORKS: A SEARCH FOR SYNERGY sponsored by The USC Biomedical Simulations Resource and The National Center for Research Resources of NIH September 20-21, 1996 Summer House Inn, La Jolla, California This workshop will bring together investigators who share an interest in methodological issues relating to the combined study of biological and artificial neural networks, including innovative uses for artificial neural network techniques in the study of biological neural systems, and novel applications of neurobiologically inspired artificial neural network architectures. Workshop topics will emphasize: * methods for quantitative study of nonstationarities in neural systems * methods for analyzing high-dimensional spatio-temporal neural dynamics * methods for study of dynamic nonlinearities in neural systems Presentations will emphasize practical methodologies. The format will consists of brief invited and contributed presentations (15-20 minutes), followed by extended question/discussion periods emphasizing audience participation. The Workshop has been scheduled in space-time proximity to the World Congress on Neural Networks (San Diego, Sept. 15-19). Registration is free, but advance registration is required as space is limited. Limited funds are available for speakers upon request. To suggest yourself as a contributor, or for additional information, please contact Ms. Stephanie Braun at braun at bmsrs.usc.edu or (213)740-0342. ORGANIZERS: V.Z. Marmarelis, T.W. Berger From marney at ai.mit.edu Wed Jul 24 19:41:44 1996 From: marney at ai.mit.edu (Marney Smyth) Date: Wed, 24 Jul 1996 19:41:44 -0400 (EDT) Subject: Intensive Tutorial: Learning Methods for Prediction, Classification Message-ID: <9607242341.AA00934@motor-cortex.ai.mit.edu> ************************************************************** *** *** *** Learning Methods for Prediction, Classification, *** *** Novelty Detection and Time Series Analysis *** *** *** *** Cambridge, MA, September 20-21, 1996 *** *** Los Angeles, CA, December 14-15, 1996 *** *** *** *** Geoffrey Hinton, University of Toronto *** *** Michael Jordan, Massachusetts Inst. of Tech. *** *** *** ************************************************************** A two-day intensive Tutorial on Advanced Learning Methods will be held on September 20 and 21, 1996, at the Royal Sonesta Hotel, Cambridge, MA, and on December 14 and 15, 1996, at Lowe's Hotel, Santa Monica, CA. Space is available for up to 50 participants for each course. The course will provide an in-depth discussion of the large collection of new tools that have become available in recent years for developing autonomous learning systems and for aiding in the analysis of complex multivariate data. These tools include neural networks, hidden Markov models, belief networks, decision trees, memory-based methods, as well as increasingly sophisticated combinations of these architectures. Applications include prediction, classification, fault detection, time series analysis, diagnosis, optimization, system identification and control, exploratory data analysis and many other problems in statistics, machine learning and data mining. The course will be devoted equally to the conceptual foundations of recent developments in machine learning and to the deployment of these tools in applied settings. Case studies will be described to show how learning systems can be developed in real-world settings. Architectures and algorithms will be presented in some detail, but with a minimum of mathematical formalism and with a focus on intuitive understanding. Emphasis will be placed on using machine methods as tools that can be combined to solve the problem at hand. WHO SHOULD ATTEND THIS COURSE? The course is intended for engineers, data analysts, scientists, managers and others who would like to understand the basic principles underlying learning systems. The focus will be on neural network models and related graphical models such as mixture models, hidden Markov models, Kalman filters and belief networks. No previous exposure to machine learning algorithms is necessary although a degree in engineering or science (or equivalent experience) is desirable. Those attending can expect to gain an understanding of the current state-of-the-art in machine learning and be in a position to make informed decisions about whether this technology is relevant to specific problems in their area of interest. COURSE OUTLINE Overview of learning systems; LMS, perceptrons and support vectors; generalized linear models; multilayer networks; recurrent networks; weight decay, regularization and committees; optimization methods; active learning; applications to prediction, classification and control Graphical models: Markov random fields and Bayesian belief networks; junction trees and probabilistic message passing; calculating most probable configurations; Boltzmann machines; influence diagrams; structure learning algorithms; applications to diagnosis, density estimation, novelty detection and sensitivity analysis Clustering; mixture models; mixtures of experts models; the EM algorithm; decision trees; hidden Markov models; variations on hidden Markov models; applications to prediction, classification and time series modeling Subspace methods; mixtures of principal component modules; factor analysis and its relation to PCA; Kalman filtering; switching mixtures of Kalman filters; tree-structured Kalman filters; applications to novelty detection and system identification Approximate methods: sampling methods, variational methods; graphical models with sigmoid units and noisy-OR units; factorial HMMs; the Helmholtz machine; computationally efficient upper and lower bounds for graphical models REGISTRATION Standard Registration: $700 Student Registration: $400 Registration fee includes course materials, breakfast, coffee breaks, and lunch on Saturday. Those interested in participating should return the completed Registration Form and Fee as soon as possible, as the total number of places is limited by the size of the venue. ADDITIONAL INFORMATION A registration form is available from the course's WWW page at http://www.ai.mit.edu/projects/cbcl/web-pis/jordan/course/index.html Marney Smyth CBCL at MIT E25-201 45 Carleton Street Cambridge, MA 02142 USA Phone: 617 253-0547 Fax: 617 253-2964 E-mail: marney at ai.mit.edu From tvogl at wo.erim.org Fri Jul 12 17:03:07 1996 From: tvogl at wo.erim.org (Thomas P. Vogl) Date: Thu, 25 Jul 1996 09:03:07 +30000 Subject: Postdoc. Fellowship Available Message-ID: Postdoctoral Research Fellowship at George Mason University (GMU), Molecular Biosciences and Technology Institute (MBTI). Applications are invited for a postdoctoral Fellowship in the area of development of self-organizing pattern recognition algorithms based on biological information processing in visual and IT cortex. The aims of the project are (1) to develop neurobiologically plausible algorithms of visual pattern recognition which are computationally efficient and robust, and (2) compare performance of resulting algorithms with human performance in order to develop hypotheses about information processing in the brain. Evaluation of algorithms is performed using real world problems (e.g. face recognition and optical character recognition), and by comparison to human observer pattern recognition performance. We are seeking an individual with background in both neurobiology and computer science (good UNIX and C or C++ skills). Working knowledge of information theory or mathematical statistics is highly desirable but not required. The position is for one year, beginning October 1, 1996, with possible renewal for an additional three years. The initital stipend is $30,000/year plus fringe benefits. Our decade-old group currently consists of Drs. T.P. Vogl and K.T. Blackwell, and two graduate students, all of whom are actively involved in ongoing collaboration among neuroscientists (electrophysiologists) at NINDS/NIH; members of the GMU faculty in MBTI, several Departments, and the Krasnow Institute at GMU; and the professional staff at the Environmental Research Institute of Michigan (ERIM), a not-for-profit R&D company associated with the University of Michigan. Research activities encompass computer modeling, particularly computational neurobiology at levels ranging from channel level modeling of learning in single neurons to network level models, and visual psychophysics. To apply for this position, send your curriculum vitae and at least two letters of reference (in ASCII or MIME attached PostScript formats only) before September 1, 1996, to Prof. Thomas P. Vogl email: tvogl at gmu.edu. snail-mail to: ERIM 1101 Wilson Blvd. Ste 1100 Arlington, VA 22209 From john at dcs.rhbnc.ac.uk Thu Jul 25 11:17:05 1996 From: john at dcs.rhbnc.ac.uk (John Shawe-Taylor) Date: Thu, 25 Jul 96 16:17:05 +0100 Subject: Technical Report Series in Neural and Computational Learning Message-ID: <199607251517.QAA12702@platon.cs.rhbnc.ac.uk> The European Community ESPRIT Working Group in Neural and Computational Learning Theory (NeuroCOLT) has produced a set of new Technical Reports available from the remote ftp site described below. They cover topics in real valued complexity theory, computational learning theory, and analysis of the computational power of continuous neural networks. Abstracts are included for the titles. *** Please note that the location of the files was changed at the beginning of ** the year, so that any copies you have of the previous instructions should be * discarded. The new location and instructions are given at the end of the list. ---------------------------------------- NeuroCOLT Technical Report NC-TR-96-047: ---------------------------------------- A Graph-theoretic Generalization of the Sauer-Shelah Lemma by Nicol\`o Cesa-Bianchi, University of Milan, Italy David Haussler, University of California, Santa Cruz, USA Abstract: We show a natural graph-theoretic generalization of the Sauer-Shelah lemma. This result is applied to bound the $\ell_{\infty}$ and $L_1$ packing numbers of classes of functions whose range is an arbitrary, totally bounded metric space. ---------------------------------------- NeuroCOLT Technical Report NC-TR-96-048: ---------------------------------------- A Comparison between Cellular Encoding and Direct Encoding for Genetic Neural Networks by Fr\'ed\'eric Gruau, CWI, the Netherlands Darrell Whitley, Colorado State University, USA Abstract: This paper compares the efficiency of two encoding schemes for Artificial Neural Networks optimized by evolutionary algorithms. Direct Encoding encodes the weights for an a~priori fixed neural network architecture. Cellular Encoding encodes both weights and the architecture of the neural network. In previous studies, Direct Encoding and Cellular Encoding have been used to create neural networks for balancing 1 and 2 poles attached to a cart on a fixed track. The poles are balanced by a controller that push the cart to the left or the right. In some cases velocity information about the pole and cart is provided as an input; in other cases the network must learn to balance a single pole without velocity information. A careful study of the behavior of these systems suggests that it is possible to balance a single pole with velocity information as an input and without learning to compute the velocity. A new fitness function is introduced that forces ANN to compute the velocity. By using this new fitness function and tuning the syntactic constraints used with cellular encoding, we achieve a tenfold speedup over our previous study and solve a more difficult problem: balancing two poles when no information about the velocity is provided as input. -------------------------------------------------------------------- ***************** ACCESS INSTRUCTIONS ****************** The Report NC-TR-96-001 can be accessed and printed as follows % ftp ftp.dcs.rhbnc.ac.uk (134.219.96.1) Name: anonymous password: your full email address ftp> cd pub/neurocolt/tech_reports ftp> binary ftp> get nc-tr-96-001.ps.Z ftp> bye % zcat nc-tr-96-001.ps.Z | lpr -l Similarly for the other technical reports. Uncompressed versions of the postscript files have also been left for anyone not having an uncompress facility. In some cases there are two files available, for example, nc-tr-96-002-title.ps.Z nc-tr-96-002-body.ps.Z The first contains the title page while the second contains the body of the report. The single command, ftp> mget nc-tr-96-002* will prompt you for the files you require. A full list of the currently available Technical Reports in the Series is held in a file `abstracts' in the same directory. The files may also be accessed via WWW starting from the NeuroCOLT homepage: http://www.dcs.rhbnc.ac.uk/neural/neurocolt.html or directly to the archive: ftp://ftp.dcs.rhbnc.ac.uk/pub/neurocolt/tech_reports Best wishes John Shawe-Taylor From listerrj at helios.aston.ac.uk Thu Jul 25 12:18:37 1996 From: listerrj at helios.aston.ac.uk (Richard Lister) Date: Thu, 25 Jul 1996 17:18:37 +0100 Subject: Postdoctoral Research Fellowship Message-ID: <15572.199607251618@sun.aston.ac.uk> ---------------------------------------------------------------------------- Neural Computing Research Group ------------------------------- Dept of Computer Science and Applied Mathematics Aston University, Birmingham, UK POSTDOCTORAL RESEARCH FELLOWSHIP -------------------------------- *** Full details at http://www.ncrg.aston.ac.uk/ *** "Dynamical Systems and Information Geometric Approaches to Generalization" -------------------------------------------------------------------------- A mathematically-oriented researcher is required to work on a project examining a geometric view of generalization in neural networks. Geometry and topology occur at two levels in network models. A dynamical systems perspective is required to examine the behaviour of an individual neural network structure, and an information geometric perspective is required to examine the space of neural network models and perform inference. In both cases the role and geometrization of prior knowledge guides the generalization. The aim of this project is to investigate the issues of generalization in neural networks based on the geometric properties of dynamical systems, and information geometry spaces. This project forms part of a larger research activity on Validation and Verification of Neural Networks. Conditions of Service --------------------- Salaries will be up to point 6 on the RA 1A scale, currently 15,986 UK pounds. The salary scale is subject to annual increments. How to Apply ------------ If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, to: Professor David Lowe Neural Computing Research Group Dept. of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: 0121 333 4631 Fax: 0121 333 6215 e-mail: D.Lowe at aston.ac.uk e-mail submission of postscript files is welcome. Candidates that applied for this Fellowship will also automatically be considered for the four other postdoctoral Fellowships currently offered by the Neural Computing Research Group. Closing date: 19 August, 1996. ---------------------------------------------------------------------------- From radford at cs.toronto.edu Thu Jul 25 17:03:11 1996 From: radford at cs.toronto.edu (Radford Neal) Date: Thu, 25 Jul 1996 17:03:11 -0400 Subject: TR on Factor Analysis Using Delta-Rule Wake-Sleep Learning Message-ID: <96Jul25.170319edt.1282@neuron.ai.toronto.edu> Technical Report Available FACTOR ANALYSIS USING DELTA-RULE WAKE-SLEEP LEARNING Radford M. Neal Dept. of Statistics and Dept. of Computer Science University of Toronto Peter Dayan Department of Brain and Cognitive Sciences Massachusetts Institute of Technology 24 July 1996 We describe a linear network that models correlations between real-valued visible variables using one or more real-valued hidden variables - a *factor analysis* model. This model can be seen as a linear version of the "Helmholtz machine", and its parameters can be learned using the "wake-sleep" method, in which learning of the primary "generative" model is assisted by a "recognition" model, whose role is to fill in the values of hidden variables based on the values of visible variables. The generative and recognition models are jointly learned in "wake" and "sleep" phases, using just the delta rule. This learning procedure is comparable in simplicity to Oja's version of Hebbian learning, which produces a somewhat different representation of correlations in terms of principal components. We argue that the simplicity of wake-sleep learning makes factor analysis a plausible alternative to Hebbian learning as a model of activity-dependent cortical plasticity. This technical report is available in compressed Postscript by ftp to the following URL: ftp://ftp.cs.toronto.edu/pub/radford/ws-fa.ps.Z ---------------------------------------------------------------------------- Radford M. Neal radford at cs.utoronto.ca Dept. of Statistics and Dept. of Computer Science radford at utstat.utoronto.ca University of Toronto http://www.cs.utoronto.ca/~radford ---------------------------------------------------------------------------- From tanig at burton.zfe.siemens.de Fri Jul 26 08:26:20 1996 From: tanig at burton.zfe.siemens.de (Michiaki Taniguchi) Date: Fri, 26 Jul 1996 14:26:20 +0200 Subject: NEuroNet Industrial Studentship Message-ID: <199607261226.OAA29944@burton.zfe.siemens.de> ************************************************************************ NEuroNet Industrial Studentship at the Neural Network Group, Siemens Corporate Research ************************************************************************ Two studentships in the amount of 800 ECU/month for six months are available from the NEuroNet program for EU students. A major objective of NEuroNet Industrial Studentship is to provide an opportunity for students to gain experience in the industrial environment. Siemens will provide supervision of two students for SIX months in the laboratory located in Munich/Germany. The students will be assigned to project-relevant tasks in the area of telecommunications. During this time the students will have an opportunity to become acquainted with real-world applications of Neural Networks in the industrial environment. Siemens is one of the largest companies in the electrical and electronics industry world-wide, with at present about 2000 staff members in its Corporate Research and Development Division. In the Neural Network Project, which was started in 1988, more than 20 scientists and about an equal number of graduate and Ph.D. students are working on the theory and on applications of Neural Networks, among others, in: - time-series forecasting - non-linear statistics and dynamics - non-linear modeling and control - development of simulation environment for Neural Networks - telecommunications One of the main areas of activity of Siemens is telecommunication - an area where we see promising new applications of Neural Networks. By now, Neural Networks are established as a new technology for modeling and control of complex technical systems. We see a large potential to improve the current technology of telecommunication by neural based approaches. Requirements: - programming experience (C, C++, Matlab,...) - basic knowledge in Neural Networks - capability of independent work - background in telecommunication would be desirable Applicants should send as soon as possible a cv (curriculum vitae) which states nationality, place of study, their research interests and, if available, a list of publications. They have to be registered for a university degree (undergraduate or graduate students). During the six months, the financial support by NEuroNet will be 800 ECU per month. Time-scale: The studentship should start autumn/winter this year and will last for 6 month. Applications should be send as soon as possible to: Michiaki Taniguchi Address: ZFE T SN 4, Siemens AG Otto-Hahn-Ring 6 D-81739 M|nchen, Germany Phone: +49/89/636-49506 Fax: +49/89/636-49767 e-mail: Michiaki.Taniguchi at zfe.siemens.de ------------------------------------------------------------------------------ _/ Michiaki Taniguchi Phone: +49/89/636-49506 _/ _/ _/ ZFE T SN 4 Fax: +49/89/636-49767 _/_/ _/ _/ _/ _/_/_/ Siemens AG _/ _/ _/ _/ _/ Otto-Hahn-Ring 6 _/ _/ _/ _/ _/_/_/ 81730 Muenchen Germany e-mail: Michiaki.Taniguchi at zfe.siemens.de http://www.siemens.de/zfe_nn/homepage.html From N.Sharkey at dcs.shef.ac.uk Fri Jul 26 14:01:40 1996 From: N.Sharkey at dcs.shef.ac.uk (Noel Sharkey) Date: Fri, 26 Jul 96 14:01:40 BST Subject: ROBOT LEARNING: the new wave Message-ID: <9607261301.AA19506@dcs.shef.ac.uk> see FAQs below: ****** ROBOT LEARNING: THE NEW WAVE ****** Special Issue of Robotics and Autonomous Systems SPECIAL EDITOR Noel Sharkey (Sheffield) SPECIAL EDITORIAL BOARD Michael Arbib (USC) Ronald Arkin (GIT) George Bekey (USC) Randall Beer (Case Western) Bartlett Mel (USC) Maja Mataric (Brandeis) Carme Torras (Spain) Lina Massone (Northwestern) Lisa Meeden (Swarthmore) + large international REVIEW PANEL (see web page) A number of people have writing to me with questions regarding the special issue. I thought that it would be better to forward the answers to everyone. FAQs Q. Where can I get information about the issue and instructions to authors? A. The full call and link to instruction can be found at www.dcs.shef.ac.uk/research/groups/nn/RASspecial.html Q. I am nearly finished my paper but need an extension of the deadline? A. Since I shall be on vacation for two weeks, authors can have an extension of two weeks (15th August). Q. The call for papers stressed a bias in favour of papers reporting implementations on real robot. We have a paper that is based on a very relevant simulation (or a review), is that acceptable? A. Our main objective is to publish high quality papers that reflect the state of the art in robot learning. The bias (given papers of equal quality) WILL be for real implementations. However, we realise that there is other important research that bears directly on robot learning problems. We will accept submission of any such relevant papers. noel From daphne at braindev.uoregon.edu Mon Jul 29 16:25:18 1996 From: daphne at braindev.uoregon.edu (Daphne Bavelier) Date: Mon, 29 Jul 1996 13:25:18 -0700 Subject: Postdoctoral Position in Cognitive Neuroscience Message-ID: <199607292025.NAA09568@braindev.uoregon.edu> Postdoctoral Position in Cognitive Neuroscience Institute of Computational and Cognitive Sciences Georgetown University, Washington DC A postdoctoral/research fellow position is available immediately in the brain and vision lab at the institute of computational and cognitive sciences at Georgetown university. Research focuses on the neural basis of visual cognition combining behavioral and imaging techniques (fMRI, ERPs). Particular research areas within the lab include the mechanisms of visual attention and scene/object perception in normal adults as well as the study of plastic changes in the visual system after altered experience (either due to learning or altered sensory experience as in congenitally deaf individuals). Applicants should have a strong background and education in cognitive neuroscience. Special consideration will be given to candidates with prior experience in computational neuroscience or/and familiar with imaging techniques. Appointment is due to start on August 1st, 1996 or as soon as possible thereafter. The institute offers a variety of laboratories in the field of cognitive neuroscience using investigation techniques such as single cell and optical recordings in behaving monkeys and bats, a 7T fMRI for animal studies and a 1.5T fMRI for human subjects. Applicants should send a CV, summary of relevant research experience and the name and addresses of at least two referees to: Dr. Daphne Bavelier. Institute for Computational and Cognitive Sciences. Georgetown University. New Research Building. 3970 Reservoir Road. Washington DC 20007-2197, or via email to daphne at braindev.uoregon.edu. From ling at cs.hku.hk Wed Jul 31 06:06:56 1996 From: ling at cs.hku.hk (Charles X. Ling) Date: Wed, 31 Jul 96 06:06:56 HKT Subject: Controversies at the Symposium of Computational Model of Development Message-ID: <9607302206.AA16412@stamina.cs.hku.hk> The 1996 Cognitive Science Conference was held from June 12 to 15 in UC San Diego. During the Conference, Kim Plunkett and Tom Shultz organized a symposium on Computational Models of Development, with four speakers: Jeff Elman, Denis Mareschal, Tom Shultz, and me. There were also two discussants, Jeff Shrager and Liz Bates. The symposium spiked heated debates among speakers, discussants, and audiences. Many good discussions were carried out after the Symposium. I have put a personal account of the event, the transparencies of my speech, and Jeff Shrager's commentary on my web page. I am also setting up animations on how decision tree learning algorithms learn quasi-regular associations, demonstrate graceful degradation, and have graded effect of development. Please let me know any comments you may have. You can find all of the above at: http://www.cs.hku.hk/~ling Cheers, Charles From gmato at mu.ft.uam.es Wed Jul 31 10:56:23 1996 From: gmato at mu.ft.uam.es (German Mato) Date: Wed, 31 Jul 1996 16:56:23 +0200 Subject: paper on the dynamics of receptive fields Message-ID: <9607311456.AA10800@mu.ft.uam.es> The following paper on the dynamics of receptive fields in the visual cortex has been put into the Neuroprose repository. Comments and suggestions are welcome. Sorry, no hard copy available. German Mato and Nestor Parga gmato at delta.ft.uam.es parga at ccuam3.sdi.uam.es ***************************************************** FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/parga.dynrf.ps.Z 44 pages Abstract: In this work we study the dynamical changes of receptive fields in a system in which the input has a lesion. A two-layer architecture representing the retina and V1 is introduced and the values of the horizontal connections in the second layer are updated in such a way that the (two-point and higher) correlations between the activities of different cortical neurons are minimized. We find that this algorithm, with a simultaneous dynamics for the activities of cortical neurons and for their horizontal connections, predicts several experimental facts: the expansion of the receptive fields, the bias in feature localization experiments and the {\it filling-in} phenomenon (in which the lesion is "filled" with the surrounding pattern after a time of adaptation). We find that non-Hebbian terms in the updating rule for the horizontal connections are essential to account quantitatively for the experimental results. From JEROEN.vanDEUTEKOM at wkap.nl Wed Jul 31 05:49:07 1996 From: JEROEN.vanDEUTEKOM at wkap.nl (Jeroen van Deutekom) Date: Wed, 31 Jul 1996 11:49:07 +0200 Subject: Neural Processing Letters Message-ID: <3156481031071996/A02085/WKVAX5/11A7FAA82300*@MHS> Considering the background of this mailinglist it might be of interest to inform you about the journal NEURAL PROCESSING LETTERS Editors in Chief: Michel Verleysen Univ. Catholique de Louvain, Belgium Francois Blayo EERIE, Lyon, France This journal is published by Kluwer Academic Publishers. More and current information on the journal of Neural Processing Letters and table of contents can be obtained from the following sites: Gopher: gopher.wkap.nl WWW: ftp://gopher.wkap.nl/journal/nepl gopher://gopher.wkap.nl:70/00gopher_root1%3A%5Bjournal.nepl%5Dnepl.inf Information can also be obtained via: E-mail from North and South America: kluwer at wkap.com E-mail from the Rest of the World : services at wkap.nl From ZECCHINA at to.infn.it Mon Jul 1 06:10:42 1996 From: ZECCHINA at to.infn.it (Riccardo Zecchina - tel.11-5647358, fax. 11-5647399) Date: Mon, 1 Jul 1996 12:10:42 +0200 (MET-DST) Subject: Paper available on the Random K-Satisfiability Problem Message-ID: <960701121042.28201d6e@to.infn.it> The following paper on algorithmic complexity is available by FTP. STATISTICAL MECHANICS OF THE RANDOM K-SATISFIABILITY PROBLEM by Remi Monasson and Riccardo Zecchina. ABSTRACT: The Random K-Satisfiability Problem, consisting in verifying the existence of an assignment of $N$ Boolean variables that satisfy a set of $M=\alpha N$ random logical clauses containing $K$ variables each, is studied using the replica symmetric framework of disordered systems. The detailed structure of the analytical solution is discussed for the different cases of interest $K=2$, $K\ge 3$ and $K\gg 1$. We present an iterative scheme allowing to obtain exact and systematically improved solutions for the replica symmetric functional order parameter. The caculation of the number of solutions, which allowed us [Phys. Rev. Lett. 76, 3881 (1996)] to predict a first order jump at the threshold where the Boolean expressions become unsatisfiable with probability one, is thoroughly displayed. In the case $K=2$, the (rigourously known) critical value ($=1$) of the number of clauses per Boolean variable is recovered while for $K\ge 3$ we show that the system exhibits a replica symmetry breaking transition. The annealed approximation is proven to be exact for large $K$. (30 pages + 8 figures) Retrieval information: FTP-host: ftp.polito.it FTP-pathname: /pub/people/zecchina/tarksat.gz URL: ftp://ftp.polito.it/pub/people/zecchina From matteo at nwu.edu Mon Jul 1 16:56:26 1996 From: matteo at nwu.edu (Matteo Carandini) Date: Mon, 1 Jul 1996 15:56:26 -0500 Subject: Symposium on Orientation Selectivity Message-ID: A Satellite Symposium to the 1996 Computation and Neural Systems CNS*96 Conference **** ORIENTATION SELECTIVITY IN V1. IS AN AGREEMENT POSSIBLE? ***** Wednesday, July 17, 7-10 pm Bldg. E 25, Room 401 Department of Brain and Cognitive Sciences, MIT Organized by Matteo Carandini (Northwestern) and David Somers (MIT) There is currently no agreement over whether the orientation selectivity of cells in the primary visual cortex results from the feed-forward arrangement of subcortical inputs or from intracortical feedback. This debate has gone on for about 30 years, and has recently been heated by the modeling work of Somers et al (J Neurosci 95), Douglas et al (Science 95), Suarez et al (J Neurosci 95) and Ben-Yishai et al (PNAS 95) and somewhat cooled by the experimental results of Ferster et al (Nature 96) and of Reid and Alonso (Nature 95). We encourage those with active research interests in this topic to attend, but also welcome those with more casual interest. We aim to stimulate discussion on the following issues: - The available evidence. The two sides in the debate sometimes cite the same references for opposite reasons. Let's discuss the evidence and decide what models are consistent with it. - The level of modeling. The existing models range from the very detailed (e.g Somers, Suarez), to the very simplified (e.g. Douglas, Ben-Yishai). What level of complexity should be achieved by a satisfactory model of orientation selectivity? - The ideal evidence. What would constitute unequivocal evidence against one of the two views? Is this evidence available or should new experiments be designed? *** Maximal participation by the audience will be encouraged. *** We have invited speakers with widely different opinions: - David Ferster (Northwestern) - Gary Holt (Caltech) - Xing Pei (Missouri) - Clay Reid (Harvard) - Dario Ringach (NYU) - Haim Sompolinsky (Hebrew U) A sizeable portion of the time will be spent in a free discussion. People with strong interest in the topic, like Bob Shapley (NYU), Mriganka Sur (MIT), Sacha Nelson (Brandeis) are expected to participate. --------------------------------------------------------------------------- Info on CNS*96 at http://www.bbb.caltech.edu/cns96/cns96.html From mccallum at cs.rochester.edu Mon Jul 1 19:29:55 1996 From: mccallum at cs.rochester.edu (Andrew McCallum) Date: Mon, 01 Jul 1996 19:29:55 -0400 Subject: Paper on RL, exploration, hidden state Message-ID: <199607012329.TAA06593@slate.cs.rochester.edu> The following paper on reinforcement learning, hidden state and exploration is available by FTP. Comments and suggestions are welcome. "Efficient Exploration in Reinforcement Learning with Hidden State" Andrew Kachites McCallum (submitted to NIPS) Abstract Undoubtedly, efficient exploration is crucial for the success of a learning agent. Previous approaches to directed exploration in reinforcement learning exclusively address exploration in Markovian domains, i.e. domains in which the state of the environment is fully observable. If the environment is only partially observable, they cease to work because exploration statistics are confounded between aliased world states. This paper presents Fringe Exploration, a technique for efficient exploration in partially observable domains. The key idea, (applicable to many exploration techniques), is to keep statistics in the space of possible short-term memories, instead of in the agent's current state space. Experimental results in a partially observable maze and in a difficult driving task with visual routines show dramatic performance improvements. Retrieval information: FTP-host: ftp.cs.rochester.edu FTP-pathname: /pub/papers/robotics/96.mccallum-nips.ps.gz URL: ftp://ftp.cs.rochester.edu/pub/papers/robotics/96.mccallum-nips.ps.gz From ndxdpran at rrzn-user.uni-hannover.de Tue Jul 2 05:36:47 1996 From: ndxdpran at rrzn-user.uni-hannover.de (ndxdpran@rrzn-user.uni-hannover.de) Date: Tue, 2 Jul 1996 11:36:47 +0200 (MET DST) Subject: ICOBIP'97 announcement Message-ID: <199607020936.LAA06163@sun1.rrzn-user.uni-hannover.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 1157 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/02420ef8/attachment-0001.ksh From pierre at mbfys.kun.nl Tue Jul 2 11:04:50 1996 From: pierre at mbfys.kun.nl (Pi\"erre van de Laar) Date: Tue, 02 Jul 1996 17:04:50 +0200 Subject: sensitivity analysis and relevance Message-ID: <31D93A92.15FB7483@mbfys.kun.nl> Dear Connectionists, On my request for references to methods which perform sensitivity analysis and/or relevance determination of input fields, and especially methods which use neural networks, I received a large number of reactions with even a larger number of references. Due to the large size of the resulting list of references, I will not post it. People interested in this list of references can download it in bibtex, refer, or html format from ftp.mbfys.kun.nl in the directory snn/pub/pierre as file connectionists.bib , connectionists.refer ,or connectionists.html respectively. The URL for the HTML format is thus ftp://ftp.mbfys.kun.nl/snn/pub/pierre/connectionists.html Once again, I would like to thank all people who sent their references about these topics to me. Greetings, -- Pi\"erre van de Laar Department of Medical Physics and Biophysics, University of Nijmegen, The Netherlands http://www.mbfys.kun.nl/~pierre/ mailto:pierre at mbfys.kun.nl P.S. New references are, of course, still welcome. From psarroa at westminster.ac.uk Tue Jul 2 04:41:30 1996 From: psarroa at westminster.ac.uk (Alexandra Psarrou) Date: Tue, 2 Jul 1996 09:41:30 +0100 (BST) Subject: Research post in Face Recognition Message-ID: <199607020841.JAA27566@jaguar.wmin.ac.uk> Research Post in Face Recognition CENTRE FOR ARTIFICIAL INTELLIGENCE RESEARCH Sir George Cayley Research Institute University of Westminster Applications are invited for the position of a Research Assistant in the Centre for AI Research of the University of Westminster to work in a one year research project in Machine Vision and Neural Networks. The successful candidate will undertake research in the area of Dynamic Face Recognition.The aim of this project is to exploit existing machine vision and neural network techniques for developing a framework for dynamic face recognition based on photometric representations. Applicants for this post should be educated to degree level within a relevant discipline (preferably computer science), and possess a working knowledge of C/C++/X-Windows on Unix platforms. Knowledge of image processing and neural network techniques will be an advantage. The post is available from July 1996 and the person appointed will be expected to start as soon as possible. Salary scales: Research A: 11,388 - 15,026 pounds sterling p.a., (including London allowance) - for more information about the project phone/email or send your CV to: Dr. Alexandra Psarrou School of Computer Science and Information Systems Engineering University of Westminster 115 New Cavendish Str London W1M 8JS Tel: (+44) - 171-911-5000 ext 3599 Fax: (+44) - 171-911-5089 Email: psarroa at westminster.ac.uk - for more information about the AI Research centre check our web page: http://www.scsise.wmin.ac.uk/AI/AI_Division.html From ndxdpran at rrzn-user.uni-hannover.de Wed Jul 3 05:49:54 1996 From: ndxdpran at rrzn-user.uni-hannover.de (ndxdpran@rrzn-user.uni-hannover.de) Date: Wed, 3 Jul 1996 11:49:54 +0200 (MET DST) Subject: ICOBIP'97 - correction of WWW home page Message-ID: <199607030949.LAA18829@sun1.rrzn-user.uni-hannover.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 1160 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/3d993bec/attachment-0001.ksh From c.k.i.williams at aston.ac.uk Wed Jul 3 09:53:50 1996 From: c.k.i.williams at aston.ac.uk (Chris Williams) Date: Wed, 03 Jul 1996 15:53:50 +0200 Subject: Post-doc research position at Aston University, England Message-ID: <10110.199607031353@sun.aston.ac.uk> Postdoctoral Research Fellowship at Aston University, England Combining Spatially-Distributed Predictions from Neural Networks The Neural Computing Research Group at Aston is looking for a highly motivated individual for a 2 year postdoctoral research position in the area of "Combining Spatially-Distributed Predictions from Neural Networks", working with Dr. Chris Williams and Dr. Ian Nabney. The aim of this project is to develop methods for the fusion of spatially-distributed predictions from neural networks with prior knowledge about possible spatial patterns. This post is funded by a grant from the Engineering and Physical Sciences Research Council (UK), in collaboration with British Aerospace and the Meteorological Office. Potential candidates should have strong mathematical and computational skills, with a background one or more of neural networks, Bayesian belief networks and statistical Markov chain Monte Carlo computation. Neural networks have been used very successfully in a wide variety of domains for performing classification or regression tasks. A characteristic of most currently successful applications is that the input patterns are either independent (as in static pattern classification) or related over time, rather than being spatially distributed. To extend the use of neural networks to spatially distributed tasks, such as the prediction of a wind vector-field from remote-sensing data, typically it is necessary to combine local bottom-up predictions (wind vector predictions on a pixel-by-pixel basis) with global prior knowledge (typical wind-field configurations, including weather fronts). This combination can be achieved by using Bayes' theorem to obtain the posterior distribution for the features of interest (the wind-field). The project will apply this framework in the areas of remote sensing, the segmentation of images, and object recognition. Closing date: 29 July, 1996. Informal enquiries can be made by email to Chris Williams (C.K.I.Williams at aston.ac.uk) or to Ian Nabney (I.T.Nabney at aston.ac.uk). The target start date is October 1996, although this may be somewhat flexible, More information on the Neural Computing Research Group and the postdoc position is available from http://www.ncrg.aston.ac.uk/ Salaries will be up to point 6 on the RA 1A scale, currently 15,986 UK pounds. These salary scales are subject to annual increments. If you wish to be considered for this position, please send a full CV and publications list, together with the names of 3 referees, to: Dr. Chris Williams Neural Computing Research Group Department of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: +44 121 333 4631 Fax: +44 121 333 4586 e-mail: C.K.I.Williams at aston.ac.uk (email submission of postscript files is welcome) From mrj at dcs.ed.ac.uk Wed Jul 3 10:43:08 1996 From: mrj at dcs.ed.ac.uk (Mark Jerrum) Date: Wed, 3 Jul 1996 15:43:08 +0100 Subject: ICMS Workshop on the Vapnik-Chervonenkis Dimension, Edinburgh Message-ID: <17667.9607031443@ox.dcs.ed.ac.uk> [For distribution on the connectionist mailing list: thanks!] ************************************************************** *** *** *** ICMS WORKSHOP on the VAPNIK-CHERVONENKIS DIMENSION *** *** Edinburgh, 9th--13th September 1996 *** *** *** *** An interdisciplinary meeting of interest to *** *** probabilists, statisticians, theoretical computer *** *** scientists, and the machine learning community *** *** *** ************************************************************** The International Centre for Mathematical Sciences (ICMS) at Edinburgh will hold a Workshop on the Vapnik-Chervonenkis Dimension(*) in the week 9th--13th September 1996. The workshop will take place at the ICMS's headquarters at 14 India Street, Edinburgh, the birthplace of James Clerk Maxwell, which has recently been adapted to support meetings with about 50 participants. We (the organisers or the workshop) envisage a multidisciplinary meeting covering the topic in all its aspects: probability and statistics, computational learning theory, geometry, and applications in computer science. The following invited speakers have agreed to participate: Shai Ben-David, Technion, Haifa, Israel; David Haussler, University of California at Santa Cruz, USA; Jiri Matousek, Charles University, Prag, Czech Republic; V. N. Vapnik, AT&T Bell Laboratories, Holmdel, NJ, USA. A registration form is available from the workshop's WWW page at http://www.dcs.ed.ac.uk/~mrj/VCWorkshop/ (also accessible from the ICMS home page). Alternatively, intending participants may contact the ICMS by post or e-mail: Margaret Cook ICMS 14 India Street Edinburgh EH3 6EZ Scotland Phone: +44 (0)131-220-1777 Fax: +44 (0)131-220-1053 E-mail: icms at maths.ed.ac.uk Those interested in participating should return the registration form as soon as possible, as the total number of places is limited by the size of the venue. There will be ample scope for contributed talks. Mark Jerrum, Angus MacIntyre, and John Shawe-Taylor (Workshop organisers) (*) The Vapnik-Chervonenkis (VC) dimension is a combinatorial parameter of a set system (equivalently, of a class of predicates) which, informally, can be said to characterise the expressibility of the class. This parameter is of great significance in a wide range of applications: in statistics, theoretical computer science, and machine learning, for example. In statistics, one may identify ``set'' with ``event,'' in which case finite VC dimension entails a _uniform_ analogue of the strong law of large numbers for the class of events in question. (This is the situation described by the phrase ``uniform convergence of empirical measure.'') In learning theory (the mathematical theory of inductive inference), one may identify ``set'' with ``concept,'' in which case the VC dimension of the concept class gives quite tight bounds on the sample size that is necessary and sufficient for a learner to form an accurate hypothesis from classified examples. From chandler at kryton.ntu.ac.uk Thu Jul 4 06:03:27 1996 From: chandler at kryton.ntu.ac.uk (chandler) Date: Thu, 4 Jul 1996 10:03:27 +0000 Subject: PhD Research Positions available, Nottingham England Message-ID: <9607040903.AA01426@kryton.ntu.ac.uk> VACANCIES Research (2 Bursary Students) Object Recognition for Assembly Human Centred Assembly Introduction The Manufacturing Automation Research Group (MARG) has been working on the development of Artificial Intelligence techniques for the recognition of 3-D objects. The aim of this work is to develop a system for the recognition of solid objects independent of their position and orientation within the work domain. This is to aid in the manipulation of objects within a robotic cell, particularly for the processes of assembly and other manipulative tasks . In the formation of this work, a novel method using ANN with parallels to the processing within the primate visual system, has been used. The Programme Object Recognition for Assembly The aim of this project is to improve the fundamental understanding of object recognition for use in assembly process. The major area of the research will be the implementation of novel techniques of object recognition, and provide position and rotation parameters to enable assembly tasks to be executed. The ANN techniques already developed in-house will be extended and integrated with the robot, providing invariant object recognition capability to the system. Additionally the geometric descriptors will be assessed for their validity/accuracy. Task level robotic operations can then use these descriptors as a base for further actions. Human Centred Assembly Whilst the sections of the research programme described above will provide both novel and effective robotic assembly, this section of the work seeks to draw the maximum knowledge from existing manual methods. It therefore provides an effective link, drawing knowledge from the manual operation and contributing to the learning of a manipulative skill by a machine. The work will analyse human centred assembly strategies and contrast them with automation techniques. Methods which are applicable to sensory challenged human assembly will be interpreted and applied to the sensor equipped robot. Applications invited from Graduates (Engineering, Science) with good classifications Bursary 6,000 UKP / annum for 3 years Applicants will be expected to register for a PhD programme. References 1) Keat J, Balendran V, Sivayoganathan K. 1995. Invariant Object Recognition with a Neurobiological Slant, Proceeding of the Fourth IEE International Conference on Artificial Neural Networks, Cambridge. 2) Keat J, Balendran V, Sivayoganathan K, Sackfield A. "IvOR: A Neurobiologically slanted approach to PSRI Object recognition", 5th Irish Neural Network Conference - INNC95, September 11-13, 1995, pp. 30-37, Maynooth, Ireland. 3) Howarth M, Sivayoganathan K, Thomas P, Gentle C.R., "Robotic task level programming using neural networks", 4th Int. Conf.on Artificial Neural Networks, 26-28 June 1995. pp 262-267. Churchill College, Cambridge. 4) Balendran V, Sivayoganathan K, Al-Dabass D. 1989. Detection of flaws on slowly varying surfaces, Proceedings of the Fifth National Conference on Production Research, London, Kogan Press, pp82-85. 5) Keat J, Balendran V, Sivayoganathan K, Sackfield A. 1994. 3-D data collection for object recognition, Advances in Manufacturing Technology VIII, Proceedings of the Tenth National Conference on Manufacturing Research, Loughborough, pp648-652. Contact: Dr. K. Sivayoganathan man3sivayk at ntu.ac.uk tel: +44(0)115 941 8418 ex 4112. Dr. S. Kennedy man3kennesj at ntu.ac.uk tel: +44(0)115 941 8418 ex 4106. Mr. M. Howarth m.howarth at marg.ntu.ac.uk tel: +44(0)115 941 8418 ex 4110. Manufacturing Automation Research Group Department of Manufacturing Engineering, Burton Street, Nottingham, NG1 4BU. fax: +44(0)115 941 4024 From kevin.swingler at psych.stir.ac.uk Thu Jul 4 12:42:54 1996 From: kevin.swingler at psych.stir.ac.uk (Kevin Swingler) Date: Thu, 4 Jul 96 12:42:54 BST Subject: New Book Applying Neural Networks Message-ID: <9607041142.AA21529@nevis.stir.ac.uk> *********************************************************************** NEW BOOK ANNOUNCEMENT Applying Neural Networks A Practical Guide Kevin Swingler Academic Press. ISBN: 0126791708 *********************************************************************** Description This book takes the most common neural network architecture--the multi-layer perceptron--and leads the reader through every step the development of a trained network. Chapters cover data collection, quantity, quality, validation, preparation and encoding; network arcitecture, size and training; error analysis ; network validation, confidence limits, sensitivity measures and rule derivation. The book also covers novelty detection and time series analysis. The book presents a set of procedures designed to ensure succsessful network development and is concluded with a set of demonstration chapters on the application of neural networks to signal processing, financial analysis and process control. Each chapter is divided into three sections: A general discussion without equations, a how-to-do-it section where equations and algorithms are layed out, and a set of worked examples. The book also comes with a disk of C and C++ programs which implement the techniques discussed. Ordering Applying Neural Networks may be ordered directly from Academic Press or from your usual retail outlets. From j.b.rogers at ic.ac.uk Fri Jul 5 05:43:07 1996 From: j.b.rogers at ic.ac.uk (j.b.rogers@ic.ac.uk) Date: Fri, 5 Jul 1996 10:43:07 +0100 (BST) Subject: NN course announcement Message-ID: <27816.9607050943@london.ee.ic.ac.uk.ee.ic.ac.uk> A non-text attachment was scrubbed... Name: not available Type: text Size: 2452 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/92a2d868/attachment-0001.ksh From srx014 at coventry.ac.uk Mon Jul 8 07:57:04 1996 From: srx014 at coventry.ac.uk (Colin Reeves) Date: Mon, 8 Jul 1996 12:57:04 +0100 (BST) Subject: Research studentship at Coventry University In-Reply-To: <344.836193914@B.GP.CS.CMU.EDU> Message-ID: The following PhD studentship is available. Please send CVs either electronically or by snail-mail (address at foot of this message). ------------------------------------------------------------------------- Project ------- The application of artificial intelligence to the control of large, complex gas transmission systems. Control Theory and Applications Centre, and British Gas TransCo, Coventry University System Control, Hinckley Background ---------- British Gas TransCo is responsible for the transportation and storage of gas produced by offshore fields. Gas is delivered at 6 coastal terminals where it enters the TransCo pipeline system. System Control manages and controls the flow of gas from these terminals to 18 million consumers. Following a recent re -organisation, there are 4 Area Control Centres (ACCs), at each of which 6 teams of 4 engineers are responsible for the operation of their Local Delivery Zones (LDZs). In order to provide a secure and economic gas transportation service, the ACC engineers need to develop best practices and apply these consistently to the operation of the LDZs. This project will investigate the possible application of Artificial Intelligence techniques to support the operation of these large, complex gas transmission systems. Methodology ----------- The Control Theory and Applications Centre at Coventry University has successfully applied AI methods, including neural nets, fuzzy systems, genetic algorithms etc, to complex control problems. It is intended in this project to extend these approaches to the gas pipeline systems of TransCo. This will involve firstly the selection (in close collaboration with TransCo System Control) of a suitable subsystem for a feasibility study. Historical records of this subsystem will be used to identify important variables, and then to extract knowledge in the form of rules. Knowledge elicitation from the experts (the operations engineers ) will also be carried out. Probable techniques include the use of neuro-fuzzy methods and the application of genetic algorithms or other heuristics in mining the data archives. At appropriate times it is required that a competent report will be presented to TransCo. In addition there will be opportunities to present the work at conferences or to publish through technical journals. The successful candidate will register for a PhD, working under the supervision of Colin Reeves at Coventry University. Candidate profile ----------------- The student should have a first degree (at least a 2i) or MSc in an appropriate technological or engineering subject. Good general mathematical and computing skills are more important than the specific subject of the degree. It would be an advantage to have previous knowledge of AI methods such as those mentioned above. Knowledge of control systems and databases would also be useful, and the candidate needs to possess the appropriate inter-personal skills to work in an operational engineering environment. Timescale --------- The project will commence in September 1996 and will last for 3 years. ------------------------------------------------------------------------- ___________________________________________ | Colin Reeves | | School of Mathematical and Information | | Sciences | | Coventry University | | Priory St | | Coventry CV1 5FB | | tel :+44 (0)1203 838979 | | fax :+44 (0)1203 838585 | | email: CRReeves at coventry.ac.uk | |___________________________________________| From moeller at informatik.uni-bonn.de Tue Jul 9 04:06:27 1996 From: moeller at informatik.uni-bonn.de (Knut Moeller) Date: Tue, 9 Jul 1996 10:06:27 +0200 (MET DST) Subject: HeKoNN96-CfP Message-ID: <199607090806.KAA09957@macke.informatik.uni-bonn.de> This announcement was sent to various lists. Sorry if you recieved multiple copies. ----------------------------------------------------------------- WWW: http://set.gmd.de/AS/fg1.1.2/hekonn ----------------------------------------------------------------- CALL FOR PARTICIPATION ================================================================= = = = H e K o N N 9 6 = = = Autumn School in C o n n e c t i o n i s m and N e u r a l N e t w o r k s October 2-6, 1996 Muenster, Germany Conference Language: German ---------------------------------------------------------------- A comprehensive description of the Autumn School together with abstracts of the courses can be found at the following address: WWW: http://set.gmd.de/AS/fg1.1.2/hekonn = = = O V E R V I E W = = = Artificial neural networks (ANN's) have been discussed in many diverse areas, ranging from models of cortical learning to the control of industrial processes. The goal of the Autumn School in Connectionionism and Neural Networks is to give a comprehensive introduction to connectionism and artificial neural networks (ANN's) and to provide an overview of the current state of the art. Courses will be offered in five thematic tracks. (The conference language is German.) The FOUNDATION track will introduce basic concepts (A. Zell, Univ. Stuttgart) and theoretical issues. Hardwareaspects (U. Rueckert, Univ. Paderborn), Lifelong Learning (G. Paass, GMD St.Augustin), algorithmic complexity of learning procedures (M. Schmitt, TU Graz) and convergence properties of ANN's (K. Hornik, TU Vienna) are presented in further lectures. This year, a special track was devoted to BRAIN RESEARCH. Courses are offered about the simulation of biological neurons (R. Rojas, Univ. Halle), theoretical neurobiology (H. Gluender, LMU Munich), learning and memory (A. Bibbig, Univ. Ulm) and dynamical aspects of cortical information processing (H. Dinse, Univ. Bochum). In the track on SYMBOLIC CONNECTIONISM and COGNITIVE MODELLING, consists of courses on: procedures for extracting rules from ANN's (J. Diederich, QUT Brisbane). representation and cognitive models (G. Peschl, Univ. Vienna), autonomous agents and ANN's (R. Pfeiffer, ETH Zuerich) and hybrid systems (A. Ultsch, Univ. Marburg). APPLICATIONS of ANN's are covered by courses on image processing (H.Bischof, TU Vienna), evolution strategies and ANN's (J. Born, FU Berlin), ANN's and fuzzy logic (R. Kruse, Univ. Braunschweig), and on medical applications (T. Waschulzik, Univ. Bremen). In addition, there will be courses on PROGRAMMING and SIMULATORS. Participants will have the opportunity to work with the SNNS simulator (G. Mamier, A. Zell, Univ. Stuttgart) and the VieNet2/ECANSE simulation tool (G. Linhart, Univ. Vienna). DEADLINE for applications is August 1,1996. For application or enquiries please contact: knepper at informatik.uni-bonn.de From terry at salk.edu Tue Jul 9 12:59:52 1996 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 9 Jul 1996 09:59:52 -0700 (PDT) Subject: NEURAL COMPUTATION 8:5 Message-ID: <199607091659.JAA23546@helmholtz.salk.edu> Neural Computation - Contents Volume 8, Number 5 - July 1, 1996 Article Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm Randall C. O'Reilly Letters Effects Of Nonlinear Synapses On The Performance Of Multilayer Neural Networks G. Dundar, F-C. Hsu and K. Rose Modeling Slowly Bursting Neuron Via Calcium Store And Voltage-Independent Calcium Current Teresa Ree Chay Type 1 Membranes, Phase Resetting Curves, And Synchrony Bard Ermentrout On Neurodynamics With Limiter Function And Linsker's Developmental Model Jianfeng Feng, Hong Pan and Vwani P. Roychowdhury Effect Of Binocular Cortical Misalignment On Ocular Dominance And Orientation Selectivity Harel Shouval, Nathan Intrator, C. Charles Law, and Leon N Cooper A Novel Optimizing Network Architecture With Applications Anand Rangarajan, Steven Gold and Eric Mjolsness Gradient Projection Network: Analog Solver For Linearly Constrained Nonlinear Programming Kiichi Urahama Online Steepest Descent Yields Weights With Non-Normal Limiting Distribution Sayandev Mukherjee and Terrence L. Fine A Numerical Study On Learning Curves In Stochastic Multilayer Feedforward Networks K.-R. Muller, M. Finke, N. Murata, K. Schulten and S. Amari Rate Of Convergence In Density Estimation Using Neural Networks Dharmendra S. Modha and Elias Masry Modeling Conditional Probability Distributions For Periodic Variables Christopher M. Bishop and Ian T. Nabney ----- ABSTRACTS - http://www-mitpress.mit.edu/jrnls-catalog/neural.html SUBSCRIPTIONS - 1996 - VOLUME 8 - 8 ISSUES ______ $50 Student and Retired ______ $78 Individual ______ $220 Institution Add $28 for postage and handling outside USA (+7% GST for Canada). (Back issues from Volumes 1-7 are regularly available for $28 each to institutions and $14 each for individuals Add $5 for postage per issue outside USA (+7% GST for Canada) mitpress-orders at mit.edu MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142. Tel: (617) 253-2889 FAX: (617) 258-6779 ----- From torkkk at base.sps.mot.com Tue Jul 9 13:12:27 1996 From: torkkk at base.sps.mot.com (Kari Torkkola) Date: Tue, 09 Jul 1996 10:12:27 -0700 Subject: papers on blind source separation available in the neuroprose archive Message-ID: <31E292FB.5347@base.sps.mot.com> The following two related papers on blind source separation are available on-line in the neuroprose archive. Papers extend the information maximization approach of Bell and Sejnowski towards the separation of more realistic mixtures of signals. FTP-host: archive.cis.ohio-state.edu FTP-file: pub/neuroprose/torkkola.icassp96.ps.Z FTP-file: pub/neuroprose/torkkola.nnsp96.ps.Z ------------------------------------------------------------------ BLIND SEPARATION OF DELAYED SOURCES BASED ON INFORMATION MAXIMIZATION 4 pages, 341K compressed, 920K uncompressed @InProceedings{Torkkola96icassp, author = "Kari Torkkola", title = "Blind separation of delayed sources based on information maximization", booktitle = "Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing", address = "Atlanta, GA", month = "May 7-10", year = "1996", pages = "3510-3513", url = "ftp://archive.cis.ohio-state.edu/pub/neuroprose/torkkola.icassp96.ps.Z", } Abstract Recently, Bell and Sejnowski have presented an approach to blind source separation based on the information maximization principle. We extend this approach into more general cases where the sources may have been delayed with respect to each other. We present a network architecture capable of coping with such sources, and we derive the adaptation equations for the delays and the weights in the network by maximizing the information transferred through the network. ------------------------------------------------------------------ BLIND SEPARATION OF CONVOLVED SOURCES BASED ON INFORMATION MAXIMIZATION 10 pages, 147K compressed, 340K uncompressed @InProceedings{ Torkkola96nnsp, author = "Kari Torkkola", title = "Blind separation of convolved sources based on information maximization", booktitle = "Neural Networks for Signal Processing VI (Proceedings of the 1996 IEEE Workshop)", address = "Kyoto, Japan", month = "September 4-6", year = "1996", note = "(in press)", url = "ftp://archive.cis.ohio-state.edu/pub/neuroprose/torkkola.nnsp96.ps.Z", } Abstract Blind separation of independent sources from their convolutive mixtures is a problem in many real world multi-sensor applications. In this paper we present a solution to this problem based on the information maximization principle, which was recently proposed by Bell and Sejnowski for the case of blind separation of instantaneous mixtures. We present a feedback network architecture capable of coping with convolutive mixtures, and we derive the adaptation equations for the adaptive filters in the network by maximizing the information transferred through the network. Examples using speech signals are presented to illustrate the algorithm. ------------------------------------------------------------------ Kari Torkkola Motorola Phoenix Corporate Research phone: +1-602-4134129 Mail Drop EL 508 fax: +1-602-4137281 2100 E. Elliot Rd email: torkkk at base.sps.mot.com Tempe, AZ 85284 A540AA at email.mot.com From mackay at mrao.cam.ac.uk Fri Jul 12 15:44:00 1996 From: mackay at mrao.cam.ac.uk (David J.C. MacKay) Date: Fri, 12 Jul 96 15:44 BST Subject: postdoctoral vacancy in Cambridge Message-ID: An 18 month postdoc in neural networks is available in the Department of Physics, Cambridge, Great Britain. Start date --- 1 Nov 96. Title: Neural Network Process Modelling for the Microstructural management of forged components Please see this URL for further information: http://wol.ra.phy.cam.ac.uk/is/sitsvac.html Please feel free to pass on this announcement to interested colleagues. Many thanks, David MacKay =============================== new phone number at work: 339852 ========== David J.C. MacKay email: mackay at mrao.cam.ac.uk www: http://wol.ra.phy.cam.ac.uk/mackay/ Cavendish Laboratory, tel: (01223) 339852 fax: 354599 home: 276411 Madingley Road, international code: +44 1223 Cambridge CB3 0HE. U.K. home: 19 Thornton Road, Girton, Cambridge CB3 0NP From murdock at ukraine.corp.mot.com Fri Jul 12 18:04:01 1996 From: murdock at ukraine.corp.mot.com (Mike Murdock) Date: Fri, 12 Jul 1996 17:04:01 -0500 Subject: internship in NN speech recognition at Motorola Message-ID: <199607122204.RAA06125@palau.mot.com> Motorola's Chicago Corporate Research Laboratories is currently seeking a motivated individual to fill an internship position in the Speech Recognition Group in Schaumburg, Illinois. The internship will last at least three months. The Speech Recognition Group has developed innovative neural network and signal processing technology for continuous, small vocabulary robust recognition. The successful candidate will work on a component of an HMM/neural network hybrid speech recognizer. The duties of the position include applied research, software development, and conducting experiments with speech data sets. A high level of motivation is the standard for all members of the team. The individual should possess a BS or MS degree in EE, CS or a related discipline. Strong programming skills in C are required. Knowledge of neural networks, decision trees, and statistical techniques is highly desirable. Please send resume and cover letter to be considered for this position to Motorola Inc., Corporate Staffing Department, Attn: Hybrid-5375, 1303 E. Algonquin Rd., Schaumburg, IL 60196. Fax: 847-576-4959. Motorola is an equal opportunity/affirmative action employer. We welcome and encourage diversity in our workforce. From haussler at cse.ucsc.edu Mon Jul 15 22:19:38 1996 From: haussler at cse.ucsc.edu (David Haussler) Date: Mon, 15 Jul 1996 19:19:38 -0700 Subject: Postdoctoral Positions in Biosequence Analysis at UCSC Message-ID: <199607160219.TAA28950@arapaho.cse.ucsc.edu> Postdoctoral and Graduate Researcher Positions in Biosequence Analysis at UCSC We are looking for people with backgrounds in statistics and computer science (e.g. machine learning, hidden Markov models, neural networks, speech recognition) to design and implement algorithms to find new genes in human DNA sequences, and to predict the structure and function of newly discovered protein sequences. Working knowledge of molecular biology is desired but not required; we can teach you what you need to know to get started and then you can learn the rest while you are here. An introductory overview of our research and selection of recent papers can be found at http://www.cse.ucsc.edu/research/compbio. The URL http://science-mag.aaas.org/science/scripts/display/short/272/5269/1730.html contains a recent article in Science about how hot this field is. Our biosequence analysis group currently consists of 3 faculty: D. Haussler (Computer Science), R. Hughey and K. Karplus (Computer Engineering), as well as 8 graduate students, 6 undergraduate students, and a postgraduate researcher. We have active collaborations with biologists at UCSC, as well as with groups at Lawrence Berkeley Labs Genome Center, the Sanger Centre, and the European Molecular Biology Laboratory. We have one postdoc position to start Jan. 1, 1997, and have the opportunity during this month (before Aug. 1, 1996) to apply for another postdoc position that will also start Jan. 1. In addition, we are trying to build our graduate program in this area, so we encourage post-baccalaureate students to apply to begin graduate study at UCSC in this area in Fall 1997. (In exceptional cases we will consider graduate applications for Jan. 1 or March 31, 1997 as well.) We support qualified graduate students with research assistantships. If you are interested in a postdoc position, send your vita and letters of reference asap to Lisa Pascal (lisa at cse.ucsc.edu, hardcopy: Lisa Pascal, Computer Science, University of California, Santa Cruz CA 95064). Applications and other information about our graduate program can be obtained on the web by starting at http://www.cse.ucsc.edu or by calling the graduate division at 408 459-2301 (email: gradadm at cats.ucsc.edu). For addition information about graduate study in computer science and computer engineering contact Carol Mullane (mullane at cse.ucsc.edu; 408 459-2576). From ling at cs.hku.hk Thu Jul 18 02:07:45 1996 From: ling at cs.hku.hk (Charles X. Ling) Date: Thu, 18 Jul 96 02:07:45 HKT Subject: Computational Cognitive Modeling: Source of the Power Message-ID: <9607171807.AA09004@stamina.cs.hku.hk> AAAI-96 Workshop Computational Cognitive Modeling Source of the Power One-day Workshop. August 5, 1996 (During UAI, KDD, AAAI, and IAAI. Portland, Oregon) Visit http://www.cs.hku.hk/~ling for updated info Program Committee: Charles Ling (co-chair), University of Hong Kong, ling at cs.hku.hk Ron Sun (co-chair), University of Alabama, rsun at cs.ua.edu Pat Langley, Stanford University Mike Pazzani, UC Irvine Tom Shultz, McGill University Paul Thagard, Univ. of Waterloo Kurt VanLehn, Univ. of Pittsburgh Invited speakers: Gary Cottrell, Jeff Elman, Denis Mareschal, Tom Shultz, Aaron Sloman, and Paul Thagard. Note: To attend the Workshop, you MUST register. To register, send e-mails to Charles Ling or Ron Sun. If you register the AAAI main conference, registration fee is free (but you still need to register); otherwise, there is a fee of $150 per Workshop. The Workshop Program and Schedule 9:00 am: Welcome and Introduction. Charles Ling and Ron Sun (Co-Chairs) 9:10 am: Aaron Sloman, The University of Birmingham, UK What sort of architecture is required for a human-like agent? (invited talk) 9:40 am: Susan L. Epstein and Jack Gelfand, City University of New York, USA The creation of new problem solving agents from experience with visual features 10:00 am: Denis Mareschal, Exeter University, UK Models of Object Permanence: How and Why they Work (invited talk) 10:30 am: coffee break 11:00 am: Pat Langley, Stanford University, USA An abstract computational model of learning selective sensing skills 11:20 am: Craig S. Miller, Dickinson College, USA The source of graded performance in a symbolic rule-based model 11:40 am: Christian D. Schunn and Lynne M. Reder, Carnegie Mellon University Modeling changes in strategy selections over time 12:00 pm: lunch break 1:30 pm: poster session 2:30 pm: Tom Shultz, McGill University, Montreal, Canada Generative Connectionist Models of Cognitive Development: Why They Work (invited talk) 3:00 pm: Garrison W. Cottrell, University of California, San Diego, USA Selective attention in the acquisition of the past tense (invited talk) 3:30 pm coffee break 4:00 pm: Jeff Elman, University of California, San Diego, USA States and stacks: Doing computation with a recurrent neural network (invited talk) 4:30 pm: Paul Thagard, Univ. of Waterloo, Canada Evaluating Computational Models of Cognition: Notes from the Analogy Wars (invited talk) 5:00 pm: Tony Veale, Barry Smyth, Diarmuid O'Donoghue, Mark Keane Representational myopia in cognitive mapping 5:20 pm: Panel and discussions Panelists: Charles Ling, Ron Sun, Pat Langley, Mike Pazzani. 6:30 pm: end From liaw at bmsr14.usc.edu Thu Jul 18 17:08:14 1996 From: liaw at bmsr14.usc.edu (Jim-Shih Liaw) Date: Thu, 18 Jul 1996 14:08:14 -0700 (PDT) Subject: Workshop on SENSORIMOTOR COORDINATION Message-ID: <199607182108.OAA07319@bmsr14.usc.edu> REGISTRATION INFORMATION Workshop on SENSORIMOTOR COORDINATION: AMPHIBIANS, MODELS, AND COMPARATIVE STUDIES Poco Diablo Resort, Sedona, Arizona, November 22-24, 1996 Co-Directors: Kiisa Nishikawa (Northern Arizona University, Flagstaff) and Michael Arbib (University of Southern California, Los Angeles). Local Arrangements Chair: Kiisa Nishikawa. E-mail enquiries may be addressed to Kiisa.Nishikawa at nau.edu or arbib at pollux.usc.edu. Further information may be found on our home page at http://www.nau.edu:80/~biology/vismot.html. Program Committee: Kiisa Nishikawa (Chair), Michael Arbib, Emilio Bizzi, Chris Comer, Peter Ewert, Simon Giszter, Mel Goodale, Ananda Weerasuriya, Walt Wilczynski, and Phil Zeigler. SCIENTIFIC PROGRAM The aim of this workshop is to study the neural mechanisms of sensorimotor coordination in amphibians and other model systems for their intrinsic interest, as a target for developments in computational neuroscience, and also as a basis for comparative and evolutionary studies. The list of subsidiary themes given below is meant to be representative of this comparative dimension, but is not intended to be exhaustive. The emphasis (but not the exclusive emphasis) will be on papers that encourage the dialog between modeling and experimentation. A decision as to whether or not to publish a proceedings is still pending. Central Theme: Sensorimotor Coordination in Amphibians and Other Model Systems Subsidiary Themes: Visuomotor Coordination: Comparative and Evolutionary Perspectives Reaching and Grasping in Frog, Pigeon, and Primate Cognitive Maps Motor Pattern Generators This workshop is the sequel to four earlier workshops on the general theme of "Visuomotor Coordination in Frog and Toad: Models and Experiments". The first two were organized by Rolando Lara and Michael Arbib at the University of Massachusetts, Amherst (1981) and Mexico City (1982). The next two were organized by Peter Ewert and Arbib in Kassel and Los Angeles, respectively, with the Proceedings published as follows: Ewert, J.-P. and M. A. Arbib (Eds.) 1989. Visuomotor Coordination: Amphibians, Comparisons, Models and Robots. New York: Plenum Press. Arbib, M.A. and J.-P. Ewert (Eds.) 1991. Visual Structures and Integrated Functions, Research Notes in Neural Computing 3. Heidelberg, New York: Springer Verlag. REGISTRATION INFORMATION Meeting Location and General Information The Workshop will be held at the Poco Diablo Resort in Sedona, Arizona (a beautiful small town set in dramatic red hills) immediately following the Society for Neuroscience meeting in 1996. The 1996 Neuroscience meeting ends on Thursday, November 21, so workshop participants can fly from Washington, DC to Phoenix, AZ that evening, meet Friday, Saturday, and Sunday, with a Workshop Banquet on Sunday evening, and fly home on Monday, November 25th. Paper sessions will be held all day on Friday, on Saturday afternoon, and all day on Sunday. Poster sessions will be held on Saturday afternoon and evening. A group field trip is planned for Saturday morning. Graduate Student and Postdoctoral Participation In order to encourage the participation of graduate students and postdoctorals, we have arranged for affordable housing, and in addition we are able to offer a reduced registration fee (see below) thanks to the generous contribution of the Office of the Associate Provost for Research and Graduate Studies at Northern Arizona University. Travel from Phoenix to Sedona Sedona, AZ is located approximately 100 miles north of Phoenix, where the nearest major airport (Sky Harbor) is located. Workshop attendees may wish to arrange their own transportation (e.g., car rental from Phoenix airport) from Phoenix to Sedona, or they may use the Workshop Shuttle (estimated round trip cost $20 US) to Sedona on 21 November, with a return to Phoenix on 25 November. If you plan to use the Workshop Shuttle, we will need to know your expected arrival time in Phoenix by 1 October 1996, to ensure that space is available for you at a convenient time. Lodging The following costs are for each night. Since many participants may want to extend their stay to further enjoy Arizona's scenic beauty, we have negotiated special rates for additional nights after the end of the workshop on November 24th. Attendees should make their own booking with the Poco Diablo Resort, by phone (800) 352-5710 or FAX (520) 282-9712. Thurs.-Fri. (and additional week nights before the workshop) per night: students $85 US + tax, faculty $105 + tax Sat.-Sun. (and additional week nights after the workshop) per night: students $69 + tax, faculty $89 + tax. The student room rates are for double occupancy. Thus, students willing to share a room may stay for half the stated rate. When you make your room reservations with the Poco Diablo Resort, please be sure to indicate the number of guests in your party. Graduate students and postdocs should be sure to indicate whether they want single or double occupancy. Registration Fees Students and postdoctorals $100; faculty, guests and others $200. The registration fee includes lunch Fri. - Sun., wine and cheese reception during the Saturday evening poster session, and a Farewell Dinner on Sunday evening. Registration fees should be paid by check in US funds, made payable to "Sensorimotor Coordination Workshop", and should be sent to Kiisa Nishikawa at the address listed below, together with the completed registration form that follows at the end of this announcement. Completed registration forms and fees must be received by 1 August, 1996. Late registration fees will be $150 for students and postdoctorals and $250 for faculty. REGISTRATION FORM NAME: ADDRESS: PHONE: FAX: EMAIL: STATUS: [ ] Faculty ($200); [ ] Postdoctoral ($100); [ ] Student ($100); [ ] Other ($200). (Postdocs and students: Please attach certification of your status signed by your supervisor.) TYPE OF PRESENTATION (paper vs. poster) ABSTRACT SUBMITTED (yes/no) AREAS OF INTEREST RELEVANT TO WORKSHOP: WILL YOU REQUIRE ANY SPECIAL AUDIOVISUAL EQUIPMENT FOR YOUR PRESENTATION? HAVE YOU MADE A RESERVATION WITH THE HOTEL? EXPECTED TIME OF ARRIVAL IN PHOENIX (ON NOVEMBER 21): EXPECTED TIME OF DEPARTURE FROM PHOENIX (ON NOVEMBER 25): DO YOU WISH TO USE THE WORKSHOP SHUTTLE TO TRAVEL FROM PHOENIX TO SEDONA? (If so, please be sure that we know your expected arrival time by 1 October!) DO YOU WISH TO PARTICIPATE IN A GROUP HIKE IN THE SEDONA AREA ON SATURDAY MORNING? Please make sure that your check (in US funds and payable to the Sensorimotor Coordination Workshop) is included with this form. If you plan to bring a guest with you to the Workshop, please add their name(s) to this form and enclose their registration fee along with your own. Mail to: Kiisa Nishikawa, Department of Biological Sciences, Northern Arizona University, Flagstaff, AZ 86011-5640. E-mail: Kiisa.Nishikawa at nau.edu. FAX: (520)523-7500. Phone: (520)523-9497. From philh at cogs.susx.ac.uk Fri Jul 19 10:43:43 1996 From: philh at cogs.susx.ac.uk (Phil Husbands) Date: Fri, 19 Jul 1996 15:43:43 +0100 (BST) Subject: 2 postdocs, comp. neuroscience and robotics Message-ID: Centre for Computational Neuroscience and Robotics University of Sussex TWO RESEARCH FELLOWSHIPS Neuroscience and AI are areas in which the University of Sussex is exceptionally strong. A new interdisciplinary Centre has recently been established to promote synergy between studies of artificial and biological nervous systems. Applications are invited for two post-doctoral fellowships to work in an exciting, interactive and unusual environment in this interdisciplinary Centre. POST 1 Computational Modelling of Biological Sensory Processing The aim is to understand the neural mechanisms underlying motion detection. The project will involve tight coupling of computer modelling and electrophysiological data. It will feed directly into our work on robot nervous systems. POST 2 Artificial Evolution of Nervous Systems for Robots The aim is to expand the evolutionary robotics field developed at Sussex by evolving robots with more complex behavioural capabilities, in particular to evolve minimally cognitive robots. Salary will be on the Research and Analogous Faculty Grade 1A scale 14,317 to 21,519 per annum. The posts are available immediately and will be for two years in the first instance with the possibility of renewal. Informal enquiries are encouraged. Enquiries about Post 1 should be made to Professor Michael O'Shea (Tel +44 (0)1273 678055; Fax +44 (0)1273 678535; email M.O-Shea at sussex.ac.uk) and for Post 2 to Dr Phil Husbands (Tel +44 (0)1273 678556; Fax +44 (0)1273 671320; email philh at cogs.susx.ac.uk). http://www.biols.sussex.ac.uk/Biols/IRC/ccnr.html Applications, (curriculum vitae, one or two sample publications, and the names and addresses of at least two referees) should be sent to the Administrator, Centre for Computational Neuroscience and Robotics, School of Biological Sciences, University of Sussex, Falmer, Brighton, BN1 9QG, UK. Closing date: 1st October 1996. From jagota at ICSI.Berkeley.EDU Fri Jul 19 22:53:13 1996 From: jagota at ICSI.Berkeley.EDU (Arun Jagota) Date: Fri, 19 Jul 1996 19:53:13 -0700 Subject: Electronic Journal Announcement Message-ID: <199607200253.TAA08445@flapjack.ICSI.Berkeley.EDU> Dear Connectionists: This is a first announcement and CALL FOR SUBMISSIONS. Manuscripts are invited, beginning now. Arun Jagota ----------- ________________________ NEURAL COMPUTING SURVEYS ------------------------ A refereed electronic journal published on the World Wide Web http://www.icsi.berkeley.edu/~jagota/NCS MISSION One way to cope with the exponential increase in the number of articles published in recent years is to ignore most of them. A second, perhaps more satisfying, approach is to provide a forum that encourages the regular production -- and perusal -- of high-quality survey articles. This is especially useful in an inter-disciplinary, evolving field such as Neural Computing. This journal aims to bring the second approach to bear. It is intended to + encourage researchers to write good survey papers. + provides an efficiently searchable repository -- perhaps the first that comes to mind -- for researchers to look at in times of need. TOPICS COVERED All aspects of Neural Computing. See the web site for details. BOARD OF ADVISORS (to expand) Yaser Abu-Mostafa Caltech Michael Arbib USC Eric Baum NEC Research Institute Jeff Elman UCSD Scott Fahlman CMU Lee C. Giles NEC Research Institute Michael Jordan MIT Wolfgang Maass TU-Graz, Austria Eric Mjolsness UCSD Michael Mozer Colorado U. Eduardo Sontag Rutgers U. Lei Xu CUHK MANAGING EDITOR Arun Jagota UCSC & ICSI-Berkeley (vis) jagota at icsi.berkeley.edu EDITORIAL BOARD (to expand somewhat) Suzanna Becker McMaster U. Yoshua Bengio bengioy at iro.umontreal.ca Wray Buntine wray at buntine.brainstorm.net Joydeep Ghosh ghosh at ece.utexas.edu Zoubin Ghahramani zoubin at cs.toronto.edu Arun Jagota jagota at icsi.berkeley.edu Pascal Koiran koiran at lip.ens-lyon.fr Barak Pearlmutter barak.pearlmutter at alumni.cs.cmu.edu Michael Perrone mpp at watson.ibm.com Anand Rangarajan rangarajan-anand at cs.yale.edu Hava Siegelmann iehava at ie.technion.ac.il Yoram Singer singer at research.att.com Ron Sun rsun at cs.ua.edu John-Shawe Taylor john at dcs.rhbnc.ac.uk Sebastian Thrun thrun+ at cs.cmu.edu Xin Wang xwang at cs.ucla.edu Lei Xu Chinese U. of Hong Kong FEATURES AT A GLANCE Survey-only Postscript & Hypertext Attachments allowed Peer-reviewed No length restrictions No delays upon acceptance Free FORMS & ATTACHMENTS Survey papers may take the form of an ESSAY or a COMPENDIUM. ATTACHMENTS to an accepted paper -- optional to the authors -- allow use of the capabilities of the web to connect to several kinds of material closely related to the original paper. A clear separation is made between attachments (which are unrefereed and changable) and the main paper (which is refereed and not changable). See the web site for more details. HOW TO SUBMIT Electronic submissions (in postscript format) are strongly encouraged, to any one of the editors based on matching area (see web site for listing of areas of editors). A Latex style file will be available for the NCS format. See the web site for details. PRINT VERSION A print version, published by some well-known publisher as a "backup" to the electronic version, is being strongly considered. FTP ACCESS Those without convenient access to the World Wide Web might use anonymous ftp as follows (see the README file there): ftp://ftp.icsi.berkeley.edu/pub/ai/jagota ______________________________________________________________________________ From hu at engr.wisc.edu Sun Jul 21 06:47:34 1996 From: hu at engr.wisc.edu (Hu, Yu Hen) Date: Sun, 21 Jul 96 10:47:34 0000 Subject: CFP: ISMIP'96 (New Submission Deadline) Message-ID: <199607211547.AA24785@eceserv0.ece.wisc.edu> ------------------------------------------------ CALL FOR PAPERS (Extension of Submission Deadline) ------------------------------------------------ 1996 International Symposium on Multi-Technology Information Processing A Joint Symposium of Artificial Neural Networks, Circuits and Systems, and Signal Processing December 16-18, 1996 Hsin-Chu, Taiwan, Republic of China ************************************************* NEW PAPER SUBMISSION DEADLINE: AUGUEST 5, 1996 ************************************************* Call For Papers --------------- The International Symposium on Multi-Technology Information Processing (ISMIP'96), a joint symposium of artificial neural networks, circuits and systems, and signal processing, will be held in National Tsing Hua University, Hsin Chu, Taiwan, Republic of China. This conference is an expansion of previous series of International Symposium of Artificial Neural Networks (ISANN). The main purpose of this conference is to offer a forum showcasing the latest advancement of modern information processing technologies. It will include recent innovative research results of theories, algorithms, architectures, systems, hardware implementations that lead to intelligent information processing. The technical program will feature opening keynote addresses, invited plenary talks, technical presentations of refereed papers. The official language is English. Papers are solicited for, but not limited to, the following topics: 1. Associative Memory 2. Digital and Analog Neurocomputers 3. Fuzzy Neural Systems 4. Supervised/Un-supervised Learning 5. Robotics 6. Sensory/Motor Control 7. Image Processing 8. Pattern Recognition 9. Language/ Speech Processing 10. Digital Signal Processing 11. VLSI Architectures 12. Non-linear Circuits 13. Multi-media information processing 14. Optimization 15. Mathematical Methods 16. Visual signal processing 17. Content based signal processing 18. Applications Prospective authors are invited to submit 4 copies of extended summaries of no more than 4 pages. All the manuscripts must be written in English in single-spaced, single column, on 8.5" by 11" white papers. The top of the first page of the paper should include a title, authors' names, affiliations, address, telephone/fax numbers, and email address if applicable. The indicated corresponding author will receive an acknowledgement of his/her submission. Camera-ready full papers of accepted manuscripts will be published in a hard-bound proceedings and distributed in the symposium. For more information, please consult at the URL site http://pierce.ee.washington.edu/~nnsp/ismip96.html Authors are invited to send submissions to one of the program co-chairs: ------------------- For submissions from USA and Europe Dr. C.-H. Lee Multimedia Communications Research Lab Bell Laboratories, Lucent Technologies 600 Mountain Ave. 2D-425 Murray Hill, NJ 07974-0636 USA Phone: 908-582-5226 fax: 908-582-7308 chl at research.bell-labs.com --------------------- For submissions from Asia and the rest of the world Prof. V. W. Soo Dept. of Computer Science National Tsing Hua University Hsin Chu, Taiwan 30043, ROC Phone: 886-35-731068 FAX: 886-35-723694 soo at cs.nthu.edu.tw Schedule -------- Submission of full paper: August 5, 1996. Notification of acceptance: September 30, 1996. Submission of camera-ready paper: October 31, 1996. Advanced registration, before: November 15, 1996. Sponsored by National Tsing Hua University (NTHU), Ministry of Education, Taiwan R.O.C. National Science Council, Taiwan R.O.C. in Cooperation with IEEE Signal Processing Society, IEEE Circuits and Systems Society IEEE Neural Networks Council, IEEE Taiwan Section Taiwanese Association for Artificial Intelligence ORGANIZATION ------------ General Co-chairs: H. C. Wang, NTHU Y. H. Hu, U. of Wisconsin Advisory board Co-chairs: W. T. Chen, NTHU S. Y. Kung, Princeton U. Vice Co-chairs: H. C. Hu, NCTU J.N. Hwang, U. of Washington Program Co-chairs: V. W. Soo, NTHU Chin-Hui Lee AT&T From robert at physik.uni-wuerzburg.de Wed Jul 24 07:54:58 1996 From: robert at physik.uni-wuerzburg.de (Robert Urbanczik) Date: Wed, 24 Jul 1996 13:54:58 +0200 (MESZ) Subject: paper available: learning in a committee machine Message-ID: FTP-host: ftp.physik.uni-wuerzburg.de FTP-filename: /pub/preprint/WUE-ITP-96-013.ps.gz **DO NOT FORWARD TO OTHER GROUPS** The following paper (9 pages, to appear in Europhys.Letts.) is now available via anonymous ftp: (See below for the retrieval procedure) --------------------------------------------------------------------- Learning in a large committee machine: Worst case and average case by R. Urbanczik Abstract: Learning of realizable rules is studied for tree committee machines with continuous weights. No nontrivial upper bound exists for the generalization error of consistent students as the number of hidden units $K$ increases. However, numerical considerations show that consistent students with a value of the generalization error significantly higher than predicted by the average case analysis are extremely hard to find. An on-line learning algorithm is presented, for which the generalization error scales with the training set size as in the average case theory in the limit of large $K$. --------------------------------------------------------------------- Retrieval procedure: unix> ftp ftp.physik.uni-wuerzburg.de Name: anonymous Password: {your e-mail address} ftp> cd pub/preprint/1996 ftp> get WUE-ITP-96-013.ps.gz ftp> quit unix> gunzip WUE-ITP-96-013.ps.gz e.g. unix> lp WUE-ITP-96-013.ps _____________________________________________________________________ From gluck at pavlov.rutgers.edu Wed Jul 24 08:49:22 1996 From: gluck at pavlov.rutgers.edu (Mark Gluck) Date: Wed, 24 Jul 1996 08:49:22 -0400 Subject: Programmer/R.A. Position at Rutgers Univ, NJ, in Computational Neuro. Message-ID: <199607241249.IAA01635@james.rutgers.edu> SEEKING A PROGRAMMER/RESEARCH ASSISTANT TO WORK ON NEURAL-NETWORK BRAIN MODELS AT RUTGERS-NEWARK NEUROSCIENCE CENTER (GLUCK LAB). We are looking for a programmer/research assistant to work with us on testing computational models of cortico-hippocampal function in animal and human learning. The applicant must be able to work independently -- given a set of specifications, he/she should be able to optimize program performance to generate results, and also analyze system behavior. The ideal applicant would be someone recently out of college, who would like some research experience prior to future graduate work in psychology, neuroscience, cognitive science, or computer science. Required Skills: Strong C (or C++) programming Knowledge of Unix Commitment to at least 15 hours/week, for at least one year. Could also be a full time position. Preferrred But Not Required Skills: Knowledge of Sun workstations Background in neural networks Background in premed, biology, or psychology. Salary: Commensurate with skill level and experience. For more information on our research, see our lab WWW page noted below. Contact Mark Gluck below with a cover letter and resume (preferably sent by email) to apply. _______________________________________________________________________________ Dr. Mark A. Gluck Center for Molecular & Behavioral Neuroscience Rutgers University 197 University Ave. Newark, New Jersey 07102 Phone: (201) 648-1080 (Ext. 3221) Fax: (201) 648-1272 Email: gluck at pavlov.rutgers.edu WWW Homepage: http://www.cmbn.rutgers.edu/cmbn/faculty/gluck.html _______________________________________________________________________________ From listerrj at helios.aston.ac.uk Wed Jul 24 09:23:07 1996 From: listerrj at helios.aston.ac.uk (Richard Lister) Date: Wed, 24 Jul 1996 14:23:07 +0100 Subject: Two Postdoctoral Research Fellowships Message-ID: <13421.199607241323@sun.aston.ac.uk> ---------------------------------------------------------------------- Neural Computing Research Group ------------------------------- Dept of Computer Science and Applied Mathematics Aston University, Birmingham, UK TWO POSTDOCTORAL RESEARCH FELLOWSHIPS ------------------------------------- *** Full details at http://www.ncrg.aston.ac.uk/ *** ---------------------------------------------------------------------- Analysis of On-Line Learning in Neural Networks ----------------------------------------------- The Neural Computing Research Group at Aston is looking for a highly motivated individual for a 2 year postdoctoral research position in the area of `Analysis of On-Line Learning in Neural Networks'. The emphasis of the research will be on applying a theoretically well- founded approach based on methods adopted from statistical mechanics to analyse learning in multilayer perceptrons in various learning scenarios. Potential candidates should have strong mathematical and computational skills, with a background in statistical mechanics and neural networks. Conditions of Service --------------------- Salaries will be up to point 6 on the RA 1A scale, currently 15,986 UK pounds. The salary scale is subject to annual increments. How to Apply ------------ If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, to: Dr. David Saad Neural Computing Research Group Dept. of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: 0121 333 4631 Fax: 0121 333 6215 e-mail: D.Saad at aston.ac.uk e-mail submission of postscript files is welcome. Candidates that applied for the position `On-line Learning in Radial Basis Function Networks' will be automatically considered for this position as well. Closing date: 12 August, 1996. ---------------------------------------------------------------------- NEUROSAT: Processing of environment observing satellite data ------------------------------------------------------------ with Neural Networks -------------------- The Neural Computing Research Group at Aston is looking for a highly motivated individual for a 3 year postdoctoral research position in the area of processing environmental data from satellites with neural networks, working with Dr. Ian Nabney. The post is funded by a grant from the European Commission Directorate General XII in Environment and Climate. Candidates should have strong mathematical and computational skills, with a background in one or more of neural networks, Bayesian inference, or satellite data analysis. NEUROSAT is a three collaborative project whose objective is to contribute to an enhanced analysis of the real Earth climate driving forces. The consortium is led by Michel Crepon at the Institut Pierre-Simon-Laplace (IPSL) in Paris, and includes partners from the UK, France, Germany and Italy. Aston will be involved in two work packages: Assessment of Generic Techniques and Inferring Sea Surface Wind from Scatterometric Measurements. In the first of these, we are responsible for contributions to survey papers, for technology transfer to other partners, and for some feasibility studies in the use of neural networks in climatological problems. The second work package, which will be the main activity, involves developing some existing research on scatterometric data analysis to a state where it can be compared with the current operational system (AEOLUS) used at the Meteorological Office. The data that will be used comes from the ERS1 satellite. A two stage approach will be applied. The first stage is to improve the local modelling (i.e. the wind vector in a single cell), and the second stage is to improve the global modelling (i.e. the overall wind field). To improve the local modelling, the influence of sea state on the scatterometer signals will be studied. This will be done by using the ERS1 signal collocated with wind vector and sea state obtained from analysed fields of meteorological models and fused with in situ buoy observations. The purpose of this work is to understand the factors involved in the GMF so as to improve the inverse function modelling. Earlier work at Aston has used mixture density networks to model the conditional density of the inverse function (since this is typically multi-valued for wind direction), and this will make it easier to incorporate probabilistic information into the global model. Such information includes priors on model parameters, priors on data coming from weather stations and climatological information (e.g. long term weather trends). This will also allow the wind-field to be `seeded' with known values at specific locations. Techniques from optical flow (for example, div-curl splines) and Bayesian models will be investigated for their application to modelling the global wind field. This approach should lead to a self consistent, accurate and fast neural network procedure to retrieve the entire wind field. This information will be useful to meteorological centres to be assimilated into their prediction models. Dr. David Offiler (of the UK Meteorological Office) will act as a consultant to the project, assisting in the development of prior models and the assessment of the prediction methods. If we can improve on existing techniques, then there is every prospect of replacing them in operational use. Informal enquiries can be made to Ian Nabney (I.T.Nabney at aston.ac.uk). The target start date is September 1996, although this may be somewhat flexible. Conditions of Service --------------------- Salaries will be up to point 6 on the RA 1A scale, currently 15,986 UK pounds. These salary scales are subject to annual increments. How to Apply ------------ If you wish to be considered for this position, please send a full CV and publications list, together with the names of 3 referees, to: Dr. Ian Nabney Neural Computing Research Group Department of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: +44 121 333 4631 Fax: +44 121 333 4586 e-mail: I.T.Nabney at aston.ac.uk (email submission of postscript files is welcome) Closing date: 12 August, 1996. ---------------------------------------------------------------------- From mel at quake.usc.edu Wed Jul 24 03:15:23 1996 From: mel at quake.usc.edu (Bartlett Mel) Date: Wed, 24 Jul 1996 15:15:23 +0800 Subject: Workshop Announcement Message-ID: <9607242215.AA21433@quake.usc.edu> ************* CALL FOR WORKSHOP PARTICIPATION ************* Advanced Workshop on BIOLOGICAL AND ARTIFICIAL NEURAL NETWORKS: A SEARCH FOR SYNERGY sponsored by The USC Biomedical Simulations Resource and The National Center for Research Resources of NIH September 20-21, 1996 Summer House Inn, La Jolla, California This workshop will bring together investigators who share an interest in methodological issues relating to the combined study of biological and artificial neural networks, including innovative uses for artificial neural network techniques in the study of biological neural systems, and novel applications of neurobiologically inspired artificial neural network architectures. Workshop topics will emphasize: * methods for quantitative study of nonstationarities in neural systems * methods for analyzing high-dimensional spatio-temporal neural dynamics * methods for study of dynamic nonlinearities in neural systems Presentations will emphasize practical methodologies. The format will consists of brief invited and contributed presentations (15-20 minutes), followed by extended question/discussion periods emphasizing audience participation. The Workshop has been scheduled in space-time proximity to the World Congress on Neural Networks (San Diego, Sept. 15-19). Registration is free, but advance registration is required as space is limited. Limited funds are available for speakers upon request. To suggest yourself as a contributor, or for additional information, please contact Ms. Stephanie Braun at braun at bmsrs.usc.edu or (213)740-0342. ORGANIZERS: V.Z. Marmarelis, T.W. Berger From marney at ai.mit.edu Wed Jul 24 19:41:44 1996 From: marney at ai.mit.edu (Marney Smyth) Date: Wed, 24 Jul 1996 19:41:44 -0400 (EDT) Subject: Intensive Tutorial: Learning Methods for Prediction, Classification Message-ID: <9607242341.AA00934@motor-cortex.ai.mit.edu> ************************************************************** *** *** *** Learning Methods for Prediction, Classification, *** *** Novelty Detection and Time Series Analysis *** *** *** *** Cambridge, MA, September 20-21, 1996 *** *** Los Angeles, CA, December 14-15, 1996 *** *** *** *** Geoffrey Hinton, University of Toronto *** *** Michael Jordan, Massachusetts Inst. of Tech. *** *** *** ************************************************************** A two-day intensive Tutorial on Advanced Learning Methods will be held on September 20 and 21, 1996, at the Royal Sonesta Hotel, Cambridge, MA, and on December 14 and 15, 1996, at Lowe's Hotel, Santa Monica, CA. Space is available for up to 50 participants for each course. The course will provide an in-depth discussion of the large collection of new tools that have become available in recent years for developing autonomous learning systems and for aiding in the analysis of complex multivariate data. These tools include neural networks, hidden Markov models, belief networks, decision trees, memory-based methods, as well as increasingly sophisticated combinations of these architectures. Applications include prediction, classification, fault detection, time series analysis, diagnosis, optimization, system identification and control, exploratory data analysis and many other problems in statistics, machine learning and data mining. The course will be devoted equally to the conceptual foundations of recent developments in machine learning and to the deployment of these tools in applied settings. Case studies will be described to show how learning systems can be developed in real-world settings. Architectures and algorithms will be presented in some detail, but with a minimum of mathematical formalism and with a focus on intuitive understanding. Emphasis will be placed on using machine methods as tools that can be combined to solve the problem at hand. WHO SHOULD ATTEND THIS COURSE? The course is intended for engineers, data analysts, scientists, managers and others who would like to understand the basic principles underlying learning systems. The focus will be on neural network models and related graphical models such as mixture models, hidden Markov models, Kalman filters and belief networks. No previous exposure to machine learning algorithms is necessary although a degree in engineering or science (or equivalent experience) is desirable. Those attending can expect to gain an understanding of the current state-of-the-art in machine learning and be in a position to make informed decisions about whether this technology is relevant to specific problems in their area of interest. COURSE OUTLINE Overview of learning systems; LMS, perceptrons and support vectors; generalized linear models; multilayer networks; recurrent networks; weight decay, regularization and committees; optimization methods; active learning; applications to prediction, classification and control Graphical models: Markov random fields and Bayesian belief networks; junction trees and probabilistic message passing; calculating most probable configurations; Boltzmann machines; influence diagrams; structure learning algorithms; applications to diagnosis, density estimation, novelty detection and sensitivity analysis Clustering; mixture models; mixtures of experts models; the EM algorithm; decision trees; hidden Markov models; variations on hidden Markov models; applications to prediction, classification and time series modeling Subspace methods; mixtures of principal component modules; factor analysis and its relation to PCA; Kalman filtering; switching mixtures of Kalman filters; tree-structured Kalman filters; applications to novelty detection and system identification Approximate methods: sampling methods, variational methods; graphical models with sigmoid units and noisy-OR units; factorial HMMs; the Helmholtz machine; computationally efficient upper and lower bounds for graphical models REGISTRATION Standard Registration: $700 Student Registration: $400 Registration fee includes course materials, breakfast, coffee breaks, and lunch on Saturday. Those interested in participating should return the completed Registration Form and Fee as soon as possible, as the total number of places is limited by the size of the venue. ADDITIONAL INFORMATION A registration form is available from the course's WWW page at http://www.ai.mit.edu/projects/cbcl/web-pis/jordan/course/index.html Marney Smyth CBCL at MIT E25-201 45 Carleton Street Cambridge, MA 02142 USA Phone: 617 253-0547 Fax: 617 253-2964 E-mail: marney at ai.mit.edu From tvogl at wo.erim.org Fri Jul 12 17:03:07 1996 From: tvogl at wo.erim.org (Thomas P. Vogl) Date: Thu, 25 Jul 1996 09:03:07 +30000 Subject: Postdoc. Fellowship Available Message-ID: Postdoctoral Research Fellowship at George Mason University (GMU), Molecular Biosciences and Technology Institute (MBTI). Applications are invited for a postdoctoral Fellowship in the area of development of self-organizing pattern recognition algorithms based on biological information processing in visual and IT cortex. The aims of the project are (1) to develop neurobiologically plausible algorithms of visual pattern recognition which are computationally efficient and robust, and (2) compare performance of resulting algorithms with human performance in order to develop hypotheses about information processing in the brain. Evaluation of algorithms is performed using real world problems (e.g. face recognition and optical character recognition), and by comparison to human observer pattern recognition performance. We are seeking an individual with background in both neurobiology and computer science (good UNIX and C or C++ skills). Working knowledge of information theory or mathematical statistics is highly desirable but not required. The position is for one year, beginning October 1, 1996, with possible renewal for an additional three years. The initital stipend is $30,000/year plus fringe benefits. Our decade-old group currently consists of Drs. T.P. Vogl and K.T. Blackwell, and two graduate students, all of whom are actively involved in ongoing collaboration among neuroscientists (electrophysiologists) at NINDS/NIH; members of the GMU faculty in MBTI, several Departments, and the Krasnow Institute at GMU; and the professional staff at the Environmental Research Institute of Michigan (ERIM), a not-for-profit R&D company associated with the University of Michigan. Research activities encompass computer modeling, particularly computational neurobiology at levels ranging from channel level modeling of learning in single neurons to network level models, and visual psychophysics. To apply for this position, send your curriculum vitae and at least two letters of reference (in ASCII or MIME attached PostScript formats only) before September 1, 1996, to Prof. Thomas P. Vogl email: tvogl at gmu.edu. snail-mail to: ERIM 1101 Wilson Blvd. Ste 1100 Arlington, VA 22209 From john at dcs.rhbnc.ac.uk Thu Jul 25 11:17:05 1996 From: john at dcs.rhbnc.ac.uk (John Shawe-Taylor) Date: Thu, 25 Jul 96 16:17:05 +0100 Subject: Technical Report Series in Neural and Computational Learning Message-ID: <199607251517.QAA12702@platon.cs.rhbnc.ac.uk> The European Community ESPRIT Working Group in Neural and Computational Learning Theory (NeuroCOLT) has produced a set of new Technical Reports available from the remote ftp site described below. They cover topics in real valued complexity theory, computational learning theory, and analysis of the computational power of continuous neural networks. Abstracts are included for the titles. *** Please note that the location of the files was changed at the beginning of ** the year, so that any copies you have of the previous instructions should be * discarded. The new location and instructions are given at the end of the list. ---------------------------------------- NeuroCOLT Technical Report NC-TR-96-047: ---------------------------------------- A Graph-theoretic Generalization of the Sauer-Shelah Lemma by Nicol\`o Cesa-Bianchi, University of Milan, Italy David Haussler, University of California, Santa Cruz, USA Abstract: We show a natural graph-theoretic generalization of the Sauer-Shelah lemma. This result is applied to bound the $\ell_{\infty}$ and $L_1$ packing numbers of classes of functions whose range is an arbitrary, totally bounded metric space. ---------------------------------------- NeuroCOLT Technical Report NC-TR-96-048: ---------------------------------------- A Comparison between Cellular Encoding and Direct Encoding for Genetic Neural Networks by Fr\'ed\'eric Gruau, CWI, the Netherlands Darrell Whitley, Colorado State University, USA Abstract: This paper compares the efficiency of two encoding schemes for Artificial Neural Networks optimized by evolutionary algorithms. Direct Encoding encodes the weights for an a~priori fixed neural network architecture. Cellular Encoding encodes both weights and the architecture of the neural network. In previous studies, Direct Encoding and Cellular Encoding have been used to create neural networks for balancing 1 and 2 poles attached to a cart on a fixed track. The poles are balanced by a controller that push the cart to the left or the right. In some cases velocity information about the pole and cart is provided as an input; in other cases the network must learn to balance a single pole without velocity information. A careful study of the behavior of these systems suggests that it is possible to balance a single pole with velocity information as an input and without learning to compute the velocity. A new fitness function is introduced that forces ANN to compute the velocity. By using this new fitness function and tuning the syntactic constraints used with cellular encoding, we achieve a tenfold speedup over our previous study and solve a more difficult problem: balancing two poles when no information about the velocity is provided as input. -------------------------------------------------------------------- ***************** ACCESS INSTRUCTIONS ****************** The Report NC-TR-96-001 can be accessed and printed as follows % ftp ftp.dcs.rhbnc.ac.uk (134.219.96.1) Name: anonymous password: your full email address ftp> cd pub/neurocolt/tech_reports ftp> binary ftp> get nc-tr-96-001.ps.Z ftp> bye % zcat nc-tr-96-001.ps.Z | lpr -l Similarly for the other technical reports. Uncompressed versions of the postscript files have also been left for anyone not having an uncompress facility. In some cases there are two files available, for example, nc-tr-96-002-title.ps.Z nc-tr-96-002-body.ps.Z The first contains the title page while the second contains the body of the report. The single command, ftp> mget nc-tr-96-002* will prompt you for the files you require. A full list of the currently available Technical Reports in the Series is held in a file `abstracts' in the same directory. The files may also be accessed via WWW starting from the NeuroCOLT homepage: http://www.dcs.rhbnc.ac.uk/neural/neurocolt.html or directly to the archive: ftp://ftp.dcs.rhbnc.ac.uk/pub/neurocolt/tech_reports Best wishes John Shawe-Taylor From listerrj at helios.aston.ac.uk Thu Jul 25 12:18:37 1996 From: listerrj at helios.aston.ac.uk (Richard Lister) Date: Thu, 25 Jul 1996 17:18:37 +0100 Subject: Postdoctoral Research Fellowship Message-ID: <15572.199607251618@sun.aston.ac.uk> ---------------------------------------------------------------------------- Neural Computing Research Group ------------------------------- Dept of Computer Science and Applied Mathematics Aston University, Birmingham, UK POSTDOCTORAL RESEARCH FELLOWSHIP -------------------------------- *** Full details at http://www.ncrg.aston.ac.uk/ *** "Dynamical Systems and Information Geometric Approaches to Generalization" -------------------------------------------------------------------------- A mathematically-oriented researcher is required to work on a project examining a geometric view of generalization in neural networks. Geometry and topology occur at two levels in network models. A dynamical systems perspective is required to examine the behaviour of an individual neural network structure, and an information geometric perspective is required to examine the space of neural network models and perform inference. In both cases the role and geometrization of prior knowledge guides the generalization. The aim of this project is to investigate the issues of generalization in neural networks based on the geometric properties of dynamical systems, and information geometry spaces. This project forms part of a larger research activity on Validation and Verification of Neural Networks. Conditions of Service --------------------- Salaries will be up to point 6 on the RA 1A scale, currently 15,986 UK pounds. The salary scale is subject to annual increments. How to Apply ------------ If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, to: Professor David Lowe Neural Computing Research Group Dept. of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: 0121 333 4631 Fax: 0121 333 6215 e-mail: D.Lowe at aston.ac.uk e-mail submission of postscript files is welcome. Candidates that applied for this Fellowship will also automatically be considered for the four other postdoctoral Fellowships currently offered by the Neural Computing Research Group. Closing date: 19 August, 1996. ---------------------------------------------------------------------------- From radford at cs.toronto.edu Thu Jul 25 17:03:11 1996 From: radford at cs.toronto.edu (Radford Neal) Date: Thu, 25 Jul 1996 17:03:11 -0400 Subject: TR on Factor Analysis Using Delta-Rule Wake-Sleep Learning Message-ID: <96Jul25.170319edt.1282@neuron.ai.toronto.edu> Technical Report Available FACTOR ANALYSIS USING DELTA-RULE WAKE-SLEEP LEARNING Radford M. Neal Dept. of Statistics and Dept. of Computer Science University of Toronto Peter Dayan Department of Brain and Cognitive Sciences Massachusetts Institute of Technology 24 July 1996 We describe a linear network that models correlations between real-valued visible variables using one or more real-valued hidden variables - a *factor analysis* model. This model can be seen as a linear version of the "Helmholtz machine", and its parameters can be learned using the "wake-sleep" method, in which learning of the primary "generative" model is assisted by a "recognition" model, whose role is to fill in the values of hidden variables based on the values of visible variables. The generative and recognition models are jointly learned in "wake" and "sleep" phases, using just the delta rule. This learning procedure is comparable in simplicity to Oja's version of Hebbian learning, which produces a somewhat different representation of correlations in terms of principal components. We argue that the simplicity of wake-sleep learning makes factor analysis a plausible alternative to Hebbian learning as a model of activity-dependent cortical plasticity. This technical report is available in compressed Postscript by ftp to the following URL: ftp://ftp.cs.toronto.edu/pub/radford/ws-fa.ps.Z ---------------------------------------------------------------------------- Radford M. Neal radford at cs.utoronto.ca Dept. of Statistics and Dept. of Computer Science radford at utstat.utoronto.ca University of Toronto http://www.cs.utoronto.ca/~radford ---------------------------------------------------------------------------- From tanig at burton.zfe.siemens.de Fri Jul 26 08:26:20 1996 From: tanig at burton.zfe.siemens.de (Michiaki Taniguchi) Date: Fri, 26 Jul 1996 14:26:20 +0200 Subject: NEuroNet Industrial Studentship Message-ID: <199607261226.OAA29944@burton.zfe.siemens.de> ************************************************************************ NEuroNet Industrial Studentship at the Neural Network Group, Siemens Corporate Research ************************************************************************ Two studentships in the amount of 800 ECU/month for six months are available from the NEuroNet program for EU students. A major objective of NEuroNet Industrial Studentship is to provide an opportunity for students to gain experience in the industrial environment. Siemens will provide supervision of two students for SIX months in the laboratory located in Munich/Germany. The students will be assigned to project-relevant tasks in the area of telecommunications. During this time the students will have an opportunity to become acquainted with real-world applications of Neural Networks in the industrial environment. Siemens is one of the largest companies in the electrical and electronics industry world-wide, with at present about 2000 staff members in its Corporate Research and Development Division. In the Neural Network Project, which was started in 1988, more than 20 scientists and about an equal number of graduate and Ph.D. students are working on the theory and on applications of Neural Networks, among others, in: - time-series forecasting - non-linear statistics and dynamics - non-linear modeling and control - development of simulation environment for Neural Networks - telecommunications One of the main areas of activity of Siemens is telecommunication - an area where we see promising new applications of Neural Networks. By now, Neural Networks are established as a new technology for modeling and control of complex technical systems. We see a large potential to improve the current technology of telecommunication by neural based approaches. Requirements: - programming experience (C, C++, Matlab,...) - basic knowledge in Neural Networks - capability of independent work - background in telecommunication would be desirable Applicants should send as soon as possible a cv (curriculum vitae) which states nationality, place of study, their research interests and, if available, a list of publications. They have to be registered for a university degree (undergraduate or graduate students). During the six months, the financial support by NEuroNet will be 800 ECU per month. Time-scale: The studentship should start autumn/winter this year and will last for 6 month. Applications should be send as soon as possible to: Michiaki Taniguchi Address: ZFE T SN 4, Siemens AG Otto-Hahn-Ring 6 D-81739 M|nchen, Germany Phone: +49/89/636-49506 Fax: +49/89/636-49767 e-mail: Michiaki.Taniguchi at zfe.siemens.de ------------------------------------------------------------------------------ _/ Michiaki Taniguchi Phone: +49/89/636-49506 _/ _/ _/ ZFE T SN 4 Fax: +49/89/636-49767 _/_/ _/ _/ _/ _/_/_/ Siemens AG _/ _/ _/ _/ _/ Otto-Hahn-Ring 6 _/ _/ _/ _/ _/_/_/ 81730 Muenchen Germany e-mail: Michiaki.Taniguchi at zfe.siemens.de http://www.siemens.de/zfe_nn/homepage.html From N.Sharkey at dcs.shef.ac.uk Fri Jul 26 14:01:40 1996 From: N.Sharkey at dcs.shef.ac.uk (Noel Sharkey) Date: Fri, 26 Jul 96 14:01:40 BST Subject: ROBOT LEARNING: the new wave Message-ID: <9607261301.AA19506@dcs.shef.ac.uk> see FAQs below: ****** ROBOT LEARNING: THE NEW WAVE ****** Special Issue of Robotics and Autonomous Systems SPECIAL EDITOR Noel Sharkey (Sheffield) SPECIAL EDITORIAL BOARD Michael Arbib (USC) Ronald Arkin (GIT) George Bekey (USC) Randall Beer (Case Western) Bartlett Mel (USC) Maja Mataric (Brandeis) Carme Torras (Spain) Lina Massone (Northwestern) Lisa Meeden (Swarthmore) + large international REVIEW PANEL (see web page) A number of people have writing to me with questions regarding the special issue. I thought that it would be better to forward the answers to everyone. FAQs Q. Where can I get information about the issue and instructions to authors? A. The full call and link to instruction can be found at www.dcs.shef.ac.uk/research/groups/nn/RASspecial.html Q. I am nearly finished my paper but need an extension of the deadline? A. Since I shall be on vacation for two weeks, authors can have an extension of two weeks (15th August). Q. The call for papers stressed a bias in favour of papers reporting implementations on real robot. We have a paper that is based on a very relevant simulation (or a review), is that acceptable? A. Our main objective is to publish high quality papers that reflect the state of the art in robot learning. The bias (given papers of equal quality) WILL be for real implementations. However, we realise that there is other important research that bears directly on robot learning problems. We will accept submission of any such relevant papers. noel From daphne at braindev.uoregon.edu Mon Jul 29 16:25:18 1996 From: daphne at braindev.uoregon.edu (Daphne Bavelier) Date: Mon, 29 Jul 1996 13:25:18 -0700 Subject: Postdoctoral Position in Cognitive Neuroscience Message-ID: <199607292025.NAA09568@braindev.uoregon.edu> Postdoctoral Position in Cognitive Neuroscience Institute of Computational and Cognitive Sciences Georgetown University, Washington DC A postdoctoral/research fellow position is available immediately in the brain and vision lab at the institute of computational and cognitive sciences at Georgetown university. Research focuses on the neural basis of visual cognition combining behavioral and imaging techniques (fMRI, ERPs). Particular research areas within the lab include the mechanisms of visual attention and scene/object perception in normal adults as well as the study of plastic changes in the visual system after altered experience (either due to learning or altered sensory experience as in congenitally deaf individuals). Applicants should have a strong background and education in cognitive neuroscience. Special consideration will be given to candidates with prior experience in computational neuroscience or/and familiar with imaging techniques. Appointment is due to start on August 1st, 1996 or as soon as possible thereafter. The institute offers a variety of laboratories in the field of cognitive neuroscience using investigation techniques such as single cell and optical recordings in behaving monkeys and bats, a 7T fMRI for animal studies and a 1.5T fMRI for human subjects. Applicants should send a CV, summary of relevant research experience and the name and addresses of at least two referees to: Dr. Daphne Bavelier. Institute for Computational and Cognitive Sciences. Georgetown University. New Research Building. 3970 Reservoir Road. Washington DC 20007-2197, or via email to daphne at braindev.uoregon.edu. From ling at cs.hku.hk Wed Jul 31 06:06:56 1996 From: ling at cs.hku.hk (Charles X. Ling) Date: Wed, 31 Jul 96 06:06:56 HKT Subject: Controversies at the Symposium of Computational Model of Development Message-ID: <9607302206.AA16412@stamina.cs.hku.hk> The 1996 Cognitive Science Conference was held from June 12 to 15 in UC San Diego. During the Conference, Kim Plunkett and Tom Shultz organized a symposium on Computational Models of Development, with four speakers: Jeff Elman, Denis Mareschal, Tom Shultz, and me. There were also two discussants, Jeff Shrager and Liz Bates. The symposium spiked heated debates among speakers, discussants, and audiences. Many good discussions were carried out after the Symposium. I have put a personal account of the event, the transparencies of my speech, and Jeff Shrager's commentary on my web page. I am also setting up animations on how decision tree learning algorithms learn quasi-regular associations, demonstrate graceful degradation, and have graded effect of development. Please let me know any comments you may have. You can find all of the above at: http://www.cs.hku.hk/~ling Cheers, Charles From gmato at mu.ft.uam.es Wed Jul 31 10:56:23 1996 From: gmato at mu.ft.uam.es (German Mato) Date: Wed, 31 Jul 1996 16:56:23 +0200 Subject: paper on the dynamics of receptive fields Message-ID: <9607311456.AA10800@mu.ft.uam.es> The following paper on the dynamics of receptive fields in the visual cortex has been put into the Neuroprose repository. Comments and suggestions are welcome. Sorry, no hard copy available. German Mato and Nestor Parga gmato at delta.ft.uam.es parga at ccuam3.sdi.uam.es ***************************************************** FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/parga.dynrf.ps.Z 44 pages Abstract: In this work we study the dynamical changes of receptive fields in a system in which the input has a lesion. A two-layer architecture representing the retina and V1 is introduced and the values of the horizontal connections in the second layer are updated in such a way that the (two-point and higher) correlations between the activities of different cortical neurons are minimized. We find that this algorithm, with a simultaneous dynamics for the activities of cortical neurons and for their horizontal connections, predicts several experimental facts: the expansion of the receptive fields, the bias in feature localization experiments and the {\it filling-in} phenomenon (in which the lesion is "filled" with the surrounding pattern after a time of adaptation). We find that non-Hebbian terms in the updating rule for the horizontal connections are essential to account quantitatively for the experimental results. From JEROEN.vanDEUTEKOM at wkap.nl Wed Jul 31 05:49:07 1996 From: JEROEN.vanDEUTEKOM at wkap.nl (Jeroen van Deutekom) Date: Wed, 31 Jul 1996 11:49:07 +0200 Subject: Neural Processing Letters Message-ID: <3156481031071996/A02085/WKVAX5/11A7FAA82300*@MHS> Considering the background of this mailinglist it might be of interest to inform you about the journal NEURAL PROCESSING LETTERS Editors in Chief: Michel Verleysen Univ. Catholique de Louvain, Belgium Francois Blayo EERIE, Lyon, France This journal is published by Kluwer Academic Publishers. More and current information on the journal of Neural Processing Letters and table of contents can be obtained from the following sites: Gopher: gopher.wkap.nl WWW: ftp://gopher.wkap.nl/journal/nepl gopher://gopher.wkap.nl:70/00gopher_root1%3A%5Bjournal.nepl%5Dnepl.inf Information can also be obtained via: E-mail from North and South America: kluwer at wkap.com E-mail from the Rest of the World : services at wkap.nl