From dblank at comp.uark.edu Tue Dec 1 15:15:15 1998 From: dblank at comp.uark.edu (Douglas Blank) Date: Tue, 01 Dec 1998 14:15:15 -0600 Subject: PhD Fellowships, Fall 1999: Please Post Message-ID: <36644E53.1C470D51@comp.uark.edu> PhD Fellowships, Fall 1999 Department of Computer Science and Engineering Artificial Intelligence & Robotics Laboratory The University of Arkansas, Fayetteville, is now accepting applications for PhD fellowships. Students may study any of a wide range of computer science topics, including high-level connectionist modeling, intelligent robotic control, textual processing, software agency, data mining, and methods of emergent computation. Five fellowships are available starting Fall 1999, each providing $15k/year plus tuition. Each fellowship is renewable for a total of 4 years. Candidates for this fellowship should be US citizens, or permanent residents of the US. UA is located in the Ozark Mountains on the borders of Arkansas, Missouri, and Oklahoma. Applicants should apply to the Graduate School, and for a Graduate Assistantship. Application materials are available online at http://www.uark.edu/depts/gradinfo/ For more information, please contact: Graduate Advisor Department of Computer Science Science-Engineering 232 University of Arkansas Fayetteville, AR 72701 Voice: (501)575-6427 FAX: (501)575-3817 http://csci.uark.edu/ csci at cavern.uark.edu -- ===================================================================== dblank at comp.uark.edu Douglas Blank, University of Arkansas Assistant Professor Computer Science ==================== http://www.uark.edu/~dblank ==================== From torras at iri.upc.es Tue Dec 1 12:26:12 1998 From: torras at iri.upc.es (Carme Torras) Date: Tue, 1 Dec 1998 18:26:12 +0100 (MET) Subject: CFP: Special Issue on Adaptive Robots Message-ID: <199812011726.SAA00912@sibelius.upc.es> =============================================================================== Special Issue of the journal CONNECTION SCIENCE on *** ADAPTIVE ROBOTS *** CALL FOR PAPERS: Deadline March 15th, 1999 Adaptivity is the capability of self-modification that some agents have, which allows them to maintain a level of performance when facing environmental changes, or to improve it when confronted repeatedly with the same situation. This special issue is aimed at capturing the state of the art in the intricate task of endowing robots with adaptive capabilities, with a special emphasis on neural-based solutions. Thus, some examples of topics covered are: - Adaptive sensing - Adaptive gaits for walking robots - Self-calibration of robot manipulators - Adaptive dynamic control of flexible robot arms - Acquiring fine manipulation skills - Learning hand-eye coordination - Exploration and reinforcement learning - Improving robot navigation - Adaptive multi-robot systems The special issue will adhere to an engineering perspective, i.e. the emphasis will be on solving practical robotic problems using adaptive techniques, disregarding their possible biological (or cognitive) inspiration or plausibility. Work on real robots is preferred, with special attention being devoted to replicability of results, as well as to the discussion of the limitations (together with the advantages) of the proposed techniques. Guest editor: ------------- Carme TORRAS, CSIC-UPC (Spain) Editorial board: ---------------- Rudiger DILLMANN, University of Karlsruhe (Germany) Leslie P. KAELBLING, Brown University (USA) Ben KR=D6SE, University of Amsterdam (The Netherlands) Jos=E9 R. MILL=C1N, Joint Research Centre (Italy) Helge RITTER, University of Bielefeld (Germany) Shankar SASTRY, University of California at Berkeley (USA) Noel SHARKEY, University of Sheffield (UK) Tim SMITHERS, CEIT (Spain) Tom ZIEMKE, University of Sk=F6vde (Sweden) Submissions to this special issue should be sent by March 15th, 1999 to: Carme Torras Institut de Robotica i Informatica Industrial (CSIC-UPC) Gran Capita 2-4 (edifici Nexus) 08034-Barcelona (Spain) e-mail: ctorras at iri.upc.es http://www-iri.upc.es/people/torras SCHEDULE: --------- December 98 - call for papers. March 15th, 99 - submission deadline May 15th, 99 - information to authors July 1st, 99 - deadline for final papers October 99 - publication of the special issue *** CONNECTION SCIENCE *** Journal of Neural Computing, Artificial Intelligence and Cognitive Research http://www.carfax.co.uk/cos-ad.htm =============================================================================== From marks at gizmo.usc.edu Wed Dec 2 12:42:49 1998 From: marks at gizmo.usc.edu (Mark S. Seidenberg) Date: Wed, 2 Dec 1998 10:42:49 -0700 Subject: training at USC Message-ID: GRADUATE AND POSTDOCTORAL TRAINING IN COGNITIVE AND COMPUTATIONAL NEUROSCIENCE AT THE UNIVERSITY OF SOUTHERN CALIFORNIA This announcement concerns opportunities for pre- and post-doctoral training in cognitive neuroscience and computational neuroscience at USC. We seek to recruit outstanding individuals to be supported under a training grant funded by the National Institute of Mental Health. The goal of the training program is to develop future cognitive neuroscientists who will be able to advance the goal of understanding brain-behavior relationships, using computational modeling as a primary tool. USC has brought together a group of outstanding researchers in cognitive neuroscience, with a particular concentration of expertise in neural network modeling at different levels of abstraction. Our program focuses on four main areas: language, vision, learning and memory, and motor performance. Research activities involve behavioral studies of normal and brain-injured humans; psychophysiological and neuroimaging techniques; basic neuroscience methods such as single-cell recording and lesion studies in animals, coupled with computational modeling. For additional information contact Mark S. Seidenberg, Program Director (marks at neuro.usc.edu). The Neuroscience Doctoral program: http://www.usc.edu/dept/nbio/nibs.html Cognitive Neuroscience at USC: http://www.usc.edu/dept/nbio/nibs/cognit.html Computational Neuroscience at USC: http://www.usc.edu/dept/nbio/nibs/compute.html Application Procedure: http://www.usc.edu/dept/nbio/nibs/howto.html Training Program: http://siva.usc.edu/coglab/training.html ____________________________________ Mark S. Seidenberg Neuroscience Program University of Southern California 3614 Watt Way Los Angeles, CA 90089-2520 Phone: 213-740-9174 Fax: 213-740-5687 http://www-rcf.usc.edu/~seidenb http://siva.usc.edu/coglab ____________________________________ From thimm at idiap.ch Wed Dec 2 08:37:24 1998 From: thimm at idiap.ch (Georg Thimm) Date: Wed, 02 Dec 1998 14:37:24 +0100 Subject: Contents of Neurocomputing 23 (1998) Message-ID: <199812021337.OAA26541@rotondo.idiap.ch> Dear reader, Please find below a compilation of the contents for Neurocomputing and Scanning the Issue written by V. David Sanchez A. More information on the journal are available at the URL http://www.elsevier.nl/locate/jnlnr/05301 . The contents of this and other journals published by Elsevier are distributed also by the ContentsDirect service (see at the URL http://www.elsevier.nl/locate/ContentsDirect). Please feel free to redistribute this message. My apologies if this message is inappropriate for this mailing list; I would appreciate a feedback. With kindest regards, Georg Thimm ====================================================================== Journal : NEUROCOMPUTING ISSN : 0925-2312 Vol./Iss. : 23 / 1-3 A comparative study of medium-weather-dependent load forecasting using enhanced artificial/fuzzy neural network and statistical techniques Elkateb , M.M. pp.: 3-13 Prediction of iron losses of wound core distribution transformers based on artificial neural networks Georgilakis , P.S. pp.: 15-29 Laboratory investigation of a digital recurrent network for transmission line directional protection Sanaye-Pasand , M. pp.: 31-46 A neural network based estimator for electricity spot-pricing with particular reference to weekend and public holidays Wang , A.J. pp.: 47-57 A neural network based protection technique for combined 275 kV/400 kV double circuit transmission lines Xuan , Q.Y. pp.: 59-70 Artificial neural networks for short-term energy forecasting: Accuracy and economic value Hobbs , Benjamin F. pp.: 71-84 Power system security boundary visualization using neural networks McCalley , James D. pp.: 85-96 The use of artificial neural networks for condition monitoring of electrical power transformers Booth , C. pp.: 97-109 Neural networks for power system condition monitoring and protection Cannas , B. pp.: 111-123 Recurrent neural network for forecasting next 10 years loads of nine Japanese utilities Kermanshahi , Bahman pp.: 125-133 Use of neural networks for customer tariff exploitation by means of short-term load forecasting Verona , Francesco Bini pp.: 135-149 Topology--independent artificial neural network for overload screening Riquelme , Jesu's pp.: 151-160 A neural network-based tool for preventive control of voltage stability in multi-area power systems Maiorano , A. pp.: 161-176 An incipient fault detection system based on the probabilistic radial basis function network: Application to the diagnosis of the condenser of a coal power plant Mun~oz , A. pp.: 177-194 Electric utility coal quality analysis using artificial neural network techniques Salehfar , H. pp.: 195-206 A class of hybrid intelligent system for fault diagnosis in electric power systems Jota , Patricia R.S. pp.: 207-224 Arcing fault detection using artificial neural networks Sidhu , T.S. pp.: 225-241 The artificial neural-networks-based relay algorithm for the detection of stochastic high impedance faults Snider , L.A. pp.: 243-254 Artificial neural network for reactive power optimization El-Sayed , Mohamed A.H. pp.: 255-263 Evolving artificial neural networks for short term load forecasting Srinivasan , Dipti pp.: 265-276 Simple recurrent neural network: A neural network structure for control systems Herna'ndez , Rafael Parra pp.: 277-289 ====================================================================== Neurocomputing 23 (1998) vii-ix Scanning the issue A comparative study of medium-weather-dependent load forecasting using enhanced artificial/fuzzy neural network and statistical techniques is presented by M.M. Elkateb, K. Solaiman and Y. Al-Turki. The introduction of a time index feature significantly enhances the performance of the ANN and FNN techniques. On the conventional side, an AutoRegressive Integrated Moving Average ARIMA technique is used. P.S. Georgilakis, N.D. Hatziargyriou, N.D. Doulamis, A.D. Doulamis and S.D. Kollias describe the Prediction of iron losses of wound core distribution transformers based on artificial neural networks. Generation of training and test data, selection of candidate attributes and the generation of the neural network structure are discussed. Suitability for prediction and classification of individual core and transformer specific iron losses is confirmed. In Laboratory investigation of a digital recurrent network for transmission line directional protection M. Sanaye-Pasand and O.P. Malik describe a recurrent neural network based technique for identifying the direction of a fault on a transmission line. Experimental evaluation shows that the approach is accurate, fast, and robust. A.J. Wang and B. Ramsay present A neural network based estimator for electricity spot-pricing with particular reference to weekend and public holidays. The estimator consists of two parts, the front-end processor and the neural network based predictor. The estimator is tested on a real System Marginal Price (SMP) prediction problem. A neural network based protection technique for combined 275 kV/400 kV double circuit transmission lines is introduced by Q.Y. Xuan, R.K. Aggarwal, A.T. Johns, R.W. Dunn and A. Bennett. The technique extracts in a pre-processing step the main features from the measured signals. The test results confirm that the adaptive protection technique works well for double-circuit lines with different voltage levels on the two circuits. B.F. Hobbs, U. Helman, S. Jitprapaikulsarn, S. Konda and D. Maratukulam describe Artificial neural networks for short-term energy forecasting: Accuracy and economic value. Eighteen electric utilities and five gas utilities are surveyed. The utilities report on the significant error reduction in daily electric load forecasts when using artificial neural networks. An average of $800K in savings per year and uility is estimated=2E J.D. McCalley, G. Zhou and V. Van Acker present Power system security boundary visualization using neural networks. The relationship between the precontingency operating parameters and the postcontingency performance measure is mapped using neural networks. The best set of operating parameters is selected using genetic algorithms. C. Booth and J.R. McDonald describe The use of artificial neural networks for condition monitoring of electrical power transformers. Artificial neural networks are used in this context for estimation, e.g. in the determination of transformer winding vibration levels, and for classification, e.g. in the automatic separation of healthy/unhealthy data. In Neural networks for power system condition monitoring and protection B. Cannas, G. Celli, M. Marchesi and F. Pilo propose a methodology based on a locally recurrent, globally feed-forward network and a neural state classifier. The accurate prediction of control variables and the fast recognition of abnormal events is demonstrated. In Recurrent neural network for forecasting next 10 years loads of nine Japanese utilities B. Kermanshahi applies a Recurrent Neural Network (RNN) and a 3-layer Backpropagation (BP) network for long-term load forecasting. The RNNs forecast the loads one year ahead whereas the BP networks forecast the next five to ten years. In Use of neural networks for customer tariff exploitation by means of short-term load forecasting F.B. Verona and M. Ceraolo apply Radial Basis Function (RBF) networks trained with leave-one-out cross validation for electric load forecasting. A prototype system shows good performance allowing some load management. J. Riquelme, A. G=F3mez and J.L. Mart=EDnez describe Topology-independent artificial neural networks for overload screening. The Artificial Neural Networks (ANN) are capable of identifying the set of harmful contingencies. Experimental results from a real-size power network are presented. ANNs enhanced with bus power injections can handle topological changes in the power system. A. Maiorano and M. Trovato present A neural network-based tool for preventive control of voltage stability in multi-area power systems. The power system is decomposed into a number of areas for each of which a trained neural network outputs an area-based voltage-stability index. The minimum among the index values characterizes the voltage stability of the whole power system. A. Mu=F1oz and M.A. Sanz-Bobi describe An incipient fault detection system based on the probabilistic radial basis function network: Application to the diagnosis of the condenser of a coal power plant. The Probabilistic Radial Basis Function Network (PRBFN) is introduced. The faults are detected by comparing the actual plant behavior with its prediction. The prediction makes use of a model of normal condition operation. H. Salehfar and S.A. Benson describe the Electric utility coal quality analysis using artificial neural network techniques. Impurities and ash forming species in coal are determined using the neural network. Results are compared to those using Computer-Controlled Scanning Electron Microscopy (CCSEM) methods and used to predict the deposition tendency and slagging behavior of ash under different operation conditions. P.R.S. Jota, S.M. Islam, T. Wu and G. Ledwich present A class of hybrid intelligent system for fault diagnosis in electric power systems. A hybrid intelligent system based on neuro-fuzzy, neuro-expert and fuzzy-expert algorithms is used to detect a number of faults in a range of electric power system equipment in Australia and Brazil. T.S. Sidhu, G. Singh and M.S. Sachdev describe a technique for Arcing fault detection using artificial neural networks. Acoustic radiation, infrared radiation and radio waves produced by arcing are recorded on a DSP-based data acquisition system. Classification is done using three-layer feedforward neural networks. Experimental results are reported. The artificial neural-networks-based relay algorithm for the detection of stochastic high impedance faults (HIF) is described by L.A. Snider and Y.S. Yuen. Low-order harmonics of residual quantities are used as the inputs to the artificial neural network. Arcs associated with high impedance faults distort the voltage and current waveforms and are modeled via simulation. The distortions are recognized by the algorithm. M.A.H. El-Sayed presents an Artificial neural network for reactive power optimization. Transmission losses are minimized using a neural network scheme. This scheme is enhanced by a rule-based approach when the network does not provide for a feasible solution. Numerical results from a real power system provides confirmation of the applicability of the scheme. D. Srinivasan describes Evolving artificial neural networks for short term load forecasting. The Artificial Neural Networks (ANN) are generated using a genetic algorihtm (GA) and forecast one-day ahead hourly electric loads. The approach avoids the use of large historical data sets and frequent retraining. When compared with statistical methods to solve the same problem the neural network approach shows a better performance. R. Parra Hern=E1ndez, J. =C1lvarez Gallegos and J.A. Hern=E1ndez Reyes present (SRNN) Simple recurrent neural network: A neural network structure for control systems. SRNNs are used to control linear and nonlinear dynamic systems. Results show that inverse modeling of dynamic systems is feasible using SRNNs and that only a few parameters are needed when using SRNNs to control dynamical systems. I appreciate the cooperation of all those who submitted their work for inclusion in this issue. V. David Sanchez A. Editor-in-Chief From jayanta at www.isical.ac.in Thu Dec 3 00:20:59 1998 From: jayanta at www.isical.ac.in (Jayanta Basak) Date: Thu, 3 Dec 1998 10:50:59 +0530 (IST) Subject: paper posting Message-ID: The following paper is available in http://xxx.lanl.gov/abs/cond-mat/9811403 Feedbacks regrading this article to the authors will be highly appreciated. Title : Response of an Excitatory-Inhibitory Neural Network to External Stimulation: An Application to Image Segmentation Authors : Sitabhra Sinha and Jayanta Basak Abstract : Neural network models comprising elements which have exclusively excitatory or inhibitory synapses are capable of a wide range of dynamic behavior, including chaos. In this paper, a simple excitatory-inhibitory neural pair, which forms the building block of larger networks, is subjected to external stimulation. The response shows transition between various types of dynamics, depending upon the magnitude of the stimulus. Coupling such pairs over a local neighborhood in a two-dimensional plane, the resultant network can achieve a satisfactory segmentation of an image into ``object'' and ``background''. Results for synthetic and and ``real-life'' images are given. Regards, Jayanta Basak Machine Intelligence Unit Indian Statistical Institute Calcutta 700 035, India. From ken at phy.ucsf.EDU Fri Dec 4 02:33:06 1998 From: ken at phy.ucsf.EDU (Ken Miller) Date: Thu, 3 Dec 1998 23:33:06 -0800 (PST) Subject: UCSF Postdoctoral and Graduate Fellowships in Theoretical Neurobiology Message-ID: <13927.36914.747091.701444@coltrane.ucsf.edu> FULL INFO: http://www.sloan.ucsf.edu/sloan/sloan-info.html PLEASE DO NOT USE 'REPLY'; FOR MORE INFO USE ABOVE WEB SITE OR CONTACT ADDRESSES GIVEN BELOW The Sloan Center for Theoretical Neurobiology at UCSF solicits applications for pre- and post-doctoral fellowships, with the goal of bringing theoretical approaches to bear on neuroscience. Applicants should have a strong background and education in mathematics, theoretical or experimental physics, or computer science, and commitment to a future research career in neuroscience. Prior biological or neuroscience training is not required. The Sloan Center offers opportunities to combine theoretical and experimental approaches to understanding the operation of the intact brain. Young scientists with strong theoretical backgrounds will receive scientific training in experimental approaches to understanding the operation of the intact brain. They will learn to integrate their theoretical abilities with these experimental approaches to form a mature research program in integrative neuroscience. The research undertaken by the trainees may be theoretical, experimental, or a combination. TO APPLY, please send a curriculum vitae, a statement of previous research and research goals, up to three relevant publications, and have two letters of recommendation sent to us. The application deadline is February 1, 1999. Send applications to: Steve Lisberger Sloan Center for Theoretical Neurobiology at UCSF Department of Physiology University of California 513 Parnassus Ave. San Francisco, CA 94143-0444 PRE-DOCTORAL applicants with strong theoretical training may seek admission into the UCSF Neuroscience Graduate Program as a first-year student. Applicants seeking such admission must apply by Jan. 5, 1999 to be considered for fall, 1999 admission. Application materials for the UCSF Neuroscience Program may be obtained from Cindy Kelly Neuroscience Graduate Program Department of Physiology University of California San Francisco San Francisco, CA 94143-0444 neuroscience at phy.ucsf.edu Be sure to include your surface-mail address. The procedure is: make a normal application to the UCSF Neuroscience program; but also alert the Sloan Center of your application, by writing to Steve Lisberger at the address given above. If you need more information: -- Consult the Sloan Center WWW Home Page: http://www.sloan.ucsf.edu/sloan -- Send e-mail to sloan-info at phy.ucsf.edu -- See also the home page for the W.M. Keck Foundation Center for Integrative Neuroscience, in which the Sloan Center is housed: http://www.keck.ucsf.edu/ From Kim.Plunkett at psy.ox.ac.uk Fri Dec 4 09:33:08 1998 From: Kim.Plunkett at psy.ox.ac.uk (Kim Plunkett) Date: Fri, 4 Dec 1998 14:33:08 GMT Subject: No subject Message-ID: <199812041433.OAA21282@pegasus.psych.ox.ac.uk> UNIVERSITY OF OXFORD OXFORD SUMMER SCHOOL ON CONNECTIONIST MODELLING Department of Experimental Psychology University of Oxford 18th - 30th July 1999 Applications are invited for participation in a 2-week residential Summer School on techniques in connectionist modelling. The course is aimed primarily at researchers who wish to exploit neural network models in their teaching and/or research and it will provide a general introduction to connectionist modelling, biologically plausible neural networks and brain function through lectures and exercises on Macintosh's and PC's. The course is interdisciplinary in content though many of the illustrative examples are taken from cognitive and developmental psychology, and cognitive neuroscience. The instructors with primary responsibility for teaching the course are Kim Plunkett and Edmund Rolls. No prior knowledge of computational modelling will be required though simple word processing skills will be assumed. Participants will be encouraged to start work on their own modelling projects during the Summer School. The cost of participation in the Summer School is stlg950. This figure covers the cost of accommodation (bed and breakfast at St. John's College), registration and all literature required for the Summer School. Participants will be expected to cover their own travel and meal costs. A number of partial bursaries will be available for graduate students. Applicants should indicate whether they wish to be considered for a graduate student scholarship but are advised to seek further funding as well, since in previous years the number of graduate student applications has far exceeded the number of scholarships available. There is a Summer School World Wide Web page describing the contents of the 1999 Summer School available on: http://www-cogsci.psych.ox.ac.uk/summer-school/ Further information about contents of the course can be obtained from Steven.Young at psy.ox.ac.uk If you are interested in participating in the Summer School, please contact: Mrs Sue King Department of Experimental Psychology University of Oxford South Parks Road Oxford OX1 3UD Tel: (01865) 271353 Email: susan.king at psy.oxford.ac.uk Please send a brief description of your background with an explanation of why you would like to attend the Summer School (one page maximum) no later than 31st January 1999. From greiner at cs.ualberta.ca Fri Dec 4 15:55:22 1998 From: greiner at cs.ualberta.ca (Russ Greiner) Date: Fri, 4 Dec 1998 13:55:22 -0700 Subject: Query Distribution Message-ID: <19981204205525Z13557-6723+317@scapa.cs.ualberta.ca> Dear Colleagues, There are now a number of deployed systems that use belief nets (aka bayesian nets, probability nets, ...) to answer queries -- ie, to compute the posterior probability of some variable(s), based on some specified set of evidence. It would be very useful to know the actual distribution of queries posed to such real-world systems; eg, how often the user asks "What is the probability of cancer, given Fever=T and Age>42 ?", vs "What is the probability of cancer, given Fever=F, lump=F and Gender=M ?" vs "What is the prior probability of hepatitis ?" etc etc etc. We could then use this "query distribution" to evaluate our learning algorithms, by computing (perhaps) the *average (sum-squared) accuracy* of the belief net it returns, where the "average" is wrt this real-world distribution (cf, [Greiner/Grove/Schuurmans, "Learning Bayesian Nets that Perform Well", UAI-97]). We are therefore looking for some real-world *query distributions*. Please let me know if you can provide this information -- perhaps in the form of the set of queries actually posed to a real system, or a set of session transcripts or log files, of a system's interations with its users, or ... To avoid confusion, note that this QUERY DISTRIBUTION cannot necessarily be inferred from the given belief net B, as the query distribution might be completely unrelated to the "NATURAL DISTRIBUTION" of events (encoded by B). Eg, we may ask many queries about low probability events --- the probability of the QUERY "What is the probability of cancer?" may be very high, even though the actual probability of Cancer is very low. Thank you. | Russell Greiner Phone: (403) 492-5461 | | Dep't of Computing Science FAX: (403) 492-1071 | | University of Alberta Email: greiner at cs.ualberta.ca | | Edmonton, AB T6G 2H1 Canada http://www.cs.ualberta.ca/~greiner/ | From greiner at cs.ualberta.ca Fri Dec 4 15:44:31 1998 From: greiner at cs.ualberta.ca (Russ Greiner) Date: Fri, 4 Dec 1998 13:44:31 -0700 Subject: SIGART/AAAI Doctoral Consortium Message-ID: <19981204204433Z13530-6722+256@scapa.cs.ualberta.ca> The SIGART/AAAI Doctoral Consortium is a great opportunity for PhD students to receive feedback on their research and network with people in the field. Accepted participants will receive travel scholarships and free registration to AAAI-99. The call for participation is at: http://www.aaai.org/Conferences/National/1999/aaai99-dccall.html Note that submissions are due 5 February 1999. From jose at tractatus.rutgers.edu Sat Dec 5 11:26:40 1998 From: jose at tractatus.rutgers.edu (Stephen Jose Hanson) Date: Sat, 05 Dec 1998 12:26:40 -0400 Subject: POSTDOC in BRAIN IMAGING & COMPUTATION RDLDL & RUMBA Message-ID: <36695EC0.E84032E5@tractatus.rutgers.edu> Rutgers RUMBA (Rutgers Mind/Brain Analysis) Center and Rutgers DLDL (Distributed Laboratories for Digital Libraries) has an immediate opening for a Postdoctural Researcher in area of Functional Brain Imaging and Neural Computation--see ad below: RUTGERS UNVERSITY (RDLDL & RUMBA) seeking a post-doctoral researcher with experience in brain imaging and computation (neural networks etc..), to work on problems related to brain function and information finding tasks. The underlying questions are (1) does the brain of a person doing fundamental tasks such as scanning for a given image, for a given term, or for terms or images related to a given concept, exhibit patterns of activation that distinguish these states from other states associated with other types of information processing (2) if such states and patterns of activation exist, can knowledge of them be used to improve our understanding of the cognitive aspects of information finding, and ultimately, to improve the systems used to perform those tasks. The successful candidate will work closely with Profs. Martin-Bly, Hanson and Kantor in the UMDNJ-Rutgers (RUMBA) functional imaging laboratories. The starting time for the position is in January 1999. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Send CV and three letters of recommendation and 2 reprints to Professor B. Martin Bly, Department of PsychologyRDLDL/RUMBA SEARCH, Rutgers University, Newark, NJ 07102. Also see: http://diglib.rutgers.edu/RDLDL/ & http//www.psych.rutgers.edu/~RUMBA. Preferably send material through email: Please send email with subject header--RDLDL or RUMBA with application material to ben at psychology.rutgers.edu or jose at psychology.rutgers.edu. From Otto_Schnurr-A11505 at email.mot.com Wed Dec 9 18:51:09 1998 From: Otto_Schnurr-A11505 at email.mot.com (Otto Schnurr-A11505) Date: Wed, 09 Dec 1998 17:51:09 -0600 Subject: NN Text-To-Speech at Motorola: Papers and Audio Files Message-ID: <366F0CED.F4C06DC1@ccrl.mot.com> Dear Connectionists: Motorola has been developing text-to-speech technology that utilizes multiple cooperating neural networks, each specializing in a particular area of human language ability. Papers that describe this work are now available electronically at the LANL archive: 1998: ===== Title : A High Quality Text-To-Speech System Composed of Multiple Neural Networks Authors : Orhan Karaali, Gerald Corrigan, Noel Massey, Corey Miller, Otto Schnurr and Andrew Mackie Abstract : http://xxx.lanl.gov/abs/cs/9812006 PostScript : http://xxx.lanl.gov/ps/cs/9812006.ps.gz PS & Audio : http://xxx.lanl.gov/e-print/cs/9812006.tar.gz 1997: ===== Title : Text-To-Speech Conversion with Neural Networks: A Recurrent TDNN Approach Authors : Orhan Karaali, Gerald Corrigan, Ira Gerson and Noel Massey Abstract : http://xxx.lanl.gov/abs/cs/9811032 PostScript : http://xxx.lanl.gov/ps/cs/9811032.ps.gz Title : Generating Segment Durations in a Text-To-Speech System: A Hybrid Rule-Based/Neural Network Approach Authors : Gerald Corrigan, Noel Massey and Orhan Karaali Abstract : http://xxx.lanl.gov/abs/cs/9811030 PostScript : http://xxx.lanl.gov/ps/cs/9811030.ps.gz Title : Variation and Synthetic Speech Authors : Corey Miller, Orhan Karaali and Noel Massey Abstract : http://xxx.lanl.gov/abs/cmp-lg/9711004 PostScript : http://xxx.lanl.gov/ps/cmp-lg/9711004.ps.gz 1996: ===== Title : Speech Synthesis with Neural Networks Authors : Orhan Karaali, Gerald Corrigan and Ira Gerson Abstract : http://xxx.lanl.gov/abs/cs/9811031 PostScript : http://xxx.lanl.gov/ps/cs/9811031.ps.gz Due to requests, we have submitted excerpts of speech generated from our text-to-speech system. These audio files demonstrate one female voice and two male voices and are available at http://xxx.lanl.gov/e-print/cs/9812006.tar.gz Note: If your system does not support Windows WAV files, try a tool like "sox" to translate the audio into a format of your choice. Regards, Otto Schnurr Speech Processing Research Lab Motorola otto_schnurr at email.mot.com From rsun at research.nj.nec.com Thu Dec 10 09:45:30 1998 From: rsun at research.nj.nec.com (Ron Sun) Date: Thu, 10 Dec 1998 09:45:30 -0500 Subject: AAAI'99 CFP in the neural/evolutionary/fuzzy computation areas Message-ID: <199812101445.JAA20138@pc-rsun.nj.nec.com> Solictation for submissions from the neural/evolutionary/fuzzy communities to: Sixteenth National Conference on Artificial Intelligence Sponsored by the American Association for Artificial Intelligence (AAAI). July 18-22, 1999, Orlando, Florida We would like to encourage the submission of high quality papers in the areas of neural/fuzzy/evolutionary computation to AAAI'99. The 1999 AAAI conference includes many program committee members in the neural/evolutionary/fuzzy areas. This year AAAI'99 expressly solicits top-quality submissions in the above-mentioned areas. Every consideration will be given to provide a fair review (albeit rigorous, in accordance with AAAI's long-standing high standard) to each paper. Your submission will be reviewed by experts in your area who understand and appreciate its contribution (or the lack of it) to the study of computational intelligence. To highlight and showcase the advance and contributions that neural/evolutionary/fuzzy communities are making in computational intelligence, in a broader context that goes beyond disciplinary and paradigmatic confines, we encourage you to submit your best work to this year's AAAI and share that work with others in the broader arena of artificial intelligence. *********************************** Some AAAI'99 program committee members whose expertize lies primarily in the neural/evolutionary/fuzzy areas: Ron Sun (senior program committee member) Lee Giles (senior program committee member) Hamid Berenji Kenneth DeJong Marco Gori Vasant Honavar John Koza Yann LeCun Risto Miikkulanain Darrell Whitley Ron Yager Jacek M. Zurada Timetable for Submission January 19, 1999: electronic submission of abstracts January 20, 1999: submission of six (6) paper copies to AAAI office (see the AAAI web page describing all caveats, formats and details). March 12, 1999: notification of acceptance or rejection. Please send papers to: AAAI-99 American Association for Artificial Intelligence 445 Burgess Drive Menlo Park, CA 94025-3442 For further details regarding submission, see http://www.aaai.org/Conferences/National/1999/aaai99-call.html *********************************** ------------ Dr. C. Lee Giles / NEC Research Institute / 4 Independence Way Princeton, NJ 08540 / 609-951-2642 / Fax 2482 ------------ Prof. Ron Sun Dept. CS, University of Alabama, Tuscaloosa, AL http://cs.ua.edu/~rsun rsun at cs.ua.edu, rsun at research.nj.nec.com ------------ From haith at ptolemy.arc.nasa.gov Fri Dec 11 15:45:52 1998 From: haith at ptolemy.arc.nasa.gov (Gary Haith) Date: Fri, 11 Dec 1998 12:45:52 -0800 Subject: Model of Retinogeniculate Development: Dissertation Available Online Message-ID: <199812112045.MAA24334@golem.arc.nasa.gov> MODELING ACTIVITY-DEPENDENT DEVELOPMENT IN THE RETINOGENICULATE PROJECTION can be downloaded at: (post-script document): http://ic-www.arc.nasa.gov/people/haith/diss.ps (gzipped post-script document): http://ic-www.arc.nasa.gov/people/haith/diss.ps.gz Gary Haith haith at ptolemy.arc.nasa.gov ############ Abstract: ############ In higher mammals, the primary visual pathway starts with the (``retinogeniculate'') projection from the retina to the dorsal lateral geniculate nucleus (dLGN) of the thalamus, which in turn projects to visual cortex. Although the retinal axons initially innervate the dLGN in a relatively disorganized manner, they are precisely arranged by maturity. Some dominant features of this organization emerge only under the influence of activity, yet these features are established before eye-opening or photoreceptor function. The crucial activity is supplied by spontaneous bursts of action potentials that propagate in waves across the immature retinal ganglion cell layer that projects to the dLGN. Under the influence of retinal activity, the retinal axons segregate into eye-specific layers, on/off sublayers, and precise retinotopic maps. This dissertation describes a formal computational framework for modeling and exploring the activity-dependent development of the retinogeniculate projection. The model is the first to support the development of layers, sublayers, and retinotopy in a unified framework. The model is constructed so as to be directly biologically interpretable and predictive. It refines based on realistic patterns of wave activity, retinal axon arbor change, and Hebbian synaptic weight change. In addition, the model is relatively tractable to formal analysis. This tractability makes the model relatively undemanding to simulate computationally and provides analytic insight into the dynamics of the model refinement. Several experimental predictions that follow directly from the model are described. From terry at salk.edu Fri Dec 11 22:57:32 1998 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 11 Dec 1998 19:57:32 -0800 (PST) Subject: UCSD Computational Neurobiology Message-ID: <199812120357.TAA27939@helmholtz.salk.edu> Computational Neurobiology Graduate Program Department of Biology -- University of California, San Diego The Computational Neurobiology Graduate Program at UCSD will provide students with rigorous training in neuroscience including experimental methods and modern mathematical methods to analyze and visualize data as well as theoretical approaches to neuronal dynamics and computation. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. All students are expected to master set of core courses to insure a common set of knowledge and a common language. All students will take the "Advanced Neurobiology Laboratory", which presents state-of-the-art imaging and electrophysiological techniques, modern techniques in genetic and viral transformation for the study of neuronal function, and modern statistical and spectral methods for data analysis, and "Advanced Computational Methods and Dynamic Systems Theory" which include training in nonlinear dynamics of single cells, the analysis of regularly spiking and bursting cells, as well as reduced models and their representation in phase space. Requests for application materials should be sent to the Graduate Admissions Office, Department of Biology 0348, 9500 Gilman Drive, UCSD, La Jolla, CA, 92093-0348: [gradprog at biology.ucsd.edu]. An initial pre-application form will be sent; applicants should indicate their interest in the Computational Neurobiology Graduate Program on this form. These forms will be screened and application forms will be sent to appropriate candidates. The deadline for completed application materials, including letters of reference, is January 8, 1999. More information about applying to the UCSD Biology Graduate Program: http://www-biology.ucsd.edu/sa/Admissions.html. The Biology Department home page is located at: http://www-biology.ucsd.edu/ Other inquiries about the Computational Neurobiology Graduate Program should be directed to: Terrence Sejnowski Institute for Neural Computation 0523 University of California, San Diego La Jolla, CA 92093 [tsejnowski at ucsd.edu]. Participating Faculty include: Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking the responses of single neurons to perception; functional agnetic Resonance Imaging (fMRI) in awake, behaving monkeys; Darwin Berg (Biology): Regulation of synaptic components of neurons; how neurons become committed to synthesizing specific synaptic components, how the components are assembled and localized in the synaptic membrane, and how the function and long-term stability of the components are controlled; Mark Ellisman (Neurosciences): High resolution anatomy using electron microscopy and light microscopy; computational procedures for anatomical reconstructions; Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex; founder of Hecht-Nielsen Corporation, Harvey Karten (Neurosciences): Visual system function and organization; anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels. David Kleinfeld (Physics): Collective properties of neuronal assemblies; optical recording of electrical activity in cortex; analysis of large-scale activity in nervous systems; William Kristan (Biology): Neuroethology of the leech; functional and developmental studies of the leech nervous system, including computational studies of the bending reflex and locomotion; Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies. Mu-ming Poo (Biology): Mechanisms for synaptic plasticity; synaptic learning rules underlying developmental plasticity and learning in nervous systems; development of sensory maps in lower vertebrate visual systems. Terrence Sejnowski (Salk Institute/Biology): Computational neurobiology; detailed biophysical and large-scale network models of nervous systems; physiological studies of neuronal reliability and synaptic mechanisms; Michael Rabinovich (Institute for Nonlinear Studies): Analysis of neural dynamics in the stomatogastric ganglion of the lobster and the olfactory system of insects. Martin Sereno (Cognitive Science): Organization of the visual system in primates and squirrels; computer models of neural systems and development of new techniques for studying human cognition with functional magnetic resonance imaging (fMRI). Nicholas Spitzer (Biology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function; Charles Stevens (Salk Institute): Synaptic physiology; physiological studies and biophysical models of synaptic plasticity in hippocampal neurons. Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters. Mark Whitehead (Neurosurgery): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior. Ruth Williams (Mathematics): Probability theory, stochastic processes and their applications, including learning in stochastic networks. Kent Wilson (Chemistry): Multi-photon techniques in scanning optical microscopy of living biological systems; multi-sensual use of computer visualization, sound and touch as tools in research and education. From tononi at nsi.edu Sat Dec 12 18:15:20 1998 From: tononi at nsi.edu (Giulio Tononi) Date: Sat, 12 Dec 1998 15:15:20 -0800 Subject: positions open Message-ID: <000001be2625$49c8e2a0$1bb985c6@spud.nsi.edu> THE NEUROSCIENCES INSTITUTE, SAN DIEGO The Neurosciences Institute is an independent, not-for-profit organization at the forefront of research on the brain. Research at the Institute spans levels from the molecular to the behavioral and from the computational to the cognitive. The Institute has a strong tradition in theoretical neurobiology and has recently established new experimental facilities. The Institute is also the home of the Neurosciences Research Program and serves as an international meeting place for neuroscientists. JUNIOR FELLOW, NEURAL BASIS OF CONSCIOUSNESS. The Institute has a strong tradition in the theoretical and experimental study of consciousness (see Science, 282:1846-1851). Applications are invited for positions as Junior Fellows to collaborate on experimental and theoretical studies of the neural correlates of conscious perception. Applicants should be at the Postdoctoral level with strong backgrounds in cognitive neuroscience, neuroimaging (including MEG, EEG, and fMRI), and theoretical neurobiology. JUNIOR FELLOW IN THEORETICAL NEUROBIOLOGY. Applications are invited for positions as Junior Fellows in Theoretical Neurobiology. Since 1987, the Institute has had a research program dedicated to developing biologically based, experimentally testable theoretical models of neural systems. Current projects include large-scale simulations of neuronal networks and the analysis of functional interactions among brain areas using information-theoretical approaches. Advanced computing facilities are available. Applicants should be at the Postdoctoral level with strong backgrounds in mathematics, statistics, and computer modeling. JUNIOR FELLOW IN EXPERIMENTAL NEUROBIOLOGY. Applications are invited for positions as Junior Fellow in Experimental Neurobiology. A current focus of the Institute is on the action and pharmacological manipulation of neuromodulatory systems with diffuse projections, such as the noradrenergic, serotoninergic, cholinergic, dopaminergic, and histaminergic systems. Another focus is on behavioral state control and the functions of sleep. Applicants should be at the Postdoctoral level with strong backgrounds in the above-mentioned areas. Fellows receive stipends and research support commensurate with qualifications and experience. Positions are now available. Applications for all positions listed should contain a short statement of research interests, a curriculum vitae, and the names of three references and should be sent to: Giulio Tononi, The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, California 92121; Email: tononi at nsi.edu; URL: http:// www.nsi.edu. From steve at cns.bu.edu Sun Dec 13 07:30:40 1998 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sun, 13 Dec 1998 08:30:40 -0400 Subject: The Link Between Brain Learning, Attention, and Consciousness Message-ID: The following article can be accessed at http://cns-web.bu.edu/Profiles/Grossberg Paper copies can also be gotten by writing Ms. Diana Myers, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA 02215 or diana at cns.bu.edu. Grossberg, S. (1998). The link between brain learning, attention, and consciousness. Consciousness and Cognition, in press. Preliminary version as Boston University Technical Report, CAS/CNS-TR-97-018. Available in gzip'ed postscript (170Kb). Abstract: The processes whereby our brains continue to learn about a changing world in a stable fashion throughout life are proposed to lead to conscious experiences. These processes include the learning of top-down expectations, the matching of these expectations against bottom-up data, the focusing of attention upon the expected clusters of information, and the development of resonant states between bottom-up and top-down processes as they reach an attentive consensus between what is expected and what is there in the outside world. It is suggested that all conscious states in the brain are resonant states, and that these resonant states trigger learning of sensory and cognitive representations. The models which summarize these concepts are therefore called Adaptive Resonance Theory, or ART, models. Psychophysical and neurobiological data in support of ART are presented from early vision, visual object recognition, auditory streaming, variable-rate speech perception, somatosensory perception, and cognitive-emotional interactions, among others. It is noted that ART mechanisms seem to be operative at all levels of the visual system, and it is proposed how these mechanisms are realized by known laminar circuits of visual cortex. It is predicted that the same circuit realization of ART mechanisms will be found in the laminar circuits of all sensory and cognitive neocortex. Concepts and data are summarized concerning how some visual percepts may be visibly, or modally, perceived, whereas amodal percepts may be consciously recognized even though they are perceptually invisible. It is also suggested that sensory and cognitive processing in the What processing stream of the brain obey top-down matching and learning laws that are often complementary to those used for spatial and motor processing in the brain's Where processing stream. This enables our sensory and cognitive representations to maintain their stability as we learn more about the world, while allowing spatial and motor representations to forget learned maps and gains that are no longer appropriate as our bodies develop and grow from infanthood to adulthood. Procedural memories are proposed to be unconscious because the inhibitory matching process that supports these spatial and motor processes cannot lead to resonance. From robtag at dia.unisa.it Mon Dec 14 11:27:49 1998 From: robtag at dia.unisa.it (Tagliaferri Roberto) Date: Mon, 14 Dec 1998 17:27:49 +0100 Subject: WIRN 99 call for paper First announcement Message-ID: <9812141627.AA29612@udsab> ***************** CALL FOR PAPERS ***************** The 11-th Italian Workshop on Neural Nets WIRN VIETRI-99 May 20-22, 1999 Vietri Sul Mare, Salerno ITALY **************** FIRST ANNOUNCEMENT ***************** Organizing - Scientific Committee -------------------------------------------------- B. Apolloni (Univ. Milano) A. Bertoni ( Univ. Milano) N. A. Borghese ( CNR Milano) D. D. Caviglia ( Univ. Genova) P. Campadelli ( Univ. Milano) A. Colla (ELSAG Genova) A. Esposito ( I.I.A.S.S.) M. Frixione ( Univ. Salerno) C. Furlanello (ITC-IRST Trento) G. M. Guazzo ( I.I.A.S.S.) M. Gori ( Univ. Siena) F. Lauria ( Univ. Napoli) M. Marinaro ( Univ. Salerno) F. Masulli (Univ. Genova) C. Morabito ( Univ. Reggio Calabria) P. Morasso (Univ. Genova) G. Orlandi ( Univ. Roma) T. Parisini ( Politecnico Milano) E. Pasero ( Politecnico Torino) A. Petrosino ( I.I.A.S.S.) V. Piuri ( Politecnico Milano) M. Protasi ( Univ. Roma II) S. Rampone ( Univ. Sannio) R. Serra ( Centro Ricerche Ambientali Montecatini Ravenna) F. Sorbello ( Univ. Palermo) R. Tagliaferri ( Univ. Salerno) Topics ---------------------------------------------------- Mathematical Models Architectures and Algorithms Hardware and Software Design Hybrid Systems Pattern Recognition and Signal Processing Industrial and Commercial Applications Fuzzy Tecniques for Neural Networks Schedule ----------------------- Papers Due: January 31, 1999 Replies to Authors: March 31, 1999 Revised Papers Due: May 22, 1999 Sponsors ------------------------------------------------------------------------------ International Institute for Advanced Scientific Studies (IIASS) Dept. of Scienze Fisiche "E.R. Caianiello", University of Salerno Dept. of Matematica ed Informatica, University of Salerno Dept. of Scienze dell'Informazione, University of Milano Societa' Italiana Reti Neuroniche (SIREN) IEEE Neural Network Council INNS/SIG Italy Istituto Italiano per gli Studi Filosofici, Napoli The 11-th Italian Workshop on Neural Nets (WIRN VIETRI-99) will take place in Vietri Sul Mare, Salerno ITALY, May 20-22, 1999. The conference will bring together scientists who are studying several topics related to neural networks. The three-day conference, to be held in the I.I.A.S.S., will feature both introductory tutorials and original, refereed papers, to be published by an International Publishing. In the 1999 edition there will be a special session, including tutorials, on Neural Networks in Economics. Official languages are Italian and English, while papers must be in English. Papers should be 6 pages, including title, figures, tables, and bibliography. The accompanying letter should give keywords, postal and electronic mailing addresses, telephone and FAX numbers, indicating oral or poster presentation. Submit 3 copies and a 1 page abstract (containing keywords, postal and electronic mailing addresses, telephone, and FAX numbers with no more than 300 words) to the address shown (WIRN 99 c/o IIASS). An electronic copy of the abstract should be sent to the E-mail address below. The papers must be sent in the camera ready format of SPRINGER which can be retrieved by anonymous ftp from ftp.tex.ac.uk or by the SIREN www site: wicsadv.org wicsadv.tex wicsadv1.tex (file modified to insert the format instruction inside) wicsbook.org wicsbook.sty or for authors who do not use latex the information can be retrieved by the SIREN www site. The publication of the proceedings is "under negotiation" with Springer. During the Workshop the "Premio E.R. Caianiello" will be assigned to the best Ph.D. thesis in the area of Neural Nets and related fields of Italian researchers. The amount is of 2.000.000 Italian Lire. The interested researchers (with the Ph.D degree got in 1996,1997,1998 until February 28 1999) must send 3 copies of a c.v. and of the thesis to "Premio Caianiello" WIRN 99 c/o IIASS before February 28,1999. It is possible to partecipate to the prize at most twice. For more information, contact the Secretary of I.I.A.S.S. I.I.A.S.S Via G.Pellegrino, 19 84019 Vietri Sul Mare (SA) ITALY Tel. +39 89 761167 Fax +39 89 761189 E-Mail robtag at udsab.dia.unisa.it or the SIREN www pages at the address below: http://www-dsi.ing.unifi.it/neural ***************************************************************** From magnus at cs.man.ac.uk Mon Dec 14 12:14:27 1998 From: magnus at cs.man.ac.uk (Magnus Rattray) Date: Mon, 14 Dec 1998 17:14:27 +0000 Subject: PhD Studentship Message-ID: <36754773.148E1EB5@cs.man.ac.uk> --------------------------------------------------------------------- PhD studentship: Statistical Mechanics Analysis of Natural Gradient Learning --------------------------------------------------------------------- Applications are sought for a three year PhD position to study natural gradient learning using the methods of statistical mechanics and stochastic dynamical systems. The position will be supported by an EPSRC studentship and based in the computer science department at Manchester University, which is one of the largest and most successful computer science departments in the UK. Living expenses will be paid according to current EPSRC rates (19635 pounds over three years) with substantial extra funding available for participation at international conferences and workshops. Project description: Natural gradient learning was recently introduced as a principled algorithm for determining the parameters of a statistical model on-line. The algorithm has been applied to feed-forward neural networks, independent component analysis and deconvolution algorithms, often providing much improved performance over existing methods. The algorithm uses an underlying Riemannian parameter space to re-define the direction of steepest descent and respects certain invariances which should be observed by any consistent algorithm. Natural gradient learning is known to provide optimal asymptotic performance under certain restricted conditions but a good general understanding of the non-asymptotic learning performance is not yet available. This is really the regime which we expect to dominate the learning time and recent work by the project supervisor and co-workers [1,2] provides some quantification of the advantage which can be expected over other algorithms. This analysis involves a statistical mechanics formalism which allows an exact solution to learning dynamics for learning in a feed-forward neural network. The proposed project will build on these initial results in order to characterize the behaviour of natural gradient learning with greater generality. The project will also explore other applications of information geometry to probabilistic modelling. This project will touch on many interesting mathematical topics (information theory, differential geometry, statistical mechanics and stochastic dynamical systems) and application areas (optimization, neural networks, probabilistic modelling). Prospective candidates would ideally be interested in a number of these topics. A good first degree in physics, mathematics or a related subject is required. Contact: Magnus Rattray (magnus at cs.man.ac.uk) Computer Science Department, University of Manchester, Manchester M13 9PL, UK. Tel +44 161 275 6187. http://www.cs.man.ac.uk/~magnus/magnus.html References: [1] M Rattray, D Saad, S Amari, "Natural Gradient Descent for On-line Learning", Physical Review Letters 81, p5461 (1998). [2] M Rattray, D Saad, "Transients and Asymptotics of Natural Gradient Learning", Proceeding of ICANN 98, edited by L Niklasson, M Boden and T Ziemke (Springer-Verlag, London), p165 (1998). From jordan at CS.Berkeley.EDU Mon Dec 14 22:35:43 1998 From: jordan at CS.Berkeley.EDU (Michael Jordan) Date: Mon, 14 Dec 1998 19:35:43 -0800 (PST) Subject: faculty position in Statistics at UC Berkeley Message-ID: <199812150335.TAA00633@orvieto.CS.Berkeley.EDU> The enclosed announcement of a faculty position in Statistics at the University of California at Berkeley may be of interest to those of you whose research is statistics-oriented, either theoretical or applied. Mike Jordan ---------------------------------------------------------------------------- Announcement of a faculty position in the Department of Statistics University of California at Berkeley Applications are invited for tenured/tenure-track rank faculty position to begin 7/1/99. We will consider strong candidates in any area of theoretical and applied statistics, probability, and applied probability theory. The department is particularly interested in hearing from suitable qualified women or members of minorities currently under-represented in faculty positions. Send applications or inquiries (including resume and three names of references) by 1/19/99 to: Chair, University of California, Berkeley, Department of Statistics, 367 Evans Hall #3860, Berkeley, CA 94720-3860, Fax: (510) 642-7892; E-mail: recruit at stat.berkeley.edu. The University of California is an Affirmative Action/Equal Opportunity Employer. (see http://www.stat.berkeley.edu for additional information) From oreilly at grey.colorado.edu Tue Dec 15 16:58:22 1998 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Tue, 15 Dec 1998 14:58:22 -0700 Subject: Graduate and Postdoctoral Training @ CU & DU Message-ID: <199812152158.OAA13863@grey.colorado.edu> Computational Cognitive Neuroscience University of Colorado Boulder and University of Denver Graduate and Postdoctoral Training Opportunities This is an invitation to apply for graduate and postdoctoral training in cognitive neuroscience and/or computational cognitive neuroscience at the University of Colorado Boulder (CU) Departments of Psychology and Computer Science (integrated with the Institute for Cognitive Science, ICS), and the University of Denver (DU) Department of Psychology and program in Developmental Cognitive Neuroscience (DCN). CU and DU have a strong common interest in computational cognitive neuroscience, and researchers in both universities are funded by the NSF, NIH, NIDCD, and the McDonnell Foundation to support both graduate students and postdoctoral researchers in cognitive neuroscience. Close collaborations exist across the two campuses. A major focus of interest here is in understanding working memory and the cognitive role of the prefrontal cortex (PFC). We apply converging cognitive neuroscience methodologies, including computational (neural network models), behavioral, developmental, and neuropsychological approaches. Other topics of interest include executive function, language, learning and memory and the roles of the hippocampus and the cortex, developmental dissociations and task-dependent behaviors, attention, and invariant object recognition. We have a strong nucleus of cognitive neuroscientists focused on developing mechanistic, computational frameworks for understanding how the brain performs cognitive functions, and exploring these ideas using a wide range of empirical methods. This environment provides a unique opportunity to interact closely with active scientists exploring problems at the forefront of cognitive neuroscience. The departments at CU and DU are highly regarded, and have a strong international reputation for high quality scientific research and training. The DU psych department is ranked No. 2 in the world in publication impact, and the CU psych department is consistently in the top 20 of the US News & World Report rankings. In addition to providing an exciting research environment and hosting the annual Neural Information Processing Systems conference, the greater Denver/Boulder area offers an exceptional quality of life. Spectacularly situated at the eastern edge of the Rockies, this area provides a wide variety of extraordinary outdoor activities, an average of 330 sunny days per year, and also affords a broad range of cultural activities. Graduate students should apply to the most appropriate department for their specific interests. Deadlines are Jan 1 for CU and Jan 15 for DU. For more information, full lists of associated faculty, and instructions on applying to the graduate programs, see the following web sites: CU Overview Web Page: http://www.cs.colorado.edu/~mozer/resgroup.html CU Psychology: http://psych-www.colorado.edu/ CU Computer Science: http://www.cs.colorado.edu/ CU ICS: http://psych-www.colorado.edu/ics/home.html DU Psychology: http://www.du.edu/psychology/ DU DCN: http://www.du.edu/psychology/DCNWHOLE.htm Postdoctoral applications should include a CV, representative publications, and a statement of research interests, and should be sent to the most appropriate of the following faculty member(s) listed below. Postdoc funding is available now, and applications will be considered until the positions are filled. Akira Miyake, CU Psych, miyake at psych.colorado.edu, http://psych-www.colorado.edu/faculty/miyake.html Michael Mozer, CU CS, mozer at cs.colorado.edu, http://www.cs.colorado.edu/~mozer/Home.html Yuko Munakata, DU Psych, munakata at kore.psy.du.edu, http://kore.psy.du.edu/munakata Randall O'Reilly, CU Psych, oreilly at psych.colorado.edu, http://psych.colorado.edu/~oreilly One or more of the above faculty should be contacted for any further information. From rrojas at ICSI.Berkeley.EDU Mon Dec 14 18:38:17 1998 From: rrojas at ICSI.Berkeley.EDU (Raul Rojas) Date: Mon, 14 Dec 1998 15:38:17 -0800 Subject: call for contributions on handwriting recognition Message-ID: <3675A169.BCE597F@icsi.berkeley.edu> [ Moderator's note: I don't usually accept calls for papers in some generic application area for a journal that is not primarily neural nets-oriented. But since neural networks are used so extensively in handwriting recognition research, this particular call for papers is an exception. -- Dave Touretzky, CONNECTIONISTS moderator ] ================================================================ Handwriting Recognition The journal "Kuenstliche Intelligenz" (Artificial Intelligence), organ of the German SIG on AI, will publish a special issue on handwriting recognition during 1999. We are looking for additional original contributions on the following topics: - on-line and off-line handwriting recognition - applications of handwriting recognition - handwriting recognition for PDAs - image processing for handwriting recognition We are looking for papers at an expository level and papers describing ongoing projects. Language: English or German Extension: 5-6 pages, including figures. Deadline: January 31, 1999. Short project summaries (a half-page with pointers to on-line materials) will be published together in a special section. There are no special format requirements, since we will reformat the papers using the source file. Files in LaTeX or MS-Word are acceptable. Electronic submission is encouraged. Guest editor: Raul Rojas Intl. Computer Science Institute 1947 Center St. Berkeley, CA 94704-1198 rrojas at icsi.berkeley.edu From biehl at physik.uni-wuerzburg.de Wed Dec 16 08:53:41 1998 From: biehl at physik.uni-wuerzburg.de (Michael Biehl) Date: Wed, 16 Dec 1998 14:53:41 +0100 (MET) Subject: three preprints available Message-ID: <199812161353.OAA12986@wptx38.physik.uni-wuerzburg.de> FTP-host: ftp.physik.uni-wuerzburg.de FTP-filename: /pub/preprint/1998/WUE-ITP-98-049.ps.gz FTP-filename: /pub/preprint/1998/WUE-ITP-98-055.ps.gz FTP-filename: /pub/preprint/1998/WUE-ITP-98-057.ps.gz The following (three) manuscripts are now available via anonymous ftp, see below for the retrieval procedure. More conveniently, they can be obtained from the Wuerzburg Theoretical Physics preprint server in the WWW: http://theorie.physik.uni-wuerzburg.de/~publications.shtml ------------------------------------------------------------------ 1) Ref. WUE-ITP-98-049 Receiver Operating Characteristics of Perceptrons: Influence of Sample Size and Prevalence A. Freking, M. Biehl, C. Braun, W. Kinzel, and M. Meesmann ABSTRACT In many practical classification problems it is important to distinguish false positive from false negative results when evaluating the performance of the classifier. This is of particular importance for medical diagnostic tests. In this context, receiver operating characteristic (ROC) curves have become a standard tool. Here we apply this concept to characterize the performance of a simple neural network. Investigating the binary classification of a perceptron we calculate analytically the shape of the corresponding ROC curves. The influence of the size of the training set and the prevalence of the quality considered are studied by means of a statistical-mechanics analysis. ------------------------------------------------------------------ 2) Ref. WUE-ITP-98-055 Optimisation of on-line principal component analysis E. Schl"osser, D. Saad, and M. Biehl ABSTRACT Various techniques, used to optimise on-line principal component analysis, are investigated by methods of statistical mechanics. These include local and global optimisation of node-dependent learning-rates which are shown to be very efficient in speeding up the learning process. They are investigated further for gaining insight into the learning rates' time-dependence, which is then employed for devising simple practical methods to improve training performance. Simulations demonstrate the benefit gained from using the new methods. ------------------------------------------------------------------- 3) Ref. WUE-ITP-98-057 Statistical physics and practical training of soft-committee machines M. Ahr, M. Biehl, and R. Urbanczik ABSTRACT Equilibrium states of large layered neural networks with differentiable activation function and a single, linear output unit are investigated using the replica formalism. The quenched free energy of a student network with a very large number of hidden units learning a rule of perfectly matching complexity is calculated analytically. The system undergoes a first order phase transition from unspecialized to specialized student configurations at a critical size of the training set. Computer simulations of learning by stochastic gradient descent from a fixed training set demonstrate that the equilibrium results describe quantitatively the plateau states which occur in practical training procedures at sufficiently small but finite learning rates. ------------------------------------------------------------------- ___________________________________________________________________ Retrieval procedure via anonymous ftp: unix> ftp ftp.physik.uni-wuerzburg.de Name: anonymous Password: {your e-mail address} ftp> cd pub/preprint/1998 ftp> binary ftp> get WUE-ITP-98.XXX.ps.gz (*) ftp> quit unix> gunzip WUE-ITP-98-XXX.ps.gz e.g. unix> lp -odouble WUE-ITP-98-XXX.ps (*) can be replaced by "get WUE-ITP-98-XXX.ps". The file will then be uncompressed before transmission (slow!). ___________________________________________________________________ Michael Biehl Institut fuer Theoretische Physik Julius-Maximilians-Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg email: biehl at physik.uni-wuerzburg.de www: http://theorie.physik.uni-wuerzburg.de/~biehl Tel.: (+49) (0)931 888 5865 " " " 5131 Fax : (+49) (0)931 888 5141 From pelillo at dsi.unive.it Wed Dec 16 09:50:19 1998 From: pelillo at dsi.unive.it (Marcello Pelillo) Date: Wed, 16 Dec 1998 15:50:19 +0100 (MET) Subject: A Neural Computation paper on graph isomorphism Message-ID: The following paper, accepted for publication in Neural Computation, is accessible at the following www site: http://www.dsi.unive.it/~pelillo/papers/nc98.ps.gz A shorter version of it has just been presented at NIPS*98, and can be accesses at: http://www.dsi.unive.it/~pelillo/papers/nips.ps.gz (files are gzipped postscripts) Comments and suggestions are welcome! Best regards, Marcello Pelillo ======================== Replicator Equations, Maximal Cliques, and Graph Isomorphism Marcello Pelillo University of Venice, Italy ABSTRACT We present a new energy-minimization framework for the graph isomorphism problem which is based on an equivalent maximum clique formulation. The approach is centered around a fundamental result proved by Motzkin and Straus in the mid-1960s, and recently expanded in various ways, which allows us to formulate the maximum clique problem in terms of a standard quadratic program. The attractive feature of this formulation is that a clear one-to-one correspondence exists between the solutions of the quadratic program and those in the original, combinatorial problem. To solve the program we use the so-called ``replicator'' equations, a class of straightforward continuous- and discrete-time dynamical systems developed in various branches of theoretical biology. We show how, despite their inherent inability to escape from local solutions, they nevertheless provide experimental results which are competitive with those obtained using more elaborate mean-field annealing heuristics. ==================== ________________________________________________________________________ Marcello Pelillo Dipartimento di Informatica Universita' Ca' Foscari di Venezia Via Torino 155, 30172 Venezia Mestre, Italy Tel: (39) 41 2908.440 Fax: (39) 41 2908.419 E-mail: pelillo at dsi.unive.it URL: http://www.dsi.unive.it/~pelillo From danr at cs.uiuc.edu Wed Dec 16 15:21:54 1998 From: danr at cs.uiuc.edu (Dan Roth) Date: Wed, 16 Dec 1998 14:21:54 -0600 Subject: postdoctoral fellowship Message-ID: <36781662.C6648EC2@cs.uiuc.edu> Postdoctoral Fellows Program Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign. This is an invitation to apply for postdoctoral fellowships at the Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign. The Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign is an inter- and multidisciplinary research institute devoted to basic research in the physical sciences and engineering, and in the life and behavioral sciences. Its primary mission is to foster interdisciplinary work of the highest quality in an environment that transcends many of the limitations inherent in traditional university organizations and structures. Research at the Institute focuses on three broadly defined themes: biological intelligence, human-computer intelligent interaction, and molecular and electronic nanostructures. Eighteen research groups, composed of faculty and students from sixteen UIUC departments, work within and across these three areas. Many research areas that are of interest to readers of this list are relevant to Beckman Institute research in general and to the postdoctoral fellows program in particular. All aspects of the learning sciences (Computational, Biological, etc. ), Natural Language (Computational aspects, Psycholinguistics), and many other topics in Biological and Artificial Intelligence are of interest. Please consult the Beckman Institute web page at http://www.beckman.uiuc.edu/ and the Beckman fellows program at http://www.beckman.uiuc.edu/outreach/fellowshome.html If interested, don't hesitate to contact me directly for further information. Notice that the due date for applying is January 8. Dan ---------------------------------------------------------------------- Dan Roth Department of Computer Science, University of Illinois, Urbana/Champaign 1304 W. Springfield Ave. Urbana IL 61801 Phone: (217) 244-7068 (217) 244-6813 (Sec) Fax: +(217) 244-6500 e-mail: danr at cs.uiuc.edu http://L2R.cs.uiuc.edu/~danr ---------------------------------------------------------------------- From Michael.Haft at mchp.siemens.de Thu Dec 17 04:39:55 1998 From: Michael.Haft at mchp.siemens.de (Michael Haft) Date: Thu, 17 Dec 1998 10:39:55 +0100 Subject: Paper available: Robust, `Topological' Codes ... Message-ID: <3678D16B.78C35184@mchp.siemens.de> The following paper recently appeared in Phys. Rev. Letters (Vol.81, Nr.18, Nov. 1998, 4016-4019): Robust, `Topological' Codes by Keeping Control of Internal Redundancy M. Haft Processing information under noisy conditions demands to find a tradeoff between coding a variety of different information and robust, {\em redundant} coding of important information. We illustrate this information theoretic demand by some simple considerations. Following this, we set up information theoretic plausible learning rules for a selforganizing network. Thereby, internal {\em redundancy} is controlled via anti-Hebbian learning based on an internal topology with a given correlation function. For finite correlation length and thus a finite amount of redundancy the emergence of a map-like representation of sensory information is shown to be the consequence. The original manuscript is available from ftp://flop.informatik.tu-muenchen.de/pub/hofmannr/topoCodes.ps.gz ------------------------------------------------------------------------ Dr. Michael Haft Siemens AG, ZT IK 4 Otto-Hahn-Ring 6, 81730 Muenchen Tel.: +49/89/636-47953 Fax.: +49/89/636-49767 email: Michael.Haft at mchp.siemens.de ------------------------------------------------------------------------ From psollich at mth.kcl.ac.uk Fri Dec 18 08:56:14 1998 From: psollich at mth.kcl.ac.uk (Peter Sollich) Date: Fri, 18 Dec 1998 13:56:14 +0000 (GMT) Subject: Papers on Gaussian processes and online learning Message-ID: Dear Connectionists, the following two papers, which I hope may be of interest, are now available from my web pages: -------------------------------------------------------------------------- Peter Sollich Learning curves for Gaussian processes http://www.mth.kcl.ac.uk/~psollich/papers/GaussianProcLearningCurveNIPSIX.ps.gz (or /~psollich/papers_uncompressed/GaussianProcLearningCurveNIPSIX.ps) I consider the problem of calculating learning curves (i.e., average generalization performance) of Gaussian processes used for regression. A simple expression for the generalization error in terms of the eigenvalue decomposition of the covariance function is derived, and used as the starting point for several approximation schemes. I identify where these become exact, and compare with existing bounds on learning curves; the new approximations, which can be used for any input space dimension, generally get substantially closer to the truth. (In M J Kearns, S A Solla, and D Cohn, editors, Advances in Neural Information Processing Systems 11, Cambridge, MA. MIT Press. In press.) -------------------------------------------------------------------------- H C Rae, P Sollich, and A C C Coolen On-Line Learning with Restricted Training Sets: Exact Solution as Benchmark for General Theories http://www.mth.kcl.ac.uk/~psollich/papers/HebbOnlineNIPSIX.ps.gz (or /~psollich/papers_uncompressed/HebbOnlineNIPSIX.ps) We solve the dynamics of on-line Hebbian learning in perceptrons exactly, for the regime where the size of the training set scales linearly with the number of inputs. We consider both noiseless and noisy teachers. Ouc calculation cannot be extended to non-Hebbian rules, but the solution provides a nice benchmark to test more general and advanced theories for solving the dynamics of learning with restricted training sets. (In M J Kearns, S A Solla, and D Cohn, editors, Advances in Neural Information Processing Systems 11, Cambridge, MA. MIT Press. In press.) -------------------------------------------------------------------------- Any comments and suggestions are welcome. For papers on related topics, you could also have a look at http://www.mth.kcl.ac.uk/~psollich/publications for my full publications list. Merry Christmas! Peter Sollich -------------------------------------------------------------------------- Peter Sollich Department of Mathematics Phone: +44 - (0)171 - 873 2875 King's College Fax: +44 - (0)171 - 873 2017 University of London E-mail: peter.sollich at kcl.ac.uk Strand WWW: http://www.mth.kcl.ac.uk/~psollich London WC2R 2LS, U.K. -------------------------------------------------------------------------- From NKasabov at infoscience.otago.ac.nz Sat Dec 19 20:25:53 1998 From: NKasabov at infoscience.otago.ac.nz (Nik Kasabov) Date: Sun, 20 Dec 1998 14:25:53 +1300 Subject: A new book on neuro-fuzzy computation Message-ID: "Neuro-Fuzzy Techniques for Intelligent Information Systems" Nikola Kasabov and Robert Kozma (eds) January 1999, Physica -Verlag (Springer Verlag), Berlin, Germany ISBN 3-7908-1187-4, DM 178-, fax: +49 (30) 8 2787301, email: orders at springer.de, http://www.springer.de/ P.O.Box 140201, D-14302 Berlin, Germany Abstract This edited volume comprises selected chapters that cover contemporary issues of the development and the application of neuro-fuzzy techniques. Developing and using neural networks, fuzzy logic systems, genetic algorithms and statistical methods as separate techniques, or in their combination, have been research topics in several areas such as Mathematics, Engineering, Computer Science, Physics, Economics and Finance.Here the latest results in this field are presented from both theoretical and practical points of view. The volume has four main parts. Part one presents generic techniques and theoretical issues, while part two, three and four deal with practically oriented models, systems and implementations. Content: Part 1: Generic Neuro-Fuzzy and Hybrid Techniques Chapter 1. Analysis and Modelling of Complex Systems Using the Self-Organising Map (O.Simula, J.Vesanto, E.Alhoniemi, J.Hollmen) Chapter 2. Fuzzy Methods for Learning from Data (V.Cherkassky) Chapter 3. Uneven Allocation of Membership Functions for Fuzzy Modelling of Multi-Input Systems (K.Tachibana,T.Furuhashi) Chapter 4. Fuzzy Equivalence Relations and Fuzzy Partitions(B.Reusch) Chapter 5. Identifying Fuzzy Rule-Based Models utilising Neural Networks, Fuzzy Logic and Genetic Algorithms (A.Bastian) Chapter 6. Neuro-Genetic Information Processing for Optimisation and Adaptation in Intelligent Systems (M.Watts,N.Kasabov) Chapter 7. Evolving Connectionist and Fuzzy Connectionist Systems: Theory and Applications for Adaptive, On-line Intelligent Systems (N.Kasabov) Part 2. Neuro-Fuzzy Systems for Pattern Recognition, Image-, Speech- and Language Processing Chapter 8. Connectionist Approaches for Feature Analysis (N.Pal) Chapter 9. Pattern Classification and Feature Selection by Ellipsoidal Fuzzy Rules (S.Abe) Chapter 10. Printed Chinese Optical Character Recognition by Neural Networks Y.Wu,M.Zhao) Chapter 11. Image Processing by Chaotic Neural Network Fuzzy Membership Functions (H.Szu,C.Hsu) Chapter 12. Fuzzy Learning Machine with Application to the Detection of Landmarks for Orthodontic Treatment (E.Uchino,T.Yamakawa) Chapter 13. Speech Data Analysis and Recognition Using Fuzzy Neural Networks and Self-Organising Maps (N.Kasabov, R.Kozma, R.Kilgour et al) Chapter 14. Connectionist Methods for Stylometric Analysis: A Hybrid Approach (D.Kassabova,P.Sallis) Part 3. Neuro-Fuzzy Systems for Information Retrieval and Socio-Economic Applications Chapter 15. Soft Information Retrieval: Applications of Fuzzy Set Theory and Neural Networks (F.Crestani,G.Pasi) Chapter 16. Modeling Consensus in Group Decision Making: a Fuzzy Dynamical Approach (Mario Fedrizzi, Michele Fedrizzi, R.A. M Pereira) Chapter 17. Building fuzzy expert systems (M.Negnevitsky) Chapter 18. A Neural Network for Fuzzy Dynamic Programming and Its use in Socio-Economic Regional Development Planning (J.Kacprzyk,R.Francelin,F.Gomide Chapter 19. Investment Maps for Emerging Markets (G.Deboeck) Chapter 20. Adaptive Fuzzy-Impedance Controller for Constrained Robot Motion (P.Petrovich,V.Milacic) Part 4. Specialised Hardware for Neuro-Fuzzy Intelligent Systems Chapter 21. Specialised Hardware for Computational Intelligence (G.Coghill) Chapter 22. Evolvable Hardware - The Coming Hardware Design Method? (J.Torrensen) ---------------------------------------------------------------------------- - Have an enjoyable and prosperous New Year! Nik Kasabov ----------------------------------- Professor Dr Nikola Kasabov Department of Information Science University of Otago,P.O.Box 56 Dunedin, New Zealand phone:+64 3 479 8319, fax: +64 3 479 8311 email: nkasabov at otago.ac.nz http://kel.otago.ac.nz/nik/ ----------------------------------- From eppler at hpe.fzk.de Tue Dec 22 05:35:25 1998 From: eppler at hpe.fzk.de (Wolfgang Eppler) Date: Tue, 22 Dec 1998 11:35:25 +0100 Subject: Session at EUFIT '99 Message-ID: <367F75ED.5E839277@hpe.fzk.de> Prof Gemmeke asked me to announce the following call for papers: --------------------------------------------------------------------- Call for Papers Invited Session "Time-critical Applications with Neural Networks" at EUFIT '99, Aachen, Germany, September 13-16, 1999 Processing power of today's processors like PENTIUM II or PowerPC is sufficient for most of the applications with neural networks. Generally, training of neural networks takes much more time than recall of trained knowledge but for most of the (industrial) applications training is done only once. So there is no need to accelerate learning process by using additional hardware. On the other hand, some of the applications with neural networks are run under time critical conditions where processing power of standard processors is not sufficient. In these cases, response times of a few microseconds are required. Typical applications are found in the sphere of trigger experiments in high energy physics or pattern recognition in medical applications. Therfore, special purpose hardware like fast DSPs, parallel processors or neural processor chips are used to fulfill timing requirements. The invited session cares about those applications using special purpose hardware to accelerate neural operations. Contributions of both PC based applications using a PC acceleration card as well as stand alone applications e.g. using fast DSPs in combination with microcontrollers are welcome. Deadline for abstract: January 31, 1999 Deadline for camera ready paper: March 15, 1999 Address: Thomas Fischer Forschungszentrum Karlsruhe, FZK (Research Centre Karlsruhe) POB 3640 76021 Karlsruhe Germany Please contact: Thomas Fischer, Tel: ++49 7247 82 4042, email: fischer at hpe.fzk.de or Wolfgang Eppler, Tel: ++49 7247 82 5537, email: eppler at hpe.fzk.de --------------------------------------------------------------------- FORSCHUNGSZENTRUM KARLSRUHE Thomas Fischer Department HPE phone: +49 7247 82-4042 P.O. Box 3640, 76021 Karlsruhe, fax: +49 7247 82-3560 GERMANY email: fischer at hpe.fzk.de ---------------------------------------------------------------------- From lsaul at research.att.com Tue Dec 22 14:22:55 1998 From: lsaul at research.att.com (Lawrence K. Saul) Date: Tue, 22 Dec 1998 14:22:55 -0500 (EST) Subject: preprints available Message-ID: <199812221922.OAA14076@octavia.research.att.com> The following preprints are available at http://www.research.att.com/~lsaul. ============================================================================== ATTRACTOR DYNAMICS IN FEEDFORWARD NEURAL NETWORKS L. Saul and M. Jordan We study the probabilistic generative models parameterized by feedforward neural networks. An attractor dynamics for probabilistic inference in these models is derived from a mean field approximation for large, layered sigmoidal networks. Fixed points of the dynamics correspond to solutions of the mean field equations, which relate the statistics of each unit to those of its Markov blanket. We establish global convergence of the dynamics by providing a Lyapunov function and show that the dynamics generate the signals required for unsupervised learning. Our results for feedforward networks provide a counterpart to those of Cohen-Grossberg and Hopfield for symmetric networks. ============================================================================== MARKOV PROCESSES ON CURVES FOR AUTOMATIC SPEECH RECOGNITION L. Saul and M. Rahim We investigate a probabilistic framework for automatic speech recognition based on the intrinsic geometric properties of curves. In particular, we analyze the setting in which two variables---one continuous (X), one discrete (S)---evolve jointly in time. We suppose that the vector X traces out a smooth multidimensional curve and that the variable S evolves stochastically as a function of the arc length traversed along this curve. Since arc length does not depend on the rate at which a curve is traversed, this gives rise to a family of Markov processes whose predictions, Pr[S|X], are invariant to nonlinear warpings of time. We describe the use of such models, known as Markov processes on curves (MPCs), for automatic speech recognition, where X are acoustic feature trajectories and S are phonetic transcriptions. On two tasks---recognizing New Jersey town names and connected alpha-digits---we find that MPCs yield lower word error rates than comparably trained hidden Markov models. ============================================================================== From jensen at volen.brandeis.edu Wed Dec 23 09:40:39 1998 From: jensen at volen.brandeis.edu (Ole Jensen) Date: Wed, 23 Dec 1998 09:40:39 -0500 (EST) Subject: reprint available: OscillatorSTM model Message-ID: The following reprint is available (PDF) at http://lucifer.ccs.brandeis.edu/~ojensen or http://www.jneurosci.org/current.shtml ------------------------------------------------------------------------- AN OSCILLATORY SHORT-TERM MEMORY BUFFER MODEL CAN ACCOUNT FOR DATA ON THE STERNBERG TASK Ole Jensen and John E. Lisman Journal of Neuroscience, 18:10699-10699, 1998 A limited number (7+/-2) of items can be held in human short-term memory (STM). We have previously suggested that observed dual (theta and gamma) oscillations could underlie a multiplexing mechanism that enables a single network to actively store up to 7 memories. Here we have asked whether models of this kind can account for the data on the Sternberg task, the most quantitative measurements of memory search available. We have found several variants of the oscillatory search model that account for the quantitative dependence of the reaction time distribution on the number of items (S) held in STM. The models differ on the issues of 1) whether theta frequency varies with S and 2) whether the phase of ongoing oscillations is reset by the probe. Using these models the frequencies of dual oscillations can be derived from psychophysical data. The derived values (f_theta = 6-10 Hz, f_gamma = 45-60 Hz) are in reasonable agreement with experimental values. The exhaustive nature of the serial search that has been inferred from psychophysical measurements can be plausibly explained by these oscillatory models. One argument against exhaustive serial search has been the existence of serial position effects. We find that these effects can be explained by short-term repetition priming in the context of serial scanning models. Our results strengthen the case for serial processing and point to experiments that discriminate between variants of the serial scanning process. ------------------------------------------------------------------------------ Ole Jensen, Ph.D. Volen Center for Complex Systems Brandeis University Waltham MA02454-9110 USA Home phone: (617) 666 8274 Work phone: (781) 736 3146 Fax: (781) 736 2398 Phone in Denmark: (+45) 59517316 jensen at volen.brandeis.edu http://lucifer.ccs.brandeis.edu/~ojensen From sas at Glue.umd.edu Fri Dec 25 21:08:49 1998 From: sas at Glue.umd.edu (Shihab A. Shamma) Date: Fri, 25 Dec 1998 21:08:49 -0500 (EST) Subject: Neuromorphic Engineering Workshop Message-ID: "NEUROMORPHIC ENGINEERING WORKSHOP" JUNE 27 - JULY 17, 1999 TELLURIDE, COLORADO Deadline for application is February 1, 1999. Avis COHEN (University of Maryland) Rodney DOUGLAS (University of Zurich and ETH, Zurich/Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) We invite applications for a three week summer workshop that will be held in Telluride, Colorado from Sunday, June 27 to Saturday, July 17, 1999. The 1998 summer workshop on "Neuromorphic Engineering", sponsored by the National Science Foundation, the Gatsby Foundation, NASA, the Office for Naval Research, and by the "Center for Neuromorphic Systems Engineering" at the California Institute of Technology, was an exciting event and a great success. A detailed report on the workshop is available at http://www.klab.caltech.edu/~timmer/telluride.html ( or in Europe: http://www.ini.unizh.ch:80/telluride98/). We strongly encourage interested parties to browse through these reports and photo albums. GOALS: Carver Mead introduced the term "Neuromorphic Engineering" for a new field based on the design and fabrication of artificial neural systems, such as vision systems, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems. The goal of this workshop is to bring together young investigators and more established researchers from academia with their counterparts in industry and national laboratories, working on both neurobiological as well as engineering aspects of sensory systems and sensory-motor integration. The focus of the workshop will be on "active" participation, with demonstration systems and hands-on-experience for all participants. Neuromorphic engineering has a wide range of applications from nonlinear adaptive control of complex systems to the design of smart sensors. Many of the fundamental principles in this field, such as the use of learning methods and the design of parallel hardware, are inspired by biological systems. However, existing applications are modest and the challenge of scaling up from small artificial neural networks and designing completely autonomous systems at the levels achieved by biological systems lies ahead. The assumption underlying this three week workshop is that the next generation of neuromorphic systems would benefit from closer attention to the principles found through experimental and theoretical studies of real biological nervous systems as whole systems. FORMAT: The three week summer workshop will include background lectures systems neuroscience (in particular, sensory processing at peripheral and central levels, motor control of locomotion and oculor-motor function, attention and learning) practical tutorials on analog VLSI design, small mobile robots (Khoalas), hands-on projects, and special interest groups. Participants are required to take part and possibly complete at least one of the projects proposed (soon to be defined). They are furthermore encouraged to become involved in as many of the other activities proposed as interest and time allow. There will be two lectures in the morning that cover issues that are important to the community in general. Because of the diverse range of backgrounds among the participants, the majority of these lectures will be tutorials, rather than detailed reports of current research. These lectures will be given by invited speakers. Participants will be free to explore and play with whatever they choose in the afternoon. Projects and interest groups meet in the late afternoons, and after dinner. The analog VLSI practical tutorials will cover all aspects of analog VLSI design, simulation, layout, and testing over the workshop of the three weeks. The first week covers basics of transistors, simple circuit design and simulation. This material is intended for participants who have no experience with analog VLSI. The second week will focus on design frames for silicon retinas, from the silicon compilation and layout of on-chip video scanners, to building the peripheral boards necessary for interfacing analog VLSI retinas to video output monitors. Retina chips will be provided. The third week will feature sessions on floating gates, including lectures on the physics of tunneling and injection, and on inter-chip communication systems. We will also feature a tutorial on the use of small, mobile robots, focussing on Khoala's, as an ideal platform for vision, auditory and sensory-motor circuits. Projects that are carried out during the workshop will be centered in a number of groups, including active vision, audition, olfaction, motor control, central pattern generator, robotics, multichip communication, analog VLSI and learning. The "active perception" project group will emphasize vision and human sensory-motor coordination. Issues to be covered will include spatial localization and constancy, attention, motor planning, eye movements, and the use of visual motion information for motor control. Demonstrations will include a robot head active vision system consisting of a three degree-of-freedom binocular camera system that is fully programmable. The "central pattern generator" group will focus on small walking and undulating robots. It will look at characteristics and sources of parts for building robots, play with working examples of legged and segmented robots, and discuss CPG's and theories of nonlinear oscillators for locomotion. It will also explore the use of simple analog VLSI sensors for autonomous robots. The "robotics" group will use rovers and working digital vision boards as well as other possible sensors to investigate issues of sensorimotor integration, navigation and learning. The "audition" group aims to develop biologically plausible algorithms and aVLSI implementations of specific auditory tasks such as source localization and tracking, and sound pattern recognition. Projects will be integrated with visual and motor tasks in the context of a robot platform. The "multichip communication" project group will use existing interchip communication interfaces to program small networks of artificial neurons to exhibit particular behaviors such as amplification, oscillation, and associative memory. Issues in multichip communication will be discussed. LOCATION AND ARRANGEMENTS: The workshop will take place at the Telluride Elementary School located in the small town of Telluride, 9000 feet high in Southwest Colorado, about 6 hours away from Denver (350 miles). Continental and United Airlines provide daily flights directly into Telluride. All facilities within the beautifully renovated public school building are fully accessible to participants with disabilities. Participants will be housed in ski condominiums, within walking distance of the school. Participants are expected to share condominiums. No cars are required. Bring hiking boots, warm clothes and a backpack, since Telluride is surrounded by beautiful mountains. The workshop is intended to be very informal and hands-on. Participants are not required to have had previous experience in analog VLSI circuit design, computational or machine vision, systems level neurophysiology or modeling the brain at the systems level. However, we strongly encourage active researchers with relevant backgrounds from academia, industry and national laboratories to apply, in particular if they are prepared to work on specific projects, talk about their own work or bring demonstrations to Telluride (e.g. robots, chips, software). Internet access will be provided. Technical staff present throughout the workshops will assist with software and hardware issues. We will have a network of SUN workstations running UNIX, MACs and PCs running LINUX and Windows95. Unless otherwise arranged with one of the organizers, we expect participants to stay for the duration of this three week workshop. FINANCIAL ARRANGEMENT: We have several funding requests pending to pay for most of the costs associated with this workshop. Different from previous years, after notification of acceptances have been mailed out around March 15., 1999, participants are expected to pay a $275.- workshop fee. In case of real hardship, this can be waived. Shared condominiums will be provided for all academic participants at no cost to them. We expect participant from National Laboratories and Industry to pay for these modestly priced condominiums. We expect to have funds to reimburse a small number of participants for up to travel (up to $500 for domestic travel and up to $800 for overseas travel). Please specify on the application whether such financial help is needed. HOW TO APPLY: The deadline for receipt of applications is February 1., 1999. Applicants should be at the level of graduate students or above (i.e. post-doctoral fellows, faculty, research and engineering staff and the equivalent positions in industry and national laboratories). We actively encourage qualified women and minority candidates to apply. Application should include: 1. Name, address, telephone, e-mail, FAX, and minority status (optional). 2. Curriculum Vitae. 3. One page summary of background and interests relevant to the workshop. 4. Description of special equipment needed for demonstrations that could be brought to the workshop. 5. Two letters of recommendation Complete applications should be sent to: Prof. Terrence Sejnowski The Salk Institute 10010 North Torrey Pines Road San Diego, CA 92037 email: terry at salk.edu FAX: (619) 587 0417 Applicants will be notified around March 15. 1999. From gyen at okway.okstate.edu Sat Dec 26 13:27:44 1998 From: gyen at okway.okstate.edu (Gary Yen) Date: Sat, 26 Dec 1998 12:27:44 -0600 Subject: IJCNN99- submission deadline extended Message-ID: <9812269146.AA914696654@okway.okstate.edu> ontributed by: Gary G. Yen gyen at master.ceat.okstate.edu CONFERENCE UPDATE: IJCNN'99 900 Submissions and Counting Deadline Extended for Email Submissions 1999 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS RENAISSANCE HOTEL WASHINGTON, D.C. JULY 10-16, 1999 We have 900 submissions in hand now; however, due to numerous requests, we are extending the submission deadline. Email submission may still be made directly to the Conference General Chair, David Brown, at dgb at cdrh.fda.gov. Submissions received after January 22 will only be considered for poster presentation unless invited by session chairs. Details on the web site www.cas.american.edu/~medsker/ijcnn99/ijcnn99.html (or via www.inns.org) as shown below: IJCNN'99 review and acceptance will be based on a one-page summary, which will be distributed to Conference participants in a summary book. Conference proceedings will be in CD form. Hard-copy proceedings may be available for an additional fee. Use of illustrations is encouraged in the summaries, within the single allowed page. Use of audio and video segments in the CD proceedings will be considered. Poster presentations will be encouraged, with "single-slide" poster presentations interspersed with regular oral sessions. Monetary awards presented for best poster in several categories. Best student paper awards will also be given. INSTRUCTIONS FOR AUTHORS: Paper summary format information The summary must be accompanied by a cover sheet listing the following information: 1. Paper title 2. Author information - full names and affiliations as they will appear in the program 3. Mailing address, telephone, fax, and email for each author 4. Request for oral or poster presentation 5. Topic(s), selected from the list below: Biological foundations Neural systems Mathematical foundations Architectures Learning algorithms Intelligent control Artificial systems Data analysis Pattern recognition Hybrid systems Intelligent computation (including fuzzy and GA) Applications The one-page, camera-ready summary must conform to the following requirements: 1. All text and illustrations must appear within a 7x9 in. (178x229mm) area. For US standard size paper (8.5x11 in.), set margins to 0.75 in. (18mm) left and right and 1 in. top and bottom. For A4 size paper, set margins to 17mm left and right, 25mm top, and 45 mm bottom. 2. Use 10-point Times Roman or equivalent typeface for the main text. Single space all text, allowing extra space between paragraphs. 3. Title (16 pt. Bold, centered) Capitalize only first word and proper names. 4. Authors names, affiliation (12 pt., centered) omit titles or degrees. 5. Five section headings (11 pt. bold). The following five headings MUST be used: Purpose, Method, Results, New or breakthrough aspect of work, and Conclusions. Submit by email to dgb at cdrh.fda.gov Acceptance will be determined by February 8, 1999 in most cases, and complete papers are due in digital form for CD publication by May 3, 1999. From S.Singh at exeter.ac.uk Tue Dec 22 12:44:56 1998 From: S.Singh at exeter.ac.uk (Sameer Singh) Date: Tue, 22 Dec 1998 17:44:56 +0000 (GMT Standard Time) Subject: FELLOWSHIP Message-ID: PANN SENIOR RESEARCH FELLOWSHIP Pattern Analysis and Neural Networks research, Exeter University, UK http://www.dcs.exeter.ac.uk/research/PANN Pattern Analysis and Applications research group at Exeter University has research interests in image processing, pattern recognition and neural networks. The research group provides, from time to time, finances to support Faculty members from other research institutions to visit and work on projects of mutual interest. The 1999 Visiting Fellowship will provide for return airfares and full accommodation costs in Exeter for a period of up to one year. It is expected that senior academics who are taking a study leave or post-doctoral staff who wish to expand their research interests in our laboratory will find this appointment suitable. Normally, visiting scientists will receive partial or full salaries at their home institutions. The 1999 fellowship must be taken on or before 1 October 1999; the actual starting date is flexible and can be negotiated with the group Director. For the 1999 fellowship, we invite interested candidates in the area of "Adaptive Image Analysis". An abstract of our current interests is appended at the end of this message. We are particularly interested in candidates who would pursue research in any of the areas mentioned in the abstract below. We would specially encourage those complement the expertise available within our research group, for example those who have an interest in robotics and learning in image processing. The exact nature of research undertaken can be discussed through email. One of the key aims of the fellowship is to encourage collaboration between PANN research group and other scientists working in similar areas. This is particularly useful for stimulating further research activity through research grants within the UK and European communities. Since the proposed project is of significance to defence, transportation and space agencies, this fellowship will provide excellent contacts with the industry for further support in this research area. If you are interested in making an application, then please send a copy of your CV to Sameer Singh at s.singh at exeter.ac.uk at the earliest possible opportunity. Please note that the fellowship is only open at the post-doctoral level. Adaptive Scene Analysis We are currently interested in researching intelligent scene analysis on a number of sub-areas in collaboration with the Defence Evaluation and Research Agency. These research areas include adaptive classifier development, modelling feedback mechanisms in image processing at various levels, optimal feature selection, dynamic control, development of contextual, spatial and temporal awareness models, self-assessment, validation and learning mechanisms and building robotic systems that incorporate such vision infrastructure for navigation and control. ___ -------------------------------------------- Sameer Singh Director, PANN Research Department of Computer Science University of Exeter Exeter EX4 4PT UK tel: +44-1392-264053 fax: +44-1392-264067 email: s.singh at exeter.ac.uk web: http://www.dcs.exeter.ac.uk/academics/sameer -------------------------------------------- From dblank at comp.uark.edu Tue Dec 1 15:15:15 1998 From: dblank at comp.uark.edu (Douglas Blank) Date: Tue, 01 Dec 1998 14:15:15 -0600 Subject: PhD Fellowships, Fall 1999: Please Post Message-ID: <36644E53.1C470D51@comp.uark.edu> PhD Fellowships, Fall 1999 Department of Computer Science and Engineering Artificial Intelligence & Robotics Laboratory The University of Arkansas, Fayetteville, is now accepting applications for PhD fellowships. Students may study any of a wide range of computer science topics, including high-level connectionist modeling, intelligent robotic control, textual processing, software agency, data mining, and methods of emergent computation. Five fellowships are available starting Fall 1999, each providing $15k/year plus tuition. Each fellowship is renewable for a total of 4 years. Candidates for this fellowship should be US citizens, or permanent residents of the US. UA is located in the Ozark Mountains on the borders of Arkansas, Missouri, and Oklahoma. Applicants should apply to the Graduate School, and for a Graduate Assistantship. Application materials are available online at http://www.uark.edu/depts/gradinfo/ For more information, please contact: Graduate Advisor Department of Computer Science Science-Engineering 232 University of Arkansas Fayetteville, AR 72701 Voice: (501)575-6427 FAX: (501)575-3817 http://csci.uark.edu/ csci at cavern.uark.edu -- ===================================================================== dblank at comp.uark.edu Douglas Blank, University of Arkansas Assistant Professor Computer Science ==================== http://www.uark.edu/~dblank ==================== From torras at iri.upc.es Tue Dec 1 12:26:12 1998 From: torras at iri.upc.es (Carme Torras) Date: Tue, 1 Dec 1998 18:26:12 +0100 (MET) Subject: CFP: Special Issue on Adaptive Robots Message-ID: <199812011726.SAA00912@sibelius.upc.es> =============================================================================== Special Issue of the journal CONNECTION SCIENCE on *** ADAPTIVE ROBOTS *** CALL FOR PAPERS: Deadline March 15th, 1999 Adaptivity is the capability of self-modification that some agents have, which allows them to maintain a level of performance when facing environmental changes, or to improve it when confronted repeatedly with the same situation. This special issue is aimed at capturing the state of the art in the intricate task of endowing robots with adaptive capabilities, with a special emphasis on neural-based solutions. Thus, some examples of topics covered are: - Adaptive sensing - Adaptive gaits for walking robots - Self-calibration of robot manipulators - Adaptive dynamic control of flexible robot arms - Acquiring fine manipulation skills - Learning hand-eye coordination - Exploration and reinforcement learning - Improving robot navigation - Adaptive multi-robot systems The special issue will adhere to an engineering perspective, i.e. the emphasis will be on solving practical robotic problems using adaptive techniques, disregarding their possible biological (or cognitive) inspiration or plausibility. Work on real robots is preferred, with special attention being devoted to replicability of results, as well as to the discussion of the limitations (together with the advantages) of the proposed techniques. Guest editor: ------------- Carme TORRAS, CSIC-UPC (Spain) Editorial board: ---------------- Rudiger DILLMANN, University of Karlsruhe (Germany) Leslie P. KAELBLING, Brown University (USA) Ben KR=D6SE, University of Amsterdam (The Netherlands) Jos=E9 R. MILL=C1N, Joint Research Centre (Italy) Helge RITTER, University of Bielefeld (Germany) Shankar SASTRY, University of California at Berkeley (USA) Noel SHARKEY, University of Sheffield (UK) Tim SMITHERS, CEIT (Spain) Tom ZIEMKE, University of Sk=F6vde (Sweden) Submissions to this special issue should be sent by March 15th, 1999 to: Carme Torras Institut de Robotica i Informatica Industrial (CSIC-UPC) Gran Capita 2-4 (edifici Nexus) 08034-Barcelona (Spain) e-mail: ctorras at iri.upc.es http://www-iri.upc.es/people/torras SCHEDULE: --------- December 98 - call for papers. March 15th, 99 - submission deadline May 15th, 99 - information to authors July 1st, 99 - deadline for final papers October 99 - publication of the special issue *** CONNECTION SCIENCE *** Journal of Neural Computing, Artificial Intelligence and Cognitive Research http://www.carfax.co.uk/cos-ad.htm =============================================================================== From marks at gizmo.usc.edu Wed Dec 2 12:42:49 1998 From: marks at gizmo.usc.edu (Mark S. Seidenberg) Date: Wed, 2 Dec 1998 10:42:49 -0700 Subject: training at USC Message-ID: GRADUATE AND POSTDOCTORAL TRAINING IN COGNITIVE AND COMPUTATIONAL NEUROSCIENCE AT THE UNIVERSITY OF SOUTHERN CALIFORNIA This announcement concerns opportunities for pre- and post-doctoral training in cognitive neuroscience and computational neuroscience at USC. We seek to recruit outstanding individuals to be supported under a training grant funded by the National Institute of Mental Health. The goal of the training program is to develop future cognitive neuroscientists who will be able to advance the goal of understanding brain-behavior relationships, using computational modeling as a primary tool. USC has brought together a group of outstanding researchers in cognitive neuroscience, with a particular concentration of expertise in neural network modeling at different levels of abstraction. Our program focuses on four main areas: language, vision, learning and memory, and motor performance. Research activities involve behavioral studies of normal and brain-injured humans; psychophysiological and neuroimaging techniques; basic neuroscience methods such as single-cell recording and lesion studies in animals, coupled with computational modeling. For additional information contact Mark S. Seidenberg, Program Director (marks at neuro.usc.edu). The Neuroscience Doctoral program: http://www.usc.edu/dept/nbio/nibs.html Cognitive Neuroscience at USC: http://www.usc.edu/dept/nbio/nibs/cognit.html Computational Neuroscience at USC: http://www.usc.edu/dept/nbio/nibs/compute.html Application Procedure: http://www.usc.edu/dept/nbio/nibs/howto.html Training Program: http://siva.usc.edu/coglab/training.html ____________________________________ Mark S. Seidenberg Neuroscience Program University of Southern California 3614 Watt Way Los Angeles, CA 90089-2520 Phone: 213-740-9174 Fax: 213-740-5687 http://www-rcf.usc.edu/~seidenb http://siva.usc.edu/coglab ____________________________________ From thimm at idiap.ch Wed Dec 2 08:37:24 1998 From: thimm at idiap.ch (Georg Thimm) Date: Wed, 02 Dec 1998 14:37:24 +0100 Subject: Contents of Neurocomputing 23 (1998) Message-ID: <199812021337.OAA26541@rotondo.idiap.ch> Dear reader, Please find below a compilation of the contents for Neurocomputing and Scanning the Issue written by V. David Sanchez A. More information on the journal are available at the URL http://www.elsevier.nl/locate/jnlnr/05301 . The contents of this and other journals published by Elsevier are distributed also by the ContentsDirect service (see at the URL http://www.elsevier.nl/locate/ContentsDirect). Please feel free to redistribute this message. My apologies if this message is inappropriate for this mailing list; I would appreciate a feedback. With kindest regards, Georg Thimm ====================================================================== Journal : NEUROCOMPUTING ISSN : 0925-2312 Vol./Iss. : 23 / 1-3 A comparative study of medium-weather-dependent load forecasting using enhanced artificial/fuzzy neural network and statistical techniques Elkateb , M.M. pp.: 3-13 Prediction of iron losses of wound core distribution transformers based on artificial neural networks Georgilakis , P.S. pp.: 15-29 Laboratory investigation of a digital recurrent network for transmission line directional protection Sanaye-Pasand , M. pp.: 31-46 A neural network based estimator for electricity spot-pricing with particular reference to weekend and public holidays Wang , A.J. pp.: 47-57 A neural network based protection technique for combined 275 kV/400 kV double circuit transmission lines Xuan , Q.Y. pp.: 59-70 Artificial neural networks for short-term energy forecasting: Accuracy and economic value Hobbs , Benjamin F. pp.: 71-84 Power system security boundary visualization using neural networks McCalley , James D. pp.: 85-96 The use of artificial neural networks for condition monitoring of electrical power transformers Booth , C. pp.: 97-109 Neural networks for power system condition monitoring and protection Cannas , B. pp.: 111-123 Recurrent neural network for forecasting next 10 years loads of nine Japanese utilities Kermanshahi , Bahman pp.: 125-133 Use of neural networks for customer tariff exploitation by means of short-term load forecasting Verona , Francesco Bini pp.: 135-149 Topology--independent artificial neural network for overload screening Riquelme , Jesu's pp.: 151-160 A neural network-based tool for preventive control of voltage stability in multi-area power systems Maiorano , A. pp.: 161-176 An incipient fault detection system based on the probabilistic radial basis function network: Application to the diagnosis of the condenser of a coal power plant Mun~oz , A. pp.: 177-194 Electric utility coal quality analysis using artificial neural network techniques Salehfar , H. pp.: 195-206 A class of hybrid intelligent system for fault diagnosis in electric power systems Jota , Patricia R.S. pp.: 207-224 Arcing fault detection using artificial neural networks Sidhu , T.S. pp.: 225-241 The artificial neural-networks-based relay algorithm for the detection of stochastic high impedance faults Snider , L.A. pp.: 243-254 Artificial neural network for reactive power optimization El-Sayed , Mohamed A.H. pp.: 255-263 Evolving artificial neural networks for short term load forecasting Srinivasan , Dipti pp.: 265-276 Simple recurrent neural network: A neural network structure for control systems Herna'ndez , Rafael Parra pp.: 277-289 ====================================================================== Neurocomputing 23 (1998) vii-ix Scanning the issue A comparative study of medium-weather-dependent load forecasting using enhanced artificial/fuzzy neural network and statistical techniques is presented by M.M. Elkateb, K. Solaiman and Y. Al-Turki. The introduction of a time index feature significantly enhances the performance of the ANN and FNN techniques. On the conventional side, an AutoRegressive Integrated Moving Average ARIMA technique is used. P.S. Georgilakis, N.D. Hatziargyriou, N.D. Doulamis, A.D. Doulamis and S.D. Kollias describe the Prediction of iron losses of wound core distribution transformers based on artificial neural networks. Generation of training and test data, selection of candidate attributes and the generation of the neural network structure are discussed. Suitability for prediction and classification of individual core and transformer specific iron losses is confirmed. In Laboratory investigation of a digital recurrent network for transmission line directional protection M. Sanaye-Pasand and O.P. Malik describe a recurrent neural network based technique for identifying the direction of a fault on a transmission line. Experimental evaluation shows that the approach is accurate, fast, and robust. A.J. Wang and B. Ramsay present A neural network based estimator for electricity spot-pricing with particular reference to weekend and public holidays. The estimator consists of two parts, the front-end processor and the neural network based predictor. The estimator is tested on a real System Marginal Price (SMP) prediction problem. A neural network based protection technique for combined 275 kV/400 kV double circuit transmission lines is introduced by Q.Y. Xuan, R.K. Aggarwal, A.T. Johns, R.W. Dunn and A. Bennett. The technique extracts in a pre-processing step the main features from the measured signals. The test results confirm that the adaptive protection technique works well for double-circuit lines with different voltage levels on the two circuits. B.F. Hobbs, U. Helman, S. Jitprapaikulsarn, S. Konda and D. Maratukulam describe Artificial neural networks for short-term energy forecasting: Accuracy and economic value. Eighteen electric utilities and five gas utilities are surveyed. The utilities report on the significant error reduction in daily electric load forecasts when using artificial neural networks. An average of $800K in savings per year and uility is estimated=2E J.D. McCalley, G. Zhou and V. Van Acker present Power system security boundary visualization using neural networks. The relationship between the precontingency operating parameters and the postcontingency performance measure is mapped using neural networks. The best set of operating parameters is selected using genetic algorithms. C. Booth and J.R. McDonald describe The use of artificial neural networks for condition monitoring of electrical power transformers. Artificial neural networks are used in this context for estimation, e.g. in the determination of transformer winding vibration levels, and for classification, e.g. in the automatic separation of healthy/unhealthy data. In Neural networks for power system condition monitoring and protection B. Cannas, G. Celli, M. Marchesi and F. Pilo propose a methodology based on a locally recurrent, globally feed-forward network and a neural state classifier. The accurate prediction of control variables and the fast recognition of abnormal events is demonstrated. In Recurrent neural network for forecasting next 10 years loads of nine Japanese utilities B. Kermanshahi applies a Recurrent Neural Network (RNN) and a 3-layer Backpropagation (BP) network for long-term load forecasting. The RNNs forecast the loads one year ahead whereas the BP networks forecast the next five to ten years. In Use of neural networks for customer tariff exploitation by means of short-term load forecasting F.B. Verona and M. Ceraolo apply Radial Basis Function (RBF) networks trained with leave-one-out cross validation for electric load forecasting. A prototype system shows good performance allowing some load management. J. Riquelme, A. G=F3mez and J.L. Mart=EDnez describe Topology-independent artificial neural networks for overload screening. The Artificial Neural Networks (ANN) are capable of identifying the set of harmful contingencies. Experimental results from a real-size power network are presented. ANNs enhanced with bus power injections can handle topological changes in the power system. A. Maiorano and M. Trovato present A neural network-based tool for preventive control of voltage stability in multi-area power systems. The power system is decomposed into a number of areas for each of which a trained neural network outputs an area-based voltage-stability index. The minimum among the index values characterizes the voltage stability of the whole power system. A. Mu=F1oz and M.A. Sanz-Bobi describe An incipient fault detection system based on the probabilistic radial basis function network: Application to the diagnosis of the condenser of a coal power plant. The Probabilistic Radial Basis Function Network (PRBFN) is introduced. The faults are detected by comparing the actual plant behavior with its prediction. The prediction makes use of a model of normal condition operation. H. Salehfar and S.A. Benson describe the Electric utility coal quality analysis using artificial neural network techniques. Impurities and ash forming species in coal are determined using the neural network. Results are compared to those using Computer-Controlled Scanning Electron Microscopy (CCSEM) methods and used to predict the deposition tendency and slagging behavior of ash under different operation conditions. P.R.S. Jota, S.M. Islam, T. Wu and G. Ledwich present A class of hybrid intelligent system for fault diagnosis in electric power systems. A hybrid intelligent system based on neuro-fuzzy, neuro-expert and fuzzy-expert algorithms is used to detect a number of faults in a range of electric power system equipment in Australia and Brazil. T.S. Sidhu, G. Singh and M.S. Sachdev describe a technique for Arcing fault detection using artificial neural networks. Acoustic radiation, infrared radiation and radio waves produced by arcing are recorded on a DSP-based data acquisition system. Classification is done using three-layer feedforward neural networks. Experimental results are reported. The artificial neural-networks-based relay algorithm for the detection of stochastic high impedance faults (HIF) is described by L.A. Snider and Y.S. Yuen. Low-order harmonics of residual quantities are used as the inputs to the artificial neural network. Arcs associated with high impedance faults distort the voltage and current waveforms and are modeled via simulation. The distortions are recognized by the algorithm. M.A.H. El-Sayed presents an Artificial neural network for reactive power optimization. Transmission losses are minimized using a neural network scheme. This scheme is enhanced by a rule-based approach when the network does not provide for a feasible solution. Numerical results from a real power system provides confirmation of the applicability of the scheme. D. Srinivasan describes Evolving artificial neural networks for short term load forecasting. The Artificial Neural Networks (ANN) are generated using a genetic algorihtm (GA) and forecast one-day ahead hourly electric loads. The approach avoids the use of large historical data sets and frequent retraining. When compared with statistical methods to solve the same problem the neural network approach shows a better performance. R. Parra Hern=E1ndez, J. =C1lvarez Gallegos and J.A. Hern=E1ndez Reyes present (SRNN) Simple recurrent neural network: A neural network structure for control systems. SRNNs are used to control linear and nonlinear dynamic systems. Results show that inverse modeling of dynamic systems is feasible using SRNNs and that only a few parameters are needed when using SRNNs to control dynamical systems. I appreciate the cooperation of all those who submitted their work for inclusion in this issue. V. David Sanchez A. Editor-in-Chief From jayanta at www.isical.ac.in Thu Dec 3 00:20:59 1998 From: jayanta at www.isical.ac.in (Jayanta Basak) Date: Thu, 3 Dec 1998 10:50:59 +0530 (IST) Subject: paper posting Message-ID: The following paper is available in http://xxx.lanl.gov/abs/cond-mat/9811403 Feedbacks regrading this article to the authors will be highly appreciated. Title : Response of an Excitatory-Inhibitory Neural Network to External Stimulation: An Application to Image Segmentation Authors : Sitabhra Sinha and Jayanta Basak Abstract : Neural network models comprising elements which have exclusively excitatory or inhibitory synapses are capable of a wide range of dynamic behavior, including chaos. In this paper, a simple excitatory-inhibitory neural pair, which forms the building block of larger networks, is subjected to external stimulation. The response shows transition between various types of dynamics, depending upon the magnitude of the stimulus. Coupling such pairs over a local neighborhood in a two-dimensional plane, the resultant network can achieve a satisfactory segmentation of an image into ``object'' and ``background''. Results for synthetic and and ``real-life'' images are given. Regards, Jayanta Basak Machine Intelligence Unit Indian Statistical Institute Calcutta 700 035, India. From ken at phy.ucsf.EDU Fri Dec 4 02:33:06 1998 From: ken at phy.ucsf.EDU (Ken Miller) Date: Thu, 3 Dec 1998 23:33:06 -0800 (PST) Subject: UCSF Postdoctoral and Graduate Fellowships in Theoretical Neurobiology Message-ID: <13927.36914.747091.701444@coltrane.ucsf.edu> FULL INFO: http://www.sloan.ucsf.edu/sloan/sloan-info.html PLEASE DO NOT USE 'REPLY'; FOR MORE INFO USE ABOVE WEB SITE OR CONTACT ADDRESSES GIVEN BELOW The Sloan Center for Theoretical Neurobiology at UCSF solicits applications for pre- and post-doctoral fellowships, with the goal of bringing theoretical approaches to bear on neuroscience. Applicants should have a strong background and education in mathematics, theoretical or experimental physics, or computer science, and commitment to a future research career in neuroscience. Prior biological or neuroscience training is not required. The Sloan Center offers opportunities to combine theoretical and experimental approaches to understanding the operation of the intact brain. Young scientists with strong theoretical backgrounds will receive scientific training in experimental approaches to understanding the operation of the intact brain. They will learn to integrate their theoretical abilities with these experimental approaches to form a mature research program in integrative neuroscience. The research undertaken by the trainees may be theoretical, experimental, or a combination. TO APPLY, please send a curriculum vitae, a statement of previous research and research goals, up to three relevant publications, and have two letters of recommendation sent to us. The application deadline is February 1, 1999. Send applications to: Steve Lisberger Sloan Center for Theoretical Neurobiology at UCSF Department of Physiology University of California 513 Parnassus Ave. San Francisco, CA 94143-0444 PRE-DOCTORAL applicants with strong theoretical training may seek admission into the UCSF Neuroscience Graduate Program as a first-year student. Applicants seeking such admission must apply by Jan. 5, 1999 to be considered for fall, 1999 admission. Application materials for the UCSF Neuroscience Program may be obtained from Cindy Kelly Neuroscience Graduate Program Department of Physiology University of California San Francisco San Francisco, CA 94143-0444 neuroscience at phy.ucsf.edu Be sure to include your surface-mail address. The procedure is: make a normal application to the UCSF Neuroscience program; but also alert the Sloan Center of your application, by writing to Steve Lisberger at the address given above. If you need more information: -- Consult the Sloan Center WWW Home Page: http://www.sloan.ucsf.edu/sloan -- Send e-mail to sloan-info at phy.ucsf.edu -- See also the home page for the W.M. Keck Foundation Center for Integrative Neuroscience, in which the Sloan Center is housed: http://www.keck.ucsf.edu/ From Kim.Plunkett at psy.ox.ac.uk Fri Dec 4 09:33:08 1998 From: Kim.Plunkett at psy.ox.ac.uk (Kim Plunkett) Date: Fri, 4 Dec 1998 14:33:08 GMT Subject: No subject Message-ID: <199812041433.OAA21282@pegasus.psych.ox.ac.uk> UNIVERSITY OF OXFORD OXFORD SUMMER SCHOOL ON CONNECTIONIST MODELLING Department of Experimental Psychology University of Oxford 18th - 30th July 1999 Applications are invited for participation in a 2-week residential Summer School on techniques in connectionist modelling. The course is aimed primarily at researchers who wish to exploit neural network models in their teaching and/or research and it will provide a general introduction to connectionist modelling, biologically plausible neural networks and brain function through lectures and exercises on Macintosh's and PC's. The course is interdisciplinary in content though many of the illustrative examples are taken from cognitive and developmental psychology, and cognitive neuroscience. The instructors with primary responsibility for teaching the course are Kim Plunkett and Edmund Rolls. No prior knowledge of computational modelling will be required though simple word processing skills will be assumed. Participants will be encouraged to start work on their own modelling projects during the Summer School. The cost of participation in the Summer School is stlg950. This figure covers the cost of accommodation (bed and breakfast at St. John's College), registration and all literature required for the Summer School. Participants will be expected to cover their own travel and meal costs. A number of partial bursaries will be available for graduate students. Applicants should indicate whether they wish to be considered for a graduate student scholarship but are advised to seek further funding as well, since in previous years the number of graduate student applications has far exceeded the number of scholarships available. There is a Summer School World Wide Web page describing the contents of the 1999 Summer School available on: http://www-cogsci.psych.ox.ac.uk/summer-school/ Further information about contents of the course can be obtained from Steven.Young at psy.ox.ac.uk If you are interested in participating in the Summer School, please contact: Mrs Sue King Department of Experimental Psychology University of Oxford South Parks Road Oxford OX1 3UD Tel: (01865) 271353 Email: susan.king at psy.oxford.ac.uk Please send a brief description of your background with an explanation of why you would like to attend the Summer School (one page maximum) no later than 31st January 1999. From greiner at cs.ualberta.ca Fri Dec 4 15:55:22 1998 From: greiner at cs.ualberta.ca (Russ Greiner) Date: Fri, 4 Dec 1998 13:55:22 -0700 Subject: Query Distribution Message-ID: <19981204205525Z13557-6723+317@scapa.cs.ualberta.ca> Dear Colleagues, There are now a number of deployed systems that use belief nets (aka bayesian nets, probability nets, ...) to answer queries -- ie, to compute the posterior probability of some variable(s), based on some specified set of evidence. It would be very useful to know the actual distribution of queries posed to such real-world systems; eg, how often the user asks "What is the probability of cancer, given Fever=T and Age>42 ?", vs "What is the probability of cancer, given Fever=F, lump=F and Gender=M ?" vs "What is the prior probability of hepatitis ?" etc etc etc. We could then use this "query distribution" to evaluate our learning algorithms, by computing (perhaps) the *average (sum-squared) accuracy* of the belief net it returns, where the "average" is wrt this real-world distribution (cf, [Greiner/Grove/Schuurmans, "Learning Bayesian Nets that Perform Well", UAI-97]). We are therefore looking for some real-world *query distributions*. Please let me know if you can provide this information -- perhaps in the form of the set of queries actually posed to a real system, or a set of session transcripts or log files, of a system's interations with its users, or ... To avoid confusion, note that this QUERY DISTRIBUTION cannot necessarily be inferred from the given belief net B, as the query distribution might be completely unrelated to the "NATURAL DISTRIBUTION" of events (encoded by B). Eg, we may ask many queries about low probability events --- the probability of the QUERY "What is the probability of cancer?" may be very high, even though the actual probability of Cancer is very low. Thank you. | Russell Greiner Phone: (403) 492-5461 | | Dep't of Computing Science FAX: (403) 492-1071 | | University of Alberta Email: greiner at cs.ualberta.ca | | Edmonton, AB T6G 2H1 Canada http://www.cs.ualberta.ca/~greiner/ | From greiner at cs.ualberta.ca Fri Dec 4 15:44:31 1998 From: greiner at cs.ualberta.ca (Russ Greiner) Date: Fri, 4 Dec 1998 13:44:31 -0700 Subject: SIGART/AAAI Doctoral Consortium Message-ID: <19981204204433Z13530-6722+256@scapa.cs.ualberta.ca> The SIGART/AAAI Doctoral Consortium is a great opportunity for PhD students to receive feedback on their research and network with people in the field. Accepted participants will receive travel scholarships and free registration to AAAI-99. The call for participation is at: http://www.aaai.org/Conferences/National/1999/aaai99-dccall.html Note that submissions are due 5 February 1999. From jose at tractatus.rutgers.edu Sat Dec 5 11:26:40 1998 From: jose at tractatus.rutgers.edu (Stephen Jose Hanson) Date: Sat, 05 Dec 1998 12:26:40 -0400 Subject: POSTDOC in BRAIN IMAGING & COMPUTATION RDLDL & RUMBA Message-ID: <36695EC0.E84032E5@tractatus.rutgers.edu> Rutgers RUMBA (Rutgers Mind/Brain Analysis) Center and Rutgers DLDL (Distributed Laboratories for Digital Libraries) has an immediate opening for a Postdoctural Researcher in area of Functional Brain Imaging and Neural Computation--see ad below: RUTGERS UNVERSITY (RDLDL & RUMBA) seeking a post-doctoral researcher with experience in brain imaging and computation (neural networks etc..), to work on problems related to brain function and information finding tasks. The underlying questions are (1) does the brain of a person doing fundamental tasks such as scanning for a given image, for a given term, or for terms or images related to a given concept, exhibit patterns of activation that distinguish these states from other states associated with other types of information processing (2) if such states and patterns of activation exist, can knowledge of them be used to improve our understanding of the cognitive aspects of information finding, and ultimately, to improve the systems used to perform those tasks. The successful candidate will work closely with Profs. Martin-Bly, Hanson and Kantor in the UMDNJ-Rutgers (RUMBA) functional imaging laboratories. The starting time for the position is in January 1999. Rutgers University is an equal opportunity/affirmative action employer. Qualified women and minority candidates are especially encouraged to apply. Send CV and three letters of recommendation and 2 reprints to Professor B. Martin Bly, Department of PsychologyRDLDL/RUMBA SEARCH, Rutgers University, Newark, NJ 07102. Also see: http://diglib.rutgers.edu/RDLDL/ & http//www.psych.rutgers.edu/~RUMBA. Preferably send material through email: Please send email with subject header--RDLDL or RUMBA with application material to ben at psychology.rutgers.edu or jose at psychology.rutgers.edu. From Otto_Schnurr-A11505 at email.mot.com Wed Dec 9 18:51:09 1998 From: Otto_Schnurr-A11505 at email.mot.com (Otto Schnurr-A11505) Date: Wed, 09 Dec 1998 17:51:09 -0600 Subject: NN Text-To-Speech at Motorola: Papers and Audio Files Message-ID: <366F0CED.F4C06DC1@ccrl.mot.com> Dear Connectionists: Motorola has been developing text-to-speech technology that utilizes multiple cooperating neural networks, each specializing in a particular area of human language ability. Papers that describe this work are now available electronically at the LANL archive: 1998: ===== Title : A High Quality Text-To-Speech System Composed of Multiple Neural Networks Authors : Orhan Karaali, Gerald Corrigan, Noel Massey, Corey Miller, Otto Schnurr and Andrew Mackie Abstract : http://xxx.lanl.gov/abs/cs/9812006 PostScript : http://xxx.lanl.gov/ps/cs/9812006.ps.gz PS & Audio : http://xxx.lanl.gov/e-print/cs/9812006.tar.gz 1997: ===== Title : Text-To-Speech Conversion with Neural Networks: A Recurrent TDNN Approach Authors : Orhan Karaali, Gerald Corrigan, Ira Gerson and Noel Massey Abstract : http://xxx.lanl.gov/abs/cs/9811032 PostScript : http://xxx.lanl.gov/ps/cs/9811032.ps.gz Title : Generating Segment Durations in a Text-To-Speech System: A Hybrid Rule-Based/Neural Network Approach Authors : Gerald Corrigan, Noel Massey and Orhan Karaali Abstract : http://xxx.lanl.gov/abs/cs/9811030 PostScript : http://xxx.lanl.gov/ps/cs/9811030.ps.gz Title : Variation and Synthetic Speech Authors : Corey Miller, Orhan Karaali and Noel Massey Abstract : http://xxx.lanl.gov/abs/cmp-lg/9711004 PostScript : http://xxx.lanl.gov/ps/cmp-lg/9711004.ps.gz 1996: ===== Title : Speech Synthesis with Neural Networks Authors : Orhan Karaali, Gerald Corrigan and Ira Gerson Abstract : http://xxx.lanl.gov/abs/cs/9811031 PostScript : http://xxx.lanl.gov/ps/cs/9811031.ps.gz Due to requests, we have submitted excerpts of speech generated from our text-to-speech system. These audio files demonstrate one female voice and two male voices and are available at http://xxx.lanl.gov/e-print/cs/9812006.tar.gz Note: If your system does not support Windows WAV files, try a tool like "sox" to translate the audio into a format of your choice. Regards, Otto Schnurr Speech Processing Research Lab Motorola otto_schnurr at email.mot.com From rsun at research.nj.nec.com Thu Dec 10 09:45:30 1998 From: rsun at research.nj.nec.com (Ron Sun) Date: Thu, 10 Dec 1998 09:45:30 -0500 Subject: AAAI'99 CFP in the neural/evolutionary/fuzzy computation areas Message-ID: <199812101445.JAA20138@pc-rsun.nj.nec.com> Solictation for submissions from the neural/evolutionary/fuzzy communities to: Sixteenth National Conference on Artificial Intelligence Sponsored by the American Association for Artificial Intelligence (AAAI). July 18-22, 1999, Orlando, Florida We would like to encourage the submission of high quality papers in the areas of neural/fuzzy/evolutionary computation to AAAI'99. The 1999 AAAI conference includes many program committee members in the neural/evolutionary/fuzzy areas. This year AAAI'99 expressly solicits top-quality submissions in the above-mentioned areas. Every consideration will be given to provide a fair review (albeit rigorous, in accordance with AAAI's long-standing high standard) to each paper. Your submission will be reviewed by experts in your area who understand and appreciate its contribution (or the lack of it) to the study of computational intelligence. To highlight and showcase the advance and contributions that neural/evolutionary/fuzzy communities are making in computational intelligence, in a broader context that goes beyond disciplinary and paradigmatic confines, we encourage you to submit your best work to this year's AAAI and share that work with others in the broader arena of artificial intelligence. *********************************** Some AAAI'99 program committee members whose expertize lies primarily in the neural/evolutionary/fuzzy areas: Ron Sun (senior program committee member) Lee Giles (senior program committee member) Hamid Berenji Kenneth DeJong Marco Gori Vasant Honavar John Koza Yann LeCun Risto Miikkulanain Darrell Whitley Ron Yager Jacek M. Zurada Timetable for Submission January 19, 1999: electronic submission of abstracts January 20, 1999: submission of six (6) paper copies to AAAI office (see the AAAI web page describing all caveats, formats and details). March 12, 1999: notification of acceptance or rejection. Please send papers to: AAAI-99 American Association for Artificial Intelligence 445 Burgess Drive Menlo Park, CA 94025-3442 For further details regarding submission, see http://www.aaai.org/Conferences/National/1999/aaai99-call.html *********************************** ------------ Dr. C. Lee Giles / NEC Research Institute / 4 Independence Way Princeton, NJ 08540 / 609-951-2642 / Fax 2482 ------------ Prof. Ron Sun Dept. CS, University of Alabama, Tuscaloosa, AL http://cs.ua.edu/~rsun rsun at cs.ua.edu, rsun at research.nj.nec.com ------------ From haith at ptolemy.arc.nasa.gov Fri Dec 11 15:45:52 1998 From: haith at ptolemy.arc.nasa.gov (Gary Haith) Date: Fri, 11 Dec 1998 12:45:52 -0800 Subject: Model of Retinogeniculate Development: Dissertation Available Online Message-ID: <199812112045.MAA24334@golem.arc.nasa.gov> MODELING ACTIVITY-DEPENDENT DEVELOPMENT IN THE RETINOGENICULATE PROJECTION can be downloaded at: (post-script document): http://ic-www.arc.nasa.gov/people/haith/diss.ps (gzipped post-script document): http://ic-www.arc.nasa.gov/people/haith/diss.ps.gz Gary Haith haith at ptolemy.arc.nasa.gov ############ Abstract: ############ In higher mammals, the primary visual pathway starts with the (``retinogeniculate'') projection from the retina to the dorsal lateral geniculate nucleus (dLGN) of the thalamus, which in turn projects to visual cortex. Although the retinal axons initially innervate the dLGN in a relatively disorganized manner, they are precisely arranged by maturity. Some dominant features of this organization emerge only under the influence of activity, yet these features are established before eye-opening or photoreceptor function. The crucial activity is supplied by spontaneous bursts of action potentials that propagate in waves across the immature retinal ganglion cell layer that projects to the dLGN. Under the influence of retinal activity, the retinal axons segregate into eye-specific layers, on/off sublayers, and precise retinotopic maps. This dissertation describes a formal computational framework for modeling and exploring the activity-dependent development of the retinogeniculate projection. The model is the first to support the development of layers, sublayers, and retinotopy in a unified framework. The model is constructed so as to be directly biologically interpretable and predictive. It refines based on realistic patterns of wave activity, retinal axon arbor change, and Hebbian synaptic weight change. In addition, the model is relatively tractable to formal analysis. This tractability makes the model relatively undemanding to simulate computationally and provides analytic insight into the dynamics of the model refinement. Several experimental predictions that follow directly from the model are described. From terry at salk.edu Fri Dec 11 22:57:32 1998 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 11 Dec 1998 19:57:32 -0800 (PST) Subject: UCSD Computational Neurobiology Message-ID: <199812120357.TAA27939@helmholtz.salk.edu> Computational Neurobiology Graduate Program Department of Biology -- University of California, San Diego The Computational Neurobiology Graduate Program at UCSD will provide students with rigorous training in neuroscience including experimental methods and modern mathematical methods to analyze and visualize data as well as theoretical approaches to neuronal dynamics and computation. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. All students are expected to master set of core courses to insure a common set of knowledge and a common language. All students will take the "Advanced Neurobiology Laboratory", which presents state-of-the-art imaging and electrophysiological techniques, modern techniques in genetic and viral transformation for the study of neuronal function, and modern statistical and spectral methods for data analysis, and "Advanced Computational Methods and Dynamic Systems Theory" which include training in nonlinear dynamics of single cells, the analysis of regularly spiking and bursting cells, as well as reduced models and their representation in phase space. Requests for application materials should be sent to the Graduate Admissions Office, Department of Biology 0348, 9500 Gilman Drive, UCSD, La Jolla, CA, 92093-0348: [gradprog at biology.ucsd.edu]. An initial pre-application form will be sent; applicants should indicate their interest in the Computational Neurobiology Graduate Program on this form. These forms will be screened and application forms will be sent to appropriate candidates. The deadline for completed application materials, including letters of reference, is January 8, 1999. More information about applying to the UCSD Biology Graduate Program: http://www-biology.ucsd.edu/sa/Admissions.html. The Biology Department home page is located at: http://www-biology.ucsd.edu/ Other inquiries about the Computational Neurobiology Graduate Program should be directed to: Terrence Sejnowski Institute for Neural Computation 0523 University of California, San Diego La Jolla, CA 92093 [tsejnowski at ucsd.edu]. Participating Faculty include: Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking the responses of single neurons to perception; functional agnetic Resonance Imaging (fMRI) in awake, behaving monkeys; Darwin Berg (Biology): Regulation of synaptic components of neurons; how neurons become committed to synthesizing specific synaptic components, how the components are assembled and localized in the synaptic membrane, and how the function and long-term stability of the components are controlled; Mark Ellisman (Neurosciences): High resolution anatomy using electron microscopy and light microscopy; computational procedures for anatomical reconstructions; Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex; founder of Hecht-Nielsen Corporation, Harvey Karten (Neurosciences): Visual system function and organization; anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels. David Kleinfeld (Physics): Collective properties of neuronal assemblies; optical recording of electrical activity in cortex; analysis of large-scale activity in nervous systems; William Kristan (Biology): Neuroethology of the leech; functional and developmental studies of the leech nervous system, including computational studies of the bending reflex and locomotion; Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies. Mu-ming Poo (Biology): Mechanisms for synaptic plasticity; synaptic learning rules underlying developmental plasticity and learning in nervous systems; development of sensory maps in lower vertebrate visual systems. Terrence Sejnowski (Salk Institute/Biology): Computational neurobiology; detailed biophysical and large-scale network models of nervous systems; physiological studies of neuronal reliability and synaptic mechanisms; Michael Rabinovich (Institute for Nonlinear Studies): Analysis of neural dynamics in the stomatogastric ganglion of the lobster and the olfactory system of insects. Martin Sereno (Cognitive Science): Organization of the visual system in primates and squirrels; computer models of neural systems and development of new techniques for studying human cognition with functional magnetic resonance imaging (fMRI). Nicholas Spitzer (Biology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function; Charles Stevens (Salk Institute): Synaptic physiology; physiological studies and biophysical models of synaptic plasticity in hippocampal neurons. Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters. Mark Whitehead (Neurosurgery): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior. Ruth Williams (Mathematics): Probability theory, stochastic processes and their applications, including learning in stochastic networks. Kent Wilson (Chemistry): Multi-photon techniques in scanning optical microscopy of living biological systems; multi-sensual use of computer visualization, sound and touch as tools in research and education. From tononi at nsi.edu Sat Dec 12 18:15:20 1998 From: tononi at nsi.edu (Giulio Tononi) Date: Sat, 12 Dec 1998 15:15:20 -0800 Subject: positions open Message-ID: <000001be2625$49c8e2a0$1bb985c6@spud.nsi.edu> THE NEUROSCIENCES INSTITUTE, SAN DIEGO The Neurosciences Institute is an independent, not-for-profit organization at the forefront of research on the brain. Research at the Institute spans levels from the molecular to the behavioral and from the computational to the cognitive. The Institute has a strong tradition in theoretical neurobiology and has recently established new experimental facilities. The Institute is also the home of the Neurosciences Research Program and serves as an international meeting place for neuroscientists. JUNIOR FELLOW, NEURAL BASIS OF CONSCIOUSNESS. The Institute has a strong tradition in the theoretical and experimental study of consciousness (see Science, 282:1846-1851). Applications are invited for positions as Junior Fellows to collaborate on experimental and theoretical studies of the neural correlates of conscious perception. Applicants should be at the Postdoctoral level with strong backgrounds in cognitive neuroscience, neuroimaging (including MEG, EEG, and fMRI), and theoretical neurobiology. JUNIOR FELLOW IN THEORETICAL NEUROBIOLOGY. Applications are invited for positions as Junior Fellows in Theoretical Neurobiology. Since 1987, the Institute has had a research program dedicated to developing biologically based, experimentally testable theoretical models of neural systems. Current projects include large-scale simulations of neuronal networks and the analysis of functional interactions among brain areas using information-theoretical approaches. Advanced computing facilities are available. Applicants should be at the Postdoctoral level with strong backgrounds in mathematics, statistics, and computer modeling. JUNIOR FELLOW IN EXPERIMENTAL NEUROBIOLOGY. Applications are invited for positions as Junior Fellow in Experimental Neurobiology. A current focus of the Institute is on the action and pharmacological manipulation of neuromodulatory systems with diffuse projections, such as the noradrenergic, serotoninergic, cholinergic, dopaminergic, and histaminergic systems. Another focus is on behavioral state control and the functions of sleep. Applicants should be at the Postdoctoral level with strong backgrounds in the above-mentioned areas. Fellows receive stipends and research support commensurate with qualifications and experience. Positions are now available. Applications for all positions listed should contain a short statement of research interests, a curriculum vitae, and the names of three references and should be sent to: Giulio Tononi, The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, California 92121; Email: tononi at nsi.edu; URL: http:// www.nsi.edu. From steve at cns.bu.edu Sun Dec 13 07:30:40 1998 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sun, 13 Dec 1998 08:30:40 -0400 Subject: The Link Between Brain Learning, Attention, and Consciousness Message-ID: The following article can be accessed at http://cns-web.bu.edu/Profiles/Grossberg Paper copies can also be gotten by writing Ms. Diana Myers, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA 02215 or diana at cns.bu.edu. Grossberg, S. (1998). The link between brain learning, attention, and consciousness. Consciousness and Cognition, in press. Preliminary version as Boston University Technical Report, CAS/CNS-TR-97-018. Available in gzip'ed postscript (170Kb). Abstract: The processes whereby our brains continue to learn about a changing world in a stable fashion throughout life are proposed to lead to conscious experiences. These processes include the learning of top-down expectations, the matching of these expectations against bottom-up data, the focusing of attention upon the expected clusters of information, and the development of resonant states between bottom-up and top-down processes as they reach an attentive consensus between what is expected and what is there in the outside world. It is suggested that all conscious states in the brain are resonant states, and that these resonant states trigger learning of sensory and cognitive representations. The models which summarize these concepts are therefore called Adaptive Resonance Theory, or ART, models. Psychophysical and neurobiological data in support of ART are presented from early vision, visual object recognition, auditory streaming, variable-rate speech perception, somatosensory perception, and cognitive-emotional interactions, among others. It is noted that ART mechanisms seem to be operative at all levels of the visual system, and it is proposed how these mechanisms are realized by known laminar circuits of visual cortex. It is predicted that the same circuit realization of ART mechanisms will be found in the laminar circuits of all sensory and cognitive neocortex. Concepts and data are summarized concerning how some visual percepts may be visibly, or modally, perceived, whereas amodal percepts may be consciously recognized even though they are perceptually invisible. It is also suggested that sensory and cognitive processing in the What processing stream of the brain obey top-down matching and learning laws that are often complementary to those used for spatial and motor processing in the brain's Where processing stream. This enables our sensory and cognitive representations to maintain their stability as we learn more about the world, while allowing spatial and motor representations to forget learned maps and gains that are no longer appropriate as our bodies develop and grow from infanthood to adulthood. Procedural memories are proposed to be unconscious because the inhibitory matching process that supports these spatial and motor processes cannot lead to resonance. From robtag at dia.unisa.it Mon Dec 14 11:27:49 1998 From: robtag at dia.unisa.it (Tagliaferri Roberto) Date: Mon, 14 Dec 1998 17:27:49 +0100 Subject: WIRN 99 call for paper First announcement Message-ID: <9812141627.AA29612@udsab> ***************** CALL FOR PAPERS ***************** The 11-th Italian Workshop on Neural Nets WIRN VIETRI-99 May 20-22, 1999 Vietri Sul Mare, Salerno ITALY **************** FIRST ANNOUNCEMENT ***************** Organizing - Scientific Committee -------------------------------------------------- B. Apolloni (Univ. Milano) A. Bertoni ( Univ. Milano) N. A. Borghese ( CNR Milano) D. D. Caviglia ( Univ. Genova) P. Campadelli ( Univ. Milano) A. Colla (ELSAG Genova) A. Esposito ( I.I.A.S.S.) M. Frixione ( Univ. Salerno) C. Furlanello (ITC-IRST Trento) G. M. Guazzo ( I.I.A.S.S.) M. Gori ( Univ. Siena) F. Lauria ( Univ. Napoli) M. Marinaro ( Univ. Salerno) F. Masulli (Univ. Genova) C. Morabito ( Univ. Reggio Calabria) P. Morasso (Univ. Genova) G. Orlandi ( Univ. Roma) T. Parisini ( Politecnico Milano) E. Pasero ( Politecnico Torino) A. Petrosino ( I.I.A.S.S.) V. Piuri ( Politecnico Milano) M. Protasi ( Univ. Roma II) S. Rampone ( Univ. Sannio) R. Serra ( Centro Ricerche Ambientali Montecatini Ravenna) F. Sorbello ( Univ. Palermo) R. Tagliaferri ( Univ. Salerno) Topics ---------------------------------------------------- Mathematical Models Architectures and Algorithms Hardware and Software Design Hybrid Systems Pattern Recognition and Signal Processing Industrial and Commercial Applications Fuzzy Tecniques for Neural Networks Schedule ----------------------- Papers Due: January 31, 1999 Replies to Authors: March 31, 1999 Revised Papers Due: May 22, 1999 Sponsors ------------------------------------------------------------------------------ International Institute for Advanced Scientific Studies (IIASS) Dept. of Scienze Fisiche "E.R. Caianiello", University of Salerno Dept. of Matematica ed Informatica, University of Salerno Dept. of Scienze dell'Informazione, University of Milano Societa' Italiana Reti Neuroniche (SIREN) IEEE Neural Network Council INNS/SIG Italy Istituto Italiano per gli Studi Filosofici, Napoli The 11-th Italian Workshop on Neural Nets (WIRN VIETRI-99) will take place in Vietri Sul Mare, Salerno ITALY, May 20-22, 1999. The conference will bring together scientists who are studying several topics related to neural networks. The three-day conference, to be held in the I.I.A.S.S., will feature both introductory tutorials and original, refereed papers, to be published by an International Publishing. In the 1999 edition there will be a special session, including tutorials, on Neural Networks in Economics. Official languages are Italian and English, while papers must be in English. Papers should be 6 pages, including title, figures, tables, and bibliography. The accompanying letter should give keywords, postal and electronic mailing addresses, telephone and FAX numbers, indicating oral or poster presentation. Submit 3 copies and a 1 page abstract (containing keywords, postal and electronic mailing addresses, telephone, and FAX numbers with no more than 300 words) to the address shown (WIRN 99 c/o IIASS). An electronic copy of the abstract should be sent to the E-mail address below. The papers must be sent in the camera ready format of SPRINGER which can be retrieved by anonymous ftp from ftp.tex.ac.uk or by the SIREN www site: wicsadv.org wicsadv.tex wicsadv1.tex (file modified to insert the format instruction inside) wicsbook.org wicsbook.sty or for authors who do not use latex the information can be retrieved by the SIREN www site. The publication of the proceedings is "under negotiation" with Springer. During the Workshop the "Premio E.R. Caianiello" will be assigned to the best Ph.D. thesis in the area of Neural Nets and related fields of Italian researchers. The amount is of 2.000.000 Italian Lire. The interested researchers (with the Ph.D degree got in 1996,1997,1998 until February 28 1999) must send 3 copies of a c.v. and of the thesis to "Premio Caianiello" WIRN 99 c/o IIASS before February 28,1999. It is possible to partecipate to the prize at most twice. For more information, contact the Secretary of I.I.A.S.S. I.I.A.S.S Via G.Pellegrino, 19 84019 Vietri Sul Mare (SA) ITALY Tel. +39 89 761167 Fax +39 89 761189 E-Mail robtag at udsab.dia.unisa.it or the SIREN www pages at the address below: http://www-dsi.ing.unifi.it/neural ***************************************************************** From magnus at cs.man.ac.uk Mon Dec 14 12:14:27 1998 From: magnus at cs.man.ac.uk (Magnus Rattray) Date: Mon, 14 Dec 1998 17:14:27 +0000 Subject: PhD Studentship Message-ID: <36754773.148E1EB5@cs.man.ac.uk> --------------------------------------------------------------------- PhD studentship: Statistical Mechanics Analysis of Natural Gradient Learning --------------------------------------------------------------------- Applications are sought for a three year PhD position to study natural gradient learning using the methods of statistical mechanics and stochastic dynamical systems. The position will be supported by an EPSRC studentship and based in the computer science department at Manchester University, which is one of the largest and most successful computer science departments in the UK. Living expenses will be paid according to current EPSRC rates (19635 pounds over three years) with substantial extra funding available for participation at international conferences and workshops. Project description: Natural gradient learning was recently introduced as a principled algorithm for determining the parameters of a statistical model on-line. The algorithm has been applied to feed-forward neural networks, independent component analysis and deconvolution algorithms, often providing much improved performance over existing methods. The algorithm uses an underlying Riemannian parameter space to re-define the direction of steepest descent and respects certain invariances which should be observed by any consistent algorithm. Natural gradient learning is known to provide optimal asymptotic performance under certain restricted conditions but a good general understanding of the non-asymptotic learning performance is not yet available. This is really the regime which we expect to dominate the learning time and recent work by the project supervisor and co-workers [1,2] provides some quantification of the advantage which can be expected over other algorithms. This analysis involves a statistical mechanics formalism which allows an exact solution to learning dynamics for learning in a feed-forward neural network. The proposed project will build on these initial results in order to characterize the behaviour of natural gradient learning with greater generality. The project will also explore other applications of information geometry to probabilistic modelling. This project will touch on many interesting mathematical topics (information theory, differential geometry, statistical mechanics and stochastic dynamical systems) and application areas (optimization, neural networks, probabilistic modelling). Prospective candidates would ideally be interested in a number of these topics. A good first degree in physics, mathematics or a related subject is required. Contact: Magnus Rattray (magnus at cs.man.ac.uk) Computer Science Department, University of Manchester, Manchester M13 9PL, UK. Tel +44 161 275 6187. http://www.cs.man.ac.uk/~magnus/magnus.html References: [1] M Rattray, D Saad, S Amari, "Natural Gradient Descent for On-line Learning", Physical Review Letters 81, p5461 (1998). [2] M Rattray, D Saad, "Transients and Asymptotics of Natural Gradient Learning", Proceeding of ICANN 98, edited by L Niklasson, M Boden and T Ziemke (Springer-Verlag, London), p165 (1998). From jordan at CS.Berkeley.EDU Mon Dec 14 22:35:43 1998 From: jordan at CS.Berkeley.EDU (Michael Jordan) Date: Mon, 14 Dec 1998 19:35:43 -0800 (PST) Subject: faculty position in Statistics at UC Berkeley Message-ID: <199812150335.TAA00633@orvieto.CS.Berkeley.EDU> The enclosed announcement of a faculty position in Statistics at the University of California at Berkeley may be of interest to those of you whose research is statistics-oriented, either theoretical or applied. Mike Jordan ---------------------------------------------------------------------------- Announcement of a faculty position in the Department of Statistics University of California at Berkeley Applications are invited for tenured/tenure-track rank faculty position to begin 7/1/99. We will consider strong candidates in any area of theoretical and applied statistics, probability, and applied probability theory. The department is particularly interested in hearing from suitable qualified women or members of minorities currently under-represented in faculty positions. Send applications or inquiries (including resume and three names of references) by 1/19/99 to: Chair, University of California, Berkeley, Department of Statistics, 367 Evans Hall #3860, Berkeley, CA 94720-3860, Fax: (510) 642-7892; E-mail: recruit at stat.berkeley.edu. The University of California is an Affirmative Action/Equal Opportunity Employer. (see http://www.stat.berkeley.edu for additional information) From oreilly at grey.colorado.edu Tue Dec 15 16:58:22 1998 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Tue, 15 Dec 1998 14:58:22 -0700 Subject: Graduate and Postdoctoral Training @ CU & DU Message-ID: <199812152158.OAA13863@grey.colorado.edu> Computational Cognitive Neuroscience University of Colorado Boulder and University of Denver Graduate and Postdoctoral Training Opportunities This is an invitation to apply for graduate and postdoctoral training in cognitive neuroscience and/or computational cognitive neuroscience at the University of Colorado Boulder (CU) Departments of Psychology and Computer Science (integrated with the Institute for Cognitive Science, ICS), and the University of Denver (DU) Department of Psychology and program in Developmental Cognitive Neuroscience (DCN). CU and DU have a strong common interest in computational cognitive neuroscience, and researchers in both universities are funded by the NSF, NIH, NIDCD, and the McDonnell Foundation to support both graduate students and postdoctoral researchers in cognitive neuroscience. Close collaborations exist across the two campuses. A major focus of interest here is in understanding working memory and the cognitive role of the prefrontal cortex (PFC). We apply converging cognitive neuroscience methodologies, including computational (neural network models), behavioral, developmental, and neuropsychological approaches. Other topics of interest include executive function, language, learning and memory and the roles of the hippocampus and the cortex, developmental dissociations and task-dependent behaviors, attention, and invariant object recognition. We have a strong nucleus of cognitive neuroscientists focused on developing mechanistic, computational frameworks for understanding how the brain performs cognitive functions, and exploring these ideas using a wide range of empirical methods. This environment provides a unique opportunity to interact closely with active scientists exploring problems at the forefront of cognitive neuroscience. The departments at CU and DU are highly regarded, and have a strong international reputation for high quality scientific research and training. The DU psych department is ranked No. 2 in the world in publication impact, and the CU psych department is consistently in the top 20 of the US News & World Report rankings. In addition to providing an exciting research environment and hosting the annual Neural Information Processing Systems conference, the greater Denver/Boulder area offers an exceptional quality of life. Spectacularly situated at the eastern edge of the Rockies, this area provides a wide variety of extraordinary outdoor activities, an average of 330 sunny days per year, and also affords a broad range of cultural activities. Graduate students should apply to the most appropriate department for their specific interests. Deadlines are Jan 1 for CU and Jan 15 for DU. For more information, full lists of associated faculty, and instructions on applying to the graduate programs, see the following web sites: CU Overview Web Page: http://www.cs.colorado.edu/~mozer/resgroup.html CU Psychology: http://psych-www.colorado.edu/ CU Computer Science: http://www.cs.colorado.edu/ CU ICS: http://psych-www.colorado.edu/ics/home.html DU Psychology: http://www.du.edu/psychology/ DU DCN: http://www.du.edu/psychology/DCNWHOLE.htm Postdoctoral applications should include a CV, representative publications, and a statement of research interests, and should be sent to the most appropriate of the following faculty member(s) listed below. Postdoc funding is available now, and applications will be considered until the positions are filled. Akira Miyake, CU Psych, miyake at psych.colorado.edu, http://psych-www.colorado.edu/faculty/miyake.html Michael Mozer, CU CS, mozer at cs.colorado.edu, http://www.cs.colorado.edu/~mozer/Home.html Yuko Munakata, DU Psych, munakata at kore.psy.du.edu, http://kore.psy.du.edu/munakata Randall O'Reilly, CU Psych, oreilly at psych.colorado.edu, http://psych.colorado.edu/~oreilly One or more of the above faculty should be contacted for any further information. From rrojas at ICSI.Berkeley.EDU Mon Dec 14 18:38:17 1998 From: rrojas at ICSI.Berkeley.EDU (Raul Rojas) Date: Mon, 14 Dec 1998 15:38:17 -0800 Subject: call for contributions on handwriting recognition Message-ID: <3675A169.BCE597F@icsi.berkeley.edu> [ Moderator's note: I don't usually accept calls for papers in some generic application area for a journal that is not primarily neural nets-oriented. But since neural networks are used so extensively in handwriting recognition research, this particular call for papers is an exception. -- Dave Touretzky, CONNECTIONISTS moderator ] ================================================================ Handwriting Recognition The journal "Kuenstliche Intelligenz" (Artificial Intelligence), organ of the German SIG on AI, will publish a special issue on handwriting recognition during 1999. We are looking for additional original contributions on the following topics: - on-line and off-line handwriting recognition - applications of handwriting recognition - handwriting recognition for PDAs - image processing for handwriting recognition We are looking for papers at an expository level and papers describing ongoing projects. Language: English or German Extension: 5-6 pages, including figures. Deadline: January 31, 1999. Short project summaries (a half-page with pointers to on-line materials) will be published together in a special section. There are no special format requirements, since we will reformat the papers using the source file. Files in LaTeX or MS-Word are acceptable. Electronic submission is encouraged. Guest editor: Raul Rojas Intl. Computer Science Institute 1947 Center St. Berkeley, CA 94704-1198 rrojas at icsi.berkeley.edu From biehl at physik.uni-wuerzburg.de Wed Dec 16 08:53:41 1998 From: biehl at physik.uni-wuerzburg.de (Michael Biehl) Date: Wed, 16 Dec 1998 14:53:41 +0100 (MET) Subject: three preprints available Message-ID: <199812161353.OAA12986@wptx38.physik.uni-wuerzburg.de> FTP-host: ftp.physik.uni-wuerzburg.de FTP-filename: /pub/preprint/1998/WUE-ITP-98-049.ps.gz FTP-filename: /pub/preprint/1998/WUE-ITP-98-055.ps.gz FTP-filename: /pub/preprint/1998/WUE-ITP-98-057.ps.gz The following (three) manuscripts are now available via anonymous ftp, see below for the retrieval procedure. More conveniently, they can be obtained from the Wuerzburg Theoretical Physics preprint server in the WWW: http://theorie.physik.uni-wuerzburg.de/~publications.shtml ------------------------------------------------------------------ 1) Ref. WUE-ITP-98-049 Receiver Operating Characteristics of Perceptrons: Influence of Sample Size and Prevalence A. Freking, M. Biehl, C. Braun, W. Kinzel, and M. Meesmann ABSTRACT In many practical classification problems it is important to distinguish false positive from false negative results when evaluating the performance of the classifier. This is of particular importance for medical diagnostic tests. In this context, receiver operating characteristic (ROC) curves have become a standard tool. Here we apply this concept to characterize the performance of a simple neural network. Investigating the binary classification of a perceptron we calculate analytically the shape of the corresponding ROC curves. The influence of the size of the training set and the prevalence of the quality considered are studied by means of a statistical-mechanics analysis. ------------------------------------------------------------------ 2) Ref. WUE-ITP-98-055 Optimisation of on-line principal component analysis E. Schl"osser, D. Saad, and M. Biehl ABSTRACT Various techniques, used to optimise on-line principal component analysis, are investigated by methods of statistical mechanics. These include local and global optimisation of node-dependent learning-rates which are shown to be very efficient in speeding up the learning process. They are investigated further for gaining insight into the learning rates' time-dependence, which is then employed for devising simple practical methods to improve training performance. Simulations demonstrate the benefit gained from using the new methods. ------------------------------------------------------------------- 3) Ref. WUE-ITP-98-057 Statistical physics and practical training of soft-committee machines M. Ahr, M. Biehl, and R. Urbanczik ABSTRACT Equilibrium states of large layered neural networks with differentiable activation function and a single, linear output unit are investigated using the replica formalism. The quenched free energy of a student network with a very large number of hidden units learning a rule of perfectly matching complexity is calculated analytically. The system undergoes a first order phase transition from unspecialized to specialized student configurations at a critical size of the training set. Computer simulations of learning by stochastic gradient descent from a fixed training set demonstrate that the equilibrium results describe quantitatively the plateau states which occur in practical training procedures at sufficiently small but finite learning rates. ------------------------------------------------------------------- ___________________________________________________________________ Retrieval procedure via anonymous ftp: unix> ftp ftp.physik.uni-wuerzburg.de Name: anonymous Password: {your e-mail address} ftp> cd pub/preprint/1998 ftp> binary ftp> get WUE-ITP-98.XXX.ps.gz (*) ftp> quit unix> gunzip WUE-ITP-98-XXX.ps.gz e.g. unix> lp -odouble WUE-ITP-98-XXX.ps (*) can be replaced by "get WUE-ITP-98-XXX.ps". The file will then be uncompressed before transmission (slow!). ___________________________________________________________________ Michael Biehl Institut fuer Theoretische Physik Julius-Maximilians-Universitaet Wuerzburg Am Hubland D-97074 Wuerzburg email: biehl at physik.uni-wuerzburg.de www: http://theorie.physik.uni-wuerzburg.de/~biehl Tel.: (+49) (0)931 888 5865 " " " 5131 Fax : (+49) (0)931 888 5141 From pelillo at dsi.unive.it Wed Dec 16 09:50:19 1998 From: pelillo at dsi.unive.it (Marcello Pelillo) Date: Wed, 16 Dec 1998 15:50:19 +0100 (MET) Subject: A Neural Computation paper on graph isomorphism Message-ID: The following paper, accepted for publication in Neural Computation, is accessible at the following www site: http://www.dsi.unive.it/~pelillo/papers/nc98.ps.gz A shorter version of it has just been presented at NIPS*98, and can be accesses at: http://www.dsi.unive.it/~pelillo/papers/nips.ps.gz (files are gzipped postscripts) Comments and suggestions are welcome! Best regards, Marcello Pelillo ======================== Replicator Equations, Maximal Cliques, and Graph Isomorphism Marcello Pelillo University of Venice, Italy ABSTRACT We present a new energy-minimization framework for the graph isomorphism problem which is based on an equivalent maximum clique formulation. The approach is centered around a fundamental result proved by Motzkin and Straus in the mid-1960s, and recently expanded in various ways, which allows us to formulate the maximum clique problem in terms of a standard quadratic program. The attractive feature of this formulation is that a clear one-to-one correspondence exists between the solutions of the quadratic program and those in the original, combinatorial problem. To solve the program we use the so-called ``replicator'' equations, a class of straightforward continuous- and discrete-time dynamical systems developed in various branches of theoretical biology. We show how, despite their inherent inability to escape from local solutions, they nevertheless provide experimental results which are competitive with those obtained using more elaborate mean-field annealing heuristics. ==================== ________________________________________________________________________ Marcello Pelillo Dipartimento di Informatica Universita' Ca' Foscari di Venezia Via Torino 155, 30172 Venezia Mestre, Italy Tel: (39) 41 2908.440 Fax: (39) 41 2908.419 E-mail: pelillo at dsi.unive.it URL: http://www.dsi.unive.it/~pelillo From danr at cs.uiuc.edu Wed Dec 16 15:21:54 1998 From: danr at cs.uiuc.edu (Dan Roth) Date: Wed, 16 Dec 1998 14:21:54 -0600 Subject: postdoctoral fellowship Message-ID: <36781662.C6648EC2@cs.uiuc.edu> Postdoctoral Fellows Program Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign. This is an invitation to apply for postdoctoral fellowships at the Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign. The Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign is an inter- and multidisciplinary research institute devoted to basic research in the physical sciences and engineering, and in the life and behavioral sciences. Its primary mission is to foster interdisciplinary work of the highest quality in an environment that transcends many of the limitations inherent in traditional university organizations and structures. Research at the Institute focuses on three broadly defined themes: biological intelligence, human-computer intelligent interaction, and molecular and electronic nanostructures. Eighteen research groups, composed of faculty and students from sixteen UIUC departments, work within and across these three areas. Many research areas that are of interest to readers of this list are relevant to Beckman Institute research in general and to the postdoctoral fellows program in particular. All aspects of the learning sciences (Computational, Biological, etc. ), Natural Language (Computational aspects, Psycholinguistics), and many other topics in Biological and Artificial Intelligence are of interest. Please consult the Beckman Institute web page at http://www.beckman.uiuc.edu/ and the Beckman fellows program at http://www.beckman.uiuc.edu/outreach/fellowshome.html If interested, don't hesitate to contact me directly for further information. Notice that the due date for applying is January 8. Dan ---------------------------------------------------------------------- Dan Roth Department of Computer Science, University of Illinois, Urbana/Champaign 1304 W. Springfield Ave. Urbana IL 61801 Phone: (217) 244-7068 (217) 244-6813 (Sec) Fax: +(217) 244-6500 e-mail: danr at cs.uiuc.edu http://L2R.cs.uiuc.edu/~danr ---------------------------------------------------------------------- From Michael.Haft at mchp.siemens.de Thu Dec 17 04:39:55 1998 From: Michael.Haft at mchp.siemens.de (Michael Haft) Date: Thu, 17 Dec 1998 10:39:55 +0100 Subject: Paper available: Robust, `Topological' Codes ... Message-ID: <3678D16B.78C35184@mchp.siemens.de> The following paper recently appeared in Phys. Rev. Letters (Vol.81, Nr.18, Nov. 1998, 4016-4019): Robust, `Topological' Codes by Keeping Control of Internal Redundancy M. Haft Processing information under noisy conditions demands to find a tradeoff between coding a variety of different information and robust, {\em redundant} coding of important information. We illustrate this information theoretic demand by some simple considerations. Following this, we set up information theoretic plausible learning rules for a selforganizing network. Thereby, internal {\em redundancy} is controlled via anti-Hebbian learning based on an internal topology with a given correlation function. For finite correlation length and thus a finite amount of redundancy the emergence of a map-like representation of sensory information is shown to be the consequence. The original manuscript is available from ftp://flop.informatik.tu-muenchen.de/pub/hofmannr/topoCodes.ps.gz ------------------------------------------------------------------------ Dr. Michael Haft Siemens AG, ZT IK 4 Otto-Hahn-Ring 6, 81730 Muenchen Tel.: +49/89/636-47953 Fax.: +49/89/636-49767 email: Michael.Haft at mchp.siemens.de ------------------------------------------------------------------------ From psollich at mth.kcl.ac.uk Fri Dec 18 08:56:14 1998 From: psollich at mth.kcl.ac.uk (Peter Sollich) Date: Fri, 18 Dec 1998 13:56:14 +0000 (GMT) Subject: Papers on Gaussian processes and online learning Message-ID: Dear Connectionists, the following two papers, which I hope may be of interest, are now available from my web pages: -------------------------------------------------------------------------- Peter Sollich Learning curves for Gaussian processes http://www.mth.kcl.ac.uk/~psollich/papers/GaussianProcLearningCurveNIPSIX.ps.gz (or /~psollich/papers_uncompressed/GaussianProcLearningCurveNIPSIX.ps) I consider the problem of calculating learning curves (i.e., average generalization performance) of Gaussian processes used for regression. A simple expression for the generalization error in terms of the eigenvalue decomposition of the covariance function is derived, and used as the starting point for several approximation schemes. I identify where these become exact, and compare with existing bounds on learning curves; the new approximations, which can be used for any input space dimension, generally get substantially closer to the truth. (In M J Kearns, S A Solla, and D Cohn, editors, Advances in Neural Information Processing Systems 11, Cambridge, MA. MIT Press. In press.) -------------------------------------------------------------------------- H C Rae, P Sollich, and A C C Coolen On-Line Learning with Restricted Training Sets: Exact Solution as Benchmark for General Theories http://www.mth.kcl.ac.uk/~psollich/papers/HebbOnlineNIPSIX.ps.gz (or /~psollich/papers_uncompressed/HebbOnlineNIPSIX.ps) We solve the dynamics of on-line Hebbian learning in perceptrons exactly, for the regime where the size of the training set scales linearly with the number of inputs. We consider both noiseless and noisy teachers. Ouc calculation cannot be extended to non-Hebbian rules, but the solution provides a nice benchmark to test more general and advanced theories for solving the dynamics of learning with restricted training sets. (In M J Kearns, S A Solla, and D Cohn, editors, Advances in Neural Information Processing Systems 11, Cambridge, MA. MIT Press. In press.) -------------------------------------------------------------------------- Any comments and suggestions are welcome. For papers on related topics, you could also have a look at http://www.mth.kcl.ac.uk/~psollich/publications for my full publications list. Merry Christmas! Peter Sollich -------------------------------------------------------------------------- Peter Sollich Department of Mathematics Phone: +44 - (0)171 - 873 2875 King's College Fax: +44 - (0)171 - 873 2017 University of London E-mail: peter.sollich at kcl.ac.uk Strand WWW: http://www.mth.kcl.ac.uk/~psollich London WC2R 2LS, U.K. -------------------------------------------------------------------------- From NKasabov at infoscience.otago.ac.nz Sat Dec 19 20:25:53 1998 From: NKasabov at infoscience.otago.ac.nz (Nik Kasabov) Date: Sun, 20 Dec 1998 14:25:53 +1300 Subject: A new book on neuro-fuzzy computation Message-ID: "Neuro-Fuzzy Techniques for Intelligent Information Systems" Nikola Kasabov and Robert Kozma (eds) January 1999, Physica -Verlag (Springer Verlag), Berlin, Germany ISBN 3-7908-1187-4, DM 178-, fax: +49 (30) 8 2787301, email: orders at springer.de, http://www.springer.de/ P.O.Box 140201, D-14302 Berlin, Germany Abstract This edited volume comprises selected chapters that cover contemporary issues of the development and the application of neuro-fuzzy techniques. Developing and using neural networks, fuzzy logic systems, genetic algorithms and statistical methods as separate techniques, or in their combination, have been research topics in several areas such as Mathematics, Engineering, Computer Science, Physics, Economics and Finance.Here the latest results in this field are presented from both theoretical and practical points of view. The volume has four main parts. Part one presents generic techniques and theoretical issues, while part two, three and four deal with practically oriented models, systems and implementations. Content: Part 1: Generic Neuro-Fuzzy and Hybrid Techniques Chapter 1. Analysis and Modelling of Complex Systems Using the Self-Organising Map (O.Simula, J.Vesanto, E.Alhoniemi, J.Hollmen) Chapter 2. Fuzzy Methods for Learning from Data (V.Cherkassky) Chapter 3. Uneven Allocation of Membership Functions for Fuzzy Modelling of Multi-Input Systems (K.Tachibana,T.Furuhashi) Chapter 4. Fuzzy Equivalence Relations and Fuzzy Partitions(B.Reusch) Chapter 5. Identifying Fuzzy Rule-Based Models utilising Neural Networks, Fuzzy Logic and Genetic Algorithms (A.Bastian) Chapter 6. Neuro-Genetic Information Processing for Optimisation and Adaptation in Intelligent Systems (M.Watts,N.Kasabov) Chapter 7. Evolving Connectionist and Fuzzy Connectionist Systems: Theory and Applications for Adaptive, On-line Intelligent Systems (N.Kasabov) Part 2. Neuro-Fuzzy Systems for Pattern Recognition, Image-, Speech- and Language Processing Chapter 8. Connectionist Approaches for Feature Analysis (N.Pal) Chapter 9. Pattern Classification and Feature Selection by Ellipsoidal Fuzzy Rules (S.Abe) Chapter 10. Printed Chinese Optical Character Recognition by Neural Networks Y.Wu,M.Zhao) Chapter 11. Image Processing by Chaotic Neural Network Fuzzy Membership Functions (H.Szu,C.Hsu) Chapter 12. Fuzzy Learning Machine with Application to the Detection of Landmarks for Orthodontic Treatment (E.Uchino,T.Yamakawa) Chapter 13. Speech Data Analysis and Recognition Using Fuzzy Neural Networks and Self-Organising Maps (N.Kasabov, R.Kozma, R.Kilgour et al) Chapter 14. Connectionist Methods for Stylometric Analysis: A Hybrid Approach (D.Kassabova,P.Sallis) Part 3. Neuro-Fuzzy Systems for Information Retrieval and Socio-Economic Applications Chapter 15. Soft Information Retrieval: Applications of Fuzzy Set Theory and Neural Networks (F.Crestani,G.Pasi) Chapter 16. Modeling Consensus in Group Decision Making: a Fuzzy Dynamical Approach (Mario Fedrizzi, Michele Fedrizzi, R.A. M Pereira) Chapter 17. Building fuzzy expert systems (M.Negnevitsky) Chapter 18. A Neural Network for Fuzzy Dynamic Programming and Its use in Socio-Economic Regional Development Planning (J.Kacprzyk,R.Francelin,F.Gomide Chapter 19. Investment Maps for Emerging Markets (G.Deboeck) Chapter 20. Adaptive Fuzzy-Impedance Controller for Constrained Robot Motion (P.Petrovich,V.Milacic) Part 4. Specialised Hardware for Neuro-Fuzzy Intelligent Systems Chapter 21. Specialised Hardware for Computational Intelligence (G.Coghill) Chapter 22. Evolvable Hardware - The Coming Hardware Design Method? (J.Torrensen) ---------------------------------------------------------------------------- - Have an enjoyable and prosperous New Year! Nik Kasabov ----------------------------------- Professor Dr Nikola Kasabov Department of Information Science University of Otago,P.O.Box 56 Dunedin, New Zealand phone:+64 3 479 8319, fax: +64 3 479 8311 email: nkasabov at otago.ac.nz http://kel.otago.ac.nz/nik/ ----------------------------------- From eppler at hpe.fzk.de Tue Dec 22 05:35:25 1998 From: eppler at hpe.fzk.de (Wolfgang Eppler) Date: Tue, 22 Dec 1998 11:35:25 +0100 Subject: Session at EUFIT '99 Message-ID: <367F75ED.5E839277@hpe.fzk.de> Prof Gemmeke asked me to announce the following call for papers: --------------------------------------------------------------------- Call for Papers Invited Session "Time-critical Applications with Neural Networks" at EUFIT '99, Aachen, Germany, September 13-16, 1999 Processing power of today's processors like PENTIUM II or PowerPC is sufficient for most of the applications with neural networks. Generally, training of neural networks takes much more time than recall of trained knowledge but for most of the (industrial) applications training is done only once. So there is no need to accelerate learning process by using additional hardware. On the other hand, some of the applications with neural networks are run under time critical conditions where processing power of standard processors is not sufficient. In these cases, response times of a few microseconds are required. Typical applications are found in the sphere of trigger experiments in high energy physics or pattern recognition in medical applications. Therfore, special purpose hardware like fast DSPs, parallel processors or neural processor chips are used to fulfill timing requirements. The invited session cares about those applications using special purpose hardware to accelerate neural operations. Contributions of both PC based applications using a PC acceleration card as well as stand alone applications e.g. using fast DSPs in combination with microcontrollers are welcome. Deadline for abstract: January 31, 1999 Deadline for camera ready paper: March 15, 1999 Address: Thomas Fischer Forschungszentrum Karlsruhe, FZK (Research Centre Karlsruhe) POB 3640 76021 Karlsruhe Germany Please contact: Thomas Fischer, Tel: ++49 7247 82 4042, email: fischer at hpe.fzk.de or Wolfgang Eppler, Tel: ++49 7247 82 5537, email: eppler at hpe.fzk.de --------------------------------------------------------------------- FORSCHUNGSZENTRUM KARLSRUHE Thomas Fischer Department HPE phone: +49 7247 82-4042 P.O. Box 3640, 76021 Karlsruhe, fax: +49 7247 82-3560 GERMANY email: fischer at hpe.fzk.de ---------------------------------------------------------------------- From lsaul at research.att.com Tue Dec 22 14:22:55 1998 From: lsaul at research.att.com (Lawrence K. Saul) Date: Tue, 22 Dec 1998 14:22:55 -0500 (EST) Subject: preprints available Message-ID: <199812221922.OAA14076@octavia.research.att.com> The following preprints are available at http://www.research.att.com/~lsaul. ============================================================================== ATTRACTOR DYNAMICS IN FEEDFORWARD NEURAL NETWORKS L. Saul and M. Jordan We study the probabilistic generative models parameterized by feedforward neural networks. An attractor dynamics for probabilistic inference in these models is derived from a mean field approximation for large, layered sigmoidal networks. Fixed points of the dynamics correspond to solutions of the mean field equations, which relate the statistics of each unit to those of its Markov blanket. We establish global convergence of the dynamics by providing a Lyapunov function and show that the dynamics generate the signals required for unsupervised learning. Our results for feedforward networks provide a counterpart to those of Cohen-Grossberg and Hopfield for symmetric networks. ============================================================================== MARKOV PROCESSES ON CURVES FOR AUTOMATIC SPEECH RECOGNITION L. Saul and M. Rahim We investigate a probabilistic framework for automatic speech recognition based on the intrinsic geometric properties of curves. In particular, we analyze the setting in which two variables---one continuous (X), one discrete (S)---evolve jointly in time. We suppose that the vector X traces out a smooth multidimensional curve and that the variable S evolves stochastically as a function of the arc length traversed along this curve. Since arc length does not depend on the rate at which a curve is traversed, this gives rise to a family of Markov processes whose predictions, Pr[S|X], are invariant to nonlinear warpings of time. We describe the use of such models, known as Markov processes on curves (MPCs), for automatic speech recognition, where X are acoustic feature trajectories and S are phonetic transcriptions. On two tasks---recognizing New Jersey town names and connected alpha-digits---we find that MPCs yield lower word error rates than comparably trained hidden Markov models. ============================================================================== From jensen at volen.brandeis.edu Wed Dec 23 09:40:39 1998 From: jensen at volen.brandeis.edu (Ole Jensen) Date: Wed, 23 Dec 1998 09:40:39 -0500 (EST) Subject: reprint available: OscillatorSTM model Message-ID: The following reprint is available (PDF) at http://lucifer.ccs.brandeis.edu/~ojensen or http://www.jneurosci.org/current.shtml ------------------------------------------------------------------------- AN OSCILLATORY SHORT-TERM MEMORY BUFFER MODEL CAN ACCOUNT FOR DATA ON THE STERNBERG TASK Ole Jensen and John E. Lisman Journal of Neuroscience, 18:10699-10699, 1998 A limited number (7+/-2) of items can be held in human short-term memory (STM). We have previously suggested that observed dual (theta and gamma) oscillations could underlie a multiplexing mechanism that enables a single network to actively store up to 7 memories. Here we have asked whether models of this kind can account for the data on the Sternberg task, the most quantitative measurements of memory search available. We have found several variants of the oscillatory search model that account for the quantitative dependence of the reaction time distribution on the number of items (S) held in STM. The models differ on the issues of 1) whether theta frequency varies with S and 2) whether the phase of ongoing oscillations is reset by the probe. Using these models the frequencies of dual oscillations can be derived from psychophysical data. The derived values (f_theta = 6-10 Hz, f_gamma = 45-60 Hz) are in reasonable agreement with experimental values. The exhaustive nature of the serial search that has been inferred from psychophysical measurements can be plausibly explained by these oscillatory models. One argument against exhaustive serial search has been the existence of serial position effects. We find that these effects can be explained by short-term repetition priming in the context of serial scanning models. Our results strengthen the case for serial processing and point to experiments that discriminate between variants of the serial scanning process. ------------------------------------------------------------------------------ Ole Jensen, Ph.D. Volen Center for Complex Systems Brandeis University Waltham MA02454-9110 USA Home phone: (617) 666 8274 Work phone: (781) 736 3146 Fax: (781) 736 2398 Phone in Denmark: (+45) 59517316 jensen at volen.brandeis.edu http://lucifer.ccs.brandeis.edu/~ojensen From sas at Glue.umd.edu Fri Dec 25 21:08:49 1998 From: sas at Glue.umd.edu (Shihab A. Shamma) Date: Fri, 25 Dec 1998 21:08:49 -0500 (EST) Subject: Neuromorphic Engineering Workshop Message-ID: "NEUROMORPHIC ENGINEERING WORKSHOP" JUNE 27 - JULY 17, 1999 TELLURIDE, COLORADO Deadline for application is February 1, 1999. Avis COHEN (University of Maryland) Rodney DOUGLAS (University of Zurich and ETH, Zurich/Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) We invite applications for a three week summer workshop that will be held in Telluride, Colorado from Sunday, June 27 to Saturday, July 17, 1999. The 1998 summer workshop on "Neuromorphic Engineering", sponsored by the National Science Foundation, the Gatsby Foundation, NASA, the Office for Naval Research, and by the "Center for Neuromorphic Systems Engineering" at the California Institute of Technology, was an exciting event and a great success. A detailed report on the workshop is available at http://www.klab.caltech.edu/~timmer/telluride.html ( or in Europe: http://www.ini.unizh.ch:80/telluride98/). We strongly encourage interested parties to browse through these reports and photo albums. GOALS: Carver Mead introduced the term "Neuromorphic Engineering" for a new field based on the design and fabrication of artificial neural systems, such as vision systems, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems. The goal of this workshop is to bring together young investigators and more established researchers from academia with their counterparts in industry and national laboratories, working on both neurobiological as well as engineering aspects of sensory systems and sensory-motor integration. The focus of the workshop will be on "active" participation, with demonstration systems and hands-on-experience for all participants. Neuromorphic engineering has a wide range of applications from nonlinear adaptive control of complex systems to the design of smart sensors. Many of the fundamental principles in this field, such as the use of learning methods and the design of parallel hardware, are inspired by biological systems. However, existing applications are modest and the challenge of scaling up from small artificial neural networks and designing completely autonomous systems at the levels achieved by biological systems lies ahead. The assumption underlying this three week workshop is that the next generation of neuromorphic systems would benefit from closer attention to the principles found through experimental and theoretical studies of real biological nervous systems as whole systems. FORMAT: The three week summer workshop will include background lectures systems neuroscience (in particular, sensory processing at peripheral and central levels, motor control of locomotion and oculor-motor function, attention and learning) practical tutorials on analog VLSI design, small mobile robots (Khoalas), hands-on projects, and special interest groups. Participants are required to take part and possibly complete at least one of the projects proposed (soon to be defined). They are furthermore encouraged to become involved in as many of the other activities proposed as interest and time allow. There will be two lectures in the morning that cover issues that are important to the community in general. Because of the diverse range of backgrounds among the participants, the majority of these lectures will be tutorials, rather than detailed reports of current research. These lectures will be given by invited speakers. Participants will be free to explore and play with whatever they choose in the afternoon. Projects and interest groups meet in the late afternoons, and after dinner. The analog VLSI practical tutorials will cover all aspects of analog VLSI design, simulation, layout, and testing over the workshop of the three weeks. The first week covers basics of transistors, simple circuit design and simulation. This material is intended for participants who have no experience with analog VLSI. The second week will focus on design frames for silicon retinas, from the silicon compilation and layout of on-chip video scanners, to building the peripheral boards necessary for interfacing analog VLSI retinas to video output monitors. Retina chips will be provided. The third week will feature sessions on floating gates, including lectures on the physics of tunneling and injection, and on inter-chip communication systems. We will also feature a tutorial on the use of small, mobile robots, focussing on Khoala's, as an ideal platform for vision, auditory and sensory-motor circuits. Projects that are carried out during the workshop will be centered in a number of groups, including active vision, audition, olfaction, motor control, central pattern generator, robotics, multichip communication, analog VLSI and learning. The "active perception" project group will emphasize vision and human sensory-motor coordination. Issues to be covered will include spatial localization and constancy, attention, motor planning, eye movements, and the use of visual motion information for motor control. Demonstrations will include a robot head active vision system consisting of a three degree-of-freedom binocular camera system that is fully programmable. The "central pattern generator" group will focus on small walking and undulating robots. It will look at characteristics and sources of parts for building robots, play with working examples of legged and segmented robots, and discuss CPG's and theories of nonlinear oscillators for locomotion. It will also explore the use of simple analog VLSI sensors for autonomous robots. The "robotics" group will use rovers and working digital vision boards as well as other possible sensors to investigate issues of sensorimotor integration, navigation and learning. The "audition" group aims to develop biologically plausible algorithms and aVLSI implementations of specific auditory tasks such as source localization and tracking, and sound pattern recognition. Projects will be integrated with visual and motor tasks in the context of a robot platform. The "multichip communication" project group will use existing interchip communication interfaces to program small networks of artificial neurons to exhibit particular behaviors such as amplification, oscillation, and associative memory. Issues in multichip communication will be discussed. LOCATION AND ARRANGEMENTS: The workshop will take place at the Telluride Elementary School located in the small town of Telluride, 9000 feet high in Southwest Colorado, about 6 hours away from Denver (350 miles). Continental and United Airlines provide daily flights directly into Telluride. All facilities within the beautifully renovated public school building are fully accessible to participants with disabilities. Participants will be housed in ski condominiums, within walking distance of the school. Participants are expected to share condominiums. No cars are required. Bring hiking boots, warm clothes and a backpack, since Telluride is surrounded by beautiful mountains. The workshop is intended to be very informal and hands-on. Participants are not required to have had previous experience in analog VLSI circuit design, computational or machine vision, systems level neurophysiology or modeling the brain at the systems level. However, we strongly encourage active researchers with relevant backgrounds from academia, industry and national laboratories to apply, in particular if they are prepared to work on specific projects, talk about their own work or bring demonstrations to Telluride (e.g. robots, chips, software). Internet access will be provided. Technical staff present throughout the workshops will assist with software and hardware issues. We will have a network of SUN workstations running UNIX, MACs and PCs running LINUX and Windows95. Unless otherwise arranged with one of the organizers, we expect participants to stay for the duration of this three week workshop. FINANCIAL ARRANGEMENT: We have several funding requests pending to pay for most of the costs associated with this workshop. Different from previous years, after notification of acceptances have been mailed out around March 15., 1999, participants are expected to pay a $275.- workshop fee. In case of real hardship, this can be waived. Shared condominiums will be provided for all academic participants at no cost to them. We expect participant from National Laboratories and Industry to pay for these modestly priced condominiums. We expect to have funds to reimburse a small number of participants for up to travel (up to $500 for domestic travel and up to $800 for overseas travel). Please specify on the application whether such financial help is needed. HOW TO APPLY: The deadline for receipt of applications is February 1., 1999. Applicants should be at the level of graduate students or above (i.e. post-doctoral fellows, faculty, research and engineering staff and the equivalent positions in industry and national laboratories). We actively encourage qualified women and minority candidates to apply. Application should include: 1. Name, address, telephone, e-mail, FAX, and minority status (optional). 2. Curriculum Vitae. 3. One page summary of background and interests relevant to the workshop. 4. Description of special equipment needed for demonstrations that could be brought to the workshop. 5. Two letters of recommendation Complete applications should be sent to: Prof. Terrence Sejnowski The Salk Institute 10010 North Torrey Pines Road San Diego, CA 92037 email: terry at salk.edu FAX: (619) 587 0417 Applicants will be notified around March 15. 1999. From gyen at okway.okstate.edu Sat Dec 26 13:27:44 1998 From: gyen at okway.okstate.edu (Gary Yen) Date: Sat, 26 Dec 1998 12:27:44 -0600 Subject: IJCNN99- submission deadline extended Message-ID: <9812269146.AA914696654@okway.okstate.edu> ontributed by: Gary G. Yen gyen at master.ceat.okstate.edu CONFERENCE UPDATE: IJCNN'99 900 Submissions and Counting Deadline Extended for Email Submissions 1999 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS RENAISSANCE HOTEL WASHINGTON, D.C. JULY 10-16, 1999 We have 900 submissions in hand now; however, due to numerous requests, we are extending the submission deadline. Email submission may still be made directly to the Conference General Chair, David Brown, at dgb at cdrh.fda.gov. Submissions received after January 22 will only be considered for poster presentation unless invited by session chairs. Details on the web site www.cas.american.edu/~medsker/ijcnn99/ijcnn99.html (or via www.inns.org) as shown below: IJCNN'99 review and acceptance will be based on a one-page summary, which will be distributed to Conference participants in a summary book. Conference proceedings will be in CD form. Hard-copy proceedings may be available for an additional fee. Use of illustrations is encouraged in the summaries, within the single allowed page. Use of audio and video segments in the CD proceedings will be considered. Poster presentations will be encouraged, with "single-slide" poster presentations interspersed with regular oral sessions. Monetary awards presented for best poster in several categories. Best student paper awards will also be given. INSTRUCTIONS FOR AUTHORS: Paper summary format information The summary must be accompanied by a cover sheet listing the following information: 1. Paper title 2. Author information - full names and affiliations as they will appear in the program 3. Mailing address, telephone, fax, and email for each author 4. Request for oral or poster presentation 5. Topic(s), selected from the list below: Biological foundations Neural systems Mathematical foundations Architectures Learning algorithms Intelligent control Artificial systems Data analysis Pattern recognition Hybrid systems Intelligent computation (including fuzzy and GA) Applications The one-page, camera-ready summary must conform to the following requirements: 1. All text and illustrations must appear within a 7x9 in. (178x229mm) area. For US standard size paper (8.5x11 in.), set margins to 0.75 in. (18mm) left and right and 1 in. top and bottom. For A4 size paper, set margins to 17mm left and right, 25mm top, and 45 mm bottom. 2. Use 10-point Times Roman or equivalent typeface for the main text. Single space all text, allowing extra space between paragraphs. 3. Title (16 pt. Bold, centered) Capitalize only first word and proper names. 4. Authors names, affiliation (12 pt., centered) omit titles or degrees. 5. Five section headings (11 pt. bold). The following five headings MUST be used: Purpose, Method, Results, New or breakthrough aspect of work, and Conclusions. Submit by email to dgb at cdrh.fda.gov Acceptance will be determined by February 8, 1999 in most cases, and complete papers are due in digital form for CD publication by May 3, 1999. From S.Singh at exeter.ac.uk Tue Dec 22 12:44:56 1998 From: S.Singh at exeter.ac.uk (Sameer Singh) Date: Tue, 22 Dec 1998 17:44:56 +0000 (GMT Standard Time) Subject: FELLOWSHIP Message-ID: PANN SENIOR RESEARCH FELLOWSHIP Pattern Analysis and Neural Networks research, Exeter University, UK http://www.dcs.exeter.ac.uk/research/PANN Pattern Analysis and Applications research group at Exeter University has research interests in image processing, pattern recognition and neural networks. The research group provides, from time to time, finances to support Faculty members from other research institutions to visit and work on projects of mutual interest. The 1999 Visiting Fellowship will provide for return airfares and full accommodation costs in Exeter for a period of up to one year. It is expected that senior academics who are taking a study leave or post-doctoral staff who wish to expand their research interests in our laboratory will find this appointment suitable. Normally, visiting scientists will receive partial or full salaries at their home institutions. The 1999 fellowship must be taken on or before 1 October 1999; the actual starting date is flexible and can be negotiated with the group Director. For the 1999 fellowship, we invite interested candidates in the area of "Adaptive Image Analysis". An abstract of our current interests is appended at the end of this message. We are particularly interested in candidates who would pursue research in any of the areas mentioned in the abstract below. We would specially encourage those complement the expertise available within our research group, for example those who have an interest in robotics and learning in image processing. The exact nature of research undertaken can be discussed through email. One of the key aims of the fellowship is to encourage collaboration between PANN research group and other scientists working in similar areas. This is particularly useful for stimulating further research activity through research grants within the UK and European communities. Since the proposed project is of significance to defence, transportation and space agencies, this fellowship will provide excellent contacts with the industry for further support in this research area. If you are interested in making an application, then please send a copy of your CV to Sameer Singh at s.singh at exeter.ac.uk at the earliest possible opportunity. Please note that the fellowship is only open at the post-doctoral level. Adaptive Scene Analysis We are currently interested in researching intelligent scene analysis on a number of sub-areas in collaboration with the Defence Evaluation and Research Agency. These research areas include adaptive classifier development, modelling feedback mechanisms in image processing at various levels, optimal feature selection, dynamic control, development of contextual, spatial and temporal awareness models, self-assessment, validation and learning mechanisms and building robotic systems that incorporate such vision infrastructure for navigation and control. ___ -------------------------------------------- Sameer Singh Director, PANN Research Department of Computer Science University of Exeter Exeter EX4 4PT UK tel: +44-1392-264053 fax: +44-1392-264067 email: s.singh at exeter.ac.uk web: http://www.dcs.exeter.ac.uk/academics/sameer --------------------------------------------