From orponen at math.jyu.fi Sun Dec 1 07:09:22 1996 From: orponen at math.jyu.fi (Pekka Orponen) Date: Sun, 1 Dec 1996 14:09:22 +0200 (EET) Subject: analog noise In-Reply-To: Message-ID: Dear Connectionists: Sorry to waste the bandwidth, but I think the facts are still not straight. On Sat, 30 Nov 1996, Mike Casey wrote: > > Regarding the RNN+noise implying only finite state machine power result, I > completely agree that their result is a nice extension of the corollary > that I proved The readers are welcome to look up the papers and compare our "nice" extension" to Dr. Casey's proof -- or come see our poster at NIPS. (We actually obtained our result before seeing Dr. Casey's paper, but that is beside the point.) > they failed to mention that I had already > proved something very similar. This is true, and an unfortunate omission from our part. While we do discuss Dr. Casey's paper and point out the limitations of the noise model he considers, I only now realize that we failed to include an explicit statement to the effect that "Casey proved an analogous result in his model". I can assure that this omission was not intentional, but was caused by our finding out about Dr. Casey's work only after completing the first version of our paper, and thus adding the technical comparisons to an already existing text. To us it was so clear _what_ Dr. Casey's result was that we forgot to mention that explicitly. I suppose this one sentence would have set the record straight and saved us all from this discussion. > Their "clipped" Gaussian noise is a special > case of bounded noise with arbitrary distribution (Bowen's pseudo-orbit > formalism), so there's no sense in which they "relaxed" the definition of > analog noise. Well, the "clipped" Gaussian noise is bounded because the state space is. It is not technically quite clear that large noise levels could not be used in some devious way in our setting, but this is a minor issue. Also, if I read Dr. Casey's paper correctly, he assumes that the noise level is nonzero everywhere within an \epsilon radius of the correct state, an assumption that we do not need. But these are little technical points that probably could be changed also in his proof. > wouldn't lead to anything interesting). Finally, in section 4 of their > paper where they concretely discuss RNNs performing computations, they > assume that the noise is bounded and that the computation is done with > perfect reliability (which were precisely the assumptions that I used > which they have spent so much time discrediting in other parts of the > paper). This is not quite fair, because in this section we do not _assume_ that the computation can be performed with 100% reliablity, but _prove_ that it can be, in the case that one only wants to simulate finite state automata, and the noise is bounded at some sufficiently low level. (This is actually an embarrassingly simple result, which we just didn't find in the literature for the basic BSB network model we consider.) The computational upper bound result, which is more significant, shows that even if we don't assume 100% reliability, we still cannot do more than finite automata. I will try to keep offline from now; see you at NIPS. -- Pekka Orponen From burkhard at zinfandel.cse.ogi.edu Sun Dec 1 16:03:58 1996 From: burkhard at zinfandel.cse.ogi.edu (Annette Burkhardt) Date: Sun, 1 Dec 96 13:03:58 -0800 Subject: Jobs in Financial Analysis at Nonlinear Prediction Systems Message-ID: <9612012103.AA13119@zinfandel.cse.ogi.edu> NONLINEAR PREDICTION SYSTEMS Portland, Oregon RESEARCH POSITIONS IN NONLINEAR STATISTICAL ANALYSIS OF FINANCIAL MARKETS Nonlinear Prediction Systems (NPS) is a small firm based in Portland, Oregon doing research in forecasting, trading, statistical arbitrage, and global risk management for the world's major financial markets. The current research group consists of John Moody, Steve Rehfuss, and Lizhong Wu. We are seeking highly talented and well-qualified candidates to fill Research Scientist and Research Programmer positions. In both cases, the work will involve the development and application of advanced modeling techniques from nonparametric statistics, time series analysis, neural networks, machine learning, data mining, genetic algorithms, and econometrics to challenging problems in financial market modeling. Successful applicants will have the following qualifications: * Research Scientist (or Senior Research Scientist): Ph.D. in mathematics, physics, statistics, econometrics, electrical engineering, computer science, or related field, a strong research track record, and excellent data analysis and computing skills. The level of appointment and compensation will depend on experience. * Research Programmer: M.S. in one of the above mentioned fields (or equivalent experience), strong quantitative skills, plus exceptional programming or software engineering skills. Experience with databases, Unix systems programming and management, or Windows NT programming are desirable. For either position, preference will be given to candidates who have experience in modeling noisy, real-world data using state-of-the-art nonparametric or nonlinear techniques. Knowledge of statistics or numerical analysis and experience with C++, S-PLUS, or MATLAB is highly desirable. Of particular interest are candidates who have training in finance or experience in trading or modeling the financial markets. Applicants must be willing to work in close collaboration with other research group members, and must be eager to tackle extremely challenging and potentially rewarding data analysis problems in finance. NPS is located near downtown Portland with views of the city and mountains. The work atmosphere is relaxed and informal. Portland is noted for it's many fine restaurants, brew pubs, excellent cultural attractions, and beautiful surroundings. The Oregon Coast, Mt. Hood, and the Columbia Gorge are within 75 minute drives of Portland. Employment at NPS offers significant opportunities for growth in compensation and responsibility. Initial compensation will be competitive based on experience. Interested applicants should email resumes (ascii or postscript) with names and phone numbers of three to five references to burkhard at cse.ogi.edu, or send by US mail to: Attn: Recruiting Nonlinear Prediction Systems PO Box 681, University Station Portland, OR 97207 USA NPS is an equal opportunity employer. Candidates who are attending NIPS this week and who would like an interview should contact John Moody at the Marriott in Denver or at the Silvertree Hotel in Snowmass. From S.Renals at dcs.shef.ac.uk Mon Dec 2 06:41:05 1996 From: S.Renals at dcs.shef.ac.uk (Steve Renals) Date: Mon, 2 Dec 1996 11:41:05 GMT Subject: Postdoc: Speech Recognition at Sheffield University (UK) Message-ID: <199612021141.LAA01116@elvis.dcs.shef.ac.uk> University of Sheffield Department of Computer Science Research Associate in Continuous Speech Recognition (Ref: R1039) Applications are invited for the above post which is tenable for three years from February 1997. The post is part of an EU Long Term Research project (THISL) that will develop a system for the indexing and retrieval of information from large speech recordings and TV/radio broadcasts. The main emphasis of the work will be developing hybrid connectionist/HMM algorithms and systems for very large vocabulary broadcast speech recognition. Candidates for the post will be expected to hold a PhD in a relevant discipline (e.g. Electrical Engineering, Computer Science, Applied Mathematics), or to have acquired equivalent experience. The successful candidate will probably have had research experience in the area of speech recognition, neural computing or language modelling. Salary will be in the range \pounds 14,317 to \pounds 19,948. Closing date for applications: 7 January 1997. For further information about the post contact Steve Renals or on the web, http://www.dcs.shef.ac.uk/research/groups/spandh/projects/thisl.html Application forms are available are available from the Director of Human Resources, University of Sheffield, Western Bank, Sheffield S10 2TN, UK, tel: +44-114-279-9800, email: jobs at sheffield.ac.uk, citing reference R1039. -------------------------------------------------------------------------- Steve Renals mailto:s.renals at dcs.shef.ac.uk Dept of Computer Science http://www.dcs.shef.ac.uk/~sjr/ Sheffield University phone: +44-114-222-1836 Regent Court fax: +44-114-278-0972 211 Portobello Street Sheffield S1 4DP UK From kruschke at croton.psych.indiana.edu Mon Dec 2 13:04:02 1996 From: kruschke at croton.psych.indiana.edu (John Kruschke) Date: Mon, 2 Dec 1996 13:04:02 -0500 (EST) Subject: TR Announcement: Rules and exemplars in category learning Message-ID: <9612021804.AA01457@croton.psych.indiana.edu> Rules and Exemplars in Category Learning Michael A. Erickson and John K. Kruschke Indiana University, Bloomington Psychological theories of categorization have generally focused on either rule- or exemplar-based explanations of categorization. We present two experiments that show evidence of both rule induction and exemplar encoding, and we present a connectionist model (ATRIUM) that specifies a mechanism for combining rule- and exemplar-based representation. In both experiments participants learned to classify items, most of which followed a simple rule although there were a few, frequently occurring exceptions. Experiment 1 examined how people extrapolate beyond the range of trained instances. Experiment 2 examined the effects of instance frequency on generalization to novel cases. We found that categorization behavior was well described by the model, in which exemplar representation is used for both rule and exception processing. A key element in correctly modeling categorization in tasks such as these was capturing the interaction between the rule- and exemplar-based representational structures using shifts of attention between rules and exemplars. This report is also available for electronic retrieval (uncompressed PostScript, 589 Kbytes) from http://www.indiana.edu/~kruschke/ek96_abstract.html A very limited number of paper copies are also available; request Cognitive Science Technical Report #183, by Erickson & Kruschke, from iucogsci at indiana.edu -- John K. Kruschke office: (812) 855-3192 Dept. of Psychology fax: (812) 855-4691 Indiana University http://www.indiana.edu/~kruschke/ Bloomington, IN 47405-1301 kruschke at indiana.edu From nq6 at columbia.edu Tue Dec 3 21:11:20 1996 From: nq6 at columbia.edu (Ning Qian) Date: Tue, 3 Dec 1996 21:11:20 -0500 (EST) Subject: stereo paper available Message-ID: <199612040211.VAA18249@labdien.cc.columbia.edu> The following paper (and some related ones, see below) on stereo vision can be downloaded from the web site: http://brahms.cpmc.columbia.edu/ Physiological Computation of Binocular Disparity Ning Qian and Yudong Zhu (to appear in Vision Research) We previously proposed a physiologically realistic model for stereo vision based on the quantitative binocular receptive field profiles mapped by Freeman and coworkers. Here we present several new results about the model that shed light on the physiological processes involved in disparity computation. First, we show that our model can be extended to a much more general class of receptive field profiles than the commonly used Gabor functions. Second, we demonstrate that there is, however, an advantage of using the Gabor filters: Similar to our perception, the stereo algorithm with the Gabor filters has a small bias towards zero disparity. Third, we prove that the complex cells as described by Freeman et al.\ compute disparity by effectively summing up two related cross products between the band-pass filtered left and right retinal image patches. This operation is related to cross-correlation but it overcomes some major problems with the standard correlator. Fourth, we demonstrate that as few as two complex cells at each spatial location are sufficient for a reasonable estimation of binocular disparity. Fifth, we find that our model can be significantly improved by considering the fact that complex cell receptive fields are on average larger than those of simple cells. This fact is incorporated into the model by averaging over {\em several} quadrature pairs of simple cells with nearby and overlapping receptive fields to construct a model complex cell. The disparity tuning curve of the resulting complex cell is much more reliable than that constructed from a {\em single} quadrature pair of simple cells used previously, and the computed disparity maps for random dot stereograms with the new algorithm are very similar to human perception, with sharp transitions at disparity boundaries. Finally, we show that under most circumstances our algorithm works equally well with either of the two well-known receptive field models in the literature. Related papers on the same web site: "A Physiological Model for Motion-stereo Integration and a Unified Explanation of Pulfrich-like Phenomena", Ning Qian and Richard A. Andersen, Vision Research, (in press). "Binocular Receptive Field Profiles, Disparity Tuning and Characteristic Disparity" Yudong Zhu and Ning Qian, Neural Computation, 1996, 8:1647-1677. "Computing Stereo Disparity and Motion with Known Binocular Cell Properties", Ning Qian, Neural Computation, 1994, 6:390-404. From Dave_Touretzky at DST.BOLTZ.CS.CMU.EDU Tue Dec 3 21:34:22 1996 From: Dave_Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave_Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Tue, 03 Dec 96 21:34:22 EST Subject: CNBC graduate training program Message-ID: <22818.849666862@DST.BOLTZ.CS.CMU.EDU> Graduate Training with the Center for the Neural Basis of Cognition The Center for the Neural Basis of Cognition offers interdisciplinary Ph.D. and postdoctoral training programs operated jointly with affiliated programs at Carnegie Mellon University and the University of Pittsburgh: Carnegie Mellon University of Pittsburgh Biological Sciences Mathematics Computer Science Neurobiology Psychology Neuroscience Robotics Psychology The Center is dedicated to the study of the neural basis of cognitive processes including learning and memory, language and thought, perception, attention, and planning; to the study of the development of the neural substrate of these processes; to the study of disorders of these processes and their underlying neuropathology; and to the promotion of applications of the results of these studies to artificial intelligence, robotics, and medicine. CNBC students have access to some of the finest facilities for cognitive neuroscience research in the world: Positron Emission Tomography (PET) and Magnetic Resonance Imaging (MRI) scanners for functional brain imaging, neurophysiology laboratories for recording from brain slices and from anesthetized or awake, behaving animals, electron and confocal microscopes for structural imaging, high performance computing facilities including an in-house supercomputer for neural modeling and image analysis, and patient populations for neuropsychological studies. Students are admitted jointly to a home department and the CNBC Training Program. Applications are encouraged from students with interests in biology, neuroscience psychology, engineering, physics, mathematics, computer science, or robotics. For a brochure describing the program and application materials, contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu This material is also available on our web site at http://www.cnbc.cmu.edu The CNBC training faculty includes: German Barrionuevo (Pitt Neuroscience): LTP in hippocampal slice Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving Jonathan Cohen (CMU Psychology): schizophrenia; dopamine and attention Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language John Horn (Pitt Neurobiology): synaptic learning in invertebrates Allen Humphrey (Pitt Neurobiology): motion processing in primary visual cortex Marcel Just (CMU Psychology): visual thinking, language comprehension Eric Klann (Pitt Neuroscience): hippocampal LTP and LTD Alan Koretsky (CMU Biological Sciences): new fMRI techniques for brain imaging Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision David Lewis (Pitt Neuroscience): anatomy of frontal cortex James McClelland (CMU Psychology): connectionist models of cognition Carl Olson (CNBC): spatial representations in primate frontal cortex David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia John Pollock (CMU Biological Sci.): neurodevelopment of the fly visual system Walter Schneider (Pitt Psychology): fMRI studies of attention and vision Charles Scudder (Pitt Neurobiology): motor learning in cerebellum Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex William Skaggs (Pitt Neuroscience): representations in rodent hippocampus David Touretzky (CMU Comp. Sci.): hippocampus, rat navigation, animal learning From riegler at ifi.unizh.ch Wed Dec 4 05:45:15 1996 From: riegler at ifi.unizh.ch (Alex Riegler) Date: Wed, 4 Dec 1996 11:45:15 +0100 Subject: 2nd CFP New Trends in Cog Sci Message-ID: Please forward to colleagues etc. Apologies if you have received this already. /\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/ International Workshop N E W T R E N D S I N C O G N I T I V E S C I E N C E NTCS '97 /\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/ "Does Representation need Reality?" Perspectives from Cognitive Science, Neuroscience, Epistemology, and Artificial Life Vienna, Austria, May 13 - 16, 1997 with plenary talks by: Georg Dorffner, Ernst von Glasersfeld, Stevan Harnad, Wolf Singer, and Sverre Sjoelander organized by the Austrian Society of Cognitive Science (ASoCS) =========================================================================== Latest information can be retrieved from the conference WWW-page =========================================================================== P u r p o s e ___________________________________________________________________________ The goal of this single-track conference is to investigate and discuss new approaches and movements in cognitive science in a workshop-like atmosphere. Among the topics which seem to have emerged in the last years are: embodiment of knowledge, system theoretic and computational neuroscience approaches to cognition, dynamics in recurrent neural architectures, evolutionary and artificial life approaches to cognition, and (epistemological) implications for perception and representation, constructivist concepts and the problem of knowledge representation, autopoiesis, implications for epistemology and philosophy (of science). Evidence for a failure of the traditional understanding of neural representation converges from several fields. Neuroscientific results in the last decade have shown that single cell representations with hierarchical processing towards representing units seems not the way the cortex represents environmental entities. Instead, distributed cell ensemble coding has become a popular concept for representation, both in computational and in empirical neuroscience. However, new problems arise from the new concepts. The problem of binding the distributed parts into a uniform percept can be "solved" by introducing synchronization of the member neurons. A deeper (epistemological) problem, however, is created by recurrent architectures within ensembles generating an internal dynamics in the network. The cortical response to an environmental stimulus is no longer dominated by stimulus properties themselves, but to a considerable degree by the internal state of the network. Thus, a clear and stable reference between a representational state (e.g. in a neuron, a Hebbian ensemble, an activation state, etc.) and the environmental state becomes questionable. Already learned experiences and expectancies might have an impact on the neural activity which is as strong as the stimulus itself. Since these internally stored experiences are constantly changing, the notion of (fixed) representations is challenged. At this point, system theory and constructivism, both investigating the interaction between environment and organism at an abstract level, come into the scene and turn out to provide helpful epistemological concepts. The goal of this conference is to discuss these phenomena and their implications for the understanding of representation, semantics, language, cognitive science, and artificial life. Contrary to many conferences in this field, the focus is on interdisciplinary cooperation and on conceptual and epistemological questions, rather than on technical details. We are trying to achieve this by giving more room to discussion and interaction between the participants (e.g., invited comments on papers, distribution of papers to the participants before the conference, etc.). According to the interdisciplinary character of cognitive science, we welcome papers/talks from the fields of artificial life, empirical, cognitive, and computational neuroscience, philosophy (of science), epistemology, anthropology, computer science, psychology, and linguistics. T o p i c s ___________________________________________________________________________ The conference is centered around but not restricted to the following topics: 1. Representation - epistemological concepts and findings from (computational) neuroscience, cognitive science (recurrent neural architectures, top-down processing, etc.), and philosophy; 2. Alternatives to representation - applying constructivism to cognitive systems; 3. Modeling language, communication, and semantics as a dynamical, evolutionary and/or adaptive process; 4. Representation and cognition in artificial life; 5. What is the role of simulation in understanding cognition? I n v i t e d S p e a k e r s ___________________________________________________________________________ Besides submitted papers the conference will also feature plenary talks by invited speakers who are leaders in their fields. The following is a list of invited speakers in alphabetical order: o Georg Dorffner, Univ. of Vienna (A) o Ernst von Glasersfeld, Univ. of Amherst, MA (USA) o Stevan Harnad, Univ. of Southampton (GB) o Rolf Pfeifer, Univ. of Zurich (CH) o Wolf Singer, Max Planck Institut fuer Hirnforschung, Frankfurt (D) o Sverre Sjoelander, Linkoeping University (S) P a p e r S u b m i s s i o n s ___________________________________________________________________________ We invite submissions of scientific papers to any of the 5 topics listed above. The papers will be reviewed by the Scientific Committee and accepted according to their scientific content, originality, quality of presentation, and relatedness to the conference topic. Please keep to the following guidelines: Hardcopy submission only, 6-9 pages A4 or USLetter single sided in Times Roman 10-12pt (or equivalent). Please send 4 copies to the organizing committee, see address below. In a first step we are planning to publish the proceedings as Technical Report of the Austrian Society for Cognitive Science. In a second step after rewriting the papers and after a second round of review a major publisher will be approached to publish the best papers in an edited volume. For the final versions of the accepted papers electronic submissions are preferred in one of the following formats: Word, FrameMaker, or Ascii. Detailed formatting information will be given upon notification of acceptance. Submission due January 7, 1997 Notification of acceptance February 28 R e g i s t r a t i o n ___________________________________________________________________________ To register please fill out the registration form at the bottom of this CFP and send it by... o Email to franz-markus.peschl at univie.ac.at, or by o Fax to +43-1-408-8838 (attn. M.Peschl), or by o Mail to Markus Peschl, Dept.for Philosophy of Science (address below) Registration Fee (includes admission to talks, presentations, and proceedings): before April 1st, 1997: Member * 1000 ATS (about 90 US$) Non-Member 1500 ATS (about 135 US$) Student Member ** 400 ATS (about 36 US$) Student Non-Member 1000 ATS (about 90 US$) after April 1st, 1997: Member * 1300 ATS (about 118 US$) Non-Member 1800 ATS (about 163 US$) Student Member ** 500 ATS (about 45 US$) Student Non-Member 1300 ATS (about 118 US$) *) Members of the Austrian Society of Cognitive Science **) Requires proof of valid student ID C o n f e r e n c e S i t e a n d A c c o m o d a t i o n ___________________________________________________________________________ The conference takes place in a small beautiful baroque castle in the suburbs of Vienna; the address is: Schloss Neuwaldegg Waldegghofg. 5 A-1170 Wien Austria Tel: +43-1-485-3605 Fax: +43-1-485-3605-112 It is surrounded by a beautiful forest and a good (international and Viennese gastronomic) infrastructure. On the tram it takes only 20 minutes to the center of Vienna. (Limited) Accommodation is provided by the castle (about 41 US$ per night (single), 30 US$ per night, per person (double) including breakfast). Please contact the telephone number above. You can find more information about Vienna and accommodation at the Vienna Tourist Board or at the Intropa Travel agent Tel: +43-1-5151-242. Further information will be available soon. D e s t i n a t i o n V i e n n a ? ___________________________________________________________________________ Vienna, Austria, can be reached internationally by plane or train. The Vienna Schwechat airport is located about 16 km from the city center. From the airport, the city air-terminal can be reached by bus (ATS 60.- per person) or taxi (about ATS 400). Rail-passengers arrive at one of the main stations which are located almost in the city center. From the air-terminal and the railway stations the congress site and hotels can be reached easily by underground (U-Bahn), tramway, or bus. A detailed description will be given to the participants. In May the climate is mild in Vienna. It is the time when spring is at its climax and everything is blooming. The weather is warm with occasional (rare) showers. The temperature is about 18 to 24 degrees Celsius. More information about Vienna and Austria on the web: Welcome to Vienna Scene Vienna City Wiener Festwochen - Vienna Festival Public Transport in Vienna (subway) Welcome to Austria General information about Austria Austria Annoted S c i e n t i f i c C o m m i t t e e ___________________________________________________________________________ R. Born Univ. of Linz (A) G. Dorffner Univ. of Vienna (A) E. v. Glasersfeld Univ. of Amherst, MA (USA) S. Harnad Univ. of Southampton (GB) M. Peschl Univ. of Vienna (A) A. Riegler Univ. of Zurich (CH) H. Risku Univ. of Skovde (S) S. Sjoelander Linkoeping University (S) A. v. Stein Neuroscience Institute, La Jolla (USA) O r g a n i z i n g C o m m i t t e e ___________________________________________________________________________ M. Peschl Univ. of Vienna (A) A. Riegler Univ. of Zurich (CH) T i m e t a b l e ___________________________________________________________________________ Submission due January 7, 1997 Notification of acceptance February 28 Early registration due April 1 Final papers due April 14 Conference date May 13-16, 1997 S p o n s o r i n g O r g a n i z a t i o n s ___________________________________________________________________________ o Christian Doppler Laboratory for Expert Systems (Vienna University of Technology) o Oesterreichische Forschgungsgemeinschaft o Austrian Federal Ministry of Science, Transport and the Arts o City of Vienna A d d i t i o n a l I n f o r m a t i o n ___________________________________________________________________________ For further information on the conference contact: Markus Peschl Dept. for Philosophy of Science University of Vienna Sensengasse 8/10 A-1090 Wien Austria Tel: +43-1-402-7601/41 Fax: +43-1-408-8838 Email: franz-markus.peschl at univie.ac.at General information about the Austrian Society for Cognitive Science can be found on the Society webpage or by contacting Alexander Riegler AILab, Dept. of Computer Science University of Zurich Winterthurerstr. 190 CH-8057 Zurich Switzerland Email: riegler at ifi.unizh.ch R e g i s t r a t i o n f o r m ___________________________________________________________________________ I participate at the Workshop "New Trends in Cognitive Science (NTCS'97)" Full Name ........................................................................ Full Postal Address: ........................................................................ ........................................................................ ........................................................................ Telephone Number (Voice): Fax: ..................................... .................................. Email address: ........................................................................ [ ] I intend to submit a paper Payment in ATS (= Austrian Schillings; 1 US$ is currently about 11 ATS). This fee includes admission to talks, presentations, and proceedings: Before April 1st, 1997: [ ] Member * 1000 ATS (about 90 US$) [ ] Non-Member 1500 ATS (about 135 US$) [ ] Student Member ** 400 ATS (about 36 US$) [ ] Student Non-Member 1000 ATS (about 90 US$) After April 1st, 1997: [ ] Member * 1300 ATS (about 118 US$) [ ] Non-Member 1800 ATS (about 163 US$) [ ] Student Member ** 500 ATS (about 45 US$) [ ] Student Non-Member 1300 ATS (about 118 US$) *) Members of the Austrian Society of Cognitive Science **) Requires proof of valid student ID Total: .................... ATS [ ] Visa [ ] Master-/Eurocard Name of Cardholder ........................................ Credit Card Number ........................................ Expiration Date ................. Date: ................ Signature: ........................................ Please send this form by... o Email to franz-markus.peschl at univie.ac.at, or by o Fax to +43-1-408-8838 (attn. M.Peschl), or by o Mail to Markus Peschl, Dept.for Philosophy of Science, Univ. of Vienna, Sensengasse 8/10, A-1090 Wien, Austria From gaudiano at cns.bu.edu Wed Dec 4 13:26:55 1996 From: gaudiano at cns.bu.edu (Paolo Gaudiano) Date: Wed, 4 Dec 1996 13:26:55 -0500 Subject: NSF Funding opportunity Message-ID: <199612041826.NAA25658@mattapan.bu.edu> The following was forwarded to me by Paul Werbos. It is being sent to various newsgroups and mailing lists, so please accept my apologies if you receive multiple copies. Paolo Gaudiano ---------------------------------------------------------------------- NSF has just announced a new initiative in Learning and Intelligent Systems, with a first-year funding of about $20 million. The announcement is up on the web site: http://www.ehr.nsf.gov/LIS/index.htm The scope of this initiative is not 100% clear from the outside, but the core will involve support of collaborations across major disciplines (e.g. biology, engineering, computer science, psychology, education), hopefully to develop a more unified understanding of learning mechanisms/models/designs/issues applicable to both natural and artificial systems. Neural network models are mentioned, along with several other paradigms, as a tool for understanding and implementing learning in a more unified way across disciplines. Technology development and education will also be a major part of this. The relative emphasis between basic science and software development is still not entirely clear, but both will clearly have a major role. It is clear, however, that there is White House interest. From ndxdendo at rrzn-serv.de Thu Dec 5 10:39:27 1996 From: ndxdendo at rrzn-serv.de (ndxdendo@rrzn-serv.de) Date: Thu, 5 Dec 1996 16:39:27 +0100 (MET) Subject: ICOBIP '97 - Conference Announcement Message-ID: <199612051539.QAA10365@sun1.rrzn-user.uni-hannover.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 12224 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/cf2a83b5/attachment.ksh From jmoody at cogsci.ucsd.edu Mon Dec 9 10:46:17 1996 From: jmoody at cogsci.ucsd.edu (Jay Moody) Date: Mon, 9 Dec 1996 10:46:17 -0500 Subject: Good introductory text for Neural Net Analysis: PCA, Cluster, etc. Message-ID: To those of you who teach analysis of neural networks to students of modest background in statistics: I recently came across this very readable explanation of principal components analysis, cluster analysis, and multidimensional scaling (and more): _Multivariate_Statistical_Methods:_A_Primer. Bryan F. J. Manly. 1994. Chapman & Hall: New York. (paperback) Chapters of particular relevance to (my experience with) neural nets: 1 The material of multivariate analysis -- 5 examples of data/problems where multivariate analysis is useful (mostly from evolutionary biology) 2 Matrix Algebra -- 9 page explanation of essentials (matrix operations, eigenvalues, covariance matricies, etc.) 6 Principal Components Analysis -- with 2 detailed examples 9 Cluster analysis 11 Multidimensional Scaling From smyth at galway.ICS.UCI.EDU Mon Dec 9 15:13:01 1996 From: smyth at galway.ICS.UCI.EDU (Padhraic Smyth) Date: Mon, 09 Dec 1996 12:13:01 -0800 Subject: Sixth International Workshop on AI and Statistics: Final Reminder Message-ID: <9612091213.aa03271@paris.ics.uci.edu> FINAL REMINDER SIXTH INTERNATIONAL AI AND STATISTICS WORKSHOP JANUARY 4TH-7TH, FORT LAUDERDALE, FLORIDA A final reminder that the AI and Statistics Workhop will be held from January 5th to January 7th at the Bahia Mar hotel in Fort Lauderdale, Florida. The workshop program will be preceded on January 4th by what promises to be a day of very interesting tutorials on such topics as - Conditional independence in statistics and AI (A. P. Dawid), - Bayesian time series analysis and forecasting (Mike West), - Learning in information agents (Tom Mitchell), and - Graphical models, neural networks, and machine learning algorithms (Mike Jordan). If you are planning on attending please note that the workshop hotel will relinquish the block of rooms reserved for workshop attendees on December 20th: since the hotel is fully booked it is essential you make reservations before the 20th. Full details (including registration forms and workshop program) are available at: http://www.stat.washington.edu/aistats97/ Padhraic Smyth General Chair, AI-Stats '97 From jdcohen+ at andrew.cmu.edu Tue Dec 10 08:45:56 1996 From: jdcohen+ at andrew.cmu.edu (Jonathan D Cohen) Date: Tue, 10 Dec 1996 08:45:56 -0500 (EST) Subject: postdoc position Message-ID: <8mfKaI_00iWl0462I0@andrew.cmu.edu> Postdoctoral Position: Computational Modeling of Neuromodulation and/or Prefrontal Cortex Function ---------------- Center for the Neural Basis of Cognition Carnegie Mellon Univeristy and the University of Pittsburgh ---------------- A postdocotral position is available starting any time between now and September 1, 1997 for someone interested in pursuing computational modeling approaches to the role of neuromodulation and/or prefrontal cortical function in cognition. The nature of the position is flexible, depending upon the individual's interest and expertise. Approaches can be focused at the neurobiological level (e.g., modeling detailed physiological characteristics of neuromodulatory systems, such as locus coeruleus and/or dopaminergic nuclei, or the circuitry of prefrontal cortex), or at the more cognitive level (e.g., the nature of representations and/or the mechanisms involved in active maintenance of information within prefrontal cortex, and their role in working memory). The primary requirement for the position is a Ph.D. in the cognitive, computational, or neurosciences, and extensive experience with computational modeling work, either at the PDP/connectionist or detailed biophysical level. The candidate will be working directly with Jonathan Cohen within the Department of Psychology at CMU, and collaborating closely with Randy O'Reilly at the University of Colorado, Boulder. Other potential collaborations include members of the Center for the Neural Basis of Cognition (CNBC), including James McClelland, David Lewis, German Barrionuevo, Susan Sesack, G. Bard Ermantrout, as well as collaborators at other institutions, such as Gary Aston-Jones (Hahnemann University), Joseph LeDoux (NYU), and Peter Dayan (MIT). Available resources include direct access to state-of-the-art computing facilities within the CNBC (IBM SP-2 and SGI PowerChallenge) and the Pittsburgh Supercomputing Center, neuroimaging facilities (PET and 3T fMRI at the University of Pittsburgh), and clinical populations (Western Psychiatric Institute and Clinic). Carnegie Mellon University and the University of Pittsburgh are both equal opportunity employers; minorities and women are encouraged to apply. Inquiries can be directed to Jonathan Cohen (jdcohen at cmu.edu) or Randy O'Reilly (oreilly at flies.mit.edu). Applicants should send a CV, a small number of relevant publications, and the names and addresses of at least two references, to: Jonathan D. Cohen Department of Psychology Carnegie Mellon University Pittsburgh, PA 15213 (412) 268-5692 (voice) (412) 268-2810 (fax) From bishopc at helios.aston.ac.uk Wed Dec 11 10:24:37 1996 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Wed, 11 Dec 1996 15:24:37 +0000 Subject: ROSENBAUM FELLOWSHIP Message-ID: <2149.9612111524@sun.aston.ac.uk> ROSENBAUM FELLOWSHIP Isaac Newton Institute University of Cambridge, U.K. Applications are invited for a Rosenbaum Visiting Fellowship at the prestigious Isaac Newton Institute for Mathematical Sciences in Cambridge. The Fellowship will allow the holder to spend 6 months at the Institute from July to December 1997 to coincide with the major scientific programme on Neural Networks and Machine Learning. The stipend will be 17,500 US dollars for the six month period, and travel expenses may also be provided. To be eligible candidates must have a PhD and be U.S. citizens or permanent residents or have resided in the U.S. for a minimum of four years. Informal enquiries may be addressed to: Professor Christopher M. Bishop Organiser, Neural Networks and Machine Learning Neural Computing Research Group Aston University Birmingham B4 7ET C.M.Bishop at aston.ac.uk Tel. +44 (0)121 333 4631 Fax. +44 (0)121 333 4586 Applications, including a CV, list of publications and the names of two referees should be sent by 31 December 1996 to: The Director Professor Keith Moffatt Isaac Newton Institute for Mathematical Sciences 20 Clarkson Road Cambridge CB3 0EH, U.K. From schwenk at IRO.UMontreal.CA Wed Dec 11 12:06:03 1996 From: schwenk at IRO.UMontreal.CA (Holger Schwenk) Date: Wed, 11 Dec 1996 12:06:03 -0500 (EST) Subject: paper on on-line character recognition using "constraint tangent distance" Message-ID: <199612111706.MAA05225@grosse.iro.umontreal.ca> The following paper on on-line character recognition is available on the WEB at http://www.iro.umontreal.ca/~schwenk/papers/icpr96.ps.gz Comments are welcome Holger ------------------------------------------------------------------------------- Holger Schwenk phone: (514) 343-6111 ext 1655 fax: (514) 343-5834 LISA, Dept. IRO University of Montreal email: schwenk at iro.umontreal.ca 2920 Chemin de la tour, CP 6128 http://www.iro.umontreal.ca/~schwenk Montreal, Quebec, H3C 3J7 CANADA ------------------------------------------------------------------------------- Constraint Tangent Distance for On-line Character Recognition H. Schwenk and M. Milgram published in International Conference on Pattern Recognition (ICPR), pp. D:515--519, August 1996 Abstract: --------- In on-line character recognition we can observe two kinds of intra-class variations: small geometric deformations and completely different writing styles. We propose a new approach to deal with these problems by defining an extension of tangent distance (Simard and al, 1993), well known in off-line character recognition. The system has been implemented with a k-nearest neighbor classifier and a so called diabolo classifier respectively (Schwenk and Milgram, 1995). Both classifiers are invariant under transformations like rotation, scale or slope and can deal with variations in stroke order and writing direction. Results are presented for our digit database with more than 200 writers. From phkywong at uxmail.ust.hk Wed Dec 11 03:21:53 1996 From: phkywong at uxmail.ust.hk (Dr. Michael Wong) Date: Wed, 11 Dec 1996 16:21:53 +0800 Subject: Paper available Message-ID: <96Dec11.162157+0800_hkt.102351-9566+1215@uxmail.ust.hk> The following paper, to appear in Europhysics Letters, is now available via anonymous FTP. (4 pages) ============================================================================ FTP-host: physics.ust.hk FTP-files: pub/kymwong/actdyn.ps.gz Improving Pattern Reconstruction in Neural Networks by Activity Dynamics K. Y. Michael Wong Department of Physics, The Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong. E-mail address: phkywong at usthk.ust.hk ABSTRACT I study the averaged dynamical behaviour of neural networks over an extended monitoring period, and consider pattern reconstruction procedures by activity clipping, selectively freezing, or sequentially freezing the dynamic nodes. They enable the retrieval precision to be improved, the basin of attraction to be widened, or the storage capacity to be increased, even when the information is not efficiently embedded in the synaptic weights. ============================================================================ FTP instructions: unix> ftp physics.ust.hk Name: anonymous Password: your full email address ftp> cd pub/kymwong ftp> get actdyn.ps.gz ftp> quit unix> gunzip actdyn.ps.gz unix> lpr actdyn.ps (or ghostview actdyn.ps) From cas-cns at cns.bu.edu Thu Dec 12 10:51:17 1996 From: cas-cns at cns.bu.edu (CAS/CNS) Date: Thu, 12 Dec 1996 10:51:17 -0500 Subject: Vision, Recognition, Action Message-ID: <199612121551.KAA28070@cns.bu.edu> ***** CALL FOR PAPERS ***** International Conference on VISION, RECOGNITION, ACTION: NEURAL MODELS OF MIND AND MACHINE May 28--31, 1997 Sponsored by the Center for Adaptive Systems and the Department of Cognitive and Neural Systems Boston University with financial support from DARPA and ONR This conference will include a day of tutorials (May 28) followed by 3 days of 21 invited lectures and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems see, understand, and act upon a changing world. The meeting program and updates can be found at http://cns-web.bu.edu/cns-meeting/. Hotel and restaurant information can be found there. WEDNESDAY, MAY 28, 1997: TUTORIALS Stephen Grossberg, "Vision, Brain, and Technology" (3 hours in two 1-1/2 hour lectures). Gail Carpenter, "Self-Organizing Neural Networks for Learning, Recognition, and Prediction: ART Architectures and Applications" (2 hours). Eric Schwartz, "Algorithms and Hardware for the Application of Space-Variant Active Vision to High Performance Machine Vision" (2 hours). THURSDAY, MAY 29---SATURDAY, MAY 31, 1997: CONFIRMED INVITED LECTURERS Andreas Andreou, Stuart Anstis, Terrance Boult, Rodney Brooks, Gail Carpenter, Patrick Cavanagh, Robert Desimone, Patricia Goldman-Rakic, Stephen Grossberg, Michael Jordan, John Kalaska, Takeo Kanade, Ennio Mingolla, Lance Optican, Alex Pentland, Tomaso Poggio, Eric Schwartz, Robert Shapley, George Sperling, Larry Squire, and Allen Waxman. CALL FOR ABSTRACTS: Contributed abstracts for talks or posters must be received, in English, by January 31, 1997. Notification of acceptance will be given by February 28, 1997. A meeting registration fee must accompany each Abstract. See Registration Information below for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Each Abstract should fit on one 8" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract, corresponding author and presenting author name, address, telephone, fax, and email address. Preference for oral or poster presentation should be noted. Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. The original and 3 copies of each Abstract should be sent to: CNS Meeting, c/o Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: To register, please fill out the enclosed registration form. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the CNS Meeting address. If paying by credit card, mail to the CNS Meeting address, or fax to (617) 353-7755. STUDENT FELLOWSHIPS: Some fellowships for PhD students and postdocs are available to defray travel and living costs. The deadline for applying for fellowship support is January 31, 1997. Applicants will be notified by February 28, 1997. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. ******************** REGISTRATION FORM (Please Type or Print) Vision, Recognition, Action: Neural Models of Mind and Machine Boston University, Boston, Massachusetts Tutorials: May 28, 1997 Meeting: May 29--31, 1997 Mr/Ms/Dr/Prof: Name: Affiliation: Address: City, State, Postal Code: Phone and Fax: Email: The conference registration fee includes the meeting program, reception, six coffee breaks, and the meeting proceedings. Two coffee breaks and a book of tutorial viewgraph copies will be covered by the tutorial registration fee. CHECK ONE: [ ] $55 Conference plus Tutorial (Regular) [ ] $40 Conference plus Tutorial (Student) [ ] $35 Conference Only (Regular) [ ] $25 Conference Only (Student) [ ] $30 Tutorial Only (Regular) [ ] $25 Tutorial Only (Student) METHOD OF PAYMENT: [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Type of card: Name as it appears on the card: Account number: Expiration date: Signature and date: ******************** From ruppin at math.tau.ac.il Thu Dec 12 18:10:35 1996 From: ruppin at math.tau.ac.il (Eytan Ruppin) Date: Fri, 13 Dec 1996 01:10:35 +0200 (GMT+0200) Subject: Last-CFP:-Modeling-Brain-Disorders Message-ID: <199612122310.BAA27410@gemini.math.tau.ac.il> CALL FOR SUBMISSIONS Special Issue of the Journal "Artificial Intelligence in Medicine" (Published by Elsevier) Theme: COMPUTATIONAL MODELING OF BRAIN DISORDERS Guest-Editors: Eytan Ruppin & James A. Reggia (Tel-Aviv University) (University of Maryland) ------------------------------------------------ **** DEADLINE FOR SUBMISSION IS MARCH 15'th , 1997 **** ------------------------------------------------ BACKGROUND As computational methods for brain modeling have advanced during the last several years, there has been an increasing interest in adopting them to study brain disorders in neurology, neuropsychology, and psychiatry. Models of Alzheimer's disease, epilepsy, aphasia, dyslexia, Parkinson's disease, stroke and schizophrenia have been recently studied to obtain a better understanding of the underlying pathophysiological processes. While computer models have the disadvantage of simplifying the underlying neurobiology and the pathophysiology, they also have remarkable advantages: They permit precise and systematic control of the model variables, and an arbitrarily large number of ``subjects''. They are open to detailed inspection, in isolation, of the influence of various metabolic and neural variables on the disease progression, in the hope of gaining insight into why observed behaviors occur. Ultimately, one seeks a sufficiently powerful model that can be used to suggest new pharmacological interventions and rehabilitative actions. OBJECTIVE OF SPECIAL ISSUE The objective of this special issue on modeling brain disorders is to report on the recent studies in this field. The main goal is to increase the awareness of the AI medical community to this research, currently primarily performed by members of the neural networks and `connectionist' community. By bringing together a series of such brain disorders modeling papers we strive to produce a contemporary overview of the kinds of problems and solutions that this growing research field has generated, and to point to future promising research directions. More specifically, papers are expected to cover one or more of the following topics: -- Specific neural models of brain disorders, expressing the link between their pathogenesis and clinical manifestations. -- Computational models of pathological alterations in basic neural, synaptic and metabolic processes, that may relate to the generation of brain disorders in a significant manner. -- Applications of neural networks that shed light on the pathogenic processes that underlie brain disorders, or explore their temporal evolution and clinical course. -- Methodological issues involved in constructing computational models of brain disorders; obtaining sufficient data, visualizing high-dimensional complex behavior, and testing and validating these models. -- Bridging the apparent gap between functional imaging investigations and current neural modeling studies, arising from their distinct spatio-temporal resolution. SCHEDULE All the submitted manuscripts will be subject to a rigorous review process. The special issue will include 5 papers of 15-20 pages each, plus an editorial. Manuscripts should be prepared in accordance with the journal "submission guidelines" which are available on request, and may also be retrieved from http://www.math.tau.ac.il/~ruppin. March 15, 1997 Receipt of full papers. Three copies of a manuscript should be sent to: Eytan Ruppin Department of Computer Science School of Mathematics Tel-Aviv University Tel-Aviv, Israel, 69978. August 1, 1997 Notification of acceptance October 1, 1997 Receipt of final-version of manuscripts June 1998 Publication of AIM special issue From giles at research.nj.nec.com Fri Dec 13 09:52:54 1996 From: giles at research.nj.nec.com (Lee Giles) Date: Fri, 13 Dec 96 09:52:54 EST Subject: TR available Message-ID: <9612131452.AA00658@alta> The following Technical Report is available via the University of Maryland Department of Computer Science and the NEC Research Institute archives: ____________________________________________________________________ HOW EMBEDDED MEMORY IN RECURRENT NEURAL NETWORK ARCHITECTURES HELPS LEARNING LONG-TERM DEPENDENCIES Technical Report CS-TR-3626 and UMIACS-TR-96-28, Institute for Advanced Computer Studies, University of Maryland, College Park, MD 20742 Tsungnan Lin{1,2}, Bill G. Horne{1}, C. Lee Giles{1,3} {1}NEC Research Institute, 4 Independence Way, Princeton, NJ 08540 {2}Department of Electrical Engineering, Princeton University, Princeton, NJ 08540 {3}UMIACS, University of Maryland, College Park, MD 20742 ABSTRACT Learning long-term temporal dependencies with recurrent neural networks can be a difficult problem. It has recently been shown that a class of recurrent neural networks called NARX networks perform much better than conventional recurrent neural networks for learning certain simple long-term dependency problems. The intuitive explanation for this behavior is that the output memories of a NARX network can be manifested as jump-ahead connections in the time-unfolded network. These jump-ahead connections can propagate gradient information more efficiently, thus reducing the sensitivity of the network to long-term dependencies. This work gives empirical justification to our hypothesis that similar improvements in learning long-term dependencies can be achieved with other classes of recurrent neural network architectures simply by increasing the order of the embedded memory. In particular we explore the impact of learning simple long-term dependency problems on three classes of recurrent neural networks architectures: globally recurrent networks, locally recurrent networks, and NARX (output feedback) networks. Comparing the performance of these architectures with different orders of embedded memory on two simple long-term dependences problems shows that all of these classes of networks architectures demonstrate significant improvement on learning long-term dependencies when the orders of embedded memory are increased. These results can be important to a user comfortable to a specific recurrent neural network architecture because simply increasing the embedding memory order will make the architecture more robust to the problem of long-term dependency learning. ------------------------------------------------------------------- KEYWORDS: discrete-time, memory, long-term dependencies, recurrent neural networks, training, gradient-descent PAGES: 15 FIGURES: 7 TABLES: 2 ------------------------------------------------------------------- http://www.neci.nj.nec.com/homepages/giles.html http://www.cs.umd.edu/TRs/TR-no-abs.html or ftp://ftp.nj.nec.com/pub/giles/papers/UMD-CS-TR-3626.recurrent.arch.long.term.ps.Z ------------------------------------------------------------------------------------ -- C. Lee Giles / Computer Sciences / NEC Research Institute / 4 Independence Way / Princeton, NJ 08540, USA / 609-951-2642 / Fax 2482 www.neci.nj.nec.com/homepages/giles.html == From erikf at sans.kth.se Fri Dec 13 05:41:45 1996 From: erikf at sans.kth.se (erikf@sans.kth.se) Date: Fri, 13 Dec 1996 11:41:45 +0100 Subject: PhD Thesis Available Message-ID: <199612131041.LAA11528@sans03.nada.kth.se> My PhD thesis is available at my home page: http://www.nada.kth.se/~erikf/publications.html It is also available for anonymous ftp downloading: ftp://ftp.nada.kth.se/pub/documents/SANS/reports/ps/ef-thesis.tar.Z ftp://ftp.nada.kth.se/pub/documents/SANS/reports/ps/ef-thesis-summary.ps.Z The complete thesis is 2.4Mb and un-tars into 15.5Mb postscript files. The summary is 530kb and prints on 68 pages. Biophysical Simulation of Cortical Associative Memory Erik Fransen Studies of Artificial Neural Systems Department of Numerical Analysis and Computing Science Royal Institute of Technology, S-100 44 Stockholm, Sweden erikf at sans.kth.se The associative memory function of the brain is an active area of experimental and theoretical research. This thesis describes the construction of a model of cortical auto-associative memory. Conceptually, it is based on Hebb's cell assembly hypothesis. The quantitative description comes from a class of artificial neural networks, ANN, with recurrent connectivity and attractor dynamics. More specifically, this work has concentrated on problems related to how this formal network description could be translated into a neurobiological model. In this work I have used a relatively detailed description of the neurons which includes changes over time for the potential and current distributions of the different parts of the cell, as well as calcium ion flux and some of its electrophysiological effects. The features of this associative memory model are interpreted in Gestalt psychological terms and discussed in relation to features of priming, as gained from memory psychological experiments. The model output is compared to single cell recordings in working memory experiments as well as to results from a slice preparation of the hippocampus region. A hypothesis for the functional role of the variable resting potentials and background activities that are seen in experiments has been put forward. This hypothesis is based on the bias values which are produced by the learning in an ANN and result in different `a priori firing probabilities of the cells. It is also shown that it is possible to increase the degree of similarity to the cortical circuitry with the cortical column model. This model can function as a content-addressable memory, as expected. Initially, the network structure and the cell types have to be determined. The next part of the work is the identification of what cell properties should be modeled. The initial results include a demonstration that cells described at this detail can support the assembly operations (persistent after-activity, pattern completion and pattern rivalry) shown for ANNs. The importance of adequate cell properties for network function was confirmed. For example, with pyramidal type cells the network produced the desired assembly operations, but with motoneuron type cells it did not. There are also results which are not dependent on the assembly hypothesis. The network can stabilize in a relatively short time and at sub-maximal cell firing frequencies despite time delays and the recurrent connectivity which provides positive feed-back. Further, the network activity may be controlled by modeling the effects of neuromodulators such as serotonin. Instances of spike synchronization and burst synchronization were found in networks that did not have any inhibitory cells. It is concluded that this type of attractor network model can be used as a valuable tool in the study of cortical associative memory, and that detailed cell models are very useful for testing the biological relevance of such models. Keywords: after--activity, attractor network, biologically realistic neural networks, computational neuroscience, computer simulation, cortical associative memory, Hebbian cell assemblies, neural modeling, recurrent artificial neural network, pattern completion, pattern rivalry Fransen E. Thesis, 1996 Biophysical Simulation of Cortical Associative Memory. Dept. of Numerical Analysis and Computing Science, Royal Institute of Technology, Stockholm, Sweden, ISBN 91-7170-689-5, TRITA-NA-P96/28 __---~~~--___ |----------------------------------| _____________ _-~ )----+ Erik Fransen +----| Studies of |\ ( )---+ Department of Numerical Analysis +----| Artificial | | ( ___-- )--+ and Computing Science +----| Neural | | (___-~ __) | Royal Institute of Technology | | Systems | | (____ _--~~ ) | S-100 44 Stockholm, Sweden | |____________|_| `~~\ ~--~~ | EMail: erikf at sans.kth.se | _____|___|_____ \--\ | http://www.nada.kth.se/~erikf | /_+46-8-7906904/ |----------------------------------| From ejua71 at tattoo.ed.ac.uk Tue Dec 17 14:32:27 1996 From: ejua71 at tattoo.ed.ac.uk (J A Bullinaria) Date: Tue, 17 Dec 96 19:32:27 GMT Subject: CFP: NCPW4 Message-ID: <9612171932.aa03068@uk.ac.ed.tattoo> 4th Neural Computation and Psychology Workshop Connectionist Representations : Theory and Practice University of London, England Wednesday 9th April - Friday 11th April 1997 AIMS AND OBJECTIVES This workshop is the fourth in a series, following on from the first at the University of Wales, Bangor (with theme "Neurodynamics and Psychology"), the second at the University of Edinburgh, Scotland ("Memory and Language") and the third at the University of Stirling, Scotland ("Perception"). The general aim is to bring together researchers from such diverse disciplines as artificial intelligence, applied mathematics, cognitive science, computer science, neurobiology, philosophy and psychology to discuss their work on the connectionist modelling of psychology. Next years workshop is to be hosted jointly by members of the Psychology Departments of Birkbeck College London and University College London. As in previous years there will be a theme to the workshop. We think that next years theme is sufficiently wide ranging and important that researchers in all areas of Neural Computation and Psychology will find it relevant and have something to say on the subject. The theme is to be: "Connectionist Representations : Theory and Practice". This covers many important issues ranging from the philosophical (such as the grounding problem) to the physiological (what can connectionist representations tell us about real neural systems) to the technical (such as what is necessary to get specific models to work). The organisation of the final program will depend on the submissions received, but particular topics might, for example, include: * Understanding representations developed in trained networks. * Merits of local v. distributed representations. * Semantic representation and hierarchies. * The problem of serial order. * The representation of time. * Neural networks and the neurophysiology/neuropsychology of representation. As in previous years we aim to keep the workshop fairly small, informal and single track. As always, participants bringing expertise from outside the UK are particularly welcome. PROVISIONAL INVITED SPEAKERS Roland Baddeley (Oxford) Dennis Norris (APU Cambridge) Gordon Brown (Warwick) Mike Page (APU Cambridge) Tony Browne (Mid Kent) Tim Shallice (UCL) Neil Burgess (UCL) Leslie Smith (Stirling) Nick Chater (Warwick) Chris Thornton (Sussex) Glyn Humphreys (Birmingham) Janet Vousden (Warwick) Bob Kentridge (Durham) CALL FOR ABSTRACTS In addition to our invited speakers, we invite other potential participants to submit abstracts of proposed talks and/or posters. As in previous years, after the workshop, selected presenters will be invited to produce a written version of their talk or poster for inclusion in a refereed proceedings. Abstracts (one page) should arrive by email at "ncpw4 at psychol.ucl.ac.uk" before 31st January 1997. Acceptance notices, registration details and a provisional program will be sent out mid-February. REGISTRATION, FOOD AND ACCOMMODATION The workshop will be held in University College London, which is situated in the centre of London, near the British Museum and within easy walking distance of the West End and many of London's major attractions. The conference registration fee (which includes lunch and morning and afternoon tea/coffee each day) will be approximately 60 pounds. A special conference dinner is planned for the Thursday evening costing approx. 20 pounds. Accommodation can be arranged in student residences or in local hotels, according to budget. The conference/ accommodation area is easily accessible by the London Underground system ("The Tube"), with direct lines from London Heathrow Airport and all the major intercity train stations. Additional information will appear nearer the workshop date on the conference web page at: "http://prospero.psychol.ucl.ac.uk/ncpw4/". ORGANISING COMMITTEE John Bullinaria (Birkbeck College London) Dave Glasspool (University College London) George Houghton (University College London) CONTACT DETAILS Workshop email address for all correspondence: ncpw4 at psychol.ucl.ac.uk John Bullinaria, NCPW4, Centre for Speech and Language, Department of Psychology, Birkbeck College, Malet Street, London WC1E 7HX, UK. Phone: +44 171 631 6330, Fax: +44 171 631 6587 Email: j.bullinaria at psyc.bbk.ac.uk Dave Glasspool, NCPW4, Department of Psychology, University College London, Gower Street, London WC1E 6BT, UK. Phone: +44 171 380 7777 Xtn. 5418. Fax: +44 171 436 4276 Email: d.glasspool at psychol.ucl.ac.uk George Houghton, NCPW4, Department of Psychology, University College London, Gower Street, London WC1E 6BT, UK. Phone: +44 171 380 7777 Xtn. 5394. Fax: +44 171 436 4276 Email: g.houghton at psychol.ucl.ac.uk From marco at McCulloch.ing.UniFI.IT Tue Dec 17 11:40:55 1996 From: marco at McCulloch.ing.UniFI.IT (Marco Gori) Date: Tue, 17 Dec 1996 17:40:55 +0100 Subject: postdoc fellowships at University of Siena (Italy) Message-ID: <9612171640.AA11169@McCulloch.ing.UniFI.IT> ============================================================================== POST-DOC FELLOWSHIPS AT UNIVERSITY OF SIENA Faculty of Engineering Dipartimento di Ingegneria dell'Informazione Via Roma, 56 - 53100 Siena (Italy) ============================================================================== Two post-doctoral fellowships are available at Dipartimento di Ingegneria dell'Informazione, University of Siena. The position is for two years. Among other research areas, people with experience in the field of recurrent networks, hybrid systems, and combinatorial optimization are highly encoraged to apply. Research at DII in the field of neural nets is carried out jointly with people at Dipartimento di Sistemi e Informatica, University of Florence. The candidates must send the application to Universita' di Siena Sezione Dottorato di Ricerca Via Banchi di Sotto, 46 53100 Siena (Italy) For further information, please send an e-mail to me (see my signature). Best regards, -- Marco Gori. ================================================================================================== Marco Gori Email: marco at mcculloch.ing.unifi.it WWW: http://www-dsi.ing.unifi.it/neural Universita' di Siena c/o Universita' di Firenze V. Roma, 56 - Siena (Italy) V. S. Marta, 3 - 50139 Firenze (Italy) Voice: +39 577 26-36-04; Fax: +39 577 26-36-02 Voice: +39 55 479-6265; Fax: +39 55 479-6363 ================================================================================================== From A.Sharkey at dcs.shef.ac.uk Fri Dec 20 08:32:00 1996 From: A.Sharkey at dcs.shef.ac.uk (Amanda Sharkey) Date: Fri, 20 Dec 96 13:32:00 GMT Subject: Special Issue of Connection Science Message-ID: <9612201332.AA18966@gw.dcs.shef.ac.uk> SPECIAL ISSUE of Connection Science, Volume 8, Numbers 3 & 4, December 1996. COMBINING ARTIFICIAL NEURAL NETS: ENSEMBLE APPROACHES. ----------------------------------------------------- Amanda J.C. Sharkey. On Combining Artificial Neural Nets. 299 Sherif Hashem. Effects of Collinearity on Combining Neural Networks. 315 David W. Opitz & Jude W. Shavlik. Actively Searching for an Effective Neural Network Ensemble. 337 Yuval Raviv & Nathan Intrator. Bootstrapping with Noise: An Effective Regularization Technique. 355 Bruce E. Rosen. Ensemble Learning Using Decorrelated Neural Networks. 373 Kagan Tumer & Joydeep Ghosh. Error Correlation and Error Reduction in Ensemble Classifiers. 385 Bambang Parmanto, Paul W. Munro & Howard R. Doyle. Reducing Variance of Committee Prediction with Resampling Techniques. 405 Peter A. Zhilkin & Ray L. Somorjai. Application of Several methods of Classification Fusion to Magnetic Resonance Spectra. 427 For information about Connection Science journal, see http://www.carfax.co.uk/cos-con.htm From sontag at control.rutgers.edu Fri Dec 20 11:31:58 1996 From: sontag at control.rutgers.edu (Eduardo Sontag) Date: Fri, 20 Dec 1996 11:31:58 -0500 Subject: TR available - Learning problems for recurrent nets Message-ID: <199612201631.LAA01544@control.rutgers.edu> VAPNIK-CHERVONENKIS DIMENSION OF RECURRENT NEURAL NETWORKS Pascal Koiran, LIP-Lyon, France Eduardo D. Sontag, Rutgers, USA DIMACS Tech Report 96-56. (Summary to appear in Proceedings of Third European Conference on Computational Learning Theory, Jerusalem, March 17-19, 1997.) ABSTRACT This paper provides lower and upper bounds for the VC dimension of recurrent networks. Several types of activation functions are discussed, including threshold, polynomial, piecewise-polynomial and sigmoidal functions. The bounds depend on two independent parameters: the number w of weights in the network, and the length k of the input sequence. Ignoring multiplicative constants, the main results say roughly the following: 1. For architectures whose activation is any fixed nonlinear polynomial, the VC dimension is proportional to wk. 2. For architectures whose activation is any fixed piecewise polynomial, the VC dimension is between wk and w**2 k. 3. For architectures with threshold activations, the VC dimension is between wlog(k/w) and min{wklog(wk),w**2+wlog(wk)}. 4. For the standard sigmoid tanh(x), the VC dimension is between wk and w**4 k**2. ============================================================================ The paper can be retrieved from the DIMACS archive: http://dimacs.rutgers.edu/TechnicalReports/1996.html as well as from Sontag's HomePage: http://www.math.rutgers.edu/~sontag (follow link to "online papers"). Many other related papers can be also found at this latter site. If Web access if inconvenient, it is also possible to use anonymous FTP: ftp dimacs.rutgers.edu login: anonymous cd pub/dimacs/TechnicalReports/TechReports/1996/ bin get 96-56.ps.gz Once file is retrieved, use gunzip to uncompress and then print as postscript. ============================================================================ Comments welcome. Happy connecting holidays! From seckel at klab.caltech.edu Fri Dec 20 12:50:44 1996 From: seckel at klab.caltech.edu (Al Seckel) Date: Fri, 20 Dec 1996 09:50:44 -0800 (PST) Subject: Caltech site on brain and cognitive science, illusio Message-ID: <199612201750.JAA25870@thales.klab.caltech.edu> Greetings, Christof Koch of the California Institute of Technology and I have been actively researching and studying the neuronal correlates of visual and other sensory illusions. In this regard we have amassed the world's largest collection of illusions, most of which are unpublished. We have put together a massive multimedia project where the user can vary critical parameters on each illusion thereby testing the underlying mechanism. This has never been possible before in the printed medium. We have also put up an enormous web site on illusions, perception, and brain and cognitive science for interested professionals and laypeople complete with interactive demonstrations, illusionary artwork, puzzles, bibliographies, recommended school projects, merchandise, and the like. Much of the material on the site is unpublished. We would very much appreciate it if you could let your subscribers know about this site as it contains up-to-date scientific explanations and demonstrations not available anywhere else and of extreme interest to our community. The present web address is www.lainet.com/illusions After Monday it will be www.illusionworks.com Thanks very much! al From hochreit at informatik.tu-muenchen.de Mon Dec 30 06:32:48 1996 From: hochreit at informatik.tu-muenchen.de (Josef Hochreiter) Date: Mon, 30 Dec 1996 12:32:48 +0100 Subject: LSTM paper announcement Message-ID: <96Dec30.123252+0100met_dst.49137+394@papa.informatik.tu-muenchen.de> LONG SHORT-TERM MEMORY Sepp Hochreiter, TUM Juergen Schmidhuber, IDSIA Substantially revised and extended Version 3.0 of TR FKI-207-95 (32 pages 130 KB; formerly 8 pages 50 KB), with numerous additional experiments and details. Abstract. Learning to store information over extended time intervals via recurrent backpropagation takes a very long time, mostly due to insufficient, decaying error back flow. We briefly review Hochreiter's 1991 analysis of this problem, then address it by introducing a novel, efficient method called "Long Short-Term Memory" (LSTM). LSTM can learn to bridge time lags in excess of 1000 steps by enforcing constant error flow through "constant error carrousels" (CECs) within special units. Multiplicative gate units learn to open and close access to CEC. LSTM's update complexity per time step is O(W), where W is the number of weights. In comparisons with RTRL, BPTT, Recurrent Cascade-Correlation, Elman nets, and Neural Sequence Chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex long time lag tasks that have never been solved by previous recurrent net algorithms. LSTM works with local, distributed, real-valued, and noisy pattern representations. Recent spin-off papers: LSTM can solve hard long time lag problems. To appear in NIPS 9, MIT Press, Cambridge MA, 1997. Bridging long time lags by weight guessing and "Long Short-Term Memory". In F. L. Silva, J. C. Principe, L. B. Almeida, eds., Frontiers in Arti- ficial Intelligence and Applications, Volume 37, pages 65-72, IOS Press, Amsterdam, Netherlands, 1996. _______________________________________________________________________ WWW/FTP pointers: ftp://flop.informatik.tu-muenchen.de/pub/fki/fki-207-95rev.ps.gz ftp://ftp.idsia.ch/pub/juergen/lstm.ps.gz For additional recurrent net papers see our home pages. For instance, the original analysis of recurrent nets' error flow and long time lag problems is in Sepp's 1991 thesis (p. 19-21). http://www7.informatik.tu-muenchen.de/~hochreit/pub.html http://www.idsia.ch/~juergen/onlinepub.html Happy new year! Sepp & Juergen PS: Why don't you stop by at IDSIA and give a talk next time you are near Switzerland or Italy? From angelo at crc.ricoh.com Tue Dec 31 14:00:33 1996 From: angelo at crc.ricoh.com (Michael Angelo (496-5735) Date: Tue, 31 Dec 1996 11:00:33 -0800 Subject: Job Opening (in Menlo Park, CA.) Message-ID: <199612311900.LAA02756@jaguar.crc.ricoh.com> The Ricoh California Research Center's Machine Learning and Perception Group invites exceptionally talented candidates to apply for a position as Research scientist in Information Technology Position description: * We seek applicants to join a small team of scientists and engineers exploring the use of machine learning and pattern recognition techniques in the general area of office information systems. Past and ongoing projects include o computer lipreading and speech-based interfaces o theory and application of neural network pruning methods o providing paper and electronic documents with novel functionality o theory for VLSI implementations of learning algorithms o novel human-machine interfaces o applications of the world-wide web * Ricoh CRC is a small center near Stanford University and other Silicon Valley landmarks; the atmosphere is collegial and exciting, and provides opportunities to expand Ricoh's products and services, travel nationally and internationally to professional conferences and presentations, publish in journals, and otherwise participate in the broader technical and professional community. Candidate requirements: * Ph.D. degree in Electrical Engineering, Computer Science or related field. (In exceptional cases, an M.S. degree with relevant work experience will suffice.) * Exceptionally strong C programming and Unix skills (experimental, not necessarily production), with experience in programming mathematical algorithms. C++, Java, Mathematica, MatLab and some parallel language are desirable. * Knowledge of neural networks, statistical and syntactic pattern recognition, image processing, handwriting recognition, natural language processing, and related topics is highly desirable. * Stong communication and organizational skills and the ability to learn quickly and to work both independently with minimal instruction and as part of a small team. Application deadline: * January 30, 1997 (hardcopy required -- see below). ---------------------------------------------------------------------------- RICOH California Research Center (RCRC): RCRC is a small research center in Menlo Park, CA, near the Stanford University campus and other Silicon Valley landmarks. The roughly 20 researchers focus on image compression and processing, pattern recognition, image and document analysis, artificial intelligence, machine learning, electronic service, and novel hardware for implementing computationally expensive algorithms. RCRC is a part of RICOH Corporation, the wholly owned subsidiary of RICOH Company, Ltd. in Japan. RICOH is a pioneer in facsimile, copiers, optical equipment, office automation products and more. Ricoh Corporation is an Equal Employment Opportunity Employer . ---------------------------------------------------------------------------- Please send any questions by e-mail to the address below, and type "Programming job" as your header line. Full applications (which must include a resume and the names and addresses of at least two people familiar with your work) should be sent by surface mail (no e-mail, ftp or html applications will be accepted) to: Dr. David G. Stork Chief Scientist RICOH California Research Center 2882 Sand Hill Road, Suite 115 Menlo Park CA 94025 stork at crc.ricoh.com ---------------------------------------------------------------------------- Web Version: http://www.crc.ricoh.com/jobs/MLPjob.html From orponen at math.jyu.fi Sun Dec 1 07:09:22 1996 From: orponen at math.jyu.fi (Pekka Orponen) Date: Sun, 1 Dec 1996 14:09:22 +0200 (EET) Subject: analog noise In-Reply-To: Message-ID: Dear Connectionists: Sorry to waste the bandwidth, but I think the facts are still not straight. On Sat, 30 Nov 1996, Mike Casey wrote: > > Regarding the RNN+noise implying only finite state machine power result, I > completely agree that their result is a nice extension of the corollary > that I proved The readers are welcome to look up the papers and compare our "nice" extension" to Dr. Casey's proof -- or come see our poster at NIPS. (We actually obtained our result before seeing Dr. Casey's paper, but that is beside the point.) > they failed to mention that I had already > proved something very similar. This is true, and an unfortunate omission from our part. While we do discuss Dr. Casey's paper and point out the limitations of the noise model he considers, I only now realize that we failed to include an explicit statement to the effect that "Casey proved an analogous result in his model". I can assure that this omission was not intentional, but was caused by our finding out about Dr. Casey's work only after completing the first version of our paper, and thus adding the technical comparisons to an already existing text. To us it was so clear _what_ Dr. Casey's result was that we forgot to mention that explicitly. I suppose this one sentence would have set the record straight and saved us all from this discussion. > Their "clipped" Gaussian noise is a special > case of bounded noise with arbitrary distribution (Bowen's pseudo-orbit > formalism), so there's no sense in which they "relaxed" the definition of > analog noise. Well, the "clipped" Gaussian noise is bounded because the state space is. It is not technically quite clear that large noise levels could not be used in some devious way in our setting, but this is a minor issue. Also, if I read Dr. Casey's paper correctly, he assumes that the noise level is nonzero everywhere within an \epsilon radius of the correct state, an assumption that we do not need. But these are little technical points that probably could be changed also in his proof. > wouldn't lead to anything interesting). Finally, in section 4 of their > paper where they concretely discuss RNNs performing computations, they > assume that the noise is bounded and that the computation is done with > perfect reliability (which were precisely the assumptions that I used > which they have spent so much time discrediting in other parts of the > paper). This is not quite fair, because in this section we do not _assume_ that the computation can be performed with 100% reliablity, but _prove_ that it can be, in the case that one only wants to simulate finite state automata, and the noise is bounded at some sufficiently low level. (This is actually an embarrassingly simple result, which we just didn't find in the literature for the basic BSB network model we consider.) The computational upper bound result, which is more significant, shows that even if we don't assume 100% reliability, we still cannot do more than finite automata. I will try to keep offline from now; see you at NIPS. -- Pekka Orponen From burkhard at zinfandel.cse.ogi.edu Sun Dec 1 16:03:58 1996 From: burkhard at zinfandel.cse.ogi.edu (Annette Burkhardt) Date: Sun, 1 Dec 96 13:03:58 -0800 Subject: Jobs in Financial Analysis at Nonlinear Prediction Systems Message-ID: <9612012103.AA13119@zinfandel.cse.ogi.edu> NONLINEAR PREDICTION SYSTEMS Portland, Oregon RESEARCH POSITIONS IN NONLINEAR STATISTICAL ANALYSIS OF FINANCIAL MARKETS Nonlinear Prediction Systems (NPS) is a small firm based in Portland, Oregon doing research in forecasting, trading, statistical arbitrage, and global risk management for the world's major financial markets. The current research group consists of John Moody, Steve Rehfuss, and Lizhong Wu. We are seeking highly talented and well-qualified candidates to fill Research Scientist and Research Programmer positions. In both cases, the work will involve the development and application of advanced modeling techniques from nonparametric statistics, time series analysis, neural networks, machine learning, data mining, genetic algorithms, and econometrics to challenging problems in financial market modeling. Successful applicants will have the following qualifications: * Research Scientist (or Senior Research Scientist): Ph.D. in mathematics, physics, statistics, econometrics, electrical engineering, computer science, or related field, a strong research track record, and excellent data analysis and computing skills. The level of appointment and compensation will depend on experience. * Research Programmer: M.S. in one of the above mentioned fields (or equivalent experience), strong quantitative skills, plus exceptional programming or software engineering skills. Experience with databases, Unix systems programming and management, or Windows NT programming are desirable. For either position, preference will be given to candidates who have experience in modeling noisy, real-world data using state-of-the-art nonparametric or nonlinear techniques. Knowledge of statistics or numerical analysis and experience with C++, S-PLUS, or MATLAB is highly desirable. Of particular interest are candidates who have training in finance or experience in trading or modeling the financial markets. Applicants must be willing to work in close collaboration with other research group members, and must be eager to tackle extremely challenging and potentially rewarding data analysis problems in finance. NPS is located near downtown Portland with views of the city and mountains. The work atmosphere is relaxed and informal. Portland is noted for it's many fine restaurants, brew pubs, excellent cultural attractions, and beautiful surroundings. The Oregon Coast, Mt. Hood, and the Columbia Gorge are within 75 minute drives of Portland. Employment at NPS offers significant opportunities for growth in compensation and responsibility. Initial compensation will be competitive based on experience. Interested applicants should email resumes (ascii or postscript) with names and phone numbers of three to five references to burkhard at cse.ogi.edu, or send by US mail to: Attn: Recruiting Nonlinear Prediction Systems PO Box 681, University Station Portland, OR 97207 USA NPS is an equal opportunity employer. Candidates who are attending NIPS this week and who would like an interview should contact John Moody at the Marriott in Denver or at the Silvertree Hotel in Snowmass. From S.Renals at dcs.shef.ac.uk Mon Dec 2 06:41:05 1996 From: S.Renals at dcs.shef.ac.uk (Steve Renals) Date: Mon, 2 Dec 1996 11:41:05 GMT Subject: Postdoc: Speech Recognition at Sheffield University (UK) Message-ID: <199612021141.LAA01116@elvis.dcs.shef.ac.uk> University of Sheffield Department of Computer Science Research Associate in Continuous Speech Recognition (Ref: R1039) Applications are invited for the above post which is tenable for three years from February 1997. The post is part of an EU Long Term Research project (THISL) that will develop a system for the indexing and retrieval of information from large speech recordings and TV/radio broadcasts. The main emphasis of the work will be developing hybrid connectionist/HMM algorithms and systems for very large vocabulary broadcast speech recognition. Candidates for the post will be expected to hold a PhD in a relevant discipline (e.g. Electrical Engineering, Computer Science, Applied Mathematics), or to have acquired equivalent experience. The successful candidate will probably have had research experience in the area of speech recognition, neural computing or language modelling. Salary will be in the range \pounds 14,317 to \pounds 19,948. Closing date for applications: 7 January 1997. For further information about the post contact Steve Renals or on the web, http://www.dcs.shef.ac.uk/research/groups/spandh/projects/thisl.html Application forms are available are available from the Director of Human Resources, University of Sheffield, Western Bank, Sheffield S10 2TN, UK, tel: +44-114-279-9800, email: jobs at sheffield.ac.uk, citing reference R1039. -------------------------------------------------------------------------- Steve Renals mailto:s.renals at dcs.shef.ac.uk Dept of Computer Science http://www.dcs.shef.ac.uk/~sjr/ Sheffield University phone: +44-114-222-1836 Regent Court fax: +44-114-278-0972 211 Portobello Street Sheffield S1 4DP UK From kruschke at croton.psych.indiana.edu Mon Dec 2 13:04:02 1996 From: kruschke at croton.psych.indiana.edu (John Kruschke) Date: Mon, 2 Dec 1996 13:04:02 -0500 (EST) Subject: TR Announcement: Rules and exemplars in category learning Message-ID: <9612021804.AA01457@croton.psych.indiana.edu> Rules and Exemplars in Category Learning Michael A. Erickson and John K. Kruschke Indiana University, Bloomington Psychological theories of categorization have generally focused on either rule- or exemplar-based explanations of categorization. We present two experiments that show evidence of both rule induction and exemplar encoding, and we present a connectionist model (ATRIUM) that specifies a mechanism for combining rule- and exemplar-based representation. In both experiments participants learned to classify items, most of which followed a simple rule although there were a few, frequently occurring exceptions. Experiment 1 examined how people extrapolate beyond the range of trained instances. Experiment 2 examined the effects of instance frequency on generalization to novel cases. We found that categorization behavior was well described by the model, in which exemplar representation is used for both rule and exception processing. A key element in correctly modeling categorization in tasks such as these was capturing the interaction between the rule- and exemplar-based representational structures using shifts of attention between rules and exemplars. This report is also available for electronic retrieval (uncompressed PostScript, 589 Kbytes) from http://www.indiana.edu/~kruschke/ek96_abstract.html A very limited number of paper copies are also available; request Cognitive Science Technical Report #183, by Erickson & Kruschke, from iucogsci at indiana.edu -- John K. Kruschke office: (812) 855-3192 Dept. of Psychology fax: (812) 855-4691 Indiana University http://www.indiana.edu/~kruschke/ Bloomington, IN 47405-1301 kruschke at indiana.edu From nq6 at columbia.edu Tue Dec 3 21:11:20 1996 From: nq6 at columbia.edu (Ning Qian) Date: Tue, 3 Dec 1996 21:11:20 -0500 (EST) Subject: stereo paper available Message-ID: <199612040211.VAA18249@labdien.cc.columbia.edu> The following paper (and some related ones, see below) on stereo vision can be downloaded from the web site: http://brahms.cpmc.columbia.edu/ Physiological Computation of Binocular Disparity Ning Qian and Yudong Zhu (to appear in Vision Research) We previously proposed a physiologically realistic model for stereo vision based on the quantitative binocular receptive field profiles mapped by Freeman and coworkers. Here we present several new results about the model that shed light on the physiological processes involved in disparity computation. First, we show that our model can be extended to a much more general class of receptive field profiles than the commonly used Gabor functions. Second, we demonstrate that there is, however, an advantage of using the Gabor filters: Similar to our perception, the stereo algorithm with the Gabor filters has a small bias towards zero disparity. Third, we prove that the complex cells as described by Freeman et al.\ compute disparity by effectively summing up two related cross products between the band-pass filtered left and right retinal image patches. This operation is related to cross-correlation but it overcomes some major problems with the standard correlator. Fourth, we demonstrate that as few as two complex cells at each spatial location are sufficient for a reasonable estimation of binocular disparity. Fifth, we find that our model can be significantly improved by considering the fact that complex cell receptive fields are on average larger than those of simple cells. This fact is incorporated into the model by averaging over {\em several} quadrature pairs of simple cells with nearby and overlapping receptive fields to construct a model complex cell. The disparity tuning curve of the resulting complex cell is much more reliable than that constructed from a {\em single} quadrature pair of simple cells used previously, and the computed disparity maps for random dot stereograms with the new algorithm are very similar to human perception, with sharp transitions at disparity boundaries. Finally, we show that under most circumstances our algorithm works equally well with either of the two well-known receptive field models in the literature. Related papers on the same web site: "A Physiological Model for Motion-stereo Integration and a Unified Explanation of Pulfrich-like Phenomena", Ning Qian and Richard A. Andersen, Vision Research, (in press). "Binocular Receptive Field Profiles, Disparity Tuning and Characteristic Disparity" Yudong Zhu and Ning Qian, Neural Computation, 1996, 8:1647-1677. "Computing Stereo Disparity and Motion with Known Binocular Cell Properties", Ning Qian, Neural Computation, 1994, 6:390-404. From Dave_Touretzky at DST.BOLTZ.CS.CMU.EDU Tue Dec 3 21:34:22 1996 From: Dave_Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave_Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Tue, 03 Dec 96 21:34:22 EST Subject: CNBC graduate training program Message-ID: <22818.849666862@DST.BOLTZ.CS.CMU.EDU> Graduate Training with the Center for the Neural Basis of Cognition The Center for the Neural Basis of Cognition offers interdisciplinary Ph.D. and postdoctoral training programs operated jointly with affiliated programs at Carnegie Mellon University and the University of Pittsburgh: Carnegie Mellon University of Pittsburgh Biological Sciences Mathematics Computer Science Neurobiology Psychology Neuroscience Robotics Psychology The Center is dedicated to the study of the neural basis of cognitive processes including learning and memory, language and thought, perception, attention, and planning; to the study of the development of the neural substrate of these processes; to the study of disorders of these processes and their underlying neuropathology; and to the promotion of applications of the results of these studies to artificial intelligence, robotics, and medicine. CNBC students have access to some of the finest facilities for cognitive neuroscience research in the world: Positron Emission Tomography (PET) and Magnetic Resonance Imaging (MRI) scanners for functional brain imaging, neurophysiology laboratories for recording from brain slices and from anesthetized or awake, behaving animals, electron and confocal microscopes for structural imaging, high performance computing facilities including an in-house supercomputer for neural modeling and image analysis, and patient populations for neuropsychological studies. Students are admitted jointly to a home department and the CNBC Training Program. Applications are encouraged from students with interests in biology, neuroscience psychology, engineering, physics, mathematics, computer science, or robotics. For a brochure describing the program and application materials, contact us at the following address: Center for the Neural Basis of Cognition 115 Mellon Institute 4400 Fifth Avenue Pittsburgh, PA 15213 Tel. (412) 268-4000. Fax: (412) 268-5060 email: cnbc-admissions at cnbc.cmu.edu This material is also available on our web site at http://www.cnbc.cmu.edu The CNBC training faculty includes: German Barrionuevo (Pitt Neuroscience): LTP in hippocampal slice Marlene Behrmann (CMU Psychology): spatial representations in parietal cortex Pat Carpenter (CMU Psychology): mental imagery, language, and problem solving Jonathan Cohen (CMU Psychology): schizophrenia; dopamine and attention Carol Colby (Pitt Neuroscience): spatial reps. in primate parietal cortex Bard Ermentrout (Pitt Mathematics): oscillations in neural systems Julie Fiez (Pitt Psychology): fMRI studies of language John Horn (Pitt Neurobiology): synaptic learning in invertebrates Allen Humphrey (Pitt Neurobiology): motion processing in primary visual cortex Marcel Just (CMU Psychology): visual thinking, language comprehension Eric Klann (Pitt Neuroscience): hippocampal LTP and LTD Alan Koretsky (CMU Biological Sciences): new fMRI techniques for brain imaging Tai Sing Lee (CMU Comp. Sci.): primate visual cortex; computer vision David Lewis (Pitt Neuroscience): anatomy of frontal cortex James McClelland (CMU Psychology): connectionist models of cognition Carl Olson (CNBC): spatial representations in primate frontal cortex David Plaut (CMU Psychology): connectionist models of reading Michael Pogue-Geile (Pitt Psychology): development of schizophrenia John Pollock (CMU Biological Sci.): neurodevelopment of the fly visual system Walter Schneider (Pitt Psychology): fMRI studies of attention and vision Charles Scudder (Pitt Neurobiology): motor learning in cerebellum Susan Sesack (Pitt Neuroscience): anatomy of the dopaminergic system Dan Simons (Pitt Neurobiology): sensory physiology of the cerebral cortex William Skaggs (Pitt Neuroscience): representations in rodent hippocampus David Touretzky (CMU Comp. Sci.): hippocampus, rat navigation, animal learning From riegler at ifi.unizh.ch Wed Dec 4 05:45:15 1996 From: riegler at ifi.unizh.ch (Alex Riegler) Date: Wed, 4 Dec 1996 11:45:15 +0100 Subject: 2nd CFP New Trends in Cog Sci Message-ID: Please forward to colleagues etc. Apologies if you have received this already. /\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/ International Workshop N E W T R E N D S I N C O G N I T I V E S C I E N C E NTCS '97 /\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/ "Does Representation need Reality?" Perspectives from Cognitive Science, Neuroscience, Epistemology, and Artificial Life Vienna, Austria, May 13 - 16, 1997 with plenary talks by: Georg Dorffner, Ernst von Glasersfeld, Stevan Harnad, Wolf Singer, and Sverre Sjoelander organized by the Austrian Society of Cognitive Science (ASoCS) =========================================================================== Latest information can be retrieved from the conference WWW-page =========================================================================== P u r p o s e ___________________________________________________________________________ The goal of this single-track conference is to investigate and discuss new approaches and movements in cognitive science in a workshop-like atmosphere. Among the topics which seem to have emerged in the last years are: embodiment of knowledge, system theoretic and computational neuroscience approaches to cognition, dynamics in recurrent neural architectures, evolutionary and artificial life approaches to cognition, and (epistemological) implications for perception and representation, constructivist concepts and the problem of knowledge representation, autopoiesis, implications for epistemology and philosophy (of science). Evidence for a failure of the traditional understanding of neural representation converges from several fields. Neuroscientific results in the last decade have shown that single cell representations with hierarchical processing towards representing units seems not the way the cortex represents environmental entities. Instead, distributed cell ensemble coding has become a popular concept for representation, both in computational and in empirical neuroscience. However, new problems arise from the new concepts. The problem of binding the distributed parts into a uniform percept can be "solved" by introducing synchronization of the member neurons. A deeper (epistemological) problem, however, is created by recurrent architectures within ensembles generating an internal dynamics in the network. The cortical response to an environmental stimulus is no longer dominated by stimulus properties themselves, but to a considerable degree by the internal state of the network. Thus, a clear and stable reference between a representational state (e.g. in a neuron, a Hebbian ensemble, an activation state, etc.) and the environmental state becomes questionable. Already learned experiences and expectancies might have an impact on the neural activity which is as strong as the stimulus itself. Since these internally stored experiences are constantly changing, the notion of (fixed) representations is challenged. At this point, system theory and constructivism, both investigating the interaction between environment and organism at an abstract level, come into the scene and turn out to provide helpful epistemological concepts. The goal of this conference is to discuss these phenomena and their implications for the understanding of representation, semantics, language, cognitive science, and artificial life. Contrary to many conferences in this field, the focus is on interdisciplinary cooperation and on conceptual and epistemological questions, rather than on technical details. We are trying to achieve this by giving more room to discussion and interaction between the participants (e.g., invited comments on papers, distribution of papers to the participants before the conference, etc.). According to the interdisciplinary character of cognitive science, we welcome papers/talks from the fields of artificial life, empirical, cognitive, and computational neuroscience, philosophy (of science), epistemology, anthropology, computer science, psychology, and linguistics. T o p i c s ___________________________________________________________________________ The conference is centered around but not restricted to the following topics: 1. Representation - epistemological concepts and findings from (computational) neuroscience, cognitive science (recurrent neural architectures, top-down processing, etc.), and philosophy; 2. Alternatives to representation - applying constructivism to cognitive systems; 3. Modeling language, communication, and semantics as a dynamical, evolutionary and/or adaptive process; 4. Representation and cognition in artificial life; 5. What is the role of simulation in understanding cognition? I n v i t e d S p e a k e r s ___________________________________________________________________________ Besides submitted papers the conference will also feature plenary talks by invited speakers who are leaders in their fields. The following is a list of invited speakers in alphabetical order: o Georg Dorffner, Univ. of Vienna (A) o Ernst von Glasersfeld, Univ. of Amherst, MA (USA) o Stevan Harnad, Univ. of Southampton (GB) o Rolf Pfeifer, Univ. of Zurich (CH) o Wolf Singer, Max Planck Institut fuer Hirnforschung, Frankfurt (D) o Sverre Sjoelander, Linkoeping University (S) P a p e r S u b m i s s i o n s ___________________________________________________________________________ We invite submissions of scientific papers to any of the 5 topics listed above. The papers will be reviewed by the Scientific Committee and accepted according to their scientific content, originality, quality of presentation, and relatedness to the conference topic. Please keep to the following guidelines: Hardcopy submission only, 6-9 pages A4 or USLetter single sided in Times Roman 10-12pt (or equivalent). Please send 4 copies to the organizing committee, see address below. In a first step we are planning to publish the proceedings as Technical Report of the Austrian Society for Cognitive Science. In a second step after rewriting the papers and after a second round of review a major publisher will be approached to publish the best papers in an edited volume. For the final versions of the accepted papers electronic submissions are preferred in one of the following formats: Word, FrameMaker, or Ascii. Detailed formatting information will be given upon notification of acceptance. Submission due January 7, 1997 Notification of acceptance February 28 R e g i s t r a t i o n ___________________________________________________________________________ To register please fill out the registration form at the bottom of this CFP and send it by... o Email to franz-markus.peschl at univie.ac.at, or by o Fax to +43-1-408-8838 (attn. M.Peschl), or by o Mail to Markus Peschl, Dept.for Philosophy of Science (address below) Registration Fee (includes admission to talks, presentations, and proceedings): before April 1st, 1997: Member * 1000 ATS (about 90 US$) Non-Member 1500 ATS (about 135 US$) Student Member ** 400 ATS (about 36 US$) Student Non-Member 1000 ATS (about 90 US$) after April 1st, 1997: Member * 1300 ATS (about 118 US$) Non-Member 1800 ATS (about 163 US$) Student Member ** 500 ATS (about 45 US$) Student Non-Member 1300 ATS (about 118 US$) *) Members of the Austrian Society of Cognitive Science **) Requires proof of valid student ID C o n f e r e n c e S i t e a n d A c c o m o d a t i o n ___________________________________________________________________________ The conference takes place in a small beautiful baroque castle in the suburbs of Vienna; the address is: Schloss Neuwaldegg Waldegghofg. 5 A-1170 Wien Austria Tel: +43-1-485-3605 Fax: +43-1-485-3605-112 It is surrounded by a beautiful forest and a good (international and Viennese gastronomic) infrastructure. On the tram it takes only 20 minutes to the center of Vienna. (Limited) Accommodation is provided by the castle (about 41 US$ per night (single), 30 US$ per night, per person (double) including breakfast). Please contact the telephone number above. You can find more information about Vienna and accommodation at the Vienna Tourist Board or at the Intropa Travel agent Tel: +43-1-5151-242. Further information will be available soon. D e s t i n a t i o n V i e n n a ? ___________________________________________________________________________ Vienna, Austria, can be reached internationally by plane or train. The Vienna Schwechat airport is located about 16 km from the city center. From the airport, the city air-terminal can be reached by bus (ATS 60.- per person) or taxi (about ATS 400). Rail-passengers arrive at one of the main stations which are located almost in the city center. From the air-terminal and the railway stations the congress site and hotels can be reached easily by underground (U-Bahn), tramway, or bus. A detailed description will be given to the participants. In May the climate is mild in Vienna. It is the time when spring is at its climax and everything is blooming. The weather is warm with occasional (rare) showers. The temperature is about 18 to 24 degrees Celsius. More information about Vienna and Austria on the web: Welcome to Vienna Scene Vienna City Wiener Festwochen - Vienna Festival Public Transport in Vienna (subway) Welcome to Austria General information about Austria Austria Annoted S c i e n t i f i c C o m m i t t e e ___________________________________________________________________________ R. Born Univ. of Linz (A) G. Dorffner Univ. of Vienna (A) E. v. Glasersfeld Univ. of Amherst, MA (USA) S. Harnad Univ. of Southampton (GB) M. Peschl Univ. of Vienna (A) A. Riegler Univ. of Zurich (CH) H. Risku Univ. of Skovde (S) S. Sjoelander Linkoeping University (S) A. v. Stein Neuroscience Institute, La Jolla (USA) O r g a n i z i n g C o m m i t t e e ___________________________________________________________________________ M. Peschl Univ. of Vienna (A) A. Riegler Univ. of Zurich (CH) T i m e t a b l e ___________________________________________________________________________ Submission due January 7, 1997 Notification of acceptance February 28 Early registration due April 1 Final papers due April 14 Conference date May 13-16, 1997 S p o n s o r i n g O r g a n i z a t i o n s ___________________________________________________________________________ o Christian Doppler Laboratory for Expert Systems (Vienna University of Technology) o Oesterreichische Forschgungsgemeinschaft o Austrian Federal Ministry of Science, Transport and the Arts o City of Vienna A d d i t i o n a l I n f o r m a t i o n ___________________________________________________________________________ For further information on the conference contact: Markus Peschl Dept. for Philosophy of Science University of Vienna Sensengasse 8/10 A-1090 Wien Austria Tel: +43-1-402-7601/41 Fax: +43-1-408-8838 Email: franz-markus.peschl at univie.ac.at General information about the Austrian Society for Cognitive Science can be found on the Society webpage or by contacting Alexander Riegler AILab, Dept. of Computer Science University of Zurich Winterthurerstr. 190 CH-8057 Zurich Switzerland Email: riegler at ifi.unizh.ch R e g i s t r a t i o n f o r m ___________________________________________________________________________ I participate at the Workshop "New Trends in Cognitive Science (NTCS'97)" Full Name ........................................................................ Full Postal Address: ........................................................................ ........................................................................ ........................................................................ Telephone Number (Voice): Fax: ..................................... .................................. Email address: ........................................................................ [ ] I intend to submit a paper Payment in ATS (= Austrian Schillings; 1 US$ is currently about 11 ATS). This fee includes admission to talks, presentations, and proceedings: Before April 1st, 1997: [ ] Member * 1000 ATS (about 90 US$) [ ] Non-Member 1500 ATS (about 135 US$) [ ] Student Member ** 400 ATS (about 36 US$) [ ] Student Non-Member 1000 ATS (about 90 US$) After April 1st, 1997: [ ] Member * 1300 ATS (about 118 US$) [ ] Non-Member 1800 ATS (about 163 US$) [ ] Student Member ** 500 ATS (about 45 US$) [ ] Student Non-Member 1300 ATS (about 118 US$) *) Members of the Austrian Society of Cognitive Science **) Requires proof of valid student ID Total: .................... ATS [ ] Visa [ ] Master-/Eurocard Name of Cardholder ........................................ Credit Card Number ........................................ Expiration Date ................. Date: ................ Signature: ........................................ Please send this form by... o Email to franz-markus.peschl at univie.ac.at, or by o Fax to +43-1-408-8838 (attn. M.Peschl), or by o Mail to Markus Peschl, Dept.for Philosophy of Science, Univ. of Vienna, Sensengasse 8/10, A-1090 Wien, Austria From gaudiano at cns.bu.edu Wed Dec 4 13:26:55 1996 From: gaudiano at cns.bu.edu (Paolo Gaudiano) Date: Wed, 4 Dec 1996 13:26:55 -0500 Subject: NSF Funding opportunity Message-ID: <199612041826.NAA25658@mattapan.bu.edu> The following was forwarded to me by Paul Werbos. It is being sent to various newsgroups and mailing lists, so please accept my apologies if you receive multiple copies. Paolo Gaudiano ---------------------------------------------------------------------- NSF has just announced a new initiative in Learning and Intelligent Systems, with a first-year funding of about $20 million. The announcement is up on the web site: http://www.ehr.nsf.gov/LIS/index.htm The scope of this initiative is not 100% clear from the outside, but the core will involve support of collaborations across major disciplines (e.g. biology, engineering, computer science, psychology, education), hopefully to develop a more unified understanding of learning mechanisms/models/designs/issues applicable to both natural and artificial systems. Neural network models are mentioned, along with several other paradigms, as a tool for understanding and implementing learning in a more unified way across disciplines. Technology development and education will also be a major part of this. The relative emphasis between basic science and software development is still not entirely clear, but both will clearly have a major role. It is clear, however, that there is White House interest. From ndxdendo at rrzn-serv.de Thu Dec 5 10:39:27 1996 From: ndxdendo at rrzn-serv.de (ndxdendo@rrzn-serv.de) Date: Thu, 5 Dec 1996 16:39:27 +0100 (MET) Subject: ICOBIP '97 - Conference Announcement Message-ID: <199612051539.QAA10365@sun1.rrzn-user.uni-hannover.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 12224 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/cf2a83b5/attachment-0001.ksh From jmoody at cogsci.ucsd.edu Mon Dec 9 10:46:17 1996 From: jmoody at cogsci.ucsd.edu (Jay Moody) Date: Mon, 9 Dec 1996 10:46:17 -0500 Subject: Good introductory text for Neural Net Analysis: PCA, Cluster, etc. Message-ID: To those of you who teach analysis of neural networks to students of modest background in statistics: I recently came across this very readable explanation of principal components analysis, cluster analysis, and multidimensional scaling (and more): _Multivariate_Statistical_Methods:_A_Primer. Bryan F. J. Manly. 1994. Chapman & Hall: New York. (paperback) Chapters of particular relevance to (my experience with) neural nets: 1 The material of multivariate analysis -- 5 examples of data/problems where multivariate analysis is useful (mostly from evolutionary biology) 2 Matrix Algebra -- 9 page explanation of essentials (matrix operations, eigenvalues, covariance matricies, etc.) 6 Principal Components Analysis -- with 2 detailed examples 9 Cluster analysis 11 Multidimensional Scaling From smyth at galway.ICS.UCI.EDU Mon Dec 9 15:13:01 1996 From: smyth at galway.ICS.UCI.EDU (Padhraic Smyth) Date: Mon, 09 Dec 1996 12:13:01 -0800 Subject: Sixth International Workshop on AI and Statistics: Final Reminder Message-ID: <9612091213.aa03271@paris.ics.uci.edu> FINAL REMINDER SIXTH INTERNATIONAL AI AND STATISTICS WORKSHOP JANUARY 4TH-7TH, FORT LAUDERDALE, FLORIDA A final reminder that the AI and Statistics Workhop will be held from January 5th to January 7th at the Bahia Mar hotel in Fort Lauderdale, Florida. The workshop program will be preceded on January 4th by what promises to be a day of very interesting tutorials on such topics as - Conditional independence in statistics and AI (A. P. Dawid), - Bayesian time series analysis and forecasting (Mike West), - Learning in information agents (Tom Mitchell), and - Graphical models, neural networks, and machine learning algorithms (Mike Jordan). If you are planning on attending please note that the workshop hotel will relinquish the block of rooms reserved for workshop attendees on December 20th: since the hotel is fully booked it is essential you make reservations before the 20th. Full details (including registration forms and workshop program) are available at: http://www.stat.washington.edu/aistats97/ Padhraic Smyth General Chair, AI-Stats '97 From jdcohen+ at andrew.cmu.edu Tue Dec 10 08:45:56 1996 From: jdcohen+ at andrew.cmu.edu (Jonathan D Cohen) Date: Tue, 10 Dec 1996 08:45:56 -0500 (EST) Subject: postdoc position Message-ID: <8mfKaI_00iWl0462I0@andrew.cmu.edu> Postdoctoral Position: Computational Modeling of Neuromodulation and/or Prefrontal Cortex Function ---------------- Center for the Neural Basis of Cognition Carnegie Mellon Univeristy and the University of Pittsburgh ---------------- A postdocotral position is available starting any time between now and September 1, 1997 for someone interested in pursuing computational modeling approaches to the role of neuromodulation and/or prefrontal cortical function in cognition. The nature of the position is flexible, depending upon the individual's interest and expertise. Approaches can be focused at the neurobiological level (e.g., modeling detailed physiological characteristics of neuromodulatory systems, such as locus coeruleus and/or dopaminergic nuclei, or the circuitry of prefrontal cortex), or at the more cognitive level (e.g., the nature of representations and/or the mechanisms involved in active maintenance of information within prefrontal cortex, and their role in working memory). The primary requirement for the position is a Ph.D. in the cognitive, computational, or neurosciences, and extensive experience with computational modeling work, either at the PDP/connectionist or detailed biophysical level. The candidate will be working directly with Jonathan Cohen within the Department of Psychology at CMU, and collaborating closely with Randy O'Reilly at the University of Colorado, Boulder. Other potential collaborations include members of the Center for the Neural Basis of Cognition (CNBC), including James McClelland, David Lewis, German Barrionuevo, Susan Sesack, G. Bard Ermantrout, as well as collaborators at other institutions, such as Gary Aston-Jones (Hahnemann University), Joseph LeDoux (NYU), and Peter Dayan (MIT). Available resources include direct access to state-of-the-art computing facilities within the CNBC (IBM SP-2 and SGI PowerChallenge) and the Pittsburgh Supercomputing Center, neuroimaging facilities (PET and 3T fMRI at the University of Pittsburgh), and clinical populations (Western Psychiatric Institute and Clinic). Carnegie Mellon University and the University of Pittsburgh are both equal opportunity employers; minorities and women are encouraged to apply. Inquiries can be directed to Jonathan Cohen (jdcohen at cmu.edu) or Randy O'Reilly (oreilly at flies.mit.edu). Applicants should send a CV, a small number of relevant publications, and the names and addresses of at least two references, to: Jonathan D. Cohen Department of Psychology Carnegie Mellon University Pittsburgh, PA 15213 (412) 268-5692 (voice) (412) 268-2810 (fax) From bishopc at helios.aston.ac.uk Wed Dec 11 10:24:37 1996 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Wed, 11 Dec 1996 15:24:37 +0000 Subject: ROSENBAUM FELLOWSHIP Message-ID: <2149.9612111524@sun.aston.ac.uk> ROSENBAUM FELLOWSHIP Isaac Newton Institute University of Cambridge, U.K. Applications are invited for a Rosenbaum Visiting Fellowship at the prestigious Isaac Newton Institute for Mathematical Sciences in Cambridge. The Fellowship will allow the holder to spend 6 months at the Institute from July to December 1997 to coincide with the major scientific programme on Neural Networks and Machine Learning. The stipend will be 17,500 US dollars for the six month period, and travel expenses may also be provided. To be eligible candidates must have a PhD and be U.S. citizens or permanent residents or have resided in the U.S. for a minimum of four years. Informal enquiries may be addressed to: Professor Christopher M. Bishop Organiser, Neural Networks and Machine Learning Neural Computing Research Group Aston University Birmingham B4 7ET C.M.Bishop at aston.ac.uk Tel. +44 (0)121 333 4631 Fax. +44 (0)121 333 4586 Applications, including a CV, list of publications and the names of two referees should be sent by 31 December 1996 to: The Director Professor Keith Moffatt Isaac Newton Institute for Mathematical Sciences 20 Clarkson Road Cambridge CB3 0EH, U.K. From schwenk at IRO.UMontreal.CA Wed Dec 11 12:06:03 1996 From: schwenk at IRO.UMontreal.CA (Holger Schwenk) Date: Wed, 11 Dec 1996 12:06:03 -0500 (EST) Subject: paper on on-line character recognition using "constraint tangent distance" Message-ID: <199612111706.MAA05225@grosse.iro.umontreal.ca> The following paper on on-line character recognition is available on the WEB at http://www.iro.umontreal.ca/~schwenk/papers/icpr96.ps.gz Comments are welcome Holger ------------------------------------------------------------------------------- Holger Schwenk phone: (514) 343-6111 ext 1655 fax: (514) 343-5834 LISA, Dept. IRO University of Montreal email: schwenk at iro.umontreal.ca 2920 Chemin de la tour, CP 6128 http://www.iro.umontreal.ca/~schwenk Montreal, Quebec, H3C 3J7 CANADA ------------------------------------------------------------------------------- Constraint Tangent Distance for On-line Character Recognition H. Schwenk and M. Milgram published in International Conference on Pattern Recognition (ICPR), pp. D:515--519, August 1996 Abstract: --------- In on-line character recognition we can observe two kinds of intra-class variations: small geometric deformations and completely different writing styles. We propose a new approach to deal with these problems by defining an extension of tangent distance (Simard and al, 1993), well known in off-line character recognition. The system has been implemented with a k-nearest neighbor classifier and a so called diabolo classifier respectively (Schwenk and Milgram, 1995). Both classifiers are invariant under transformations like rotation, scale or slope and can deal with variations in stroke order and writing direction. Results are presented for our digit database with more than 200 writers. From phkywong at uxmail.ust.hk Wed Dec 11 03:21:53 1996 From: phkywong at uxmail.ust.hk (Dr. Michael Wong) Date: Wed, 11 Dec 1996 16:21:53 +0800 Subject: Paper available Message-ID: <96Dec11.162157+0800_hkt.102351-9566+1215@uxmail.ust.hk> The following paper, to appear in Europhysics Letters, is now available via anonymous FTP. (4 pages) ============================================================================ FTP-host: physics.ust.hk FTP-files: pub/kymwong/actdyn.ps.gz Improving Pattern Reconstruction in Neural Networks by Activity Dynamics K. Y. Michael Wong Department of Physics, The Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong. E-mail address: phkywong at usthk.ust.hk ABSTRACT I study the averaged dynamical behaviour of neural networks over an extended monitoring period, and consider pattern reconstruction procedures by activity clipping, selectively freezing, or sequentially freezing the dynamic nodes. They enable the retrieval precision to be improved, the basin of attraction to be widened, or the storage capacity to be increased, even when the information is not efficiently embedded in the synaptic weights. ============================================================================ FTP instructions: unix> ftp physics.ust.hk Name: anonymous Password: your full email address ftp> cd pub/kymwong ftp> get actdyn.ps.gz ftp> quit unix> gunzip actdyn.ps.gz unix> lpr actdyn.ps (or ghostview actdyn.ps) From cas-cns at cns.bu.edu Thu Dec 12 10:51:17 1996 From: cas-cns at cns.bu.edu (CAS/CNS) Date: Thu, 12 Dec 1996 10:51:17 -0500 Subject: Vision, Recognition, Action Message-ID: <199612121551.KAA28070@cns.bu.edu> ***** CALL FOR PAPERS ***** International Conference on VISION, RECOGNITION, ACTION: NEURAL MODELS OF MIND AND MACHINE May 28--31, 1997 Sponsored by the Center for Adaptive Systems and the Department of Cognitive and Neural Systems Boston University with financial support from DARPA and ONR This conference will include a day of tutorials (May 28) followed by 3 days of 21 invited lectures and contributed lectures and posters by experts on the biology and technology of how the brain and other intelligent systems see, understand, and act upon a changing world. The meeting program and updates can be found at http://cns-web.bu.edu/cns-meeting/. Hotel and restaurant information can be found there. WEDNESDAY, MAY 28, 1997: TUTORIALS Stephen Grossberg, "Vision, Brain, and Technology" (3 hours in two 1-1/2 hour lectures). Gail Carpenter, "Self-Organizing Neural Networks for Learning, Recognition, and Prediction: ART Architectures and Applications" (2 hours). Eric Schwartz, "Algorithms and Hardware for the Application of Space-Variant Active Vision to High Performance Machine Vision" (2 hours). THURSDAY, MAY 29---SATURDAY, MAY 31, 1997: CONFIRMED INVITED LECTURERS Andreas Andreou, Stuart Anstis, Terrance Boult, Rodney Brooks, Gail Carpenter, Patrick Cavanagh, Robert Desimone, Patricia Goldman-Rakic, Stephen Grossberg, Michael Jordan, John Kalaska, Takeo Kanade, Ennio Mingolla, Lance Optican, Alex Pentland, Tomaso Poggio, Eric Schwartz, Robert Shapley, George Sperling, Larry Squire, and Allen Waxman. CALL FOR ABSTRACTS: Contributed abstracts for talks or posters must be received, in English, by January 31, 1997. Notification of acceptance will be given by February 28, 1997. A meeting registration fee must accompany each Abstract. See Registration Information below for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Each Abstract should fit on one 8" x 11" white page with 1" margins on all sides, single-column format, single-spaced, Times Roman or similar font of 10 points or larger, printed on one side of the page only. Fax submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract, corresponding author and presenting author name, address, telephone, fax, and email address. Preference for oral or poster presentation should be noted. Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. The original and 3 copies of each Abstract should be sent to: CNS Meeting, c/o Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, MA 02215. REGISTRATION INFORMATION: To register, please fill out the enclosed registration form. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the CNS Meeting address. If paying by credit card, mail to the CNS Meeting address, or fax to (617) 353-7755. STUDENT FELLOWSHIPS: Some fellowships for PhD students and postdocs are available to defray travel and living costs. The deadline for applying for fellowship support is January 31, 1997. Applicants will be notified by February 28, 1997. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery should accompany the application and summarize how the candidate may benefit from the meeting. Students who also submit an Abstract need to include the registration fee with their Abstract. ******************** REGISTRATION FORM (Please Type or Print) Vision, Recognition, Action: Neural Models of Mind and Machine Boston University, Boston, Massachusetts Tutorials: May 28, 1997 Meeting: May 29--31, 1997 Mr/Ms/Dr/Prof: Name: Affiliation: Address: City, State, Postal Code: Phone and Fax: Email: The conference registration fee includes the meeting program, reception, six coffee breaks, and the meeting proceedings. Two coffee breaks and a book of tutorial viewgraph copies will be covered by the tutorial registration fee. CHECK ONE: [ ] $55 Conference plus Tutorial (Regular) [ ] $40 Conference plus Tutorial (Student) [ ] $35 Conference Only (Regular) [ ] $25 Conference Only (Student) [ ] $30 Tutorial Only (Regular) [ ] $25 Tutorial Only (Student) METHOD OF PAYMENT: [ ] Enclosed is a check made payable to "Boston University". Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay my fees by credit card (MasterCard, Visa, or Discover Card only). Type of card: Name as it appears on the card: Account number: Expiration date: Signature and date: ******************** From ruppin at math.tau.ac.il Thu Dec 12 18:10:35 1996 From: ruppin at math.tau.ac.il (Eytan Ruppin) Date: Fri, 13 Dec 1996 01:10:35 +0200 (GMT+0200) Subject: Last-CFP:-Modeling-Brain-Disorders Message-ID: <199612122310.BAA27410@gemini.math.tau.ac.il> CALL FOR SUBMISSIONS Special Issue of the Journal "Artificial Intelligence in Medicine" (Published by Elsevier) Theme: COMPUTATIONAL MODELING OF BRAIN DISORDERS Guest-Editors: Eytan Ruppin & James A. Reggia (Tel-Aviv University) (University of Maryland) ------------------------------------------------ **** DEADLINE FOR SUBMISSION IS MARCH 15'th , 1997 **** ------------------------------------------------ BACKGROUND As computational methods for brain modeling have advanced during the last several years, there has been an increasing interest in adopting them to study brain disorders in neurology, neuropsychology, and psychiatry. Models of Alzheimer's disease, epilepsy, aphasia, dyslexia, Parkinson's disease, stroke and schizophrenia have been recently studied to obtain a better understanding of the underlying pathophysiological processes. While computer models have the disadvantage of simplifying the underlying neurobiology and the pathophysiology, they also have remarkable advantages: They permit precise and systematic control of the model variables, and an arbitrarily large number of ``subjects''. They are open to detailed inspection, in isolation, of the influence of various metabolic and neural variables on the disease progression, in the hope of gaining insight into why observed behaviors occur. Ultimately, one seeks a sufficiently powerful model that can be used to suggest new pharmacological interventions and rehabilitative actions. OBJECTIVE OF SPECIAL ISSUE The objective of this special issue on modeling brain disorders is to report on the recent studies in this field. The main goal is to increase the awareness of the AI medical community to this research, currently primarily performed by members of the neural networks and `connectionist' community. By bringing together a series of such brain disorders modeling papers we strive to produce a contemporary overview of the kinds of problems and solutions that this growing research field has generated, and to point to future promising research directions. More specifically, papers are expected to cover one or more of the following topics: -- Specific neural models of brain disorders, expressing the link between their pathogenesis and clinical manifestations. -- Computational models of pathological alterations in basic neural, synaptic and metabolic processes, that may relate to the generation of brain disorders in a significant manner. -- Applications of neural networks that shed light on the pathogenic processes that underlie brain disorders, or explore their temporal evolution and clinical course. -- Methodological issues involved in constructing computational models of brain disorders; obtaining sufficient data, visualizing high-dimensional complex behavior, and testing and validating these models. -- Bridging the apparent gap between functional imaging investigations and current neural modeling studies, arising from their distinct spatio-temporal resolution. SCHEDULE All the submitted manuscripts will be subject to a rigorous review process. The special issue will include 5 papers of 15-20 pages each, plus an editorial. Manuscripts should be prepared in accordance with the journal "submission guidelines" which are available on request, and may also be retrieved from http://www.math.tau.ac.il/~ruppin. March 15, 1997 Receipt of full papers. Three copies of a manuscript should be sent to: Eytan Ruppin Department of Computer Science School of Mathematics Tel-Aviv University Tel-Aviv, Israel, 69978. August 1, 1997 Notification of acceptance October 1, 1997 Receipt of final-version of manuscripts June 1998 Publication of AIM special issue From giles at research.nj.nec.com Fri Dec 13 09:52:54 1996 From: giles at research.nj.nec.com (Lee Giles) Date: Fri, 13 Dec 96 09:52:54 EST Subject: TR available Message-ID: <9612131452.AA00658@alta> The following Technical Report is available via the University of Maryland Department of Computer Science and the NEC Research Institute archives: ____________________________________________________________________ HOW EMBEDDED MEMORY IN RECURRENT NEURAL NETWORK ARCHITECTURES HELPS LEARNING LONG-TERM DEPENDENCIES Technical Report CS-TR-3626 and UMIACS-TR-96-28, Institute for Advanced Computer Studies, University of Maryland, College Park, MD 20742 Tsungnan Lin{1,2}, Bill G. Horne{1}, C. Lee Giles{1,3} {1}NEC Research Institute, 4 Independence Way, Princeton, NJ 08540 {2}Department of Electrical Engineering, Princeton University, Princeton, NJ 08540 {3}UMIACS, University of Maryland, College Park, MD 20742 ABSTRACT Learning long-term temporal dependencies with recurrent neural networks can be a difficult problem. It has recently been shown that a class of recurrent neural networks called NARX networks perform much better than conventional recurrent neural networks for learning certain simple long-term dependency problems. The intuitive explanation for this behavior is that the output memories of a NARX network can be manifested as jump-ahead connections in the time-unfolded network. These jump-ahead connections can propagate gradient information more efficiently, thus reducing the sensitivity of the network to long-term dependencies. This work gives empirical justification to our hypothesis that similar improvements in learning long-term dependencies can be achieved with other classes of recurrent neural network architectures simply by increasing the order of the embedded memory. In particular we explore the impact of learning simple long-term dependency problems on three classes of recurrent neural networks architectures: globally recurrent networks, locally recurrent networks, and NARX (output feedback) networks. Comparing the performance of these architectures with different orders of embedded memory on two simple long-term dependences problems shows that all of these classes of networks architectures demonstrate significant improvement on learning long-term dependencies when the orders of embedded memory are increased. These results can be important to a user comfortable to a specific recurrent neural network architecture because simply increasing the embedding memory order will make the architecture more robust to the problem of long-term dependency learning. ------------------------------------------------------------------- KEYWORDS: discrete-time, memory, long-term dependencies, recurrent neural networks, training, gradient-descent PAGES: 15 FIGURES: 7 TABLES: 2 ------------------------------------------------------------------- http://www.neci.nj.nec.com/homepages/giles.html http://www.cs.umd.edu/TRs/TR-no-abs.html or ftp://ftp.nj.nec.com/pub/giles/papers/UMD-CS-TR-3626.recurrent.arch.long.term.ps.Z ------------------------------------------------------------------------------------ -- C. Lee Giles / Computer Sciences / NEC Research Institute / 4 Independence Way / Princeton, NJ 08540, USA / 609-951-2642 / Fax 2482 www.neci.nj.nec.com/homepages/giles.html == From erikf at sans.kth.se Fri Dec 13 05:41:45 1996 From: erikf at sans.kth.se (erikf@sans.kth.se) Date: Fri, 13 Dec 1996 11:41:45 +0100 Subject: PhD Thesis Available Message-ID: <199612131041.LAA11528@sans03.nada.kth.se> My PhD thesis is available at my home page: http://www.nada.kth.se/~erikf/publications.html It is also available for anonymous ftp downloading: ftp://ftp.nada.kth.se/pub/documents/SANS/reports/ps/ef-thesis.tar.Z ftp://ftp.nada.kth.se/pub/documents/SANS/reports/ps/ef-thesis-summary.ps.Z The complete thesis is 2.4Mb and un-tars into 15.5Mb postscript files. The summary is 530kb and prints on 68 pages. Biophysical Simulation of Cortical Associative Memory Erik Fransen Studies of Artificial Neural Systems Department of Numerical Analysis and Computing Science Royal Institute of Technology, S-100 44 Stockholm, Sweden erikf at sans.kth.se The associative memory function of the brain is an active area of experimental and theoretical research. This thesis describes the construction of a model of cortical auto-associative memory. Conceptually, it is based on Hebb's cell assembly hypothesis. The quantitative description comes from a class of artificial neural networks, ANN, with recurrent connectivity and attractor dynamics. More specifically, this work has concentrated on problems related to how this formal network description could be translated into a neurobiological model. In this work I have used a relatively detailed description of the neurons which includes changes over time for the potential and current distributions of the different parts of the cell, as well as calcium ion flux and some of its electrophysiological effects. The features of this associative memory model are interpreted in Gestalt psychological terms and discussed in relation to features of priming, as gained from memory psychological experiments. The model output is compared to single cell recordings in working memory experiments as well as to results from a slice preparation of the hippocampus region. A hypothesis for the functional role of the variable resting potentials and background activities that are seen in experiments has been put forward. This hypothesis is based on the bias values which are produced by the learning in an ANN and result in different `a priori firing probabilities of the cells. It is also shown that it is possible to increase the degree of similarity to the cortical circuitry with the cortical column model. This model can function as a content-addressable memory, as expected. Initially, the network structure and the cell types have to be determined. The next part of the work is the identification of what cell properties should be modeled. The initial results include a demonstration that cells described at this detail can support the assembly operations (persistent after-activity, pattern completion and pattern rivalry) shown for ANNs. The importance of adequate cell properties for network function was confirmed. For example, with pyramidal type cells the network produced the desired assembly operations, but with motoneuron type cells it did not. There are also results which are not dependent on the assembly hypothesis. The network can stabilize in a relatively short time and at sub-maximal cell firing frequencies despite time delays and the recurrent connectivity which provides positive feed-back. Further, the network activity may be controlled by modeling the effects of neuromodulators such as serotonin. Instances of spike synchronization and burst synchronization were found in networks that did not have any inhibitory cells. It is concluded that this type of attractor network model can be used as a valuable tool in the study of cortical associative memory, and that detailed cell models are very useful for testing the biological relevance of such models. Keywords: after--activity, attractor network, biologically realistic neural networks, computational neuroscience, computer simulation, cortical associative memory, Hebbian cell assemblies, neural modeling, recurrent artificial neural network, pattern completion, pattern rivalry Fransen E. Thesis, 1996 Biophysical Simulation of Cortical Associative Memory. Dept. of Numerical Analysis and Computing Science, Royal Institute of Technology, Stockholm, Sweden, ISBN 91-7170-689-5, TRITA-NA-P96/28 __---~~~--___ |----------------------------------| _____________ _-~ )----+ Erik Fransen +----| Studies of |\ ( )---+ Department of Numerical Analysis +----| Artificial | | ( ___-- )--+ and Computing Science +----| Neural | | (___-~ __) | Royal Institute of Technology | | Systems | | (____ _--~~ ) | S-100 44 Stockholm, Sweden | |____________|_| `~~\ ~--~~ | EMail: erikf at sans.kth.se | _____|___|_____ \--\ | http://www.nada.kth.se/~erikf | /_+46-8-7906904/ |----------------------------------| From ejua71 at tattoo.ed.ac.uk Tue Dec 17 14:32:27 1996 From: ejua71 at tattoo.ed.ac.uk (J A Bullinaria) Date: Tue, 17 Dec 96 19:32:27 GMT Subject: CFP: NCPW4 Message-ID: <9612171932.aa03068@uk.ac.ed.tattoo> 4th Neural Computation and Psychology Workshop Connectionist Representations : Theory and Practice University of London, England Wednesday 9th April - Friday 11th April 1997 AIMS AND OBJECTIVES This workshop is the fourth in a series, following on from the first at the University of Wales, Bangor (with theme "Neurodynamics and Psychology"), the second at the University of Edinburgh, Scotland ("Memory and Language") and the third at the University of Stirling, Scotland ("Perception"). The general aim is to bring together researchers from such diverse disciplines as artificial intelligence, applied mathematics, cognitive science, computer science, neurobiology, philosophy and psychology to discuss their work on the connectionist modelling of psychology. Next years workshop is to be hosted jointly by members of the Psychology Departments of Birkbeck College London and University College London. As in previous years there will be a theme to the workshop. We think that next years theme is sufficiently wide ranging and important that researchers in all areas of Neural Computation and Psychology will find it relevant and have something to say on the subject. The theme is to be: "Connectionist Representations : Theory and Practice". This covers many important issues ranging from the philosophical (such as the grounding problem) to the physiological (what can connectionist representations tell us about real neural systems) to the technical (such as what is necessary to get specific models to work). The organisation of the final program will depend on the submissions received, but particular topics might, for example, include: * Understanding representations developed in trained networks. * Merits of local v. distributed representations. * Semantic representation and hierarchies. * The problem of serial order. * The representation of time. * Neural networks and the neurophysiology/neuropsychology of representation. As in previous years we aim to keep the workshop fairly small, informal and single track. As always, participants bringing expertise from outside the UK are particularly welcome. PROVISIONAL INVITED SPEAKERS Roland Baddeley (Oxford) Dennis Norris (APU Cambridge) Gordon Brown (Warwick) Mike Page (APU Cambridge) Tony Browne (Mid Kent) Tim Shallice (UCL) Neil Burgess (UCL) Leslie Smith (Stirling) Nick Chater (Warwick) Chris Thornton (Sussex) Glyn Humphreys (Birmingham) Janet Vousden (Warwick) Bob Kentridge (Durham) CALL FOR ABSTRACTS In addition to our invited speakers, we invite other potential participants to submit abstracts of proposed talks and/or posters. As in previous years, after the workshop, selected presenters will be invited to produce a written version of their talk or poster for inclusion in a refereed proceedings. Abstracts (one page) should arrive by email at "ncpw4 at psychol.ucl.ac.uk" before 31st January 1997. Acceptance notices, registration details and a provisional program will be sent out mid-February. REGISTRATION, FOOD AND ACCOMMODATION The workshop will be held in University College London, which is situated in the centre of London, near the British Museum and within easy walking distance of the West End and many of London's major attractions. The conference registration fee (which includes lunch and morning and afternoon tea/coffee each day) will be approximately 60 pounds. A special conference dinner is planned for the Thursday evening costing approx. 20 pounds. Accommodation can be arranged in student residences or in local hotels, according to budget. The conference/ accommodation area is easily accessible by the London Underground system ("The Tube"), with direct lines from London Heathrow Airport and all the major intercity train stations. Additional information will appear nearer the workshop date on the conference web page at: "http://prospero.psychol.ucl.ac.uk/ncpw4/". ORGANISING COMMITTEE John Bullinaria (Birkbeck College London) Dave Glasspool (University College London) George Houghton (University College London) CONTACT DETAILS Workshop email address for all correspondence: ncpw4 at psychol.ucl.ac.uk John Bullinaria, NCPW4, Centre for Speech and Language, Department of Psychology, Birkbeck College, Malet Street, London WC1E 7HX, UK. Phone: +44 171 631 6330, Fax: +44 171 631 6587 Email: j.bullinaria at psyc.bbk.ac.uk Dave Glasspool, NCPW4, Department of Psychology, University College London, Gower Street, London WC1E 6BT, UK. Phone: +44 171 380 7777 Xtn. 5418. Fax: +44 171 436 4276 Email: d.glasspool at psychol.ucl.ac.uk George Houghton, NCPW4, Department of Psychology, University College London, Gower Street, London WC1E 6BT, UK. Phone: +44 171 380 7777 Xtn. 5394. Fax: +44 171 436 4276 Email: g.houghton at psychol.ucl.ac.uk From marco at McCulloch.ing.UniFI.IT Tue Dec 17 11:40:55 1996 From: marco at McCulloch.ing.UniFI.IT (Marco Gori) Date: Tue, 17 Dec 1996 17:40:55 +0100 Subject: postdoc fellowships at University of Siena (Italy) Message-ID: <9612171640.AA11169@McCulloch.ing.UniFI.IT> ============================================================================== POST-DOC FELLOWSHIPS AT UNIVERSITY OF SIENA Faculty of Engineering Dipartimento di Ingegneria dell'Informazione Via Roma, 56 - 53100 Siena (Italy) ============================================================================== Two post-doctoral fellowships are available at Dipartimento di Ingegneria dell'Informazione, University of Siena. The position is for two years. Among other research areas, people with experience in the field of recurrent networks, hybrid systems, and combinatorial optimization are highly encoraged to apply. Research at DII in the field of neural nets is carried out jointly with people at Dipartimento di Sistemi e Informatica, University of Florence. The candidates must send the application to Universita' di Siena Sezione Dottorato di Ricerca Via Banchi di Sotto, 46 53100 Siena (Italy) For further information, please send an e-mail to me (see my signature). Best regards, -- Marco Gori. ================================================================================================== Marco Gori Email: marco at mcculloch.ing.unifi.it WWW: http://www-dsi.ing.unifi.it/neural Universita' di Siena c/o Universita' di Firenze V. Roma, 56 - Siena (Italy) V. S. Marta, 3 - 50139 Firenze (Italy) Voice: +39 577 26-36-04; Fax: +39 577 26-36-02 Voice: +39 55 479-6265; Fax: +39 55 479-6363 ================================================================================================== From A.Sharkey at dcs.shef.ac.uk Fri Dec 20 08:32:00 1996 From: A.Sharkey at dcs.shef.ac.uk (Amanda Sharkey) Date: Fri, 20 Dec 96 13:32:00 GMT Subject: Special Issue of Connection Science Message-ID: <9612201332.AA18966@gw.dcs.shef.ac.uk> SPECIAL ISSUE of Connection Science, Volume 8, Numbers 3 & 4, December 1996. COMBINING ARTIFICIAL NEURAL NETS: ENSEMBLE APPROACHES. ----------------------------------------------------- Amanda J.C. Sharkey. On Combining Artificial Neural Nets. 299 Sherif Hashem. Effects of Collinearity on Combining Neural Networks. 315 David W. Opitz & Jude W. Shavlik. Actively Searching for an Effective Neural Network Ensemble. 337 Yuval Raviv & Nathan Intrator. Bootstrapping with Noise: An Effective Regularization Technique. 355 Bruce E. Rosen. Ensemble Learning Using Decorrelated Neural Networks. 373 Kagan Tumer & Joydeep Ghosh. Error Correlation and Error Reduction in Ensemble Classifiers. 385 Bambang Parmanto, Paul W. Munro & Howard R. Doyle. Reducing Variance of Committee Prediction with Resampling Techniques. 405 Peter A. Zhilkin & Ray L. Somorjai. Application of Several methods of Classification Fusion to Magnetic Resonance Spectra. 427 For information about Connection Science journal, see http://www.carfax.co.uk/cos-con.htm From sontag at control.rutgers.edu Fri Dec 20 11:31:58 1996 From: sontag at control.rutgers.edu (Eduardo Sontag) Date: Fri, 20 Dec 1996 11:31:58 -0500 Subject: TR available - Learning problems for recurrent nets Message-ID: <199612201631.LAA01544@control.rutgers.edu> VAPNIK-CHERVONENKIS DIMENSION OF RECURRENT NEURAL NETWORKS Pascal Koiran, LIP-Lyon, France Eduardo D. Sontag, Rutgers, USA DIMACS Tech Report 96-56. (Summary to appear in Proceedings of Third European Conference on Computational Learning Theory, Jerusalem, March 17-19, 1997.) ABSTRACT This paper provides lower and upper bounds for the VC dimension of recurrent networks. Several types of activation functions are discussed, including threshold, polynomial, piecewise-polynomial and sigmoidal functions. The bounds depend on two independent parameters: the number w of weights in the network, and the length k of the input sequence. Ignoring multiplicative constants, the main results say roughly the following: 1. For architectures whose activation is any fixed nonlinear polynomial, the VC dimension is proportional to wk. 2. For architectures whose activation is any fixed piecewise polynomial, the VC dimension is between wk and w**2 k. 3. For architectures with threshold activations, the VC dimension is between wlog(k/w) and min{wklog(wk),w**2+wlog(wk)}. 4. For the standard sigmoid tanh(x), the VC dimension is between wk and w**4 k**2. ============================================================================ The paper can be retrieved from the DIMACS archive: http://dimacs.rutgers.edu/TechnicalReports/1996.html as well as from Sontag's HomePage: http://www.math.rutgers.edu/~sontag (follow link to "online papers"). Many other related papers can be also found at this latter site. If Web access if inconvenient, it is also possible to use anonymous FTP: ftp dimacs.rutgers.edu login: anonymous cd pub/dimacs/TechnicalReports/TechReports/1996/ bin get 96-56.ps.gz Once file is retrieved, use gunzip to uncompress and then print as postscript. ============================================================================ Comments welcome. Happy connecting holidays! From seckel at klab.caltech.edu Fri Dec 20 12:50:44 1996 From: seckel at klab.caltech.edu (Al Seckel) Date: Fri, 20 Dec 1996 09:50:44 -0800 (PST) Subject: Caltech site on brain and cognitive science, illusio Message-ID: <199612201750.JAA25870@thales.klab.caltech.edu> Greetings, Christof Koch of the California Institute of Technology and I have been actively researching and studying the neuronal correlates of visual and other sensory illusions. In this regard we have amassed the world's largest collection of illusions, most of which are unpublished. We have put together a massive multimedia project where the user can vary critical parameters on each illusion thereby testing the underlying mechanism. This has never been possible before in the printed medium. We have also put up an enormous web site on illusions, perception, and brain and cognitive science for interested professionals and laypeople complete with interactive demonstrations, illusionary artwork, puzzles, bibliographies, recommended school projects, merchandise, and the like. Much of the material on the site is unpublished. We would very much appreciate it if you could let your subscribers know about this site as it contains up-to-date scientific explanations and demonstrations not available anywhere else and of extreme interest to our community. The present web address is www.lainet.com/illusions After Monday it will be www.illusionworks.com Thanks very much! al From hochreit at informatik.tu-muenchen.de Mon Dec 30 06:32:48 1996 From: hochreit at informatik.tu-muenchen.de (Josef Hochreiter) Date: Mon, 30 Dec 1996 12:32:48 +0100 Subject: LSTM paper announcement Message-ID: <96Dec30.123252+0100met_dst.49137+394@papa.informatik.tu-muenchen.de> LONG SHORT-TERM MEMORY Sepp Hochreiter, TUM Juergen Schmidhuber, IDSIA Substantially revised and extended Version 3.0 of TR FKI-207-95 (32 pages 130 KB; formerly 8 pages 50 KB), with numerous additional experiments and details. Abstract. Learning to store information over extended time intervals via recurrent backpropagation takes a very long time, mostly due to insufficient, decaying error back flow. We briefly review Hochreiter's 1991 analysis of this problem, then address it by introducing a novel, efficient method called "Long Short-Term Memory" (LSTM). LSTM can learn to bridge time lags in excess of 1000 steps by enforcing constant error flow through "constant error carrousels" (CECs) within special units. Multiplicative gate units learn to open and close access to CEC. LSTM's update complexity per time step is O(W), where W is the number of weights. In comparisons with RTRL, BPTT, Recurrent Cascade-Correlation, Elman nets, and Neural Sequence Chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex long time lag tasks that have never been solved by previous recurrent net algorithms. LSTM works with local, distributed, real-valued, and noisy pattern representations. Recent spin-off papers: LSTM can solve hard long time lag problems. To appear in NIPS 9, MIT Press, Cambridge MA, 1997. Bridging long time lags by weight guessing and "Long Short-Term Memory". In F. L. Silva, J. C. Principe, L. B. Almeida, eds., Frontiers in Arti- ficial Intelligence and Applications, Volume 37, pages 65-72, IOS Press, Amsterdam, Netherlands, 1996. _______________________________________________________________________ WWW/FTP pointers: ftp://flop.informatik.tu-muenchen.de/pub/fki/fki-207-95rev.ps.gz ftp://ftp.idsia.ch/pub/juergen/lstm.ps.gz For additional recurrent net papers see our home pages. For instance, the original analysis of recurrent nets' error flow and long time lag problems is in Sepp's 1991 thesis (p. 19-21). http://www7.informatik.tu-muenchen.de/~hochreit/pub.html http://www.idsia.ch/~juergen/onlinepub.html Happy new year! Sepp & Juergen PS: Why don't you stop by at IDSIA and give a talk next time you are near Switzerland or Italy? From angelo at crc.ricoh.com Tue Dec 31 14:00:33 1996 From: angelo at crc.ricoh.com (Michael Angelo (496-5735) Date: Tue, 31 Dec 1996 11:00:33 -0800 Subject: Job Opening (in Menlo Park, CA.) Message-ID: <199612311900.LAA02756@jaguar.crc.ricoh.com> The Ricoh California Research Center's Machine Learning and Perception Group invites exceptionally talented candidates to apply for a position as Research scientist in Information Technology Position description: * We seek applicants to join a small team of scientists and engineers exploring the use of machine learning and pattern recognition techniques in the general area of office information systems. Past and ongoing projects include o computer lipreading and speech-based interfaces o theory and application of neural network pruning methods o providing paper and electronic documents with novel functionality o theory for VLSI implementations of learning algorithms o novel human-machine interfaces o applications of the world-wide web * Ricoh CRC is a small center near Stanford University and other Silicon Valley landmarks; the atmosphere is collegial and exciting, and provides opportunities to expand Ricoh's products and services, travel nationally and internationally to professional conferences and presentations, publish in journals, and otherwise participate in the broader technical and professional community. Candidate requirements: * Ph.D. degree in Electrical Engineering, Computer Science or related field. (In exceptional cases, an M.S. degree with relevant work experience will suffice.) * Exceptionally strong C programming and Unix skills (experimental, not necessarily production), with experience in programming mathematical algorithms. C++, Java, Mathematica, MatLab and some parallel language are desirable. * Knowledge of neural networks, statistical and syntactic pattern recognition, image processing, handwriting recognition, natural language processing, and related topics is highly desirable. * Stong communication and organizational skills and the ability to learn quickly and to work both independently with minimal instruction and as part of a small team. Application deadline: * January 30, 1997 (hardcopy required -- see below). ---------------------------------------------------------------------------- RICOH California Research Center (RCRC): RCRC is a small research center in Menlo Park, CA, near the Stanford University campus and other Silicon Valley landmarks. The roughly 20 researchers focus on image compression and processing, pattern recognition, image and document analysis, artificial intelligence, machine learning, electronic service, and novel hardware for implementing computationally expensive algorithms. RCRC is a part of RICOH Corporation, the wholly owned subsidiary of RICOH Company, Ltd. in Japan. RICOH is a pioneer in facsimile, copiers, optical equipment, office automation products and more. Ricoh Corporation is an Equal Employment Opportunity Employer . ---------------------------------------------------------------------------- Please send any questions by e-mail to the address below, and type "Programming job" as your header line. Full applications (which must include a resume and the names and addresses of at least two people familiar with your work) should be sent by surface mail (no e-mail, ftp or html applications will be accepted) to: Dr. David G. Stork Chief Scientist RICOH California Research Center 2882 Sand Hill Road, Suite 115 Menlo Park CA 94025 stork at crc.ricoh.com ---------------------------------------------------------------------------- Web Version: http://www.crc.ricoh.com/jobs/MLPjob.html