From skrzypek at CS.UCLA.EDU Fri Dec 1 13:42:53 1989 From: skrzypek at CS.UCLA.EDU (Dr. Josef Skrzypek) Date: Fri, 1 Dec 89 10:42:53 PST Subject: TR available Message-ID: <8912011842.AA05532@retina.cs.ucla.edu> UCLA MPL TR 89-10 Visual Recognition of Script Characters; Neural Network Architectures. Josef Skrzypek Jeff Hoffman Machine Perception Laboratory Computer Science Department University of California Los Angeles, CA 90024 Visual recognition of script characters is introduced in the context of the neural network paradigm and the results of applying one specific neural architecture are analyzed. First, computer classification of script characters is partitioned into preprocessing, recognition and postprocessing techniques which are briefly reviewed in terms of suitability for implementation as neural net architectures. The second part of the paper introduces one example of neural net solution to script recognition. Handwriting is assumed to be defined as concatenation of ballistic hand movements where characters can be represented as functions of position and velocity. We adapt a hypothesized model of human script generation where characters can be composed from a limited number of basic strokes which are learned using visual and positional feedback. The neural representation of these characters is used for assembling motor program during writing and it can be used for their visual recognition during reading. A modified, three-layer "backpropagation" algorithm is used to learn features of each single character that are independent of writing style. Preliminary results suggest 80% recognition rate. From netlist at psych.Stanford.EDU Sat Dec 2 11:48:52 1989 From: netlist at psych.Stanford.EDU (Mark Gluck) Date: Sat, 2 Dec 89 08:48:52 PST Subject: REMINDER: TODAY (MONDAY) -- R. Sutton @ 3:45pm Message-ID: Stanford University Interdisciplinary Colloquium Series: Adaptive Networks and their Applications December 4th (Monday, 3:45pm): Room 380-380C ******************************************************************************** DYNA: AN INTEGRATED ARCHITECTURE FOR LEARNING, PLANNING, AND REACTING Richard S. Sutton GTE Laboratories Incorporated ******************************************************************************** Abstract Location: Room 380-380C, which can be reached through the lower level between the Psychology and Mathematical Sciences buildings. Level: Technically oriented for persons working in related areas. Mailing lists: To be added to the network mailing list, netmail to netlist at psych.stanford.edu with "addme" as your subject header. Additional information: Contact Mark Gluck (gluck at psych.stanford.edu). From LUBTODI%YALEVM.BITNET at VMA.CC.CMU.EDU Sun Dec 3 16:10:00 1989 From: LUBTODI%YALEVM.BITNET at VMA.CC.CMU.EDU (LUBTODI%YALEVM.BITNET@VMA.CC.CMU.EDU) Date: Sun, 03 Dec 89 16:10 EST Subject: requesting references to problem solving task models Message-ID: I am trying to compile a bibliography about connectionist/neural network research on high level cognitive problem solving. I would appreciate any suggestions on relevant arrticles, book chapters, or manuscripts. Please send complete citation information if possible. Some relevant topics include: analogical or inductive reasoning, logical (deductive) reasoning, multistage decision tasks, models of intelligence, schema implementation and use, or other applications of connectionist/neural network models to high-level cognitive tasks. Todd Lubart Yale University Department of Psychology Box 11A Yale Station New Haven CT 06520 BITNET ADDRESS: LUBTODI at YALEVM From harnad at clarity.Princeton.EDU Tue Dec 5 01:01:46 1989 From: harnad at clarity.Princeton.EDU (Stevan Harnad) Date: Tue, 5 Dec 89 01:01:46 EST Subject: Emperor's New Mind: BBS Call for Commentators Message-ID: <8912050601.AA01777@suspicion.Princeton.EDU> Below is the synopsis of a book that will be accorded a multiple book review (20 - 30 multidisciplinary reviews, followed by the author's response) in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal that provides Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Reviewers must be current BBS Associates or nominated by a current BBS Associate. To be considered as a reviewer for this book, to suggest other appropriate reviewers, or for information about how to become a BBS Associate, please send email to: harnad at confidence.princeton.edu or write to: BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771] ____________________________________________________________________ THE EMPEROR'S NEW MIND: CONCERNING COMPUTERS, MINDS AND THE LAWS OF PHYSICS Roger Penrose Rouse Ball Professor of Mathematics University of Oxford The Emperor's New Mind is an attempt to put forward a scientific alternative to the viewpoint of "Strong AI," according to which mental activity is merely the acting out of some algorithmic procedure. John Searle and other thinkers have likewise argued that mere calculation does not, of itself, evoke conscious mental attributes, such as understanding or intentionality, but they are still prepared to accept that the action of the brain, like that of any other physical object, could in principle be simulated by a computer. In my book I go further than this and suggest that the outward manifestations of conscious mental activity cannot even be properly simulated by calculation. To support this view I use various arguments to show that the results of mathematical insight, in particular, do not seem to be obtained algorithmically. The main thrust of this work, however, is to present an overview of the present state of physical understanding and to show that an important gap exists at the point where quantum and classical physics meet, and to speculate on how the conscious brain might be taking advantage of whatever new physics is needed to fill this gap, in order to achieve its non-algorithmic effects. From noel%CS.EXETER.AC.UK at VMA.CC.CMU.EDU Tue Dec 5 11:59:44 1989 From: noel%CS.EXETER.AC.UK at VMA.CC.CMU.EDU (Noel Sharkey) Date: Tue, 5 Dec 89 16:59:44 GMT Subject: reminder Message-ID: <6765.8912051659@kaos.cs.exeter.ac.uk> A reminder about the special issue of connection science. keep those submissions coming. Issues 2 and 3 are in press and may be out before christmas. ******* CALL FOR PAPERS ********* CONNECTION SCIENCE (Journal of Neural Computing, Artificial Intelligence and Cognitive Research) Special Issue CONNECTIONIST RESEARCH ON NATURAL LANGUAGE Editor: Noel E. Sharkey, University of Exeter Special Editorial Review Panel Robert Allen, Bell Communication Research Garrison W. Cottrell, University of California, San Diego Michael G. Dyer, University of California, Los Angeles Jeffrey L. Elman, University of California, San Diego George Lakoff, University of California, Berkeley Wendy W. Lehnert, University of Massachusetts at Amherst Jordan Pollack, Ohio State University Ronan Reilly, Beckmann Institute, Illinois Bart Selman, University of Toronto Paul Smolensky, University of Colorado, Boulder This special issue will accept submissions of full length connectionist papers and brief reports from any area of natural language research including: Connectionist applications to AI problems in natural language (e.g. paraphrase, summarisation, question answering). New formalisms or algorithms for natural language processing. Simulations of psychological data. Memory modules or inference mechanisms to support natural language processing. Representational methods for natural language. Techniques for ambiguity resolution. Parsing. Speech recognition, production, and processing. Connectionist approaches to linguistics (phonology, morphology etc.). Submissions of short reports or recent updates will also be accepted for the Brief Reports section in the journal. No paper should be currently submitted elsewhere. DEADLINES Deadline for submissions: December 15th 1989 Decision/reviews by: February 1990 Papers may be accepted to appear in regular issues if there is insufficient space in the special issue. For further information about the journal please contact Lyn Shackleton (Assistant Editor) Centre for Connection Science JANET: lyn at uk.ac.exeter.cs Dept. Computer Science University of Exeter UUCP: !ukc!expya!lyn Exeter EX4 4PT Devon BITNET: lyn at cs.exeter.ac.uk@UKACRL U.K. FAX: (0392) 264067 From kannan at faulty.che.utexas.edu Tue Dec 5 14:29:05 1989 From: kannan at faulty.che.utexas.edu (kannan@faulty.che.utexas.edu) Date: Tue, 5 Dec 89 13:29:05 CST Subject: Question on BP.. Message-ID: <8912051929.AA10774@faulty.che.utexas.edu.che.utexas.edu> Hi ., I would like to know if continous valued inputs can be used for the Back Propagation algorithm ? If so what are the problems faced on any of the simulations done? I would also like to know if there are any software available which would allow one to give inputs say between 0 and 1 . Thanks ., Kannan From Scott.Fahlman at B.GP.CS.CMU.EDU Tue Dec 5 16:43:45 1989 From: Scott.Fahlman at B.GP.CS.CMU.EDU (Scott.Fahlman@B.GP.CS.CMU.EDU) Date: Tue, 05 Dec 89 16:43:45 EST Subject: Question on BP.. In-Reply-To: Your message of Tue, 05 Dec 89 13:29:05 -0600. <8912051929.AA10774@faulty.che.utexas.edu.che.utexas.edu> Message-ID: I would like to know if continous valued inputs can be used for the Back Propagation algorithm ? Yes, definitely. For two good examples, see the work of Lapedes and Farber at Los Alamos on predicting time series (I don't have the reference handy) and the work on the two-spirals benchmark. The latter is described in a paper by Lang & Witbrock in the Proceedings of the 1988 Connectionist Models Summer School and in follow-up work presented by me and Chris Lebiere in the 1989 NIPS conference (proceedings due out in April or so, tech report sooner). In addition, most of the problems I've seen on phoneme labeling and sonar use multiple continuous-valued inputs. -- Scott Fahlman From slehar at bucasb.BU.EDU Tue Dec 5 23:20:29 1989 From: slehar at bucasb.BU.EDU (slehar@bucasb.BU.EDU) Date: Tue, 5 Dec 89 23:20:29 EST Subject: Question on BP.. In-Reply-To: connectionists@c.cs.cmu.edu's message of 5 Dec 89 21:07:14 GMT Message-ID: <8912060420.AA22688@bucasd.bu.edu> Yeah- I did a hand-sketched character recognition program with backprop (see ICNN 1986 was it? the Boston conference). I had a 100 x 100 sketch pad which I down-sampled to a 10 x 10 grid which became an input layer of 100 neurons. A run-time option allowed for either thresholding, which produced on or off binary inputs, or proportional sampling, which produced graded neurons between 0.0 and 1.0. The algorithm quite effortlessly handles both. To the human observer, the graded inputs looked much more recognizable because you could tell whether the sketch just nicked the corner of a grid square, or ran smack through the middle of it. There was, in fact, more information in the graded version. To my surprise, backprop actually prefered the binarized version. It learned faster, and discriminated better. From peterc at cs.brandeis.edu Wed Dec 6 13:48:09 1989 From: peterc at cs.brandeis.edu (Peter Cariani) Date: Wed, 6 Dec 89 13:48:09 est Subject: Alternative conceptions of symbol systems Message-ID: Comments on Harnad's definition of symbol system: We all owe to Steve Harnad the initiation of this important discussion. I believe that Harnad has taken the discourse of the symbol grounding problem in the right direction, toward the grounding of symbols in their interactions with the world at large. I think, however, that we could go further in this direction, and in the process continue to re-examine some of the fundamental assumptions that are still in force. The perspective presented here is elaborated much more fully and systematically in a doctoral dissertation that I completed in May of this year: Cariani, Peter (1989) On the Design of Devices With Emergent Semantic Functions Ph.D. Dissertation, Department of Systems Science, SUNY-Binghamton, University Microfilms, Ann Arbor Michigan. My work is primarily based on that of theoretical biologists Howard Pattee and Robert Rosen. Pattee has been elaborating on the evolutionary origins of symbols in biological systems while Rosen has concentrated on the modelling relations that biological organisms implement. The Hungarian theoretical biologist George Kampis has also recently published work along these lines. I would like to apologise for the length of this response, but I come out of a field which is very small and virtually unknown by those outside of it, so many of the basic concepts must be covered to avoid misunderstandings. Here are some suggestions for clarifying this murky discourse about symbol systems: 1) Define the tokens in terms of observable properties. The means of recognizing the tokens or states of the system must be given explicitly, such that all members of a community of observer-participants can reliably agree on what "state" the physical system is in. Without this specification the definition is gratuitous hand-waving. I stress this because there are a number of papers in the literature which discuss "computations" in the physical world (e.g. "the universe is a gigantic computation in progress") without the slightest indication of what the symbol tokens are that being manipulated, what the relevant states of such systems might be, or how we would go about determining, in concrete terms, whether a given physical system is to be classified as a physical symbol system. One has to be careful when one says "practically everything can be interpreted as rule-governed." Of course we can easily wave our hands and say, yes, those leaves fluttering in the breeze over there are rule-governed, without having any idea what the specific rules are or for that matter, what the states are), but to demonstrate that a phenomenon is rule-governed, we should show how we would come to see it as such: we should concretely show what measurements need to be made, we should make them, and then articulate the rules which describe/govern the behavior. If we say "a computer is a physical symbol system" we mean that if we look at the computer through the appropriate observational frame, measuring the appropriate voltages at the logic gates, then we can use this device to consistently and reliably implement a deterministic input-output function. For each initial distinguishable state of affairs, by operating the device we always arrive at one and only one end state within some humanly-relevant amount of time. This is a functionally-based physically-implemented concept of a formal system, one which is related to Hilbert's idea of reliable physical operations on concrete symbols leading to consistent results. Note that this definition is distinct from logicist/platonist definitions which include nonconcrete objects (e.g. sets of sets) or physically unrealizable objects (e.g. potential and actual infinities, indefinitely extendible tapes). 2) Abandon the explicit-implicit rule distinction. First, I'm not sure if Wittgenstein's distinction between "explicit" and "implicit" rule-following is appropriate here, since we are taking the role of external observers rather than participants in a language-game. If the purpose of the definition is to give us criteria to decide whether we are participating in a formal system, then we must know the rules to follow them. If the purpose is to identify "physical symbol systems" in nature and in human artefacts, then this distinction is irrelevant. What does it mean for a computer to explicitly or implicitly carry out a logical operation? If it made a difference, then the device would cease to be wholly syntactic. If it doesn't make a difference then we don't need it in our definition. Does the brain implement a physical symbol system, and if so, does it follow rules explicitly or implicitly? How would we decide? 3) Abandon semantic interpretability. I'm not sure if I understand fully the motivation behind this criterion of semantic interpretability. An external observer can assign whatever meanings s/he chooses to the tokens of the formal device. This criterion makes the definition very subjective, because it depends upon an arbitrary assignment of meaning. I don't even see how this is restrictive, since the observer can always come up with purely whimsical mappings, or to simply let the tokens stand for themselves. Note that "semantic interpretability" does not confer upon the physical symbol system its own semantics. The relations of the symbols manipulated in computer programs to the world at large are completely parasitical on human interpreters, unless the computer is part of a robot (i.e. possesses its own sensors and effectors). Merely being semantically interpretable doesn't ground the semantics in a definite way; when we say "X in my program represents the number of speeding violations on the Mass Pike" we stabilize the relation of the symbol X in the computer relative to ourselves (assuming that we can be completely consistent in our interpretation of the program). But each of us has a different tacit interpretation of what "the number of speeding violations on the Mass Pike" means. (Does a violator have to be caught for it to be a violation? Are speeding police cars violations? How is speed measured?) In order for this tacit interpretation to be made explicit we would need to calibrate our perceptions and their classifications along with our use of language to communicate them so that we as a community could reach agreement on our interpretations. The wrong turn that Carnap and many others made in the 1930's was to assume that these interpretations could be completely formalized, that a "logical semantics" was possible in which one could unambiguously determine the "meaning of an expression" within the context of other expressions. The only way to do this is to formalize completely the context, but in doing so you transform a semantic relation into a syntactic one. The semantic relation of the symbol to the nonsymbolic world at large gets reduced to a syntactic rule-governed relation of the symbol to other symbols. (Contingent truths become reduced to necessary truths.) What Carnap tried to say was that as long as a proposition referred to an observation statement (which refers to an act of observation), then that proposition has a semantic content. This has led us astray to the point that many people no longer believe that they need to materially connect the symbols to the world through perception and action, that merely referring to a potential connection is enough. This is perhaps the most serious failing of symbolic AI, the failure to ground the symbols used by their programs in materially implemented connections to the external world. 4) Abandon semantic theories based on reference A much better alternative to logical semantics involves replacing these syntactic theories of reference with a pragmatist semiotic when we go to analyze the roles of symbols in various kinds of devices. Pragmatist semiotics (as developed within Charles Morris' framework) avoid the formal reductionism and realist assumptions of referential theories of meaning by replacing correspondences between "objective" referents with physically implemented semantic operations (e.g. measurement, perception, control, action). These ideas are developed more fully in my dissertation. What one must do to semantically ground the symbols is to connect them to the world via sensors and effectors. If they are to be useful to the device or organism, they must be materially linked to the world in a nonarbitrary way, rather than referentially connected in someone's mind or postulated as unspecified logical ("causal") connections (as in "possible world" semantics). 4) Abandon Newell and Pylyshyn's Symbol Level. Upon close examination of both of their rationales for a separate symbol level, one finds that it rests precariously upon a distinction, between the Turing machine's internal states and the state of affairs on the tape (Pylyshyn, 1984, pp.68-74). Now the essential nature of this distinction is maintained because one is potentially infinite and the other is finite (else one could simply make a big finite-state-automaton and the distinction would be an abitrary labelling of the global machine states), but physically realizable devices cannot be potentially infinite, so the essential, nonarbitrary character of the distinction vanishes (Cariani, 1989, Appendix 2). 5) Purge the definition of nonphysical, platonic entities (or at least recognize them as such and be aware of them). For example, the definition of physical symbol systems is intimately tied up with Turing's definition of computation, but, as von Neumann noted, this is not a physical definition; it is a formal one. Now, physically realizable automata cannot have indefinitely extendible tapes, so the relevance of potentially-infinite computations to real world computations is dubious. Everything we can physically compute can be described in terms of finite-state automata (finite tape Turing machines). We run out of memory space and processing time long before we ever encounter computability limitations. Computational complexity matters, computability doesn't. I'd be especially interested in thoughtful counter-arguments to this point. 6) Alternative I: Adopt a physical theory of symbolic action. Howard Pattee has been developing a physical theory of symbolic function for 20 years--symbolic processes are those which can be described in terms of nonholonomic constraints (in terms of the equations of motion and basins of attraction (in terms of trajectories)(see refs: Pattee; Cariani, Minch, Rosen). (Next summer there will be a workshop entitled "Symbols and Dynamics" at the ISSS meeting in Portland, Ore., July 8-13, 1990. Contact: Gail Fleischaker, 76 Porter St., Somerville, MA 02143 for more info.) The only disadvantage of these approaches lie in their classical/ realist assumption of complete knowledge of the state space within which the symbolic activity occurs. These premises are deeply embedded in the very terms of the disourse, but nevertheless, this descriptive physical language is exceedingly useful as long as the limitations of these assumptions are constantly kept in mind. To translate from the semiotic to the physical, syntactic relations are those processes for which a nonholonomic rate-independent equation of constraint can completely replace the rate-dependent laws of motion. For an electronic computer, we can replace all of the microscopic electromagnetic equations of motion describing the trajectories of electrons with macroscopic state-transition rules describing gate voltages in terms of binary states. These state transition rules are not rate-dependent, since they depend upon successions of states rather than time; consequently time need not enter explicitly when describing the behavior of a computer in terms of binary states of gate voltages. Semantic relations are those processes which can be described in terms of rate-independent terms coupled to rate-dependent terms: one side of the constraint equation is symbolic and rate-independent, the other half is nonsymbolic and rate-dependent. Processes of measurement are semantic in character: a rate-dependent, nonsymbolic interaction gives rise to a rate-independent symbolic output. Here pragmatic relations are those processes which change the structure of the organism or device, which appear in the formalism as changes in the nonholonomic constraints over time. 7) Alternative II: Adopt a phenomenally grounded systems-theoretic definition. Part of my work has been to ground the definition of symbol in terms of the observed behavior of a system. This is the only way we will arrive at an unambiguous definition. We select a set of measuring devices which implement distinctions on the world which become our observable "states." We observe the behavior of the physical system through this observational framework. This strategy is similar to the way W. Ross Ashby grounded his theory of systems. Either the state-transitions are deterministic--state A is always followed by state B which is always followed by state G--or they are nondeterministic-- state D is sometimes followed by state F and sometimes followed by state J. Here the relation between states A,B, and G appears to be symbolic, because the behavior can be completely captured in terms of rules, where the relation between the states D, F, and J appears nonsymbolic, because the behavior depends upon aspects of the world which are not captured by this observational frame. Syntactic, rule-governed, symbol manipulations appear to an external observer as deterministic state transitions (in Ashby's terms, "a state-determined system"). Semantic processes appear to the observer as nondeterministic, contingent state transitions leading to states which appear as symbolic. Pragmatic relations appear as changes in the structure of the observed state-transitions. 8) Alternative III: Adopt a physical, mechanism-based definition of symbol systems. Symbolic and nonsymbolic can also be viewed in terms of "digital" and "analog" in the sense of differentiated (discrete) and nondifferentiated (continuous). Sensors implement semantic A-to-D operations. Logical operations ("computations") implement syntactic, determinate D-to-D transformations. Controls implement semantic D-to-A operations. One has to be careful here, because there are many confusing uses of these words (e.g. "analog computation"), and what appears to be "analog" or "digital" is a function of how you look at the device. Given a particular observational framework and a common usage of terms, however, these distinctions can be made reliable. I would argue that von Neumann's major philosophical works (General & Logical Theory of Automata, Self-Reproducing Automata, The Computer and the Brain) all take this approach. 9) Alternative IV: Adopt a semiotic-functionalist definition of symbol systems. It can be argued that the basic functionalities needed in the modelling relation are the ability to convert nonsymbolic interactions with the world into symbols (measurement), the ability to manipulate symbols in a definite, rule-governed way (computations), and the ability to use a symbol to direct action on the nonsymbolic world (controls). I have argued that these functionalities are irreducible; One cannot achieve measurements by doing computations simply because measurement involves a contingent state-transition where two or more possible observed outcomes are reduced to one observed outcome, whereas computation involves a necessary state-transition, where each state has but one observed outcome. These assertions are similar to the epistemological positions adopted by Bohr, von Neumann, Aristotle and many others. In such a definition, a physical symbol system is defined in terms of its use to us as observer-participants. Are we trying to gain information about the external world by reducing the possible observed states of a sensor to one (by performing a measurement)? Are we trying to manipulate symbols in a consistent, reliable way so that we always arrive at the same outcome given the same input strings and rules. If so, we are performing computations. Are we trying to use symbols to change the nonsymbolic world by acting on it. If so, we are employing symbolically-directed control operations. In summary, there are many worthwile alternatives to the basic assumptions that have been handed down to us through logical positivism, model-theoretic semantics, artificial intelligence and cognitive psychology. Steve Harnad has done us a great service in making many of these assumptions visible to us and clarifying them in the process. There are other conceptual frameworks which can be of great assistance to us as we engage in this process: theoretical biology, semiotics/pragmatist philosophy, cybernetics and systems theory. It is difficult to entertain ideas which challenge cherished modes of thought, but such critical questioning and debate are indispensible if we are to deepen our understanding of the world around us. References: ------------------------------------------------------------------------------ Cariani, Peter (1989) On the Design of Devices with Emergent Semantic Functions. PhD Dissertation, Department of Systems Science, State University of New York at Binghamton; University Microfilms, Ann Arbor, MI. (1989) Adaptivity, emergence, and machine-environment dependencies. Proc 33rd Ann Mtg Intl Soc System Sciences (ISSS), July, Edinburgh, III:31-37. Kampis, George (1988) Two approaches for defining "systems." Int. J. Gen. Systems (IJGS), vol 15, pp.75-80. (1988) On the modelling relation. Systems Research, vol 5, (2), pp. 131-44. (1988) Some problems of system descriptions I: Function, II: Information. Int. J. Gen. Systems 13:143-171. Minch, Eric (1988) Representations of Hierarchical Structures in Evolving Networks. PhD Dissertation, Dept. of Systems Science, SUNY-Binghamton. Morris, Charles (1956) Foundations of the theory of Signs. In:Foundations in the Unity of Science, Vol.I, Neurath, Carnap, & Morris, eds, UChicago. Pattee, Howard H. (1968) The physical basis of coding and reliability in biological evolution. In: Towards a Theoretical Biology (TTB) Vol. 1 C.H. Waddington, ed., Aldine, Chicago. (1969) How does a molecule become a message? Dev Biol Supp 3: 1-16. (1972) Laws and constraints, symbols and languages. In: TTB, Vol. 4 (1973) Physical problems in the origin of natural controls. In: Biogenesis, Evolution, Homeostasis. Alfred Locker, ed., Pergamon Press, New York. (1973) Discrete and continuous processes in computers and brains. In: The Physics & Mathematics of the Nervous System, Guttinger & Conrad, eds S-V. (1977) Dynamic and linguistic modes of complex systems. IJGS 3:259-266. (1979) The complemetarity principle and the origin of macromolecular information. Biosystems 11: 217-226. (1982) Cell psychology: an evolutionary view of the symbol-matter problem. Cognition & Brain Theory 5:325-341. (1985) Universal principles of measurement and language functions in evol- ving systems. In: Complexity, Language, and Life Casti & Karlqvist, S-V. (1988) Instabilities and information in biological self-organization. In: Self-Organizing Systems: The Emergence of Order. E Yates, ed. Plenum Press. (1989) Simulations, realizations, and theories of life. Artificial Life, C. Langton, ed., Addison-Wesley. Rosen, Robert (1973) On the generation of metabolic novelties in evolution. In: Biogenesis, Evolution, Homeostasis. A Locker, ed., Pergamon Press, New York. (1974) Biological systems as organizational paradigms. IJGS 1:165-174 (1978) Fundamentals of Measurement and Representation of Natural Systems. North Holland (N-H), New York. (1985) Anticipatory Systems. Pergamon Press, New York. (1986) Causal structures in brains and machines. IJGS 12: 107-126. (1987) On the scope of syntactics in mathematics and science: the machine metaphor. In: Real Brains Artificial Minds. Casti & Karlqvist, eds N-H. From pollack at cis.ohio-state.edu Fri Dec 8 10:16:36 1989 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Fri, 8 Dec 89 10:16:36 EST Subject: New CogSci Conf. Deadline Message-ID: <8912081516.AA13708@toto.cis.ohio-state.edu> I recently communicated with one of the organizers of this years Cognitive Science Conference, who informed me that a new Call For Papers was going out soon with a revised deadline of March 15th. >> Date: Fri, 8 Dec 89 09:22:18 EST >> From: adelson%cs.tufts.edu at RELAY.CS.NET >> To: pollack at cis.ohio-state.edu >> Subject: Re: verification >> >> Jordan, >> We have moved the deadline to March 15 for this year's Cog Sci >> papers. >> >> Talk to you soon, >> B Holiday Cheers, Jordan From french at cogsci.indiana.edu Fri Dec 8 14:11:06 1989 From: french at cogsci.indiana.edu (Bob French) Date: Fri, 8 Dec 89 14:11:06 EST Subject: CogSci Deadline addendum Message-ID: There is a "no revisions" stipulation that goes along with the March 15 CogSci Conference paper submission deadline. Below a message I received from Massimo Piattelli-Palmarini (Nov. 21): > We have received a number of such protests. In the next few days all > members will receive a new deadline communication. The new deadline > is March 15, but then there will be no revisions for the accepted > papers. And these must be submitted by that date as photo-ready. > I am sorry, but the organizers of last year's meeting had a lot > of difficulties in meeting the publisher's deadline and have the > Proceedings ready at the meeting. They could do it, I am told, > only because by sheer good luck the printer happened to be located > at Ann Arbor. With the new deadlines, which were decided upon only > three days ago, there is enough time to go to print. > Sorry for all inconvenience this may cause. Yours truly. > Massimo Piattelli-Palmarini. Bob French From B344DSL%UTARLG.ARL.UTEXAS.EDU at ricevm1.rice.edu Sun Dec 10 14:56:00 1989 From: B344DSL%UTARLG.ARL.UTEXAS.EDU at ricevm1.rice.edu (B344DSL%UTARLG.ARL.UTEXAS.EDU@ricevm1.rice.edu) Date: Sun, 10 Dec 89 13:56 CST Subject: No subject Message-ID: This message is an announcement of a forthcoming graduate textbook neural networks by Daniel S. Levine. The title of the book is Introduction to Neural and Cognitive Modeling, and the publisher is Lawrence Erlbaum Associates, Inc. The book should be in production early in 1990, so should, with luck, be ready by the start of the Fall, 1990 semester at universities. Chapters 2 to 7 will contain homework exercises. Some of the homework problems will involve computer simulations of models already in the literature. Others will involve thought experiments about whether a particular network can model a particular cognitive process, or how that network might be modified to do so. The table of contents follows. Please contact the author or publisher for further information. Author: Daniel S. Levine Department of Mathematics University of Texas at Arlington Arlington, TX 76019-9408 817-273-3598 b344dsl at utarlg.bitnet Publisher: Lawrence Erlbaum Associates, Inc. 365 Broadway Hillsdale, NJ 07642 201-666-4110 Table of Contents: PREFACE CHAPTER 1: BRAIN AND MACHINE: THE SAME PRINCIPLES? What are Neural Networks? What are Some Principles of Neural Network Theory? Methodological Considerations i 36 CHAPTER 2: HISTORICAL OUTLINE 2.1 -- Digital Approaches The Mc Culloch-Pitts network Early Approaches to Modeling Learning: Hull and Hebb Rosenblatt's Perceptrons Some Experiments with Perceptrons The Divergence of Artificial Intelligence and Neural Modeling 2.2 -- Continuous and Random-net Approaches Rashevsky's Work Early Random Net Models Reconciling Randomness and Specificity 2.3 -- Definitions and Detailed Rules for Rosenblatt's Perceptrons CHAPTER 3: ASSOCIATIVE LEARNING AND SYNAPTIC PLASTICITY 3.1 -- Physiological Bases for Learning 3.2 -- Rules for Associative Learning Outstars and Other Early Models of Grossberg Anderson's Connection Matrices Kohonen's Work 3.3 -- Learning Rules Related to Changes in Node Activities Klopf's Hedonistic Neurons and the Sutton-Barto Learning Rule Error Correction and Back Propagation The Differential Hebbian Idea Gated Dipole Theory 3.4 -- Associative Learning of Patterns Kohonen's Recent Work: Autoassociation and Heteroassociation Kosko's Bidirectional Associative Memory 3.5 -- Equations and Some Physiological Details Neurophysiological Principles Equations for Grossberg's Outstar Kohonen's Early Equations (Example: Simulation of Face Recognition) Derivation of the Back Propagation Learning Law Due to Rumelhart, Hinton, and Williams Equations for Sutton and Barto's Learning Network Gated Dipole Equations Due to Grossberg Kosko's Bidirectional Associative Memory (BAM) Kohonen's Autoassociative Maps CHAPTER 4: COMPETITION, INHIBITION, SPATIAL CONTRAST ENHANCEMENT, AND SHORT-TERM MEMORY 4.1 -- Early Studies and General Themes (Contrast Enhancement, Competition, and Normalization) Nonrecurrent Versus Recurrent Lateral Inhibition 4.2 -- Lateral Inhibition and Excitation Between Sensory Representations Wilson and Cowan's Work Work of Grossberg and Colleagues Work of Amari and Colleagues Energy Functions in the Cohen-Grossberg and Hopfield- Tank Models The Implications of Approach to Equilibrium 4.3 -- Competition and Cooperation in Visual Pattern Recognition Models Visual Illusions Boundary Detection Versus Feature Detection Binocular and Stereoscopic Vision Comparison of Grossberg's and Marr's Approaches 4.4 -- Uses of Lateral Inhibition in Higher-level Processing 4.5 -- Equations for Various Competitive and Lateral Inhibition Models Equations of Sperling and Sondhi Equations of Wilson and Cowan Equations of Grossberg and his Co-workers: Analytical Results Equations of Hopfield and Tank Equations of Amari and Arbib CHAPTER 5: CONDITIONING, ATTENTION, REINFORCEMENT, AND COMPLEX ASSOCIATIVE LEARNING 5.1 -- Network Models of Classical Conditioning Early Work: Uttley and Brindley Rescorla and Wagner's Psychological Model Grossberg: Drive Representations and Synchronization Aversive Conditioning and Extinction Differential Hebbian Theory Versus Gated Dipole Theory 5.2 -- Attention and Short Term Memory in the Context of Conditioning Grossberg's Approach to Attention Sutton and Barto's Approach to Blocking Some Contrasts Between the Above Two Approaches Further Connections with Invertebrate Neurophysiology Gated Dipoles and Aversive Conditioning 5.3 -- Equations for Some Conditioning and Associative Learning Models Klopf's Drive-reinforcement Model Some Later Variations of the Sutton-Barto model: Temporal Difference The READ Circuit of Grossberg, Schmajuk, and Levine The Aplysia Model of Gingrich and Byrne CHAPTER 6: CODING AND CATEGORIZATION 6.1 -- Interactions Between Short and Long Term Memory in Code Development: Examples from Vision Malsburg's Model with Synaptic Conservation Grossberg's Model with Pattern Normalization Mathematical Results of Grossberg and Amari Feature Detection Models with Stochastic Elements From Feature Coding to Categorization 6.2 -- Supervised Classification Models The Back Propagation Network and its Variants Some Models from the Anderson-Cooper School 6.3 -- Unsupervised Classification Models The Rumelhart-Zipser Competitive Learning Algorithm Adaptive Resonance Theory Edelman and Neural Darwinism 6.4 -- Translation and Scale Invariance 6.5 -- Equations for Various Coding and Categorization Models Malsburg's and Grossberg's Development of Feature Detectors Some Implementation Issues for Back Propagation Equations Brain-state-in-a-box Equations Rumelhart and Zipser's Competitive Learning Equations Adaptive Resonance Equations CHAPTER 7: OPTIMIZATION, CONTROL, DECISION MAKING, AND KNOWLEDGE REPRESENTATION 7.1 -- Optimization and Control Hopfield, Tank, and the Traveling-Salesman Problem Simulated Annealing and Boltzmann Machines Motor Control: the Example of Eye Movements Motor Control: Arm Movements Speech Recognition and Synthesis Robotic Control 7.2 -- Decision Making and Knowledge Representation What, if Anything, do Biological Organisms Optimize? Affect, Habit, and Novelty in Neural Network Theories Neural Control Circuits, Neurochemical Modulation, and Mental Illness Some Comments on Models of Specific Brain Areas Knowledge Representation: Letters and Words Knowledge Representation: Concepts and Inference 7.3 -- Equations for a Few Neural Networks Performing Complex Tasks Hopfield and Tank's "Traveling Salesman" Network The Boltzmann Machine Grossberg and Kuperstein's Eye Movement Network VITE and Passive Update of Position (PUP) for Arm Movement Control Affective Balance and Decision Making Under Risk CHAPTER 8: A FEW RECENT ADVANCES IN NEUROCOMPUTING AND NEUROBIOLOGY 8.1 -- Some "Toy" and Real World Computing Applications 8.2 -- Some Biological Discoveries APPENDIX 1: BASIC FACTS OF NEUROBIOLOGY The Neuron Synapses, Transmitters, Messengers, and Modulators Invertebrate and Vertebrate Nervous Systems Functions of Subcortical Regions Functions of the Mammalian Cerebral Cortex APPENDIX 2: DIFFERENCE AND DIFFERENTIAL EQUATIONS IN NEURAL NETWORKS Example: the Sutton-Barto Difference Equations Differential Versus Difference Equations Outstar Equations: Network Interpretation and Numerical Implementation The Chain Rule and Back Propagation Dynamical Systems: Steady States, Limit Cycles, and Chaos From aboulang at WILMA.BBN.COM Mon Dec 11 15:40:49 1989 From: aboulang at WILMA.BBN.COM (aboulang@WILMA.BBN.COM) Date: Mon, 11 Dec 89 15:40:49 EST Subject: Chemical-wave reference mentioned in dynamics workshop at Keystone Message-ID: Several people wanted to know the reference to image processing via chemical-waves which seems to be akin to the kind of image processing capabilities of Pierre Baldi's oscillating networks (talk Thursday morning): "Image Processing Using Light-sensitive Chemical Waves" L. Kuhnert, K.I. Agladze, & V.I. Krinsky Nature Vol 337, 19 Jan. 1989, 244-247. Regards, Albert Boulanger BBN Systems & Technologies Corp. aboulanger at bbn.com From ST401843%BROWNVM.BITNET at VMA.CC.CMU.EDU Mon Dec 11 01:07:12 1989 From: ST401843%BROWNVM.BITNET at VMA.CC.CMU.EDU (thanasis kehagias) Date: Mon, 11 Dec 89 01:07:12 EST Subject: update of recurrent nets bibliography Message-ID: dear list manager: a while ago i passed you a connectionist bibliography of recuurent nets and you deposited it in your ftp directories. i am now sending you the updated version of the bibliography and a message to the list members. if this is fine with you, replace the bibliography and post the following message (adding the appropriate instructions for ftp-ing): (i hope this is not too much work for you - if it is, let me know and i will find other ways to distribute this .....) --------------------text of message to list members ------------- a while ago i had posted a preliminary version of a bibliography of recurrent nets. here is a more definitive version of it. it is expanded from its old form (now has somewhat over 100 items). it is not strictly recurrent nets. a better way to describe it is: how-to-capture-temporal-relationships-bibliography. some of the selections are static nets, but most are dynamic. i explain some more the distinction in a short blurb that comes with the bibliography. the blurb is in tex format (latex-able) and the bibliography is a bib-texable bibliographic database. so people who like this kind of thing can produce a pretty TeX document. if you do not want to mess with it, just ftp the files- they are usable as ascii files without any texxing. the files are in the connectionists/cmu directory and here is how to access them : ************************************************************************** ----------(here you should add the ftp instructions) ----------- ************************************************************************** as i say in the tex document, this is a very informal bibliographic search. some information is missing and the bibliography is certainly incomplete. at this point i am not inclined to work much more on it, except if someone feels he has work that should be in it, i would certainly put it in. at this point i have already sent two messages to the members of the list, so i suppose if somebody has something has already contacted me. so here it is, enjoy. thanasis -------------------end of message text ------------------------------------ -------------------beginning of tex document ------------------------------- \documentstyle{article} \title{{\bf A Short Bibliography of Connectionist Systems \\ for Temporal Behavior}} \author{Athanasios Kehagias \\ Division of Applied Mathematics \\ Brown University \\ Providence, RI 02912} \date{December 12, 1989} \begin{document} \normalsize \bibliography{recmes2} \bibliographystyle{alpha} \maketitle % This is a bibliography of work in Connectionism (and related fields) that tries to capture temporal relationships. It is not intended as a complete bibliography. I include work I came across after a reasonable but not exhaustive search and the connection to the problem of representing temporal relationships is seen from my personal perspective. So, if you think something is omitted, misrepresented etc., send me your suggestions. A large part of the current work in connectionism is concerned with the design of {\bf static} networks - meaning that the output of the network at time $t+1$ is more or less not influenced by the output at time $t$. Nevertheless, such static architectures have been used to capture temporal relationships. For example consider the work in predicting chaotic time series: \cite{kn:Moo88a} \cite{kn:Lap87a} \cite{kn:Far88a} . Another case of implicitly {\bf dynamical} networks are Hopfield type networks and Boltzmann machines. In both cases we have networks where the state of a neuron at time $t+1$ depends on the input from other neuron states at time $t$ . However, even though such networks do have dynamic behavior, the goal is that they settle down at a steady state useful for storing patterns. So in the long run we are more interested in static behavior. There is an extensive amount of work on Hopfield networks. Hopfield's seminal papers are: \cite{kn:Hop82a}, \cite{kn:Hop84a}, \cite{kn:Hop85a}. Similarly there is a lot of work on Boltzmann machines; here is a small sample: \cite{kn:Ack85a}, \cite{kn:Pet87a}, \cite{kn:Pet88a}, \cite{kn:Pet89a}, \cite{kn:Pet89b}, \cite{kn:Sus88a}. Moving on to certain application oriented work where temporal behavior is important, we see a mixed approach, sometimes using static network solutions, sometimes using dynamic network solutions. For instance in control/robotics, following a trajectory in state-space is an essentially dynamic problem. Certain connectionist approaches are: \cite{kn:Bar89a}, \cite{kn:Bar89b}, \cite{kn:Jor88a}, \cite{kn:Pea89a}, \cite{kn:Eck89a}. Similarly, in speech recognition time is of great importance. Some researchers have used a static approach (e.g. the time-delay neural network, \cite{kn:Wai89a} , or Boltzmann machines \cite{kn:Pra86a} ). But there is also a lot of work where dynamic (recurrent) neural networks are used: \cite{kn:Bou88a}, \cite{kn:Bou89a}, \cite{kn:Elm88a}, \cite{kn:Gra89a}, \cite{kn:Koh89a}, \cite{kn:Kuh90a}, \cite{kn:Rob88a}, \cite{kn:Rob89a}, \cite{kn:Wat87a}, \cite{kn:Wat87b}. Another area of practical problems being attacked by recurrent neural architectures is tasks related to formal or natural language, e.g. \cite{kn:Cot85a},\cite{kn:San89a},\cite{kn:Cha87a}, \cite{kn:Fan85a}, \cite{kn:Han87a}, \cite{kn:Mik89a}, \cite{kn:Mii88a}, \cite{kn:Cle90a}, \cite{kn:Ser88a}, \cite{kn:All89a}, \cite{kn:All89b}, \cite{kn:All88a}, \cite{kn:Rie88a}, \cite{kn:Sej86a}, \cite{kn:StJ85a}, \cite{kn:Whe89a}. Finally, the bulk of the references in this bibliography refer to more or less theoretical treatments of the problem of dynamical (recurrent) neural networks. I cite first a very small sample of Grossberg's work: \cite{kn:Gro86a}, \cite{kn:Gro86b}. This ART-type work by Grossberg , Carpenter and collaborators is very interesting and there is a very voluminous literature; references to more of this type work are contained in the books listed above. I now cite a large number of references which relate to neural networks mostly as dynamical systems in $R^n$ (as opposed to dynamical systems taking Boolean values) : \cite{kn:Alm87a}, \cite{kn:Alm88a}, \cite{kn:Alm89a}, \cite{kn:Alm89b}, \cite{kn:Alm90a}, \cite{kn:Bab87a}, \cite{kn:Bar85a}, \cite{kn:Bar81a}, \cite{kn:Bel88a}, \cite{kn:Deh87a}, \cite{kn:Elm88b}, \cite{kn:Elm89b}, \cite{kn:Gal88a}, \cite{kn:Gol86a}, \cite{kn:Gol88a}, \cite{kn:Gut88a}, \cite{kn:Guy82a}, \cite{kn:Kur86a}, \cite{kn:Mar89a}, \cite{kn:Moz89a}, \cite{kn:Now88a}, \cite{kn:Ott89a}, \cite{kn:Pin87a}, \cite{kn:Pin88a}, \cite{kn:Pol87a}, \cite{kn:Pol88a}, \cite{kn:Ren90a}, \cite{kn:Ren90b}, \cite{kn:Roh87a}, \cite{kn:Roh90a}, \cite{kn:Ried88a}, \cite{kn:Sch88a}, \cite{kn:Sch89a}, \cite{kn:Sim88a}, \cite{kn:Som88a}, \cite{kn:Sto}, \cite{kn:Sun}, \cite{kn:Sut88a}, \cite{kn:Wil88a}. Finally here are some refernces to work that considers dynamical neural networks as dynamical systems with a Boolean state vector (or in other words sequential machines): \cite{kn:Ale89a}, \cite{kn:Ama71a}, \cite{kn:Ama72a}, \cite{kn:Ama83a}, \cite{kn:Cai70a}, \cite{kn:Cai75a}, \cite{kn:Cai76a}, \cite{kn:Cai86a}, \cite{kn:Cai89a}, \cite{kn:Jor86a}, \cite{kn:Mar89b}, \cite{kn:Mar87a}, \cite{kn:Mcc43a}, \cite{kn:Par89a}, \cite{kn:Sch}, \cite{kn:Raj}, \cite{kn:Tsu89a}, \cite{kn:Cot89a}, \cite{kn:All89c}, \cite{kn:All89d}, \cite{kn:Sun89a}, \cite{kn:Sun89b}, \cite{kn:Roz69a}, \cite{kn:Par88a}. This is a rather fine and arbitrary classification , but work which is often characterized as research in cellular automata is relevant to neural netorks issues. Here are some samples: \cite{kn:Ale73a}, \cite{kn:Kau69a}, \cite{kn:Fog82a}, \cite{kn:Fog83a}, \cite{kn:Fog85a}, \cite{kn:Slo67a}. This is the material I was able to collect; there is a whole lot more, sometimes refrred to in the references of the above works. Any suggestions for improvement are welcome. \end{document} ---------------------end of tex document -------------------------- ---------------------beginning of bibliographic database----------- @BOOK{KN:GRO86A, author ="S. Grossberg", title ="The Adaptive brain:I Learning, reinforcement, motivation and rhythm", YEAR ="1986", PUBLISHER="?" } @book{kn:Gro86b, author ="S. Grossberg", title ="The Adaptive brain:II Vision, Speech, Language and Motor Control", YEAR ="1986", PUBLISHER="?" } @ARTICLE{kn:Hop85a, AUTHOR= "J.J.Hopfield and D.W.Tank", TITLE= "Neural Computation of Decisions in Optimization Problems", JOURNAL= "Biol. Cyb.", YEAR= "1985", VOLUME= "52" } @ARTICLE{kn:Hop82a, AUTHOR= "J.J. Hopfield", TITLE= "Neural Nets and Physical Systems with Emergent Collective Computational Properties", JOURNAL= "Proc. Nat'l Acad. Sci. USA", YEAR= "1982", VOLUME= "?" } @ARTICLE{kn:Hop84a, AUTHOR= "J.J. Hopfield", TITLE= "Neurons with Graded Response have Collective Computational Properties like those of Two-State Neurons", JOURNAL= "Proc. Nat'l Acad. Sci. USA", YEAR= "1984", VOLUME= "81" } @ARTICLE{kn:Ack85a, AUTHOR= "D.H. Ackley and others", TITLE= "A Learning Algorithm for Boltzmann Machines", JOURNAL= "Cognitive Science", YEAR= "1985", VOLUME= "9" } @ARTICLE{kn:Pet87a, AUTHOR= "C. Peterson and J.R. Anderson", TITLE= "A Mean Field Theory Learning Algorithm for Neural Nets", JOURNAL= "Complex Systems", YEAR= "1987", VOLUME= "1" } @ARTICLE{kn:Pet89a, AUTHOR= "C. Peterson and E. Hartman", TITLE= "Explorations of the Mean Field Theory Learning Algorithm", JOURNAL= "Neural Networks", YEAR= "1989", VOLUME= "?" } @ARTICLE{KN:Pet88a, AUTHOR= "C. Peterson and J.R. Anderson", TITLE= "Neural Networks and NP-complete Optimization Problems; A Performance Study on the Graph Bisection Problem", JOURNAL= "Complex Systems", YEAR= "1988", VOLUME= "2" } @ARTICLE{kn:Pet89b, AUTHOR= "C. Peterson and B. Soderberg", TITLE= "A New Method for Mapping Optimization Problems onto Neural Networks", JOURNAL= "Int. J. of Neural Systems", YEAR= "1989", VOLUME= "1" } @techreport{kn:Sus88a, author ="H.J. Sussman", title ="On the Convergence of Learning Algorithms for Boltzmann Machines", number ="sycon-88-03", institution ="Rutgers Center for Systems and Control", year ="1988" } @techreport{kn:Bar89a, AUTHOR= "A.G. Barto and R.S. Sutton", TITLE= "Learning and Sequential Decision Making", INSTITUTION= "COINS Dept., Amherst Un. ", YEAR= "1989", Number= "TR 89-95" } @ARTICLE{kn:Jor88a, AUTHOR= "M.I. Jordan", TITLE= "Supervised Learning and Systems with Excess Degrees of Freedom", INSTITUTION= "COINS Dept., Amherst Un. ", YEAR= "1988", number= "TR 88-27" } @ARTICLE{kn:Bar89b, AUTHOR= "A.G. Barto", TITLE= "Connectionist Learning for Control:An Overview", INSTITUTION= "COINS Dept., Amherst Un. ", YEAR= "1989", number= "TR 89-89" } @inproceedings{kn:Pea89a, AUTHOR= "B.A. Pearlmutter", TITLE= "Learning State Space Trajectories in Recurrent Neural Nets", booktitle= "IJCNN", YEAR= "1989", organization= "IEEE" } @article{kn:Eck89a, author ="R. Eckmiller", title ="Generation of Movement Trajectories in Primates and Robots", journal ="Neural Computing Architectures, I. Aleksander ed.", year ="MIT , 1989" } @inproceedings{kn:Moo88a, title ="Learning with Localized Receptor Fields", booktitle ="Connectionist Models Summer School ", author ="J. Moody and C. Darken", year ="1988", volume ="?", organization ="Carnegie Mellon University" } @techreport{kn:Lap87a, author ="A. Lapedes and R. Farber", title ="Nonlinear Signal Processing using Neural Networks: Prediction and System Modelling", number ="LA UR 87-?", institution ="Los Alamos National Lab", year ="1987" } @techreport{kn:Far88a, author ="J.D. Farmer and J.J. Sidorowich", title ="Exploiting Chaos to Predict the Future and Reduce Noise", number ="LA UR 88-901", institution ="Los Alamos National Lab", year ="1988" } @inproceedings{kn:And88a, author ="S. Anderson", title ="Dynamic System Categorization with Recurrent Networks", booktitle ="Connectionist Models Summer School", year ="1988" } @ARTICLE{kn:Bou88a, AUTHOR= "H. Bourlard and C.J. Wellekens", TITLE= "Links between Markov Models and Multilayer Perceptrons", organization="Phillips Research Lab", YEAR= "1988", techreport= "M 263" } @inproceedings{kn:Bou89a, AUTHOR= "H. Bourlard and C.J. Wellekens", TITLE= "Speech Dynamics and Recurrent Neural Nets", booktitle= "ICASSP", organization="IEEE", YEAR= "1989", VOLUME= "?" } @article{kn:Elm88a, author ="J. L. Elman and D. Zipser", title ="Learning the Hidden Structure of Speech", journal ="Journal of the Acoustical Society of America", year ="1988", volume ="83" } @inproceedings{kn:Gra89a, AUTHOR= "K.A. Grajski and others ", TITLE= "A Preliminary Note on Training Static and Recurrent Neural Nets for Word-level Speech Recognition", booktitle= "IJCNN", YEAR= "1989", VOLUME= "?" } @incollection{kn:Koh89a, author ="T. Kohonen", editor ="I. Aleksander", title ="Speech Recognition based on Topology Preserving Maps", booktitle ="Neural Computing Architectures", year ="1989", publisher ="MIT" } @article{kn:Kuh90a, title ="Connected Recognition with a Recurrent Network", author ="G. Kuhn", journal ="Speech Communication", year ="1990", volume ="9", number ="2", pages ="?", } @inproceedings{kn:Pin88a, author= "F. Pineda", title = "Generalization of Backpropagation to Recurrent and Higher Order Neural Networks", booktitle= "Neural Information Processing Systems", editor= "D. Anderson", year = "1988" } @article{kn:Pra86a, title ="Boltzmann Machines for Speech Recognition", author ="R. Prager and others", journal ="Computer , Speech and Language", year ="1986", volume ="1", number ="?", pages ="1-20", } @phdthesis{kn:Rob89a, author = "Anthony J. Robinson", title ="Dynamic Error Propagation Networks", school ="Cambridge University Engineering Department", address ="Cambridge, England", year ="1989" } @techreport{kn:Rob88a, author ="Anthony J. Robinson", title ="A Dynamic Connectionist Model for Phoneme Recognition", organization="Cambridge University Engineering Department", address ="Cambridge, England", year ="1988" } @inproceedings{kn:Wat87a, key = "Watrous" , author = "Watrous, R.L. and Shastri, L." , title = "Learning Phonetic Features Using Connectionist Networks: An Experiment in Speech Recognition" , booktitle= "Int. Conf. on Neural Networks" , organization= "IEEE", month = "June" , year = "1987" } @inproceedings{kn:Wat87b, author = "R.L. Watrous and others" , title = "Learned Phonetic Discrimination Using Connectionist Networks" , booktitle= "European Conference on Speech Technology" , month = "September" , address = "Edinburgh" , year = "1987" , pages = "377-380" } @phdthesis{kn:Wat88a, key = "Watrous" , author = "Watrous, R." , title = "Speech Recognition Using Connectionist Networks" , school = "University of Pennsylvania" , month = "October" , year = "1988" } @article{kn:Wai89a, title = "Phoneme Recognition Using Time-Delay Neural Networks" , author = "A. Waibel and others " , year = "1989" , journal = "IEEE, Trans. Acoustics, Speech and Signal Processing", month = "March" } @INCOLLECTION{KN:Ale89a, author ="I. Aleksander", editor ="I. Aleksander", title ="The Logic of Connectionist Systems", booktitle ="Neural Computing Architectures", year ="1989", publisher ="MIT" } @ARTICLE{KN:Ale73a, title ="Cycle Activity in Nature: Causes of Stability", author ="I. Aleksander and P. Atlas", journal ="Int. J. of Neuroscience", year ="1973", volume ="6", number ="?", pages ="45-50", } @INPROCEEDINGS{KN:ALL89C, AUTHOR ="Allen, R.B. and Kauffman, S.M.", year ="1989", title ="Developing agent models with a neural reinforcement technique", ORGANIZATION="IEE", booktitle= "Conf. on Artificial Neural Networks" } @INPROCEEDINGS{KN:ALL89D, AUTHOR ="Allen, R.B.", year ="1989", title ="Sequence Generation with Connectionist State Machines", booktitle= "IJCNN" } @article{kn:Ama71a, title ="Characteristics of Randomly Connected Threshold Elements and Network Systems", author ="S.I. Amari", journal ="Proc. of the IEEE", year ="1971", volume ="39", number ="?", pages ="33-47", } @article{kn:Ama72a, title ="Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements", author ="S.I. Amari", journal ="IEEE Trans. on Computers", year ="1972", volume ="21", number ="?", pages ="1197-1206" } @article{kn:Ama83a, title ="Field Theory of Self-Organizing Neural Nets", author ="S.I. Amari", journal ="IEEE Trans. on Systems , Man and Cybernetics", year ="1983", volume ="?", number ="?", pages ="741-748", } @incollection{kn:Cai89a, author ="E. R. Caianello", editor ="I. Aleksander", title ="A Theory of Neural Networks", booktitle ="Neural Computing Architectures", year ="1989", publisher ="MIT" } @article{kn:Cai75a, title ="Synthesis of Boolean Nets and Time-Behavior of a General Mathematical Neuron", author ="E. Caianello and E. Grimson", journal ="Biol. Cyb.", year ="1975", volume ="18", number ="?", pages ="111-117", } @article{kn:Cai86a, title ="Linearization and Synthesis of Cellular Automata. The Additive Case", author ="E. Caianello and M. Marinaro", journal ="Physica Scripta", year ="1986", volume ="34", number ="?", pages ="444", } @article{kn:Cai76a, title ="Methods of Analysis of Neural Nets", author ="E. Caianello and E. Grimson", journal ="Biol. Cyb.", year ="1976", volume ="21", number ="?", pages ="1-6", } @article{kn:Cai70a, title ="Reverberations and Control of Neural Networks", author ="E. Caianello", journal ="Kybernetik", year ="1970", volume ="7", number ="5", pages ="191", } @article{kn:Fog82a, title ="Specific Roles of the Different Boolean Mappings in Random Networks", author ="F. Fogelman-Soulie and others", journal ="Bull. of Math. Biol.", year ="1982", volume ="44", number ="5", pages ="715-730", } @article{kn:Fog85a, title ="Frustration and Stability in Random Boolean Networks", author ="F. Fogelman-Soulie", journal ="Discrete Applied Math.", year ="?", volume ="?", number ="?", pages ="?", } @article{kn:Fog83a, title ="Transient Length in Sequential Iterations of Threshold Functions ", author ="F. Fogelman-Soulie and Others", journal ="Discrete Applied Math.", year ="1983", volume ="6", number ="?", pages ="95-98", } @article{kn:Kau69a, title ="Metabolic Stability and Epigenesis in Randomly Constructed Genetic Nets", author ="S.A. Kauffman", journal ="J. Theoret. Biology", year ="1969", volume ="22", number ="?", pages ="437-467", } @incollection{kn:Mar89a, author ="D. Martland", editor ="I. Aleksander", title ="Dynamic Behavior of Boolean Networks", booktitle="Neural Computing Architectures", year ="1989", publisher="MIT" } @inproceedings{kn:Mar87a, title ="Behavior of Autonomous (Synchronous) Boolean Networks", booktitle ="1st Int. Conf. on Neural Networks ", author ="D. Martland", year ="1987", volume ="II", organization ="IEEE" } @article{kn:Mcc43a, title ="A Logical Calculus of the Ideas Immanent in Nervous Activity ", author ="W.S. McCulloch and W. Pitts ", journal ="Bull. Math. Biophysics", year ="1943", volume ="5", pages ="115-143", } @ARTICLE{kn:Par89a, AUTHOR= "I. Parberry", TITLE= "Relating Boltzmann Machines to Conventional Models of Computation", JOURNAL= "Neural Networks", YEAR= "1989", VOLUME= "2" } @ARTICLE{kn:Roz69a, AUTHOR= "L.I. Rozonoer", TITLE= "Random Logical Nets, I-III (in Russian)", JOURNAL= "Avtomatika i Telemekhanika", YEAR= "1969", VOLUME= "5" } @INPROCEEDINGS{KN:Sun89a, AUTHOR ="R. Sun", year ="1989", title ="A Discrete Neural Network Model for Conceptual Representation and Reasoning", booktitle= "11th Ann. Cog. Sci. Soc. Conf." } @TECHREPORT{KN:SUN89b, AUTHOR ="R. Sun", TITLE ="The Discrete Neuronal Model and the Probabilistic Discrete Neuronal Model", NUMBER ="?", INSTITUTION ="Computer Sc. Dept., Brandeis Un.", year ="1989" } @ARTICLE{kn:Sch, AUTHOR ="R.E. Schneider", TITLE ="The Neuron as a Sequential Mahine", JOURNAL ="?", YEAR ="?", VOLUME ="?" } @ARTICLE{kn:Raj, AUTHOR= "V. Rajlich", TITLE= "Dynamics of certain Discrete Systems and Self Reproduction of Patterns", JOURNAL= "?", YEAR= "?", VOLUME= "?" } @inproceedings{kn:Tsu89a, author ="Tsung, Fu-Sheng and Cottrell, G.", title ="A sequential adder using recurrent neural networks", year ="1989", booktitle ="IJCNN", } @inproceedings{kn:Cot89a, author ="Cottrell, G. and Tsung, Fu-Sheng", title ="Learning simple arithmetic procedures", booktitle ="11th Annual Conf. of Cog. Sci. Soc.", year ="1989" } @INCOLLECTION{KN:ALL89A , author ="Allen, R.B. and Riecken, M.E.", year ="1989", title ="Reference in connectionist language users", booktitle ="Connectionism in Perspective", editor ="R. Pfeifer and others", publisher ="Elsevier", pages ="301-308" } @phdthesis{kn:Rie88a, author ="Riecken, M.E.", year ="1988", title ="Neural networks in natural language processing and distributed artificial intelligence", school ="University of New Mexico" } @inproceedings{kn:All89b, author ="Allen, R.B.", year ="1989", title ="Developing agent models with a neural reinforcement technique", organization="IEEE", booktitle= "Systems Man and Cybernetics Conference", } @inproceedings{kn:All88a, author ="Allen, R.B.", year ="1988", title ="Sequential connectionist networks for answering simple questions about a microworld", booktitle ="10th Ann. Cog. Sci. Soc. Conf.", pages ="489-495" } @inproceedings{kn:Cot85a, title ="Connectionist Parsing", author ="G.W. Cottrell", BOOKTITLE ="7th Ann. Conf. of Cog. Sci. Soc. ", year ="1985" } @incollection{kn:San89a, author ="E. {Santos Jr.}", title ="A Massively Parallel Self-Tuning Context-free Parser", booktitle ="Adv. in Neural Information Processing Systems", year ="1989", publisher ="Morgan Kauffman" } @inproceedings{kn:Cha87a, title ="A Connectionist Context-free Parser which is not Context- Free but then it is not really Connectionist either", author ="E. Charniak and E. Santos", BOOKTITLE ="9th Annual Conference of Cog. Sci. Soc. ", year ="1987" } @techreport{kn:Fan85a, author ="M. Fanty", title ="Context-Free Parsing in Connectionist Networks", number ="TR 174", institution ="Un. of Rochester, Computer Sc. Dept.", year ="1985" } @inproceedings{kn:Han87a, title ="PARSNIP:A connectionist Network that learns Natural Language from Exposure to Natural Language Sentences", author ="S. Hanson and J. Kegl", BOOKTITLE ="9th Annual Conference of Cog. Sci. Soc.", year ="1987" } @techreport{kn:Mik89a, author ="R. Miikkulainen and M.G. Dyer", title ="A Modular Neural Network Architecture for Sequential Paraphrasing of Script-Based Stories", NUMBER ="TR UCLA-AI-89-02", INSTITUTION ="Un. of California at Los Angeles" , year ="1989" } @inproceedings{kn:Mii88a, author ="R. Miikkulainen and M.G. Dyer", title ="Encoding Input/Output Representations in Connectionist Cognitive Systems", booktitle ="Connectionist Models Summer School", year ="1988" } @article{kn:Cle90a, author ="A. Cleeremans and others", year ="1990", title ="Finite state automata and simple recurrent networks", journal ="Neural Computation", volume ="1" } @techreport{kn:Ser88a, AUTHOR ="D. Servan-Schreiber and others", year ="1988", title ="Encoding sequential structure in simple recurrent networks", number ="CMU-CS-88-183", institution ="School of Computer Science, Carnegie Mellon Un.", } @techreport{kn:Sej86a, author ="T.J. Sejnowski and C. Rosenberg", title ="NETtalk:A Parallel Network that Learns to Read Aloud", number ="JHU EECS 86-01", institution ="John Hopkins University", year ="1986" } @techreport{kn:StJ85a, author ="M. StJohn and J.L. McLelland", title ="Learning and Applying Contextual Constraints in Sentence Comprehension", number ="?", institution ="Dept. of Psychology, Carnegie Mellon Un.", year ="1985" } @techreport{kn:Whe89a, author = "D.W. Wheeler and D.S. Touretzky", title = "A Connectionist Implementation of Cognitive Phonology", year = "1989" , institution = "CS Dept., Carnegie Mellon University", number = "CMU-CS-89-144" } @inproceedings{kn:Alm87a, title ="A Learning Rule for Asynchronous Perceptrons with Feedback in a Combinatorial Environment", author ="L.B. Almeida", booktitle ="1st Int. Conf. on Neural Networks, S. Diego ", organization ="IEEE", year ="1987", } @incollection{kn:Alm89a, title ="Backpropagation in Nonfeedforward Networks", author ="L.B. Almeida", booktitle ="Neural Computing Architectures", editor ="I. Aleksander", publisher ="MIT Press", year ="1989", } @incollection{kn:Alm88a, title ="Backpropagation in Perceptrons with Feedback", author ="L.B. Almeida", booktitle ="Neural Computers", editor ="R. Eckmiller and C.v.d. Malsburg", publisher ="Springer", year ="1988", } @incollection{kn:Alm89b, title ="Recurrent Backpropagation and Hopfield Networks", author ="L.B. Almeida and J.P. Neto", booktitle ="Neuro Computing, Algorithms Architectures and Applications", editor ="F. Fogelman-Soulie ", PUBLISHER ="Springer", chapter ="?", year ="1989", } @incollection{kn:Alm90a, title ="Backpropagation in Feedforward and Recurrent Networks", author ="L.B. Almeida", booktitle ="?", editor ="B. Shriver", chapter ="submitted to", publisher ="IEEE Press", year ="1990", } @article{kn:Bab87a, title ="Dynamics of Simple Electronic Neural Networks", author ="K.L. Babcock and R.M. Westervelt", journal ="Physica", year ="1987", volume ="28D", number ="?", pages ="305", } @article{kn:Bar85a, author ="A.G. Barto and P. Anandan", title ="Pattern Recognizing Stochastic Learning Automata", journal ="IEEE Trans. on Systems , Man and Cybernetics", volume ="SMC 15", Number ="3", year ="June 1985" } @article{kn:Bar81a, author ="A.G. Barto and others", title ="Associative Search Network:Reinforcement Learning Associative Memory", journal ="Biol. Cyb. 40", year ="1981" } @inproceedings{kn:Bel88a, author ="T. Bell", title ="Sequential Processing using Attractor Transitions", booktitle ="Connectionist Models Summer School", year ="1988" } @article{kn:Deh87a, title ="Neural Networks that learn Temporal Sequences by Selection ", pages ="2727", author ="S. Dehaene and others", year ="1987 ", volume ="84", journal ="Proc. of Nat'l Acad. Sci. USA" } @inproceedings{kn:Elm89b, author ="J. L. Elman", title ="Representation and structure in connectionist models", BOOKTITLE ="11th Ann. Conf. Cog. Sci. Soc.", year ="1989" } @TECHREPORT{KN:ELM88B, author ="J. L. Elman", title ="Finding structure in time", institution="Center for Research in Language, University of California at San Diego", number ="CRL TR 8801", year ="1988" } @inproceedings{kn:Gal88a, author =" Gallant, S. I. and King, D. J", title ="Experiments with Sequential Associative Memories", booktitle ="10th Ann. Conf. of Cog. Sci. Soc.", year ="1988", page ="40-47", } @ARTICLE{kn:Gol86a, AUTHOR= "R.M. Golden", TITLE= "The Brain-State-in-a-Box Neural Model is a Gradient Descent Algorithm", JOURNAL= "Journal of Mathematical Psychology", YEAR= "1986", VOLUME= "30" , NUMBER= "1" } @ARTICLE{kn:Gol88a, AUTHOR= "R.M. Golden", TITLE= "A Unified Framework for Connectionist Systems", JOURNAL= "Biol. Cyb.", YEAR= "1988", VOLUME= "59" } @article{kn:Gut88a, title ="Processing of Temporal Sequences in Neural Networks", author ="H. Gutfreund and M. Mezard", journal ="Phys. Rev. Let.", year ="1988", volume ="61", number ="?", pages ="235", } @article{kn:Guy82a, title ="Storage and Retrieval of Complex sequences in Neural Networks", author ="I. Guyon and others", journal ="Phys. Rev. A", year ="1982", volume ="38", number ="?", pages ="6365", } @inproceedings{kn:Jor86a, title ="Attractor Dynamics and Parallelism in a Connectionist Sequential Machine", author ="M.I. Jordan", BOOKTITLE ="8th Ann. Conf. of Cog. Sci. Soc." , year ="1986" } @article{kn:Kur86a, title ="Chaos in Neural Systems", author ="K.E. Kurten and J.W. Clark", journal ="Phys. Lett.", year ="1986", volume ="114A", number ="?", pages ="413", } @incollection{kn:Mar89b, title ="Dynamics of Analog Neural Networks with Time Delay", booktitle ="Advances in Neural Information Processing Systems I", author ="C.M. Marcus and R.M. Westervelt", editor ="D. Touretzky", publisher ="Morgan Kauffman", year ="1989", } @ARTICLE{kn:Moz89a, AUTHOR ="M. C. Mozer", TITLE ="A Focused Back Propagation Algorithm for Temporal Pattern Recognition", JOURNAL ="Complex Systems", YEAR ="1989 or 1990 (accepted for publication)", VOLUME ="??" } @ARTICLE{KN:Now88a, AUTHOR ="S.J. Nowlan", TITLE ="Gain Variation in Recurrent Error Propagation Networks", JOURNAL ="Complex Systems", YEAR ="1988", VOLUME= "2" } @incollection{kn:Ott89a, title ="Fixed Point Analysis for Recurrent Neural Networks", booktitle ="Advances in Neural Information Processing Systems I", author ="M.B. Ottaway", editor ="D. Touretzky", publisher ="Morgan Kauffman", year ="1989", } @inproceedings{kn:Par88a, author = "K. Park", title = "Sequential Learning: Observations on the Internal Code Generation Problem", booktitle= "Connectionist Models Summer School", year = "1988" } @ARTICLE{kn:Pin87a, AUTHOR= "F.J. Pineda", TITLE= "Generalization of Back Propagation to Recurrent Neural Nets", JOURNAL= "Physical Review Letters" , YEAR= "1987", VOLUME= "59" } @ARTICLE{kn:Pin88b, AUTHOR= "F.J. Pineda", TITLE= "Dynamics and Architecture for Neural Computation", JOURNAL= "Journal of Complexity", YEAR= "1988", VOLUME= "4" , pages= "216" } @inproceedings{kn:Pol87a, title ="Cascaded Back Propagation on Dynamic Connectionist Networks", author ="J. B. Pollack", booktitle ="9th Ann. Conf. of the Cog. Sci. Soc. ", year ="1987" , page ="391-404", } @inproceedings{kn:Pol88a, title ="Recursive Auto-Associative Memory: Devising Compositional Distributed Representations", author ="J. B. Pollack", booktitle ="10th Ann. Conf. of the Cog. Sci. Soc. ", year ="1988", page ="33-39" } @inproceedings{kn:Ren90a, title ="Chaos in Neural Networks ", booktitle ="EURASIP ", author ="S. Renals", year ="1990", volume ="?", organization ="?" } @inproceedings{kn:Roh90a, title ="The Moving Targets Training Algorithm", booktitle ="EURASIP ", author ="R. Rohwer", year ="1990", volume ="?", organization ="?" } @inproceedings{kn:Roh87a, title ="Training Time-Dependence in Neural Networks", booktitle ="1st Int. Conf. on Neural Networks", author ="R. Rohwer", year ="1987", volume ="?", organization ="IEEE" } @article{kn:Ren90b, title ="A Study of Network Dynamics", author ="S. Renals and R. Rohwer", journal ="J. Stat. Phys.", year ="1990", volume ="In Press", number ="?", pages ="?", } @ARTICLE{KN:RIED88A, title ="Temporal Sequences and Chaos in Neural Networks", author ="Riedel and others", journal ="Phys. Rev. A", year ="1988", volume ="36", number ="?", pages ="1428", } @inproceedings{kn:Sch89a, title ="Networks Adjusting Networks", author ="J. Schmidhuber", booktitle ="Distributed Adaptive Neural Infornmation Processing", year ="1989", } @inproceedings{kn:Sch88a, title ="The Neural Bucket Brigade", author ="J. Schmidhuber", booktitle ="International Conference on Connectionism in Perspective", year ="1988", } @article{kn:Sim88a, author ="P.Y. Simard and others", title ="Analysis of Recurrent Back-Propagation", journal ="Proc. of the Connectionist Models Summer Schools", year ="1988" } @techreport{kn:Slo67a, author ="N.J. Sloane", title ="Lengths of Cycle Times in Random Neural Networks", number ="10", institution ="Cornell Un., Cognitive Systems Research Program", year ="1967" } @ARTICLE{KN:SOM88A, title ="Chaos in Random Neural Networks", author ="Sompolinsky and others", journal ="Phys. Rev. Lett.", year ="1988", volume ="61", number ="?", pages ="259", } @ARTICLE{KN:STO, AUTHOR= "W.S. Stornetta", TITLE= "A Dynamical Approach to Temporal Pattern Processing", JOURNAL= "?", YEAR= "?", VOLUME= "?" } @ARTICLE{kn:Sun, AUTHOR= "G.Z. Sun", TITLE= "A Recurrent Network that learns Context Free Grammars", JOURNAL= "?", YEAR= "?", VOLUME= "?" } @article{kn:Sut88a, title ="Learning to Predict by the Methods of Temporal Difference", author ="R.S. Sutton", journal ="Machine Learning", year ="1988", volume ="3", number ="?", pages ="9-44", } @techreport{kn:Wil88a, author ="R.J. Williams and D. Zipser", title ="A Learning Algorithm for Continually Running Fully Connected Recurrent Neural Networks", number ="ICS-8805", institution ="Un. of California at San Diego", year ="1988" } ------------end of bibliographic database and of message ------- From Connectionists-Request at CS.CMU.EDU Tue Dec 12 17:10:48 1989 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Tue, 12 Dec 89 17:10:48 EST Subject: update of recurrent nets bibliography Message-ID: <23529.629503848@B.GP.CS.CMU.EDU> Sorry about sending out Thanasis' whole bibliography. The updated version of the how-to-capture-temporal-relationships-bibliography is still called recurrent.bib. How to FTP Files from the CONNECTIONISTS Archive ------------------------------------------------ 1. Open an FTP connection to host B.GP.CS.CMU.EDU (Internet address 128.2.242.8). 2. Login as user anonymous with password your username. 3. 'cd' directly to one of the following directories: /usr/connect/connectionists/archives /usr/connect/connectionists/bibliographies 4. The archives and bibliographies directories are the ONLY ones you can access. You can't even find out whether any other directories exist. If you are using the 'cd' command you must cd DIRECTLY into one of these two directories. Access will be denied to any others, including their parent directory. 5. The archives subdirectory contains back issues of the mailing list. Some bibliographies are in the bibliographies subdirectory. Problems? - contact me at "Connectionists-Request at cs.cmu.edu". Happy Browsing Scott Crowder Connectionists-Request at cs.cmu.edu From pollack at cis.ohio-state.edu Wed Dec 13 12:11:43 1989 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Wed, 13 Dec 89 12:11:43 EST Subject: CogSci Meeting Message-ID: <8912131711.AA01095@wizard.cis.ohio-state.edu> I just received the updated call for papers: New Deadline: March 15th, 1990 Submit 4 photo-ready copies of a full paper (8 pages max. including figures, tables, and references and a 250 word abstract). There will be no revision of accepted papers. Cognitive Science Society Meeting c/o MIT Center for Cognitive Science Room 20B-225 77 Mass. Ave Cambridge, MA 02139 Paper presentations will either be 20 minutes or 10 minutes, assigned by the referees and organizng committee. You must first indicate your preference for POSTER or PAPER. If you prefer to present, you must further answer two questions: Will you ACCEPT a 10 minute [y/n] Do you PREFER a 10 minute slot [y/n] This is followed by a threat: If a 10-minute slot is unacceptable, but a 20 minute slot is not available, the committee will be unable to accept the paper(!!!) Jordan Pretty labrynthine stuff going on with the organization of cognitive science this year; they have too many plenary speakers and special-interest panels, so there can't be many normal talks, and the committee has therefore taken it upon itself to reject (or absurdly shorten the presentation time) of papers (or areas?) they don't like. From norman%cogsci at ucsd.edu Wed Dec 13 18:02:06 1989 From: norman%cogsci at ucsd.edu (Donald A Norman-UCSD Cog Sci Dept) Date: Wed, 13 Dec 89 15:02:06 PST Subject: CogSci Meeting In-Reply-To: Jordan B Pollack's message of Wed, 13 Dec 89 12:11:43 EST <8912131711.AA01095@wizard.cis.ohio-state.edu> Message-ID: <8912132302.AA16889@cogsci.UCSD.EDU> In fairness to the Cognitive Science society. And why we need a strong connectionist showing at the society meetings. Each year, the conference is held in a different location, run by volunteers who must spend a considerable amount of time and energy to organize things. As partial payment for the effort, the Society gives the organizers a good deal of latitude on the structure of the conference. We welcomed the overture from MIT to hold a conference, especially since in the west pole/east pole split in the science, MIT folks (prototype east polers) tended to ignore the conference and society: having MIT sponsor the conference was seen as a positive step toward including all views of cognitive science in the society. (Note that connectionists are viewed with alarm and suspicion by east polers -- if they are viewed at all (the preference would be that you--we-- would all go away). And since the Cog Sci conference has become a major place for substantive connectionist reports on science (as opposed to techniques, methodology, and engineering applications), again, having MIT host the conference is a wonderful opportunity.) HOWEVER: there were severe problems and conflicts in getting the conference going. It almost got scrubbed, except that by the time the Society was informed of the problems, it was too late to find another host. Dave Rumelhart played a major role in getting things smoothed over and getting the conference on track again. It now does look like we have a conference. The scheduling problems and the balance of programs and the other apparent mishaps seem minor incidents in the attempt to make the Cognitive Science Society's conference a major scientific forum for all views in the substantive study of cognition. Please do attend: connectionism promises to revolutionize our views of cognition (I know, many of you think it already has), but both you folks and the others need to interact so that we can better explore the experimental phenomena and the theoretical alternatives. don norman (Disclaimer: I am a member of the Governing Board of the Society, but I have played very little role in this conference. The current chair of the society is Dave Rumelhart: give him the credits, and save the complaints for others. Remember: would YOU want to host a large, complex conference? (And if the answer is yes, then by all means volunteer, after making sure that you have sufficient meeting rooms, hotel and dorm rooms, and local support.) Don Norman INTERNET: dnorman at ucsd.edu Department of Cognitive Science D-015 BITNET: dnorman at ucsd University of California, San Diego AppleLink: d.norman La Jolla, California 92093 USA FAX: (619) 534-1128 From karit at hutmc Thu Dec 14 03:09:14 1989 From: karit at hutmc (karit@hutmc) Date: Thu, 14 Dec 89 10:09:14 +0200 Subject: ICANN-91 Message-ID: <8912140809.AA13599@santra.hut.fi> +---------------------------------------------------------------+ | | | I C A N N - 91 | | | | International Conference on Artificial Neural Networks, | | Helsinki University of Technology, Finland, June 24-28, 1991 | | | +---------------------------------------------------------------+ FIRST ANNOUNCEMENT Theories, implementations, and applications of Artificial Neural Networks are progressing at a growing speed in Europe and elsewhere. The first commercial hardware for neural circuits and systems are emerging. This conference will be a major international contact forum for experts from academia and industry worldwide. Around 1000 participants are expected. TOPICS: CONFERENCE CHAIRMAN: networks and algorithms Prof. Teuvo Kohonen neural software neural hardware PROGRAM CHAIRMAN: applications Prof. Igor Aleksander brain and neural theories INTERNATIONAL CONFERENCE COMMITTEE: ACTIVITIES: B.Angeniol tutorials E.Caianiello oral and poster sessions R.Eckmiller prototype demonstrations J.Hertz videopresentations L.Steels industrial exhibition J.G.Taylor Fore more information, please contact: Prof. Olli Simula, chairman, organizations committee ICANN-91, Helsinki University of Technology SF-02150 Espoo, Finland Fax: +358-04513277 Telex: 1251 61 htkk sf Email: ollis at hutmc.hut.fi From Dave.Touretzky at B.GP.CS.CMU.EDU Thu Dec 14 05:59:41 1989 From: Dave.Touretzky at B.GP.CS.CMU.EDU (Dave.Touretzky@B.GP.CS.CMU.EDU) Date: Thu, 14 Dec 89 05:59:41 EST Subject: tech report available Message-ID: <7549.629636381@DST.BOLTZ.CS.CMU.EDU> Controlling Search Dynamics by Manipulating Energy Landscapes David S. Touretzky CMU-CS-89-113 December, 1989 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213-3890 Touretzky and Hinton's DCPS (Distributed Connectionist Production System) is a neural network with complex dynamical properties. Visualization of the energy landscapes of some of its component modules leads to a better intuitive understanding of the model. Three visualization techniques are used in this paper. Analysis of the way energy landscapes change as modules interact during an annealing search suggests ways in which the search dynamics can be controlled, thereby improving the model's performance on difficult match cases. ================ This report is available free by writing the School of Computer Science at the address above, or by sending electronic mail to Ms. Catherine Copetas. Her email address is copetas+ at cs.cmu.edu. Be sure to ask for technical report number CMU-CS-89-113. From jose at neuron.siemens.com Thu Dec 14 06:53:47 1989 From: jose at neuron.siemens.com (Steve Hanson) Date: Thu, 14 Dec 89 06:53:47 EST Subject: CogSci Meeting Message-ID: <8912141153.AA09977@neuron.siemens.com.siemens.com> Re: don norman's note here, here! I would also like to encourage those who regularly attend the NIPS conference and have some cognitive, or behavioral science interests or even cognitive neuroscience to submit relevant work to the Cognitive Science Conference. Also note NIPS90 will have a Cognitive Science and AI track this next year in order to encourage the crosstalk between the high quality scientific work in Neural Networks and Cognitive Science. Watch this space for the call for papers. Steve Hanson (NIPS organizing committee) From hendler at cs.UMD.EDU Thu Dec 14 10:08:09 1989 From: hendler at cs.UMD.EDU (Jim Hendler) Date: Thu, 14 Dec 89 10:08:09 -0500 Subject: CogSci Meeting Message-ID: <8912141508.AA02196@dormouse.cs.UMD.EDU> I can't really let Don Norman's message go by without feeling compelled to add my own two cents -- I agree with Don that it is important that connectionists attend the Cog Swci meeting -- it is an important meeting for new researcj in cognitive science. However, as far as I'm concerned, those going simply to 'spread the gospel' of connectionism, rather than to also find out what is happening elsewhere are (well let's be polite) perhaps not availing themselves of the potential to learn important information. I think some very BAD cognitive results have come from people in this camp (and I do consider myself a connectionist to some degree, although somewhat reluctantly) because they have simply ignored a large body of research which discusses important cognitive phenomena which our models MUST account for someday (you cannot ignore experimental results without making a compelling argument as to why they are wrong). There have also been some very important results which have derived from connectionist modeling (Rumelhart's work, Norman's own recent work, etc.) and the Cognitive Science community has been forced to pay attention. - because the people doing this work did NOT ignore the data. So, just to summarize, I think people should plan on attending, but not simply to convince us east-poler's that connectionism is the word of God, but rather to learn for yourselves where the greatest challenges to connectionists lie. cheers Jim H. From mike at bucasb.BU.EDU Thu Dec 14 14:05:11 1989 From: mike at bucasb.BU.EDU (Michael Cohen) Date: Thu, 14 Dec 89 14:05:11 EST Subject: WANG INSTITUTE CONFERENCE Message-ID: <8912141905.AA18969@bucasb.bu.edu> BOSTON UNIVERSITY, A WORLD LEADER IN NEURAL NETWORK RESEARCH AND TECHNOLOGY, PRESENTS TWO MAJOR SCIENTIFIC EVENTS: MAY 6--11, 1990 NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS A self-contained systematic course by leading neural architects. MAY 11--13, 1990 NEURAL NETWORKS FOR AUTOMATIC TARGET RECOGNITION An international research conference presenting INVITED and CONTRIBUTED papers, herewith solicited, on one of the most active research topics in science and technology today. SPONSORED BY THE CENTER FOR ADAPTIVE SYSTEMS, THE GRADUATE PROGRAM IN COGNITIVE AND NEURAL SYSTEMS, AND THE WANG INSTITUTE OF BOSTON UNIVERSITY WITH PARTIAL SUPPORT FROM THE AIR FORCE OFFICE OF SCIENTIFIC RESEARCH ----------------------------------------------------------------------------- CALL FOR PAPERS --------------- NEURAL NETWORKS FOR AUTOMATIC TARGET RECOGNITION MAY 11--13, 1990 This research conference at the cutting edge of neural network science and technology will bring together leading experts in academe, government, and industry to present their latest results on automatic target recognition in invited lectures and contributed posters. Automatic target recognition is a key process in systems designed for vision and image processing, speech and time series prediction, adaptive pattern recognition, and adaptive sensory-motor control and robotics. It is one of the areas emphasized by the DARPA Neural Networks Program, and has attracted intense research activity around the world. Invited lecturers include: JOE BROWN, Martin Marietta, "Multi-Sensor ATR using Neural Nets" GAIL CARPENTER, Boston University, "Target Recognition by Adaptive Resonance: ART for ATR" NABIL FARHAT, University of Pennsylvania, "Bifurcating Networks for Target Recognition" STEPHEN GROSSBERG, Boston University, "Recent Results on Self-Organizing ATR Networks" ROBERT HECHT-NIELSEN, HNC, "Spatiotemporal Attention Focusing by Expectation Feedback" KEN JOHNSON, Hughes Aircraft, "The Application of Neural Networks to the Acquisition and Tracking of Maneuvering Tactical Targets in High Clutter IR Imagery" PAUL KOLODZY, MIT Lincoln Laboratory, "A Multi-Dimensional ATR System" MICHAEL KUPERSTEIN, Neurogen, "Adaptive Sensory-Motor Coordination using the INFANT Controller" YANN LECUN, AT&T Bell Labs, "Structured Back Propagation Networks for Handwriting Recognition" CHRISTOPHER SCOFIELD, Nestor, "Neural Network Automatic Target Recognition by Active and Passive Sonar Signals" STEVEN SIMMES, Science Applications International Co., "Massively Parallel Approaches to Automatic Target Recognition" ALEX WAIBEL, Carnegie Mellon University, "Patterns, Sequences and Variability: Advances in Connectionist Speech Recognition" ALLEN WAXMAN, MIT Lincoln Laboratory, "Invariant Learning and Recognition of 3D Objects from Temporal View Sequences" FRED WEINGARD, Booz-Allen and Hamilton, "Current Status and Results of Two Major Government Programs in Neural Network-Based ATR" BARBARA YOON, DARPA, "DARPA Artificial Neural Networks Technology Program: Automatic Target Recognition" ------------------------------------------------------ CALL FOR PAPERS---ATR POSTER SESSION: A featured poster session on ATR neural network research will be held on May 12, 1990. Attendees who wish to present a poster should submit 3 copies of an extended abstract (1 single-spaced page), postmarked by March 1, 1990, for refereeing. Include with the abstract the name, address, and telephone number of the corresponding author. Mail to: ATR Poster Session, Neural Networks Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be informed of abstract acceptance by March 31, 1990. SITE: The Wang Institute possesses excellent conference facilities on a beautiful 220-acre rustic setting. It is easily reached from Boston's Logan Airport and Route 128. REGISTRATION FEE: Regular attendee--$90; full-time student--$70. Registration fee includes admission to all lectures and poster session, one reception, two continental breakfasts, one lunch, one dinner, daily morning and afternoon coffee service. STUDENTS: Read below about FELLOWSHIP support. REGISTRATION: To register by telephone with VISA or MasterCard call (508) 649-9731 between 9:00AM--5:00PM (EST). To register by FAX, fill out the registration form and FAX back to (508) 649-6926. To register by mail, complete the registration form and mail with your full form of payment as directed. Make check payable in U.S. dollars to "Boston University". See below for Registration Form. To register by electronic mail, use the address "rosenber at bu-tyng.bu.edu". On-site registration on a space-available basis will take place from 1:00--5:00PM on Friday, May 11. A RECEPTION will be held from 3:00--5:00PM on Friday, May 11. LECTURES begin at 5:00PM on Friday, May 11 and conclude at 1:00PM on Sunday, May 13. ------------------------------------------------------------------------------ NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS MAY 6--11, 1989 This in-depth, systematic, 5-day course is based upon the world's leading graduate curriculum in the technology, computation, mathematics, and biology of neural networks. Developed at the Center for Adaptive Systems (CAS) and the Graduate Program in Cognitive and Neural Systems (CNS) of Boston University, twenty-eight hours of the course will be taught by six CAS/CNS faculty. Three distinguished guest lecturers will present eight hours of the course. COURSE OUTLINE -------------- MAY 7, 1990 ----------- MORNING SESSION (PROFESSOR GROSSBERG) HISTORICAL OVERVIEW: Introduction to the binary, linear, and continuous-nonlinear streams of neural network research: McCulloch-Pitts, Rosenblatt, von Neumann; Anderson, Kohonen, Widrow; Hodgkin-Huxley, Hartline-Ratliff, Grossberg. CONTENT ADDRESSABLE MEMORY: Classification and analysis of neural network models for absolutely stable CAM. Models include: Cohen-Grossberg, additive, shunting, Brain-State-In-A-Box, Hopfield, Boltzmann Machine, McCulloch-Pitts, masking field, bidirectional associative memory. COMPETITIVE DECISION MAKING: Analysis of asynchronous variable-load parallel processing by shunting competitive networks; solution of noise-saturation dilemma; classification of feedforward networks: automatic gain control, ratio processing, Weber law, total activity normalization, noise suppression, pattern matching, edge detection, brightness constancy and contrast, automatic compensation for variable illumination or other background energy distortions; classification of feedback networks: influence of nonlinear feedback signals, notably sigmoid signals, on pattern transformation and memory storage, winner-take-all choices, partial memory compression, tunable filtering, quantization and normalization of total activity, emergent boundary segmentation; method of jumps for classifying globally consistent and inconsistent competitive decision schemes. ASSOCIATIVE LEARNING: Derivation of associative equations for short-term memory and long-term memory. Overview and analysis of associative outstars, instars, computational maps, avalanches, counterpropagation nets, adaptive bidrectional associative memories. Analysis of unbiased associative pattern learning by asynchronous parallel sampling channels; classification of associative learning laws. AFTERNOON SESSION (PROFESSORS JORDAN AND MINGOLLA) COMBINATORIAL OPTIMIZATION PERCEPTRONS: Adeline, Madeline, delta rule, gradient descent, adaptive statistical predictor, nonlinear separability. INTRODUCTION TO BACK PROPAGATION: Supervised learning of multidimensional nonlinear maps, NETtalk, image compression, robotic control. RECENT DEVELOPMENTS OF BACK PROPAGATION: This two-hour guest tutorial lecture will provide a systematic review of recent developments of the back propagation learning network, especially focussing on recurrent back propagation variations and applications to outstanding technological problems. EVENING SESSION: DISCUSSIONS WITH TUTORS MAY 8, 1990 ----------- MORNING SESSION (PROFESSORS CARPENTER AND GROSSBERG) ADAPTIVE PATTERN RECOGNITION: Adaptive filtering; contrast enhancement; competitive learning of recognition categories; adaptive vector quantization; self-organizing computational maps; statistical properties of adaptive weights; learning stability and causes of instability. INTRODUCTION TO ADAPTIVE RESONANCE THEORY: Absolutely stable recognition learning, role of learned top-down expectations; attentional priming; matching by 2/3 Rule; adaptive search; self-controlled hypothesis testing; direct access to globally optimal recognition code; control of categorical coarseness by attentional vigilance; comparison with relevant behavioral and brain data to emphasize biological basis of ART computations. ANALYSIS OF ART 1: Computational analysis of ART 1 architecture for self-organized real-time hypothesis testing, learning, and recognition of arbitrary sequences of binary input patterns. AFTERNOON SESSION (PROFESSOR CARPENTER) ANALYSIS OF ART 2: Computational analysis of ART 2 architecture for self-organized real-time hypothesis testing, learning, and recognition for arbitrary sequences of analog or binary input patterns. ANALYSIS OF ART 3: Computational analysis of ART 3 architecture for self-organized real-time hypothesis testing, learning, and recognition within distributed network hierarchies; role of chemical transmitter dynamics in forming a memory representation distinct from short-term memory and long-term memory; relationships to brain data concerning neuromodulators and synergetic ionic and transmitter interactions. SELF-ORGANIZATION OF INVARIANT PATTERN RECOGNITION CODES: Computational analysis of self-organizing ART architectures for recognizing noisy imagery undergoing changes in position, rotation, and size. NEOCOGNITION: Recognition and completion of images by hierarchical bottom-up filtering and top-down attentive feedback. EVENING SESSION: DISCUSSIONS WITH TUTORS MAY 9, 1990 ----------- MORNING SESSION (PROFESSORS GROSSBERG & MINGOLLA) VISION AND IMAGE PROCESSING: Introduction to Boundary Contour System for emergent segmentation and Feature Contour System for filling-in after compensation for variable illumination; image compression, orthogonalization, and reconstruction; multidimensional filtering, multiplexing, and fusion; coherent boundary detection, regularization, self-scaling, and completion; compensation for variable illumination sources, including artificial sensors (infrared sensors, laser radars); filling-in of surface color and form; 3-D form from shading, texture, stereo, and motion; parallel processing of static form and moving form; motion capture and induced motion; synthesis of static form and motion form representations. AFTERNOON SESSION (PROFESSORS BULLOCK, COHEN, & GROSSBERG) ADAPTIVE SENSORY-MOTOR CONTROL AND ROBOTICS: Overview of recent progress in adaptive sensory-motor control and related robotics research. Reaching to, grasping, and transporting objects of variable mass and form under visual guidance in a cluttered environment will be used as a target behavioral competence to clarify subproblems of real-time adaptive sensory-motor control. The balance of the tutorial will be spent detailing neural network modules that solve various subproblems. Topics include: Self-organizing networks for real-time control of eye movements, arm movements, and eye-arm coordination; learning of invariant body-centered target position maps; learning of intermodal associative maps; real-time trajectory formation; adaptive vector encoders; circular reactions between action and sensory feedback; adaptive control of variable speed movements; varieties of error signals; supportive behavioral and neural data; inverse kinematics; automatic compensation for unexpected perturbations; independent adaptive control of force and position; adaptive gain control by cerebellar learning; position-dependent sampling from spatial maps; predictive motor planning and execution. SPEECH PERCEPTION AND PRODUCTION: Hidden Markov models; self-organization of speech perception and production codes; eighth nerve Average Localized Synchrony Response; phoneme recognition by back propagation, time delay networks, and vector quantization. MAY 10, 1990 ------------ MORNING SESSION (PROFESSORS COHEN, GROSSBERG, & MERRILL) SPEECH PERCEPTION AND PRODUCTION: Disambiguation of coarticulated vowels and consonants; dynamics of working memory; multiple-scale adaptive coding by masking fields; categorical perception; phonemic restoration; contextual disambiguation of speech tokens; resonant completion and grouping of noisy variable-rate speech streams. REINFORCEMENT LEARNING AND PREDICTION: Recognition learning, reinforcement learning, and recall learning are the 3 R's of neural network learning. Reinforcement learning clarifies how external events interact with internal organismic requirements to trigger learning processes capable of focussing attention upon and generating appropriate actions towards motivationally desired goals. A neural network model will be derived to show how reinforcement learning and recall learning can self-organize in response to asynchronous series of significant and irrelevant events. These mechanisms also control selective forgetting of memories that are no longer predictive, adaptive timing of behavioral responses, and self-organization of goal directed problem solvers. AFTERNOON SESSION (PROFESSORS GROSSBERG & MERRILL AND DR. HECHT-NIELSEN) REINFORCEMENT LEARNING AND PREDICTION: Analysis of drive representations, adaptive critics, conditioned reinforcers, role of motivational feedback in focusing attention on predictive data; attentional blocking and unblocking; adaptively timed problem solving; synthesis of perception, recognition, reinforcement, recall, and robotics mechanisms into a total neural architecture; relationship to data about hypothalamus, hippocampus, neocortex, and related brain regions. RECENT DEVELOPMENTS IN THE NEUROCOMPUTER INDUSTRY: This two-hour guest tutorial will provide an overview of the growth and prospects of the burgeoning neurocomputer industry by one of its most important leaders. EVENING SESSION: DISCUSSIONS WITH TUTORS MAY 11, 1990 ------------ MORNING SESSION (DR. FAGGIN) VLSI IMPLEMENTATION OF NEURAL NETWORKS: This is a four-hour self-contained tutorial on the application and development of VLSI techniques for creating compact real-time chips embodying neural network designs for applications in technology. Review of neural networks from a hardware implementation perspective; hardware requirements and alternatives; dedicated digital implementation of neural networks; neuromorphic design methodology using VLSI CMOS technology; applications and performance of neuromorphic implementations; comparison of neuromorphic and digital hardware; future prospectus. ---------------------------------------------------------------------------- COURSE FACULTY FROM BOSTON UNIVERSITY ------------------------------------- STEPHEN GROSSBERG, Wang Professor of CNS, as well as Professor of Mathematics, Psychology, and Biomedical Engineering, is one of the world's leading neural network pioneers and most versatile neural architects; Founder and 1988 President of the International Neural Network Society (INNS); Founder and Co-Editor-in-Chief of the INNS journal "Neural Networks"; an editor of the journals "Neural Computation", "Cognitive Science", and "IEEE Expert"; Founder and Director of the Center for Adaptive Systems; General Chairman of the 1987 IEEE First International Conference on Neural Networks (ICNN); Chief Scientist of Hecht-Nielsen Neurocomputer Company (HNC); and one of the four technical consultants to the national DARPA Neural Network Study. He is author of 200 articles and books about neural networks, including "Neural Networks and Natural Intelligence" (MIT Press, 1988), "Neural Dynamics of Adaptive Sensory-Motor Control" (with Michael Kuperstein, Pergamon Press, 1989), "The Adaptive Brain, Volumes I and II" (Elsevier/North-Holland, 1987), "Studies of Mind and Brain" (Reidel Press, 1982), and the forthcoming "Pattern Recognition by Self-Organizing Neural Networks" (with Gail Carpenter). GAIL CARPENTER is Professor of Mathematics and CNS; Co-Director of the CNS Graduate Program; 1989 Vice President of the International Neural Network Society (INNS); Organization Chairman of the 1988 INNS annual meeting; Session Chairman at the 1989 and 1990 IEEE/INNS International Joint Conference on Neural Networks (IJCNN); one of four technical consultants to the national DARPA Neural Network Study; editor of the journals "Neural Networks", "Neural Computation", and "Neural Network Review"; and a member of the scientific advisory board of HNC. A leading neural architect, Carpenter is especially well-known for her seminal work on developing the adaptive resonance theory architectures (ART 1, ART 2, ART 3) for adaptive pattern recognition. MICHAEL COHEN, Associate Professor of Computer Science and CNS, is a leading architect of neural networks for content addressable memory (Cohen-Grossberg model), vision (Feature Contour System), and speech (Masking Fields); editor of "Neural Networks"; Session Chairman at the 1987 ICNN, and the 1989 IJCNN; and member of the DARPA Neural Network Study panel on Simulation/Emulation Tools and Techniques. ENNIO MINGOLLA, Assistant Professor of Psychology and CNS, is holder of one of the first patented neural network architectures for vision and image processing (Boundary Contour System); Co-Organizer of the 3rd Workshop on Human and Machine Vision in 1985; editor of the journals "Neural Networks" and "Ecological Psychology"; member of the DARPA Neural Network Study panel of Adaptive Knowledge Processing; consultant to E.I. duPont de Nemours, Inc.; Session Chairman for vision and image processing at the 1987 ICNN, and the 1988 INNS meetings. DANIEL BULLOCK, Assistant Professor of Psychology and CNS, is developer of neural network models for real-time adaptive sensory-motor control of arm movements and eye-arm coordination, notably the VITE and FLETE models for adaptive control of multi-joint trajectories; editor of "Neural Networks"; Session Chairman for adaptive sensory-motor control and robotics at the 1987 ICNN and the 1988 INNS meetings; invited speaker at the 1990 IJCNN. JOHN MERRILL, Assistant Professor of Mathematics and CNS, is developing neural network models for adaptive pattern recognition, speech recognition, reinforcement learning, and adaptive timing in problem solving behavior, after having received his Ph.D. in mathematics from the University of Wisconsin at Madison, and completing postdoctoral research in computer science and linguistics at Indiana University. GUEST LECTURERS --------------- FEDERICO FAGGIN is co-founder and president of Synaptics, Inc. Dr. Faggin developed the Silicon Gate Technology at Fairchild Semiconductor. He also designed the first commercial circuit using Silicon Gate Technology: the 3708, an 8-bit analog multiplexer. At Intel Corporation he was responsible for designing what was to become the first microprocessor---the 4000 family, also called MCS-4. He and Hal Feeney designed the 8008, the first 8-bit microprocessor introduced in 1972, and later Faggin conceived the 8080 and with M. Shima designed it. The 8080 was the first high-performance 8-bit microprocessor. At Zilog Inc., Faggin conceived the Z80 microprocessor family and directed the design of the Z80 CPU. Faggin also started Cygnet Technologies, which developed a voice and data communication peripheral for the personal computer. In 1986 Faggin co-founded Synaptics Inc., a company dedicated to the creation of a new type of VLSI hardware for artificial neural networks and other machine intelligence applications. Faggin is the recipient of the 1988 Marconi Fellowship Award for his contributions to the birth of the microprocessor. ROBERT HECHT-NIELSEN is co-founder and chairman of the Board of Directors of Hecht-Nielsen Neurocomputer Corporation (HNC), a pioneer in neurocomputer technology and the application of neural networks, and a recognized leader in the field. Prior to the formation of HNC, he founded and managed the neurocomputer development and neural network applications at TRW (1983--1986) and Motorola (1979--1983). He has been active in neural network technology and neurocomputers since 1961 and earned his Ph.D. in mathematics in 1974. He is currently a visiting lecturer in the Electrical Engineering Department at the University of California at San Diego, and is the author of influential technical reports and papers on neurocomputers, neural networks, pattern recognition, signal processing algorithms, and artificial intelligence. MICHAEL JORDAN is an Assistant Professor of Brain and Cognitive Sciences at MIT. One of the key developers of the recurrent back propagation algorithms, Professor Jordan's research is concerned with learning in recurrent networks and with the use of networks as forward models in planning and control. His interest in interdisciplinary research on neural networks is founded in his training for a Bachelors degree in Psychology, a Masters degree in Mathematics, and a Ph.D. in Cognitive Science from the University of California at San Diego. He was a postdoctoral researcher in Computer Science at the University of Massachusetts at Amherst before assuming his present position at MIT. ---------------------------------------------------------- REGISTRATION FEE: Regular attendee--$950; full-time student--$250. Registration fee includes five days of tutorials, course notebooks, one reception, five continental breakfasts, five lunches, four dinners, daily morning and afternoon coffee service, evening discussion sessions with leading neural architects. REGISTRATION: To register by telephone with VISA or MasterCard call (508) 649-9731 between 9:00AM--5:00PM (EST). To register by FAX, fill out the registration form and FAX back to (508) 649-6926. To register by mail, complete the registration form and mail with you full form of payment as directed. Make check payable in U.S. dollars to "Boston University". See below for Registration Form. To register by electronic mail, use the address "rosenber at bu-tyng.bu.edu". On-site registration on a space-available basis will take place from 2:00--7:00PM on Sunday, May 6 and from 7:00--8:00AM on Monday, May 7, 1990. A RECEPTION will be held from 4:00--7:00PM on Sunday, May 6. LECTURES begin at 8:00AM on Monday, May 7 and conclude at 12:30PM on Friday, May 11. STUDENT FELLOWSHIPS supporting travel, registration, and lodging for the Course and the Research Conference are available to full-time graduate students in a PhD program. Applications must be postmarked by March 1, 1990. Send curriculum vitae, a one-page essay describing your interest in neural networks, and a letter from a faculty advisor to: Student Fellowships, Neural Networks Course, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships for Ph.D. candidates in the CNS Graduate Program. Corporate and individual gifts to endow CNS Fellowships are also welcome. Please write: Cognitive and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston University, 111 Cummington Street, Boston, MA 02215. ------------------------------------------------------------------------------ REGISTRATION FOR COURSE AND RESEARCH CONFERENCE Course: Neural Network Foundations and Applications, May 6--11, 1990 Research Conference: Neural Networks for Automatic Target Recognition, May 11--13, 1990 NAME: _________________________________________________________________ ORGANIZATION (for badge): _____________________________________________ MAILING ADDRESS: ______________________________________________________ ______________________________________________________ CITY/STATE/COUNTRY: ___________________________________________________ POSTAL/ZIP CODE: ______________________________________________________ TELEPHONE(S): _________________________________________________________ COURSE RESEARCH CONFERENCE ------ ------------------- [ ] regular attendee $950 [ ] regular attendee $90 [ ] full-time student $250 [ ] full-time student $70 (limited number of spaces) (limited number of spaces) [ ] Gift to CNS Fellowship Fund TOTAL PAYMENT: $________ FORM OF PAYMENT: [ ] check or money order (payable in U.S. dollars to Boston University) [ ] VISA [ ] MasterCard Card Number: ______________________________________________ Expiration Date: ______________________________________________ Signature: ______________________________________________ Please complete and mail to: Neural Networks Wang Institute of Boston University 72 Tyng Road Tyngsboro, MA 01879 USA To register by telephone, call: (508) 649-9731. HOTEL RESERVATIONS: Room blocks have been reserved at 3 hotels near the Wang Institute. Hotel names, rates, and telephone numbers are listed below. A shuttle bus will take attendees to and from the hotels for the Course and Research Conference. Attendees should make their own reservations by calling the hotel. The special conference rate applies only if you mention the name and dates of the meeting when making the reservations. Sheraton Tara Red Roof Inn Stonehedge Inn Nashua, NH Nashua, NH Tyngsboro, MA (603) 888-9970 (603) 888-1893 (508) 649-4342 $70/night+tax $39.95/night+tax $89/night+tax The hotels in Nashua are located approximately 5 miles from the Wang Institute. A shuttle bus will be provided. ------------------------------------------------------------------------------- From noel%CS.EXETER.AC.UK at VMA.CC.CMU.EDU Fri Dec 15 07:52:19 1989 From: noel%CS.EXETER.AC.UK at VMA.CC.CMU.EDU (Noel Sharkey) Date: Fri, 15 Dec 89 12:52:19 GMT Subject: CogSci Meeting In-Reply-To: Jim Hendler's message of Thu, 14 Dec 89 10:08:09 -0500 <8912141508.AA02196@dormouse.cs.UMD.EDU Message-ID: <5100.8912151252@entropy.cs.exeter.ac.uk> i support jim h. fully on his points, but knowing the breadth of don norman's work, i am sure he would agree. noel From R09614%BBRBFU01.BITNET at vma.CC.CMU.EDU Mon Dec 18 08:24:49 1989 From: R09614%BBRBFU01.BITNET at vma.CC.CMU.EDU (R09614%BBRBFU01.BITNET@vma.CC.CMU.EDU) Date: Mon, 18 Dec 89 14:24:49 +0100 Subject: NATO Conference Announcement Message-ID: ANNOUNCEMENT: _______________________________________ NATO Advanced Research Workshop on Self-organization, Emerging Properties and Learning. Center for Studies in Statistical Mechanics and Complex Systems The University of Texas Austin, Texas, USA March 12-14, 1990 _______________________________________ Topics ------ - Self-Organization and Dynamics in Networks of Interacting Elements - Dynamical Aspects of Neural Activity: Experiments and Modelling - From Statistical Physics to Neural Networks - Role of Dynamical Attractors in Cognition and Memory - Dynamics of Learning in Biological and Social Systems The goal of the workshop is to review recent progress on self- organization and the generation of spatio-temporal patterns in multi-unit networks of interacting elements, with special emphasis on the role of coupling and connectivity on the observed behavior. The importance of these findings will be assessed from the standpoint of information and cognitive sciences, and their possible usefulness in the field of artificial intelligence will be discussed. We will compare the collective behavior of model networks with the dynamics inferred from the analysis of cortical activity. This confrontation should lead to the design of more realistic networks, sharing some of the basic properties of real-world neurons. Sponsors -------- - NATO International Scientific Exchange Programmes - International Solvay Institutes for Physics and Chemistry, Brussels, Belgium - Center for Statistical Mechanics and Complex Systems, The University of Texas at Austin - IC2 Institute of The University of Texas at Austin International Organizing Committee -------------------------------- Ilya Prigogine, The University of Texas at Austin and Free University of Brussels Gregoire Nicolis, Free University of Brussels Agnes Babloyantz, Free University of Brussels J. Demongeot, University of Grenoble, France Linda Reichl, The University of Texas at Austin Local Organizing Committee ------------------------- Ilya Prigogine, George Kozmetsky, Ping Chen, Linda Reichl, William Schieve, Robert Herman, Harry Swinney, Fred Phillips For Further Information Contact: ----------------------------- Professor Linda Reichl Center for Statistical Mechanics The University of Texas Austin, TX 78712, USA Phone: (512) 471-7253; Fax: (512) 471-9637; Bitnet: CTAA450 at UTA3081 or PAPE at UTAPHY From Ajay.Jain at ANJ.BOLTZ.CS.CMU.EDU Mon Dec 18 14:17:38 1989 From: Ajay.Jain at ANJ.BOLTZ.CS.CMU.EDU (Ajay.Jain@ANJ.BOLTZ.CS.CMU.EDU) Date: Mon, 18 Dec 89 14:17:38 EST Subject: tech report available Message-ID: A CONNECTIONIST ARCHITECTURE FOR SEQUENTIAL SYMBOLIC DOMAINS Ajay N. Jain School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213-3890 Technical Report CMU-CS-89-187 December, 1989 Abstract: This report describes a connectionist architecture specifically intended for use in sequential domains requiring symbol manipulation. The architecture is based on a network formalism which differs from other connectionist networks developed for use in temporal/sequential domains. Units in this formalism are updated synchronously and retain partial activation between updates. They produce two output values: the standard sigmoidal function of the activity and its velocity. Activation flowing along connections can be gated by units. Well-behaved symbol buffers which learn complex assignment behavior can be constructed using gates. Full recurrence is supported. The network architecture, its underlying formalism, and its performance on an incremental parsing task requiring non-trivial dynamic behavior are presented. This report discusses a connectionist parser built for a smaller task than was discussed at NIPS. ---------------------------------------------------------------------- TO ORDER COPIES of this tech report: send electronic mail to copetas at cs.cmu.edu, or write the School of Computer Science at the address above. Those of you who requested copies of the report at NIPS a couple of weeks ago need not make a request (your copies are in the mail). ****** Do not use your mailer's "reply" command. ****** From Ajay.Jain at ANJ.BOLTZ.CS.CMU.EDU Tue Dec 19 11:43:31 1989 From: Ajay.Jain at ANJ.BOLTZ.CS.CMU.EDU (Ajay.Jain@ANJ.BOLTZ.CS.CMU.EDU) Date: Tue, 19 Dec 89 11:43:31 EST Subject: TR CMU-CS-89-187 Message-ID: The report won't be mailed until after the holidays. It isn't back from the printers yet. Your requests will be processed as soon as possible. Ajay From jose at neuron.siemens.com Tue Dec 19 17:01:37 1989 From: jose at neuron.siemens.com (Steve Hanson) Date: Tue, 19 Dec 89 17:01:37 EST Subject: COGNITIVE NEUROSCIENCE RFP Message-ID: <8912192201.AA03476@neuron.siemens.com.siemens.com> MCDONNELL-PEW PROGRAM IN COGNITIVE NEUROSCIENCE December 1989 Individual Grants-in-Aid for Research and Training Supported jointly by the James S. McDonnell Foundation and The Pew Charitable Trusts INTRODUCTION The McDonnell-Pew Program in Cognitive Neuroscience has been created jointly by the James S. McDonnell Foundation and The Pew Charitable Trusts to promote the development of cognitive neuroscience. The foundations have allocated $12 million over an initial three-year period for this program. Cognitive neuroscience attempts to understand human mental events by specifying how neural tissue carries out computations. Work in cognitive neuroscience is interdisciplinary in character, drawing on developments in clinical and basic neuroscience, computer science, psychology, linguistics, and philosophy. Cognitive neuroscience excludes descriptions of psychological function that do not address the underlying brain mechanisms and neuroscientific descriptions that do not speak to psychological function. The program has three components. (1) Institutional grants have been awarded for the purpose of creating centers where cognitive scientists and neuroscientists can work together. (2) To encourage Ph.D. and M.D. investigators in cognitive neuroscience, small grants-in-aid will be awarded for individual research projects. (3) To encourage Ph.D. and M.D. investigators to acquire skills for interdisciplinary research, small training grants will be awarded. During the program's initial three-year period, approximately $4 million will be available for the latter two components -- individual grants-in-aid for research and training -- which this announcement describes. RESEARCH GRANTS The McDonnell-Pew Program in Cognitive Neuroscience will issue a limited number of awards to support collaborative work by cognitive neuroscientists. Applications are sought for projects of exceptional merit that are not currently fundable through other channels, and from investigators who are not already supported by institutional grants under this Program. Preference will be given to support projects requiring collaboration or interaction between at least two subfields of cognitive neuroscience. The goal is to encourage broad, national participation in the development of the field and to facilitate the participation of investigators outside the major centers of cognitive neuroscience. Submissions will be reviewed by the program's advisory board. Grant support under this component is limited to $30,000 per year for two years, with indirect costs limited to 10 percent of direct costs. These grants are not renewable. The program is looking for innovative proposals that would, for example: -- combine experimental data from cognitive psychology and neuroscience; -- explore the implications of neurobiological methods for the study of the higher cognitive processes; -- bring formal modeling techniques to bear on cognition; -- use sensing or imaging techniques to observe the brain during conscious activity; -- make imaginative use of patient populations to analyze cognition; -- develop new theories of the human mind/brain system. This list of examples is necessarily incomplete but should suggest the general kind of proposals desired. Ideally, a small grant-in-aid for research should facilitate the initial exploration of a novel or risky idea, with success leading to more extensive funding from other sources. TRAINING GRANTS A limited number of grants will also be awarded to support training investigators in cognitive neuroscience. Here again, the objective is to support proposals of exceptional merit that are underfunded or unlikely to be funded from other sources. Some postdoctoral awards for exceptional young scientists will be available; postdoctoral stipends will be funded at prevailing rates at the host institution, and will be renewable annually for periods up to three years. Highest priority will be given to candidates seeking postdoctoral training outside the field of their previous training. Innovative programs for training young scientists, or broadening the experience of senior scientists, are also encouraged. Some examples of appropriate proposals follow. -- Collaboration between a junior scientist in a relevant discipline and a senior scientist in a different discipline has been suggested as an effective method for developing the field. -- Two senior scientists might wish to learn each other's discipline through a collaborative project. -- An applicant might wish to visit several laboratories in order to acquire new research techniques. -- Senior researchers might wish to investigate new methods or technologies in their own fields that are unavailable at their home institutions. Here again, examples can only suggest the kind of training experience that might be considered appropriate. APPLICATIONS Applicants should submit five copies of a proposal no longer than 10 pages (5,000 words). Proposals for research grants should include: -- a description of the work to be done and where it might lead; -- an account of the investigator's professional qualifications to do the work. Proposals for training grants should include: -- a description of the training sought and its relationship to the applicant's work and previous training; -- a statement from the mentor as well as the applicant concerning the acceptability of the training plan. Proposals for both research grants and training grants should include: -- an account of any plans to collaborate with other cognitive neuroscientists; -- a brief description of the available research facilities; -- no appendices. The proposal should be accompanied by the following separate information: -- a brief, itemized budget and budget justification for the proposed work, including direct costs, with indirect costs not to exceed 10 percent of direct costs; -- curricula vitae of the participating investigators; -- evidence that the sponsoring organization is a nonprofit, tax-exempt, public institution; -- an authorized form indicating clearance for the use of human and animal subjects; -- an endorsement letter from the officer of the sponsoring institution who will be responsible for administering the grant. Applications received on or before March 1 will be acted on by the following September 1; applications received on or before September 1 will be acted on by the following March 1. INFORMATION For more information contact: McDonnell-Pew Program in Cognitive Neuroscience Green Hall 1-N-6 Princeton University Princeton, New Jersey 08544-1010 Telephone: 609-258-5014 Facsimile: 609-258-3031 Email: cns at confidence.princeton.edu ADVISORY BOARD Emilio Bizzi, M.D. Eugene McDermott Professor in the Brain Sciences and Human Behavior Chairman, Department of Brain and Cognitive Sciences Whitaker College Massachusetts Institute of Technology, E25-526 Cambridge, Massachusetts 02139 Sheila Blumstein, Ph.D. Professor of Cognitive and Linguistic Sciences Dean of the College Brown University University Hall, Room 218 Providence, Rhode Island 02912 Stephen J. Hanson, Ph.D. Group Leader Learning and Knowledge Acquisition Research Group Siemens Research Center 755 College Road East Princeton, New Jersey 08540 Jon Kaas, Ph.D. Centennial Professor Department of Psychology Vanderbilt University Nashville, Tennessee 37240 George A. Miller, Ph.D. James S. McDonnell Distinguished University Professor of Psychology Department of Psychology Princeton University Princeton, New Jersey 08544 Mortimer Mishkin, Ph.D. Laboratory of Neuropsychology National Institute of Mental Health 9000 Rockville Pike Building 9, Room 1N107 Bethesda, Maryland 20892 Marcus Raichle, M.D. Professor of Neurology and Radiology Department of Radiology Washington University School of Medicine Barnes Hospital 510 S. Kingshighway, Campus Box 8131 St. Louis, Missouri 63110 Endel Tulving, Ph.D. Department of Psychology University of Toronto Toronto, Ontario M5S 1A1 Canada From gasser at iuvax.cs.indiana.edu Tue Dec 19 21:32:02 1989 From: gasser at iuvax.cs.indiana.edu (Michael Gasser) Date: Tue, 19 Dec 89 21:32:02 -0500 Subject: tech report available Message-ID: NETWORKS THAT LEARN PHONOLOGY Michael Gasser Chan-Do Lee Computer Science Department Indiana University Bloomington, IN 47405 Technical Report 300 December 1989 Abstract: Natural language phonology presents a challenge to connectionists because it is an example of apparently symbolic, rule-governed behavior. This paper describes two experiments investigating the power of simple recurrent networks (SRNs) to acquire aspects of phonological regularity. The first experiment demonstrates the ability of an SRN to learn harmony constraints, restrictions on the cooccurrence of particular types of segments within a word. The second experiment shows that an SRN is capable of learning the kinds of phonological alternations that appear at morpheme boundaries, in this case those occurring in the regular plural forms of English nouns. This behavior is usually characterized in terms of a derivation from a more to a less abstract level, and in previous connectionist treatments (Rumelhart & McClelland, 1986; Plunkett & Marchman, 1989) it has been dealt with as a process of yielding the combined form (plural) from the simpler form (stem). Here the behavior takes the form of the more psychologically plausible process of the production of a sequence of segments given a meaning or of a meaning given a sequence of segments. This is accomplished by having both segmental and semantic inputs and outputs in the network. The network is trained to auto-associate the current segment and the meaning and to predict the next phoneme. ---------------------------------------------------------------------- To order copies of this tech report, send mail to Nancy Garrett at nlg at cs.indiana.edu / Computer Science Department, Indiana University, Bloomington, IN 47405. From elman at amos.ucsd.edu Wed Dec 20 14:20:50 1989 From: elman at amos.ucsd.edu (Jeff Elman) Date: Wed, 20 Dec 89 11:20:50 PST Subject: Announcement: 1990 Connectionists Models Summer School Message-ID: <8912201920.AA06467@amos.ucsd.edu> December 20, 1989 ANNOUNCEMENT & SOLICITATION FOR APPLICATIONS CONNECTIONIST MODELS SUMMER SCHOOL / SUMMER 1990 UCSD La Jolla, California The next Connectionist Models Summer School will be held at the University of California, San Diego from June 19 to 29, 1990. This will be the third session in the series which was held at Carnegie Mellon in the summers of 1986 and 1988. Previous summer schools have been extremely success- ful, and we look forward to the 1990 session with anticipa- tion of another exciting summer school. The summer school will offer courses in a variety of areas of connectionist modelling, with emphasis on computa- tional neuroscience, cognitive models, and hardware imple- mentation. A variety of leaders in the field will serve as Visiting Faculty (the list of invited faculty appears below). In addition to daily lectures, there will be a series of shorter tutorials and public colloquia. Proceed- ings of the summer school will be published the following fall by Morgan-Kaufmann (previous proceedings appeared as 'Proceedings of the 1988 Connectionist Models Summer School', Ed., David Touretzky, Morgan-Kaufmann). As in the past, participation will be limited to gradu- ate students enrolled in PhD. programs (full- or part-time). Admission will be on a competitive basis. Tuition is sub- sidized for all students and scholarships are available to cover housing costs ($250). Applications should include the following: (1) A statement of purpose, explaining major areas of interest and prior background in connectionist model- ing (if any). (2) A description of a problem area you are interested in modeling. (3) A list of relevant coursework, with instructors' names and grades. (4) Names of the three individuals whom you will be ask- ing for letters of recommendation (see below). (5) If you are requesting support for housing, please include a statement explaining the basis for need. Please also arrange to have letters of recommendation sent directly from three individuals who know your current work. Applications should be sent to Marilee Bateman Institute for Neural Computation, B-047 University of California, San Diego La Jolla, CA 92093 (619) 534-7880 All application material must be received by March 15, 1990. Decisions about acceptance and scholarship awards will be announced April 1. If you have further questions, contact Marilee Bateman (address above), or one of the members of the Organizing Committee. Jeff Elman Terry Sejnowski UCSD UCSD/Salk Institute elman at amos.ucsd.edu terry at sdbio2.ucsd.edu Geoff Hinton Dave Touretzky Toronto CMU hinton at ai.toronto.edu touretzky at cs.cmu.edu -------------------------------------------------- INVITED FACULTY: Yaser Abu-Mostafa (CalTech) Richard Lippmann (MIT Lincoln Labs) Dana Ballard (Rochester) James L. McClelland (Carnegie Mellon) Andy Barto (UMass/Amherst) Carver Mead (CalTech) Gail Carpenter (BU) David Rumelhart (Stanford) Patricia Churchland (UCSD) Terry Sejnowski (UCSD/Salk) Jack Cowan (Chicago) Al Selverston (UCSD) Jeff Elman (UCSD) Paul Smolensky (Colorado) Jerry Feldman (ICSI/UCB) David Tank (Bell Labs) Geoffrey Hinton (Toronto) David Touretzky (Carnegie Mellon) Michael Jordan (MIT) Halbert White (UCSD) Teuvo Kohonen (Helsinki) Ron Williams (Northeastern) George Lakoff (UCB) David Zipser (UCSD) From D4PBPHB2%EB0UB011.BITNET at VMA.CC.CMU.EDU Wed Dec 20 19:26:54 1989 From: D4PBPHB2%EB0UB011.BITNET at VMA.CC.CMU.EDU (Perfecto Herrera-Boyer) Date: Wed, 20 Dec 89 19:26:54 HOE Subject: Hardware for NN Message-ID: Dear connectionists: I am trying to make a survey of hardware suited to work with NN on IBM/PS computers (80286) in order to acquire some equipment for our laboratory. I am thinking of coprocessors, cards, and so on... Could anybody send me information about them? (It would be interesting to receive not only "objective" data but also "subjective" impressions from people who is working with those devices). I promise you a summary if you want it. Thanks in advance: Perfecto Herrera-Boyer Dpt. Psicologia Basica Univ. Barcelona From ang at hertz.njit.edu Thu Dec 21 11:35:32 1989 From: ang at hertz.njit.edu (nirwan ansari fac ee) Date: Thu, 21 Dec 89 11:35:32 EST Subject: Call for papers for GLOBECOM '90 Message-ID: <8912211635.AA18916@hertz.njit.edu> The 1990 IEEE Global Telecommunications Conference (GLOBECOM 90) will be held in San Diego, California, Decemebr 2-5, 1990. I was asked by the technical committee to organize a session "Neural Networks in Communication Systems." You are cordially invited to submit an original technical paper related to this topic for consideration for GLOBECOM 90. The SCHEDULE is as follows: Complete Manuscript Due 3/15/1990 Notification of Acceptance Mailed 5/30/1990 Camera-ready Manuscript Due 8/20/1990 INSTRUCTIONS: The title page must include the author's name, complete return address, telephone, telex and fax number and abstract (100 words). For papers with multiple authors, please designate the author to whom all correspondence should be sent by listing that author first. All other pages should have the title and first author of the paper. The manuscript should not exceed 3,000 words in English. Page charges will be assessed for camera-ready copies exceeding five pages. Please send six double-spaced copies of the manuscript in English to: Dr. Arne Mortensen GLOBECOM '90 Technical Program Secretary M/A-COM Government Systems 3033 Science Park Road San Diego, CA 92121 Phone:(619) 457-2340 Telex:910-337-1277 FAX:(619) 457-0579, and a copy to me: Dr. Nirwan Ansari GLOBECOM '90 Neural Network Session Organizer Electrical and Computer Engineering Department New Jersey Insitute of Technology University Heights Newark, NJ 07102 Phone:(201) 596-5739. Please also indicate in your cover letter to Dr. Mortensen that you have communicated with and sent me a copy of your manuscript for consideration for the "Neural Networks in Communication Systems" Session. For further questions, please feel free to contact me using the above address or the e-mail address, ang at hertz.njit.edu (node address: 128.235.1.26). From skrzypek at CS.UCLA.EDU Thu Dec 21 18:14:38 1989 From: skrzypek at CS.UCLA.EDU (Dr. Josef Skrzypek) Date: Thu, 21 Dec 89 15:14:38 PST Subject: neural nets and light adaptation (TR) Message-ID: <8912212314.AA20901@retina.cs.ucla.edu> NEURAL NETWORK CONTRIBUTION TO LIGHT ADAPTATION: FEEDBACK FROM HORIZONTAL CELLS TO CONES JOSEF SKRZYPEK Machine Perception Laboratory, Computer Science Department and CRUMP Institute of Medical Engineering. UCLA SUMMARY Vertebrate cones respond to a stepwise increase in localized light intensity with a graded potential change of corresponding amplitude. This S-shaped intensity-response (I-R) relation is limited to 3 log units of the stimulating light and yet, cone vision remains functional between twilight and the brightest time of day. This is in part due to light adaptation mechanism localized in the outer segment of a cone. The phenomenon of light adaptation can be described as a resetting of the system's response-generation mechanism to a new intensity domain that reflects the ambient illumination. In this project we examined spatial effects of annular illumination on resetting of I-R relation by measuring intracellular photoresponses in cones. Our results suggest that peripheral illumination contributes to the cellular mechanism of adaptation. This is done by a neural network involving feedback synapse from horizontal cell to cones. The effect is to unsaturate the membrane potential of a fully hyperpolarized cone, by "instantaneously" shifting cone's I-R curves along intensity axis to be in register with ambient light level of the periphery. An equivalent electrical circuit with three different transmembrane channels leakage, photocurrent and feedback was used to model static behavior of a cone. SPICE simulation showed that interactions between feedback synapse and the light sensitive conductance in the outer segment can shift the I-R curves along the intensity domain, provided that phototransduction mechanism is not saturated during maximally hyperpolarized light response. Key words: adaptation, feedback, cones, retina, lateral interactions Josef Skrzypek Computer Science Department 3532D Boelter Hall UCLA Los Angeles, California 90024 INTERNET: SKRZYPEK at CS.UCLA.EDU From NHATAOKA%vax1.tcd.ie at cunyvm.cuny.edu Thu Dec 21 13:38:00 1989 From: NHATAOKA%vax1.tcd.ie at cunyvm.cuny.edu (NHATAOKA%vax1.tcd.ie@cunyvm.cuny.edu) Date: Thu, 21 Dec 89 18:38 GMT Subject: Technical report is available Message-ID: <8912271217.AA01539@uunet.uu.net> The following technical report is available. Unfortunately, I want to post this on this connectionists_mailing list only, so "Please don't forward to other newsgroups or mailing lists." Speaker-Independent Phoneme Recognition on TIMIT Database Using Integrated Time-Delay Neural Networks (TDNNs) Nobuo Hataoka(*) and Alex H. Waibel November 27, 1989 CMU-CS -89-190 (also, CMU-CMT-89-115) School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Abstract: This paper describes a new structure of Neural Networks (NNs) for speaker- independent and context-independent phoneme recognition. This structure is based on the integration of Time-Delay Neural Networks(TDNN, Waibel et al.) which have several TDNNs separated according to the duration of phonemes. As a result, the proposed structure has the advantage that it deals with phonemes of varying duration more effectively. In the experimental evaluation of the proposed new structure, 16-English vowel recognition was performed using 5268 vowel tokens picked from 480 sentences spoken by 140 speakers (98 males and 42 females) on the TIMIT (TI-MIT) database. The number of training tokens and testing tokens was 4326 from 100 speakers (69 males and 31 females) and 942 from 40 speakers (29 males and 11 females), respectively. The result was a 60.5% recognition rate (around 70% for a collapsed 13-vowel case), which was improved from 56% in the single TDNN structure, showing the effectiveness of the proposed new structure to use temporal information. (*) The author was a visiting researcher from Central Research Laboratory, Hitachi, Ltd., Japan. This work has been done on a collaborative research project between the Center for Machine Translation of CMU and Hitachi, Ltd. Currently, the author is working for Hitachi Dublin Laboratory in Trinity College, Ireland. --------------------------------------------------- If you want to have a copy of this report, please send an e-mail or a letter to the following address. nhataoka%vax1.tcd.ie at cunyvm.cuny.edu or ^--(one) Alison Dunne Hitachi Dublin Laboratory O'Reilly Institute Trinity College Dublin 2, Ireland P.S. Do not use your mailer's "reply" command. From poggio at ai.mit.edu Wed Dec 27 11:07:40 1989 From: poggio at ai.mit.edu (Tomaso Poggio) Date: Wed, 27 Dec 89 11:07:40 EST Subject: MIT AI Lab memo 1164 Message-ID: <8912271607.AA23473@rice-chex> the following technical report is available from the MIT AI Lab Publication Office (send e-mail to liz at ai.mit.edu) Networks and the Best Approximation Property by Federico Girosi and Tomaso Poggio ABSTRACT Networks can be considered as approximation schemes. Multilayer networks of the backpropagation type can approximate arbitrarily well continuous functions (Cybenko, 1989; Funahashi, 1989; Stinchcombe and White, 1989). We prove that networks derived from regularization theory and including Radial Basis Functions (Poggio and Girosi, 1989, AI memo 1140), have a similar property. From the point of view of approximation theory, however, the property of approximating continuous functions arbitrarily well is not sufficient for characterizing good approximation schemes. More critical is the property of {\it best approximation}. The main result of this paper is that multilayer networks, of the type used in backpropagation, are not best approximation. For regularization networks (in particular Radial Basis Function networks) we prove existence and uniqueness of best approximation. From pollack at cis.ohio-state.edu Wed Dec 27 22:08:06 1989 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Wed, 27 Dec 89 22:08:06 EST Subject: FTP Service is Down; should it come up? Message-ID: <8912280308.AA00375@toto.cis.ohio-state.edu> **Do not forward to other newsgroups** The directory of connectionist electronic tech-reports, pub/neuroprose on cheops.cis.ohio-state.edu, seems to have been deleted. It is impossible to tell whether it was done by a local diskspace scrounger or one of us, perhaps a double agent really working for symbolic AI!!! A request for backup has been made to the local authorities, and I will post another message when it is restored to its former glory. Will take this opportunity for a straw poll: 1) have you ever put a report in neuroprose? 2) Approx how many reports have you retrieved this way? 3) Do you find ftp easy or difficult to use? 4) do you find ftp's binary mode, and the compress/uncompress protocol easy or difficult to use? 5) have you had any problems printing the postscript or tex posted by others? 6) Any other comments on the viability of continuing the distribution of preprints in this fashion? **Do not forward to other newsgroups** Jordan Pollack Laboratory for AI Research CIS Dept/OSU 2036 Neil Ave email: pollack at cis.ohio-state.edu Columbus, OH 43210 Fax/Phone: (614) 292-4890 From smk at flash.bellcore.com Thu Dec 28 11:16:57 1989 From: smk at flash.bellcore.com (Selma M Kaufman) Date: Thu, 28 Dec 89 11:16:57 EST Subject: No subject Message-ID: <8912281616.AA24130@flash.bellcore.com> Subject: Reprint Available: Learning of Stable States in Stochastic Asymmetric Networks Robert B. Allen and Joshua Alspector Bellcore TR-AR-89-351 December, 1989 Boltzmann-based models with asymmetric connections are investigated. Although they are initially unstable, we find that these networks spontaneously self-stablize as a result of learning. Moreover, we find that pairs of weights symmetrize during learning; however, the symmetry is not enough to account for the observed stability. To characterize the system we consider how its entropy is affected by learning and the entropy of the information stream. Finally, the stability of an asymmetric network was confirmed with an electronic model. For paper copies, contact: Selma Kaufman, Bellcore, 2M-356, 445 South St., Morristown, NJ 07960-1910. smk at flash.bellcore.com From hinton at ai.toronto.edu Thu Dec 28 13:15:13 1989 From: hinton at ai.toronto.edu (Geoffrey Hinton) Date: Thu, 28 Dec 89 13:15:13 EST Subject: Technical Report available Message-ID: <89Dec28.131528est.11309@ephemeral.ai.toronto.edu> Please do not reply to this message. To order a copy of the TR described below, please send email to carol at ai.toronto.edu _________________________________________________________________________ DETERMINISTIC BOLTZMANN LEARNING IN NETWORKS WITH ASYMMETRIC CONNECTIVITY Conrad C. Galland and Geoffrey E. Hinton Department of Computer Science University of Toronto 10 Kings College Road Toronto M5S 1A4, Canada Technical Report CRG-TR-89-6 The simplicity and locality of the "contrastive Hebb synapse" (CHS) used in Boltzmann machine learning makes it an attractive model for real biological synapses. The slow learning exhibited by the stochastic Boltzmann machine can be greatly improved by using a mean field approximation and it has been shown (Hinton, 1989) that the CHS also performs steepest descent in these deterministic mean field networks. A major weakness of the learning procedure, from a biological perspective, is that the derivation assumes detailed symmetry of the connectivity. Using networks with purely asymmetric connectivity, we show that the CHS still works in practice provided the connectivity is grossly symmetrical so that if unit i sends a connection to unit j, there are numerous indirect feedback paths from j to i. So long as the network settles to a stable state, we show that the CHS approximates steepest descent and that the proportional error in the approximation can be expected to scale as 1/sqrt(N), where N is the number of connections. ________________________________________________________________________ PS: The research described in this TR uses a different kind of network and a different analysis than the research described in the TR by Allen and Alspector that was recently advertised on the connectionists mailing list. However, the general conclusion of both TR's is the same. From gasser at iuvax.cs.indiana.edu Thu Dec 28 15:51:33 1989 From: gasser at iuvax.cs.indiana.edu (Michael Gasser) Date: Thu, 28 Dec 89 15:51:33 -0500 Subject: tech report Message-ID: **********DO NOT FORWARD TO OTHER BBOARDS************** **********DO NOT FORWARD TO OTHER BBOARDS************** **********DO NOT FORWARD TO OTHER BBOARDS************** The TR "Networks that Learn Phonology" (Gasser & Lee, Indiana University Computer Science Dept. TR #300), advertised here last week, has been added to the (recently restored) neuroprose database at Ohio State. If you already asked for a hardcopy, please try the ftp option. If this is not convenient, you can request a copy from Nancy Garrett, nlg at iuvax.cs.indiana.edu, Computer Science Department, Indiana University, Bloomington, IN 47405. Here's how to obtain a copy using ftp: unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): neuron ftp> cd pub/neuroprose ftp> type binary ftp> get (remote-file) gasser.phonology.ps.Z (local-file) foo.ps.Z ... ftp> quit unix> uncompress foo.ps unix> lpr -P(your_local_postscript_printer) foo.ps From pollack at cis.ohio-state.edu Fri Dec 29 16:12:36 1989 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Fri, 29 Dec 89 16:12:36 -0500 Subject: neuroprose Message-ID: <8912292112.AA16700@giza.cis.ohio-state.edu> *****do not forward to other newsgroups ***** It seems that the pub/neuroprose directory on cheops.cis.ohio-state.edu was set up in such a fashion that ANY anonymous user could have deleted it. It has been restored as of November 11, 1989. If you placed a report in there since then, I apologize for the inconvenience and ask anyone whose work was lost to re-post. Current contents: barto.control.ps barto.control.ps.Z barto.sequential_decisions.ps barto.sequential_decisions.ps.Z gasser.phonology.ps.Z kehagias.hmm0289.tex maclennan.contin_comp.tex maclennan.tex miikkulainen.hierarchical.ps.Z pollack.newraam.ps pollack.newraam.ps.Z pollack.nips88.ps pollack.perceptrons.ps tenorio.cluster.plain tenorio.speech_dev.ps (This is obviously just a fraction of the tech reports announced on the newsgroup, but the poll (below) shows that a few people appreciate rapid retrieval.) At the suggestion of Barak, I have changed the protocol somewhat to avoid the problem of malicious vandalism in the future. Unfortunately it puts me in the loop. The pub/neuroprose directory is now publically readable. It contains a subdirectory called Inbox, which is publically writable. To post a report, PUT it the pub/neuroprose/Inbox directory and send me email, and I will move your file to the neuroprose directory and acknowledge. Similar intervention is required for deleting a report. This seems less horrible than discovering the directory missing every couple of months. Jordan -------------------------------- Here are initial results of the straw poll. 2 respondents have posted reports 12 have retrieved reports (average of 2) 12 find FTP easy to use, 1 hard 8 find binary compression easy, 4 hard 11 votes for continuation. Major Problems: No service in europe (4) Difficulty with Tex standard (3) Still cutting & pasting figures, cant use it (2) Non-unix mac is pretty incompatible (1) Plagiarization worry (1) Suspicion of technology (1) *****do not forward to other newsgroups ***** From pauls at neuron.Colorado.EDU Fri Dec 29 19:06:55 1989 From: pauls at neuron.Colorado.EDU (Paul Smolensky) Date: Fri, 29 Dec 89 17:06:55 MST Subject: Behavioral Neuroscience Faculty Position at Boulder Message-ID: <8912300006.AA00736@neuron.colorado.edu> Below is the job description for a faculty position in Behavioral Neuroscience at the University of Colorado at Boulder. Our campus has an active, collaborative, multi-disciplinary connectionist community. Anyone interested in more information is welcome to contact us; if you apply for the job, let us know so we can follow up. -- Paul Smolensky & Mike Mozer BEHAVIORAL NEUROSCIENCE POSITION University of Colorado, Boulder The Department of Psychology at the University of Colorado at Boulder invites applications for a faculty position in Behavioral Neuroscience, starting September 1990. Outstanding applicants at any rank are encouraged to apply. This position carries with it attractive research space and significant start-up funds. Appli- cants should send a vita, 3 letters of recommendation, and a statement of teaching and research interest to: Jerry Rudy, Chairperson, Behavioral Neuroscience Search Committee, Department of Psychology, Box 345, University of Colorado, Boulder, CO 80309. Application deadline is January 15, 1990. From mv10801 at uc.msc.umn.edu Fri Dec 29 10:42:34 1989 From: mv10801 at uc.msc.umn.edu (mv10801@uc.msc.umn.edu) Date: Fri, 29 Dec 89 09:42:34 CST Subject: Symmetrizing weights Message-ID: <8912291542.AA20346@uc.msc.umn.edu> On a related note, I described in a 1988 tech report how asymmetric lateral connections in a self-organizing neural network can symmetrize by using an inhibitory learning rule. The paper is called "Self- Organizing Neural Networks for Perception of Visual Motion," by J.A.Marshall. A more concise version will appear in the next issue of Neural Networks (January 1990). To obtain the TR, you can write to the Dept. Secretary, Boston Univ. Computer Science Dept., 111 Cummington St., Boston, MA 02215, U.S.A., [pam at cs.bu.edu], and ask for CS-TR-88-010; the price is $7.00. --Jonathan A. Marshall mv10801 at uc.msc.umn.edu Center for Research in Learning, Perception, and Cognition 205 Elliott Hall, University of Minnesota Minneapolis, MN 55455, U.S.A. From skrzypek at CS.UCLA.EDU Fri Dec 1 13:42:53 1989 From: skrzypek at CS.UCLA.EDU (Dr. Josef Skrzypek) Date: Fri, 1 Dec 89 10:42:53 PST Subject: TR available Message-ID: <8912011842.AA05532@retina.cs.ucla.edu> UCLA MPL TR 89-10 Visual Recognition of Script Characters; Neural Network Architectures. Josef Skrzypek Jeff Hoffman Machine Perception Laboratory Computer Science Department University of California Los Angeles, CA 90024 Visual recognition of script characters is introduced in the context of the neural network paradigm and the results of applying one specific neural architecture are analyzed. First, computer classification of script characters is partitioned into preprocessing, recognition and postprocessing techniques which are briefly reviewed in terms of suitability for implementation as neural net architectures. The second part of the paper introduces one example of neural net solution to script recognition. Handwriting is assumed to be defined as concatenation of ballistic hand movements where characters can be represented as functions of position and velocity. We adapt a hypothesized model of human script generation where characters can be composed from a limited number of basic strokes which are learned using visual and positional feedback. The neural representation of these characters is used for assembling motor program during writing and it can be used for their visual recognition during reading. A modified, three-layer "backpropagation" algorithm is used to learn features of each single character that are independent of writing style. Preliminary results suggest 80% recognition rate. From netlist at psych.Stanford.EDU Sat Dec 2 11:48:52 1989 From: netlist at psych.Stanford.EDU (Mark Gluck) Date: Sat, 2 Dec 89 08:48:52 PST Subject: REMINDER: TODAY (MONDAY) -- R. Sutton @ 3:45pm Message-ID: Stanford University Interdisciplinary Colloquium Series: Adaptive Networks and their Applications December 4th (Monday, 3:45pm): Room 380-380C ******************************************************************************** DYNA: AN INTEGRATED ARCHITECTURE FOR LEARNING, PLANNING, AND REACTING Richard S. Sutton GTE Laboratories Incorporated ******************************************************************************** Abstract Location: Room 380-380C, which can be reached through the lower level between the Psychology and Mathematical Sciences buildings. Level: Technically oriented for persons working in related areas. Mailing lists: To be added to the network mailing list, netmail to netlist at psych.stanford.edu with "addme" as your subject header. Additional information: Contact Mark Gluck (gluck at psych.stanford.edu). From LUBTODI%YALEVM.BITNET at VMA.CC.CMU.EDU Sun Dec 3 16:10:00 1989 From: LUBTODI%YALEVM.BITNET at VMA.CC.CMU.EDU (LUBTODI%YALEVM.BITNET@VMA.CC.CMU.EDU) Date: Sun, 03 Dec 89 16:10 EST Subject: requesting references to problem solving task models Message-ID: I am trying to compile a bibliography about connectionist/neural network research on high level cognitive problem solving. I would appreciate any suggestions on relevant arrticles, book chapters, or manuscripts. Please send complete citation information if possible. Some relevant topics include: analogical or inductive reasoning, logical (deductive) reasoning, multistage decision tasks, models of intelligence, schema implementation and use, or other applications of connectionist/neural network models to high-level cognitive tasks. Todd Lubart Yale University Department of Psychology Box 11A Yale Station New Haven CT 06520 BITNET ADDRESS: LUBTODI at YALEVM From harnad at clarity.Princeton.EDU Tue Dec 5 01:01:46 1989 From: harnad at clarity.Princeton.EDU (Stevan Harnad) Date: Tue, 5 Dec 89 01:01:46 EST Subject: Emperor's New Mind: BBS Call for Commentators Message-ID: <8912050601.AA01777@suspicion.Princeton.EDU> Below is the synopsis of a book that will be accorded a multiple book review (20 - 30 multidisciplinary reviews, followed by the author's response) in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal that provides Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Reviewers must be current BBS Associates or nominated by a current BBS Associate. To be considered as a reviewer for this book, to suggest other appropriate reviewers, or for information about how to become a BBS Associate, please send email to: harnad at confidence.princeton.edu or write to: BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771] ____________________________________________________________________ THE EMPEROR'S NEW MIND: CONCERNING COMPUTERS, MINDS AND THE LAWS OF PHYSICS Roger Penrose Rouse Ball Professor of Mathematics University of Oxford The Emperor's New Mind is an attempt to put forward a scientific alternative to the viewpoint of "Strong AI," according to which mental activity is merely the acting out of some algorithmic procedure. John Searle and other thinkers have likewise argued that mere calculation does not, of itself, evoke conscious mental attributes, such as understanding or intentionality, but they are still prepared to accept that the action of the brain, like that of any other physical object, could in principle be simulated by a computer. In my book I go further than this and suggest that the outward manifestations of conscious mental activity cannot even be properly simulated by calculation. To support this view I use various arguments to show that the results of mathematical insight, in particular, do not seem to be obtained algorithmically. The main thrust of this work, however, is to present an overview of the present state of physical understanding and to show that an important gap exists at the point where quantum and classical physics meet, and to speculate on how the conscious brain might be taking advantage of whatever new physics is needed to fill this gap, in order to achieve its non-algorithmic effects. From noel%CS.EXETER.AC.UK at VMA.CC.CMU.EDU Tue Dec 5 11:59:44 1989 From: noel%CS.EXETER.AC.UK at VMA.CC.CMU.EDU (Noel Sharkey) Date: Tue, 5 Dec 89 16:59:44 GMT Subject: reminder Message-ID: <6765.8912051659@kaos.cs.exeter.ac.uk> A reminder about the special issue of connection science. keep those submissions coming. Issues 2 and 3 are in press and may be out before christmas. ******* CALL FOR PAPERS ********* CONNECTION SCIENCE (Journal of Neural Computing, Artificial Intelligence and Cognitive Research) Special Issue CONNECTIONIST RESEARCH ON NATURAL LANGUAGE Editor: Noel E. Sharkey, University of Exeter Special Editorial Review Panel Robert Allen, Bell Communication Research Garrison W. Cottrell, University of California, San Diego Michael G. Dyer, University of California, Los Angeles Jeffrey L. Elman, University of California, San Diego George Lakoff, University of California, Berkeley Wendy W. Lehnert, University of Massachusetts at Amherst Jordan Pollack, Ohio State University Ronan Reilly, Beckmann Institute, Illinois Bart Selman, University of Toronto Paul Smolensky, University of Colorado, Boulder This special issue will accept submissions of full length connectionist papers and brief reports from any area of natural language research including: Connectionist applications to AI problems in natural language (e.g. paraphrase, summarisation, question answering). New formalisms or algorithms for natural language processing. Simulations of psychological data. Memory modules or inference mechanisms to support natural language processing. Representational methods for natural language. Techniques for ambiguity resolution. Parsing. Speech recognition, production, and processing. Connectionist approaches to linguistics (phonology, morphology etc.). Submissions of short reports or recent updates will also be accepted for the Brief Reports section in the journal. No paper should be currently submitted elsewhere. DEADLINES Deadline for submissions: December 15th 1989 Decision/reviews by: February 1990 Papers may be accepted to appear in regular issues if there is insufficient space in the special issue. For further information about the journal please contact Lyn Shackleton (Assistant Editor) Centre for Connection Science JANET: lyn at uk.ac.exeter.cs Dept. Computer Science University of Exeter UUCP: !ukc!expya!lyn Exeter EX4 4PT Devon BITNET: lyn at cs.exeter.ac.uk@UKACRL U.K. FAX: (0392) 264067 From kannan at faulty.che.utexas.edu Tue Dec 5 14:29:05 1989 From: kannan at faulty.che.utexas.edu (kannan@faulty.che.utexas.edu) Date: Tue, 5 Dec 89 13:29:05 CST Subject: Question on BP.. Message-ID: <8912051929.AA10774@faulty.che.utexas.edu.che.utexas.edu> Hi ., I would like to know if continous valued inputs can be used for the Back Propagation algorithm ? If so what are the problems faced on any of the simulations done? I would also like to know if there are any software available which would allow one to give inputs say between 0 and 1 . Thanks ., Kannan From Scott.Fahlman at B.GP.CS.CMU.EDU Tue Dec 5 16:43:45 1989 From: Scott.Fahlman at B.GP.CS.CMU.EDU (Scott.Fahlman@B.GP.CS.CMU.EDU) Date: Tue, 05 Dec 89 16:43:45 EST Subject: Question on BP.. In-Reply-To: Your message of Tue, 05 Dec 89 13:29:05 -0600. <8912051929.AA10774@faulty.che.utexas.edu.che.utexas.edu> Message-ID: I would like to know if continous valued inputs can be used for the Back Propagation algorithm ? Yes, definitely. For two good examples, see the work of Lapedes and Farber at Los Alamos on predicting time series (I don't have the reference handy) and the work on the two-spirals benchmark. The latter is described in a paper by Lang & Witbrock in the Proceedings of the 1988 Connectionist Models Summer School and in follow-up work presented by me and Chris Lebiere in the 1989 NIPS conference (proceedings due out in April or so, tech report sooner). In addition, most of the problems I've seen on phoneme labeling and sonar use multiple continuous-valued inputs. -- Scott Fahlman From slehar at bucasb.BU.EDU Tue Dec 5 23:20:29 1989 From: slehar at bucasb.BU.EDU (slehar@bucasb.BU.EDU) Date: Tue, 5 Dec 89 23:20:29 EST Subject: Question on BP.. In-Reply-To: connectionists@c.cs.cmu.edu's message of 5 Dec 89 21:07:14 GMT Message-ID: <8912060420.AA22688@bucasd.bu.edu> Yeah- I did a hand-sketched character recognition program with backprop (see ICNN 1986 was it? the Boston conference). I had a 100 x 100 sketch pad which I down-sampled to a 10 x 10 grid which became an input layer of 100 neurons. A run-time option allowed for either thresholding, which produced on or off binary inputs, or proportional sampling, which produced graded neurons between 0.0 and 1.0. The algorithm quite effortlessly handles both. To the human observer, the graded inputs looked much more recognizable because you could tell whether the sketch just nicked the corner of a grid square, or ran smack through the middle of it. There was, in fact, more information in the graded version. To my surprise, backprop actually prefered the binarized version. It learned faster, and discriminated better. From peterc at cs.brandeis.edu Wed Dec 6 13:48:09 1989 From: peterc at cs.brandeis.edu (Peter Cariani) Date: Wed, 6 Dec 89 13:48:09 est Subject: Alternative conceptions of symbol systems Message-ID: Comments on Harnad's definition of symbol system: We all owe to Steve Harnad the initiation of this important discussion. I believe that Harnad has taken the discourse of the symbol grounding problem in the right direction, toward the grounding of symbols in their interactions with the world at large. I think, however, that we could go further in this direction, and in the process continue to re-examine some of the fundamental assumptions that are still in force. The perspective presented here is elaborated much more fully and systematically in a doctoral dissertation that I completed in May of this year: Cariani, Peter (1989) On the Design of Devices With Emergent Semantic Functions Ph.D. Dissertation, Department of Systems Science, SUNY-Binghamton, University Microfilms, Ann Arbor Michigan. My work is primarily based on that of theoretical biologists Howard Pattee and Robert Rosen. Pattee has been elaborating on the evolutionary origins of symbols in biological systems while Rosen has concentrated on the modelling relations that biological organisms implement. The Hungarian theoretical biologist George Kampis has also recently published work along these lines. I would like to apologise for the length of this response, but I come out of a field which is very small and virtually unknown by those outside of it, so many of the basic concepts must be covered to avoid misunderstandings. Here are some suggestions for clarifying this murky discourse about symbol systems: 1) Define the tokens in terms of observable properties. The means of recognizing the tokens or states of the system must be given explicitly, such that all members of a community of observer-participants can reliably agree on what "state" the physical system is in. Without this specification the definition is gratuitous hand-waving. I stress this because there are a number of papers in the literature which discuss "computations" in the physical world (e.g. "the universe is a gigantic computation in progress") without the slightest indication of what the symbol tokens are that being manipulated, what the relevant states of such systems might be, or how we would go about determining, in concrete terms, whether a given physical system is to be classified as a physical symbol system. One has to be careful when one says "practically everything can be interpreted as rule-governed." Of course we can easily wave our hands and say, yes, those leaves fluttering in the breeze over there are rule-governed, without having any idea what the specific rules are or for that matter, what the states are), but to demonstrate that a phenomenon is rule-governed, we should show how we would come to see it as such: we should concretely show what measurements need to be made, we should make them, and then articulate the rules which describe/govern the behavior. If we say "a computer is a physical symbol system" we mean that if we look at the computer through the appropriate observational frame, measuring the appropriate voltages at the logic gates, then we can use this device to consistently and reliably implement a deterministic input-output function. For each initial distinguishable state of affairs, by operating the device we always arrive at one and only one end state within some humanly-relevant amount of time. This is a functionally-based physically-implemented concept of a formal system, one which is related to Hilbert's idea of reliable physical operations on concrete symbols leading to consistent results. Note that this definition is distinct from logicist/platonist definitions which include nonconcrete objects (e.g. sets of sets) or physically unrealizable objects (e.g. potential and actual infinities, indefinitely extendible tapes). 2) Abandon the explicit-implicit rule distinction. First, I'm not sure if Wittgenstein's distinction between "explicit" and "implicit" rule-following is appropriate here, since we are taking the role of external observers rather than participants in a language-game. If the purpose of the definition is to give us criteria to decide whether we are participating in a formal system, then we must know the rules to follow them. If the purpose is to identify "physical symbol systems" in nature and in human artefacts, then this distinction is irrelevant. What does it mean for a computer to explicitly or implicitly carry out a logical operation? If it made a difference, then the device would cease to be wholly syntactic. If it doesn't make a difference then we don't need it in our definition. Does the brain implement a physical symbol system, and if so, does it follow rules explicitly or implicitly? How would we decide? 3) Abandon semantic interpretability. I'm not sure if I understand fully the motivation behind this criterion of semantic interpretability. An external observer can assign whatever meanings s/he chooses to the tokens of the formal device. This criterion makes the definition very subjective, because it depends upon an arbitrary assignment of meaning. I don't even see how this is restrictive, since the observer can always come up with purely whimsical mappings, or to simply let the tokens stand for themselves. Note that "semantic interpretability" does not confer upon the physical symbol system its own semantics. The relations of the symbols manipulated in computer programs to the world at large are completely parasitical on human interpreters, unless the computer is part of a robot (i.e. possesses its own sensors and effectors). Merely being semantically interpretable doesn't ground the semantics in a definite way; when we say "X in my program represents the number of speeding violations on the Mass Pike" we stabilize the relation of the symbol X in the computer relative to ourselves (assuming that we can be completely consistent in our interpretation of the program). But each of us has a different tacit interpretation of what "the number of speeding violations on the Mass Pike" means. (Does a violator have to be caught for it to be a violation? Are speeding police cars violations? How is speed measured?) In order for this tacit interpretation to be made explicit we would need to calibrate our perceptions and their classifications along with our use of language to communicate them so that we as a community could reach agreement on our interpretations. The wrong turn that Carnap and many others made in the 1930's was to assume that these interpretations could be completely formalized, that a "logical semantics" was possible in which one could unambiguously determine the "meaning of an expression" within the context of other expressions. The only way to do this is to formalize completely the context, but in doing so you transform a semantic relation into a syntactic one. The semantic relation of the symbol to the nonsymbolic world at large gets reduced to a syntactic rule-governed relation of the symbol to other symbols. (Contingent truths become reduced to necessary truths.) What Carnap tried to say was that as long as a proposition referred to an observation statement (which refers to an act of observation), then that proposition has a semantic content. This has led us astray to the point that many people no longer believe that they need to materially connect the symbols to the world through perception and action, that merely referring to a potential connection is enough. This is perhaps the most serious failing of symbolic AI, the failure to ground the symbols used by their programs in materially implemented connections to the external world. 4) Abandon semantic theories based on reference A much better alternative to logical semantics involves replacing these syntactic theories of reference with a pragmatist semiotic when we go to analyze the roles of symbols in various kinds of devices. Pragmatist semiotics (as developed within Charles Morris' framework) avoid the formal reductionism and realist assumptions of referential theories of meaning by replacing correspondences between "objective" referents with physically implemented semantic operations (e.g. measurement, perception, control, action). These ideas are developed more fully in my dissertation. What one must do to semantically ground the symbols is to connect them to the world via sensors and effectors. If they are to be useful to the device or organism, they must be materially linked to the world in a nonarbitrary way, rather than referentially connected in someone's mind or postulated as unspecified logical ("causal") connections (as in "possible world" semantics). 4) Abandon Newell and Pylyshyn's Symbol Level. Upon close examination of both of their rationales for a separate symbol level, one finds that it rests precariously upon a distinction, between the Turing machine's internal states and the state of affairs on the tape (Pylyshyn, 1984, pp.68-74). Now the essential nature of this distinction is maintained because one is potentially infinite and the other is finite (else one could simply make a big finite-state-automaton and the distinction would be an abitrary labelling of the global machine states), but physically realizable devices cannot be potentially infinite, so the essential, nonarbitrary character of the distinction vanishes (Cariani, 1989, Appendix 2). 5) Purge the definition of nonphysical, platonic entities (or at least recognize them as such and be aware of them). For example, the definition of physical symbol systems is intimately tied up with Turing's definition of computation, but, as von Neumann noted, this is not a physical definition; it is a formal one. Now, physically realizable automata cannot have indefinitely extendible tapes, so the relevance of potentially-infinite computations to real world computations is dubious. Everything we can physically compute can be described in terms of finite-state automata (finite tape Turing machines). We run out of memory space and processing time long before we ever encounter computability limitations. Computational complexity matters, computability doesn't. I'd be especially interested in thoughtful counter-arguments to this point. 6) Alternative I: Adopt a physical theory of symbolic action. Howard Pattee has been developing a physical theory of symbolic function for 20 years--symbolic processes are those which can be described in terms of nonholonomic constraints (in terms of the equations of motion and basins of attraction (in terms of trajectories)(see refs: Pattee; Cariani, Minch, Rosen). (Next summer there will be a workshop entitled "Symbols and Dynamics" at the ISSS meeting in Portland, Ore., July 8-13, 1990. Contact: Gail Fleischaker, 76 Porter St., Somerville, MA 02143 for more info.) The only disadvantage of these approaches lie in their classical/ realist assumption of complete knowledge of the state space within which the symbolic activity occurs. These premises are deeply embedded in the very terms of the disourse, but nevertheless, this descriptive physical language is exceedingly useful as long as the limitations of these assumptions are constantly kept in mind. To translate from the semiotic to the physical, syntactic relations are those processes for which a nonholonomic rate-independent equation of constraint can completely replace the rate-dependent laws of motion. For an electronic computer, we can replace all of the microscopic electromagnetic equations of motion describing the trajectories of electrons with macroscopic state-transition rules describing gate voltages in terms of binary states. These state transition rules are not rate-dependent, since they depend upon successions of states rather than time; consequently time need not enter explicitly when describing the behavior of a computer in terms of binary states of gate voltages. Semantic relations are those processes which can be described in terms of rate-independent terms coupled to rate-dependent terms: one side of the constraint equation is symbolic and rate-independent, the other half is nonsymbolic and rate-dependent. Processes of measurement are semantic in character: a rate-dependent, nonsymbolic interaction gives rise to a rate-independent symbolic output. Here pragmatic relations are those processes which change the structure of the organism or device, which appear in the formalism as changes in the nonholonomic constraints over time. 7) Alternative II: Adopt a phenomenally grounded systems-theoretic definition. Part of my work has been to ground the definition of symbol in terms of the observed behavior of a system. This is the only way we will arrive at an unambiguous definition. We select a set of measuring devices which implement distinctions on the world which become our observable "states." We observe the behavior of the physical system through this observational framework. This strategy is similar to the way W. Ross Ashby grounded his theory of systems. Either the state-transitions are deterministic--state A is always followed by state B which is always followed by state G--or they are nondeterministic-- state D is sometimes followed by state F and sometimes followed by state J. Here the relation between states A,B, and G appears to be symbolic, because the behavior can be completely captured in terms of rules, where the relation between the states D, F, and J appears nonsymbolic, because the behavior depends upon aspects of the world which are not captured by this observational frame. Syntactic, rule-governed, symbol manipulations appear to an external observer as deterministic state transitions (in Ashby's terms, "a state-determined system"). Semantic processes appear to the observer as nondeterministic, contingent state transitions leading to states which appear as symbolic. Pragmatic relations appear as changes in the structure of the observed state-transitions. 8) Alternative III: Adopt a physical, mechanism-based definition of symbol systems. Symbolic and nonsymbolic can also be viewed in terms of "digital" and "analog" in the sense of differentiated (discrete) and nondifferentiated (continuous). Sensors implement semantic A-to-D operations. Logical operations ("computations") implement syntactic, determinate D-to-D transformations. Controls implement semantic D-to-A operations. One has to be careful here, because there are many confusing uses of these words (e.g. "analog computation"), and what appears to be "analog" or "digital" is a function of how you look at the device. Given a particular observational framework and a common usage of terms, however, these distinctions can be made reliable. I would argue that von Neumann's major philosophical works (General & Logical Theory of Automata, Self-Reproducing Automata, The Computer and the Brain) all take this approach. 9) Alternative IV: Adopt a semiotic-functionalist definition of symbol systems. It can be argued that the basic functionalities needed in the modelling relation are the ability to convert nonsymbolic interactions with the world into symbols (measurement), the ability to manipulate symbols in a definite, rule-governed way (computations), and the ability to use a symbol to direct action on the nonsymbolic world (controls). I have argued that these functionalities are irreducible; One cannot achieve measurements by doing computations simply because measurement involves a contingent state-transition where two or more possible observed outcomes are reduced to one observed outcome, whereas computation involves a necessary state-transition, where each state has but one observed outcome. These assertions are similar to the epistemological positions adopted by Bohr, von Neumann, Aristotle and many others. In such a definition, a physical symbol system is defined in terms of its use to us as observer-participants. Are we trying to gain information about the external world by reducing the possible observed states of a sensor to one (by performing a measurement)? Are we trying to manipulate symbols in a consistent, reliable way so that we always arrive at the same outcome given the same input strings and rules. If so, we are performing computations. Are we trying to use symbols to change the nonsymbolic world by acting on it. If so, we are employing symbolically-directed control operations. In summary, there are many worthwile alternatives to the basic assumptions that have been handed down to us through logical positivism, model-theoretic semantics, artificial intelligence and cognitive psychology. Steve Harnad has done us a great service in making many of these assumptions visible to us and clarifying them in the process. There are other conceptual frameworks which can be of great assistance to us as we engage in this process: theoretical biology, semiotics/pragmatist philosophy, cybernetics and systems theory. It is difficult to entertain ideas which challenge cherished modes of thought, but such critical questioning and debate are indispensible if we are to deepen our understanding of the world around us. References: ------------------------------------------------------------------------------ Cariani, Peter (1989) On the Design of Devices with Emergent Semantic Functions. PhD Dissertation, Department of Systems Science, State University of New York at Binghamton; University Microfilms, Ann Arbor, MI. (1989) Adaptivity, emergence, and machine-environment dependencies. Proc 33rd Ann Mtg Intl Soc System Sciences (ISSS), July, Edinburgh, III:31-37. Kampis, George (1988) Two approaches for defining "systems." Int. J. Gen. Systems (IJGS), vol 15, pp.75-80. (1988) On the modelling relation. Systems Research, vol 5, (2), pp. 131-44. (1988) Some problems of system descriptions I: Function, II: Information. Int. J. Gen. Systems 13:143-171. Minch, Eric (1988) Representations of Hierarchical Structures in Evolving Networks. PhD Dissertation, Dept. of Systems Science, SUNY-Binghamton. Morris, Charles (1956) Foundations of the theory of Signs. In:Foundations in the Unity of Science, Vol.I, Neurath, Carnap, & Morris, eds, UChicago. Pattee, Howard H. (1968) The physical basis of coding and reliability in biological evolution. In: Towards a Theoretical Biology (TTB) Vol. 1 C.H. Waddington, ed., Aldine, Chicago. (1969) How does a molecule become a message? Dev Biol Supp 3: 1-16. (1972) Laws and constraints, symbols and languages. In: TTB, Vol. 4 (1973) Physical problems in the origin of natural controls. In: Biogenesis, Evolution, Homeostasis. Alfred Locker, ed., Pergamon Press, New York. (1973) Discrete and continuous processes in computers and brains. In: The Physics & Mathematics of the Nervous System, Guttinger & Conrad, eds S-V. (1977) Dynamic and linguistic modes of complex systems. IJGS 3:259-266. (1979) The complemetarity principle and the origin of macromolecular information. Biosystems 11: 217-226. (1982) Cell psychology: an evolutionary view of the symbol-matter problem. Cognition & Brain Theory 5:325-341. (1985) Universal principles of measurement and language functions in evol- ving systems. In: Complexity, Language, and Life Casti & Karlqvist, S-V. (1988) Instabilities and information in biological self-organization. In: Self-Organizing Systems: The Emergence of Order. E Yates, ed. Plenum Press. (1989) Simulations, realizations, and theories of life. Artificial Life, C. Langton, ed., Addison-Wesley. Rosen, Robert (1973) On the generation of metabolic novelties in evolution. In: Biogenesis, Evolution, Homeostasis. A Locker, ed., Pergamon Press, New York. (1974) Biological systems as organizational paradigms. IJGS 1:165-174 (1978) Fundamentals of Measurement and Representation of Natural Systems. North Holland (N-H), New York. (1985) Anticipatory Systems. Pergamon Press, New York. (1986) Causal structures in brains and machines. IJGS 12: 107-126. (1987) On the scope of syntactics in mathematics and science: the machine metaphor. In: Real Brains Artificial Minds. Casti & Karlqvist, eds N-H. From pollack at cis.ohio-state.edu Fri Dec 8 10:16:36 1989 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Fri, 8 Dec 89 10:16:36 EST Subject: New CogSci Conf. Deadline Message-ID: <8912081516.AA13708@toto.cis.ohio-state.edu> I recently communicated with one of the organizers of this years Cognitive Science Conference, who informed me that a new Call For Papers was going out soon with a revised deadline of March 15th. >> Date: Fri, 8 Dec 89 09:22:18 EST >> From: adelson%cs.tufts.edu at RELAY.CS.NET >> To: pollack at cis.ohio-state.edu >> Subject: Re: verification >> >> Jordan, >> We have moved the deadline to March 15 for this year's Cog Sci >> papers. >> >> Talk to you soon, >> B Holiday Cheers, Jordan From french at cogsci.indiana.edu Fri Dec 8 14:11:06 1989 From: french at cogsci.indiana.edu (Bob French) Date: Fri, 8 Dec 89 14:11:06 EST Subject: CogSci Deadline addendum Message-ID: There is a "no revisions" stipulation that goes along with the March 15 CogSci Conference paper submission deadline. Below a message I received from Massimo Piattelli-Palmarini (Nov. 21): > We have received a number of such protests. In the next few days all > members will receive a new deadline communication. The new deadline > is March 15, but then there will be no revisions for the accepted > papers. And these must be submitted by that date as photo-ready. > I am sorry, but the organizers of last year's meeting had a lot > of difficulties in meeting the publisher's deadline and have the > Proceedings ready at the meeting. They could do it, I am told, > only because by sheer good luck the printer happened to be located > at Ann Arbor. With the new deadlines, which were decided upon only > three days ago, there is enough time to go to print. > Sorry for all inconvenience this may cause. Yours truly. > Massimo Piattelli-Palmarini. Bob French From B344DSL%UTARLG.ARL.UTEXAS.EDU at ricevm1.rice.edu Sun Dec 10 14:56:00 1989 From: B344DSL%UTARLG.ARL.UTEXAS.EDU at ricevm1.rice.edu (B344DSL%UTARLG.ARL.UTEXAS.EDU@ricevm1.rice.edu) Date: Sun, 10 Dec 89 13:56 CST Subject: No subject Message-ID: This message is an announcement of a forthcoming graduate textbook neural networks by Daniel S. Levine. The title of the book is Introduction to Neural and Cognitive Modeling, and the publisher is Lawrence Erlbaum Associates, Inc. The book should be in production early in 1990, so should, with luck, be ready by the start of the Fall, 1990 semester at universities. Chapters 2 to 7 will contain homework exercises. Some of the homework problems will involve computer simulations of models already in the literature. Others will involve thought experiments about whether a particular network can model a particular cognitive process, or how that network might be modified to do so. The table of contents follows. Please contact the author or publisher for further information. Author: Daniel S. Levine Department of Mathematics University of Texas at Arlington Arlington, TX 76019-9408 817-273-3598 b344dsl at utarlg.bitnet Publisher: Lawrence Erlbaum Associates, Inc. 365 Broadway Hillsdale, NJ 07642 201-666-4110 Table of Contents: PREFACE CHAPTER 1: BRAIN AND MACHINE: THE SAME PRINCIPLES? What are Neural Networks? What are Some Principles of Neural Network Theory? Methodological Considerations i 36 CHAPTER 2: HISTORICAL OUTLINE 2.1 -- Digital Approaches The Mc Culloch-Pitts network Early Approaches to Modeling Learning: Hull and Hebb Rosenblatt's Perceptrons Some Experiments with Perceptrons The Divergence of Artificial Intelligence and Neural Modeling 2.2 -- Continuous and Random-net Approaches Rashevsky's Work Early Random Net Models Reconciling Randomness and Specificity 2.3 -- Definitions and Detailed Rules for Rosenblatt's Perceptrons CHAPTER 3: ASSOCIATIVE LEARNING AND SYNAPTIC PLASTICITY 3.1 -- Physiological Bases for Learning 3.2 -- Rules for Associative Learning Outstars and Other Early Models of Grossberg Anderson's Connection Matrices Kohonen's Work 3.3 -- Learning Rules Related to Changes in Node Activities Klopf's Hedonistic Neurons and the Sutton-Barto Learning Rule Error Correction and Back Propagation The Differential Hebbian Idea Gated Dipole Theory 3.4 -- Associative Learning of Patterns Kohonen's Recent Work: Autoassociation and Heteroassociation Kosko's Bidirectional Associative Memory 3.5 -- Equations and Some Physiological Details Neurophysiological Principles Equations for Grossberg's Outstar Kohonen's Early Equations (Example: Simulation of Face Recognition) Derivation of the Back Propagation Learning Law Due to Rumelhart, Hinton, and Williams Equations for Sutton and Barto's Learning Network Gated Dipole Equations Due to Grossberg Kosko's Bidirectional Associative Memory (BAM) Kohonen's Autoassociative Maps CHAPTER 4: COMPETITION, INHIBITION, SPATIAL CONTRAST ENHANCEMENT, AND SHORT-TERM MEMORY 4.1 -- Early Studies and General Themes (Contrast Enhancement, Competition, and Normalization) Nonrecurrent Versus Recurrent Lateral Inhibition 4.2 -- Lateral Inhibition and Excitation Between Sensory Representations Wilson and Cowan's Work Work of Grossberg and Colleagues Work of Amari and Colleagues Energy Functions in the Cohen-Grossberg and Hopfield- Tank Models The Implications of Approach to Equilibrium 4.3 -- Competition and Cooperation in Visual Pattern Recognition Models Visual Illusions Boundary Detection Versus Feature Detection Binocular and Stereoscopic Vision Comparison of Grossberg's and Marr's Approaches 4.4 -- Uses of Lateral Inhibition in Higher-level Processing 4.5 -- Equations for Various Competitive and Lateral Inhibition Models Equations of Sperling and Sondhi Equations of Wilson and Cowan Equations of Grossberg and his Co-workers: Analytical Results Equations of Hopfield and Tank Equations of Amari and Arbib CHAPTER 5: CONDITIONING, ATTENTION, REINFORCEMENT, AND COMPLEX ASSOCIATIVE LEARNING 5.1 -- Network Models of Classical Conditioning Early Work: Uttley and Brindley Rescorla and Wagner's Psychological Model Grossberg: Drive Representations and Synchronization Aversive Conditioning and Extinction Differential Hebbian Theory Versus Gated Dipole Theory 5.2 -- Attention and Short Term Memory in the Context of Conditioning Grossberg's Approach to Attention Sutton and Barto's Approach to Blocking Some Contrasts Between the Above Two Approaches Further Connections with Invertebrate Neurophysiology Gated Dipoles and Aversive Conditioning 5.3 -- Equations for Some Conditioning and Associative Learning Models Klopf's Drive-reinforcement Model Some Later Variations of the Sutton-Barto model: Temporal Difference The READ Circuit of Grossberg, Schmajuk, and Levine The Aplysia Model of Gingrich and Byrne CHAPTER 6: CODING AND CATEGORIZATION 6.1 -- Interactions Between Short and Long Term Memory in Code Development: Examples from Vision Malsburg's Model with Synaptic Conservation Grossberg's Model with Pattern Normalization Mathematical Results of Grossberg and Amari Feature Detection Models with Stochastic Elements From Feature Coding to Categorization 6.2 -- Supervised Classification Models The Back Propagation Network and its Variants Some Models from the Anderson-Cooper School 6.3 -- Unsupervised Classification Models The Rumelhart-Zipser Competitive Learning Algorithm Adaptive Resonance Theory Edelman and Neural Darwinism 6.4 -- Translation and Scale Invariance 6.5 -- Equations for Various Coding and Categorization Models Malsburg's and Grossberg's Development of Feature Detectors Some Implementation Issues for Back Propagation Equations Brain-state-in-a-box Equations Rumelhart and Zipser's Competitive Learning Equations Adaptive Resonance Equations CHAPTER 7: OPTIMIZATION, CONTROL, DECISION MAKING, AND KNOWLEDGE REPRESENTATION 7.1 -- Optimization and Control Hopfield, Tank, and the Traveling-Salesman Problem Simulated Annealing and Boltzmann Machines Motor Control: the Example of Eye Movements Motor Control: Arm Movements Speech Recognition and Synthesis Robotic Control 7.2 -- Decision Making and Knowledge Representation What, if Anything, do Biological Organisms Optimize? Affect, Habit, and Novelty in Neural Network Theories Neural Control Circuits, Neurochemical Modulation, and Mental Illness Some Comments on Models of Specific Brain Areas Knowledge Representation: Letters and Words Knowledge Representation: Concepts and Inference 7.3 -- Equations for a Few Neural Networks Performing Complex Tasks Hopfield and Tank's "Traveling Salesman" Network The Boltzmann Machine Grossberg and Kuperstein's Eye Movement Network VITE and Passive Update of Position (PUP) for Arm Movement Control Affective Balance and Decision Making Under Risk CHAPTER 8: A FEW RECENT ADVANCES IN NEUROCOMPUTING AND NEUROBIOLOGY 8.1 -- Some "Toy" and Real World Computing Applications 8.2 -- Some Biological Discoveries APPENDIX 1: BASIC FACTS OF NEUROBIOLOGY The Neuron Synapses, Transmitters, Messengers, and Modulators Invertebrate and Vertebrate Nervous Systems Functions of Subcortical Regions Functions of the Mammalian Cerebral Cortex APPENDIX 2: DIFFERENCE AND DIFFERENTIAL EQUATIONS IN NEURAL NETWORKS Example: the Sutton-Barto Difference Equations Differential Versus Difference Equations Outstar Equations: Network Interpretation and Numerical Implementation The Chain Rule and Back Propagation Dynamical Systems: Steady States, Limit Cycles, and Chaos From aboulang at WILMA.BBN.COM Mon Dec 11 15:40:49 1989 From: aboulang at WILMA.BBN.COM (aboulang@WILMA.BBN.COM) Date: Mon, 11 Dec 89 15:40:49 EST Subject: Chemical-wave reference mentioned in dynamics workshop at Keystone Message-ID: Several people wanted to know the reference to image processing via chemical-waves which seems to be akin to the kind of image processing capabilities of Pierre Baldi's oscillating networks (talk Thursday morning): "Image Processing Using Light-sensitive Chemical Waves" L. Kuhnert, K.I. Agladze, & V.I. Krinsky Nature Vol 337, 19 Jan. 1989, 244-247. Regards, Albert Boulanger BBN Systems & Technologies Corp. aboulanger at bbn.com From ST401843%BROWNVM.BITNET at VMA.CC.CMU.EDU Mon Dec 11 01:07:12 1989 From: ST401843%BROWNVM.BITNET at VMA.CC.CMU.EDU (thanasis kehagias) Date: Mon, 11 Dec 89 01:07:12 EST Subject: update of recurrent nets bibliography Message-ID: dear list manager: a while ago i passed you a connectionist bibliography of recuurent nets and you deposited it in your ftp directories. i am now sending you the updated version of the bibliography and a message to the list members. if this is fine with you, replace the bibliography and post the following message (adding the appropriate instructions for ftp-ing): (i hope this is not too much work for you - if it is, let me know and i will find other ways to distribute this .....) --------------------text of message to list members ------------- a while ago i had posted a preliminary version of a bibliography of recurrent nets. here is a more definitive version of it. it is expanded from its old form (now has somewhat over 100 items). it is not strictly recurrent nets. a better way to describe it is: how-to-capture-temporal-relationships-bibliography. some of the selections are static nets, but most are dynamic. i explain some more the distinction in a short blurb that comes with the bibliography. the blurb is in tex format (latex-able) and the bibliography is a bib-texable bibliographic database. so people who like this kind of thing can produce a pretty TeX document. if you do not want to mess with it, just ftp the files- they are usable as ascii files without any texxing. the files are in the connectionists/cmu directory and here is how to access them : ************************************************************************** ----------(here you should add the ftp instructions) ----------- ************************************************************************** as i say in the tex document, this is a very informal bibliographic search. some information is missing and the bibliography is certainly incomplete. at this point i am not inclined to work much more on it, except if someone feels he has work that should be in it, i would certainly put it in. at this point i have already sent two messages to the members of the list, so i suppose if somebody has something has already contacted me. so here it is, enjoy. thanasis -------------------end of message text ------------------------------------ -------------------beginning of tex document ------------------------------- \documentstyle{article} \title{{\bf A Short Bibliography of Connectionist Systems \\ for Temporal Behavior}} \author{Athanasios Kehagias \\ Division of Applied Mathematics \\ Brown University \\ Providence, RI 02912} \date{December 12, 1989} \begin{document} \normalsize \bibliography{recmes2} \bibliographystyle{alpha} \maketitle % This is a bibliography of work in Connectionism (and related fields) that tries to capture temporal relationships. It is not intended as a complete bibliography. I include work I came across after a reasonable but not exhaustive search and the connection to the problem of representing temporal relationships is seen from my personal perspective. So, if you think something is omitted, misrepresented etc., send me your suggestions. A large part of the current work in connectionism is concerned with the design of {\bf static} networks - meaning that the output of the network at time $t+1$ is more or less not influenced by the output at time $t$. Nevertheless, such static architectures have been used to capture temporal relationships. For example consider the work in predicting chaotic time series: \cite{kn:Moo88a} \cite{kn:Lap87a} \cite{kn:Far88a} . Another case of implicitly {\bf dynamical} networks are Hopfield type networks and Boltzmann machines. In both cases we have networks where the state of a neuron at time $t+1$ depends on the input from other neuron states at time $t$ . However, even though such networks do have dynamic behavior, the goal is that they settle down at a steady state useful for storing patterns. So in the long run we are more interested in static behavior. There is an extensive amount of work on Hopfield networks. Hopfield's seminal papers are: \cite{kn:Hop82a}, \cite{kn:Hop84a}, \cite{kn:Hop85a}. Similarly there is a lot of work on Boltzmann machines; here is a small sample: \cite{kn:Ack85a}, \cite{kn:Pet87a}, \cite{kn:Pet88a}, \cite{kn:Pet89a}, \cite{kn:Pet89b}, \cite{kn:Sus88a}. Moving on to certain application oriented work where temporal behavior is important, we see a mixed approach, sometimes using static network solutions, sometimes using dynamic network solutions. For instance in control/robotics, following a trajectory in state-space is an essentially dynamic problem. Certain connectionist approaches are: \cite{kn:Bar89a}, \cite{kn:Bar89b}, \cite{kn:Jor88a}, \cite{kn:Pea89a}, \cite{kn:Eck89a}. Similarly, in speech recognition time is of great importance. Some researchers have used a static approach (e.g. the time-delay neural network, \cite{kn:Wai89a} , or Boltzmann machines \cite{kn:Pra86a} ). But there is also a lot of work where dynamic (recurrent) neural networks are used: \cite{kn:Bou88a}, \cite{kn:Bou89a}, \cite{kn:Elm88a}, \cite{kn:Gra89a}, \cite{kn:Koh89a}, \cite{kn:Kuh90a}, \cite{kn:Rob88a}, \cite{kn:Rob89a}, \cite{kn:Wat87a}, \cite{kn:Wat87b}. Another area of practical problems being attacked by recurrent neural architectures is tasks related to formal or natural language, e.g. \cite{kn:Cot85a},\cite{kn:San89a},\cite{kn:Cha87a}, \cite{kn:Fan85a}, \cite{kn:Han87a}, \cite{kn:Mik89a}, \cite{kn:Mii88a}, \cite{kn:Cle90a}, \cite{kn:Ser88a}, \cite{kn:All89a}, \cite{kn:All89b}, \cite{kn:All88a}, \cite{kn:Rie88a}, \cite{kn:Sej86a}, \cite{kn:StJ85a}, \cite{kn:Whe89a}. Finally, the bulk of the references in this bibliography refer to more or less theoretical treatments of the problem of dynamical (recurrent) neural networks. I cite first a very small sample of Grossberg's work: \cite{kn:Gro86a}, \cite{kn:Gro86b}. This ART-type work by Grossberg , Carpenter and collaborators is very interesting and there is a very voluminous literature; references to more of this type work are contained in the books listed above. I now cite a large number of references which relate to neural networks mostly as dynamical systems in $R^n$ (as opposed to dynamical systems taking Boolean values) : \cite{kn:Alm87a}, \cite{kn:Alm88a}, \cite{kn:Alm89a}, \cite{kn:Alm89b}, \cite{kn:Alm90a}, \cite{kn:Bab87a}, \cite{kn:Bar85a}, \cite{kn:Bar81a}, \cite{kn:Bel88a}, \cite{kn:Deh87a}, \cite{kn:Elm88b}, \cite{kn:Elm89b}, \cite{kn:Gal88a}, \cite{kn:Gol86a}, \cite{kn:Gol88a}, \cite{kn:Gut88a}, \cite{kn:Guy82a}, \cite{kn:Kur86a}, \cite{kn:Mar89a}, \cite{kn:Moz89a}, \cite{kn:Now88a}, \cite{kn:Ott89a}, \cite{kn:Pin87a}, \cite{kn:Pin88a}, \cite{kn:Pol87a}, \cite{kn:Pol88a}, \cite{kn:Ren90a}, \cite{kn:Ren90b}, \cite{kn:Roh87a}, \cite{kn:Roh90a}, \cite{kn:Ried88a}, \cite{kn:Sch88a}, \cite{kn:Sch89a}, \cite{kn:Sim88a}, \cite{kn:Som88a}, \cite{kn:Sto}, \cite{kn:Sun}, \cite{kn:Sut88a}, \cite{kn:Wil88a}. Finally here are some refernces to work that considers dynamical neural networks as dynamical systems with a Boolean state vector (or in other words sequential machines): \cite{kn:Ale89a}, \cite{kn:Ama71a}, \cite{kn:Ama72a}, \cite{kn:Ama83a}, \cite{kn:Cai70a}, \cite{kn:Cai75a}, \cite{kn:Cai76a}, \cite{kn:Cai86a}, \cite{kn:Cai89a}, \cite{kn:Jor86a}, \cite{kn:Mar89b}, \cite{kn:Mar87a}, \cite{kn:Mcc43a}, \cite{kn:Par89a}, \cite{kn:Sch}, \cite{kn:Raj}, \cite{kn:Tsu89a}, \cite{kn:Cot89a}, \cite{kn:All89c}, \cite{kn:All89d}, \cite{kn:Sun89a}, \cite{kn:Sun89b}, \cite{kn:Roz69a}, \cite{kn:Par88a}. This is a rather fine and arbitrary classification , but work which is often characterized as research in cellular automata is relevant to neural netorks issues. Here are some samples: \cite{kn:Ale73a}, \cite{kn:Kau69a}, \cite{kn:Fog82a}, \cite{kn:Fog83a}, \cite{kn:Fog85a}, \cite{kn:Slo67a}. This is the material I was able to collect; there is a whole lot more, sometimes refrred to in the references of the above works. Any suggestions for improvement are welcome. \end{document} ---------------------end of tex document -------------------------- ---------------------beginning of bibliographic database----------- @BOOK{KN:GRO86A, author ="S. Grossberg", title ="The Adaptive brain:I Learning, reinforcement, motivation and rhythm", YEAR ="1986", PUBLISHER="?" } @book{kn:Gro86b, author ="S. Grossberg", title ="The Adaptive brain:II Vision, Speech, Language and Motor Control", YEAR ="1986", PUBLISHER="?" } @ARTICLE{kn:Hop85a, AUTHOR= "J.J.Hopfield and D.W.Tank", TITLE= "Neural Computation of Decisions in Optimization Problems", JOURNAL= "Biol. Cyb.", YEAR= "1985", VOLUME= "52" } @ARTICLE{kn:Hop82a, AUTHOR= "J.J. Hopfield", TITLE= "Neural Nets and Physical Systems with Emergent Collective Computational Properties", JOURNAL= "Proc. Nat'l Acad. Sci. USA", YEAR= "1982", VOLUME= "?" } @ARTICLE{kn:Hop84a, AUTHOR= "J.J. Hopfield", TITLE= "Neurons with Graded Response have Collective Computational Properties like those of Two-State Neurons", JOURNAL= "Proc. Nat'l Acad. Sci. USA", YEAR= "1984", VOLUME= "81" } @ARTICLE{kn:Ack85a, AUTHOR= "D.H. Ackley and others", TITLE= "A Learning Algorithm for Boltzmann Machines", JOURNAL= "Cognitive Science", YEAR= "1985", VOLUME= "9" } @ARTICLE{kn:Pet87a, AUTHOR= "C. Peterson and J.R. Anderson", TITLE= "A Mean Field Theory Learning Algorithm for Neural Nets", JOURNAL= "Complex Systems", YEAR= "1987", VOLUME= "1" } @ARTICLE{kn:Pet89a, AUTHOR= "C. Peterson and E. Hartman", TITLE= "Explorations of the Mean Field Theory Learning Algorithm", JOURNAL= "Neural Networks", YEAR= "1989", VOLUME= "?" } @ARTICLE{KN:Pet88a, AUTHOR= "C. Peterson and J.R. Anderson", TITLE= "Neural Networks and NP-complete Optimization Problems; A Performance Study on the Graph Bisection Problem", JOURNAL= "Complex Systems", YEAR= "1988", VOLUME= "2" } @ARTICLE{kn:Pet89b, AUTHOR= "C. Peterson and B. Soderberg", TITLE= "A New Method for Mapping Optimization Problems onto Neural Networks", JOURNAL= "Int. J. of Neural Systems", YEAR= "1989", VOLUME= "1" } @techreport{kn:Sus88a, author ="H.J. Sussman", title ="On the Convergence of Learning Algorithms for Boltzmann Machines", number ="sycon-88-03", institution ="Rutgers Center for Systems and Control", year ="1988" } @techreport{kn:Bar89a, AUTHOR= "A.G. Barto and R.S. Sutton", TITLE= "Learning and Sequential Decision Making", INSTITUTION= "COINS Dept., Amherst Un. ", YEAR= "1989", Number= "TR 89-95" } @ARTICLE{kn:Jor88a, AUTHOR= "M.I. Jordan", TITLE= "Supervised Learning and Systems with Excess Degrees of Freedom", INSTITUTION= "COINS Dept., Amherst Un. ", YEAR= "1988", number= "TR 88-27" } @ARTICLE{kn:Bar89b, AUTHOR= "A.G. Barto", TITLE= "Connectionist Learning for Control:An Overview", INSTITUTION= "COINS Dept., Amherst Un. ", YEAR= "1989", number= "TR 89-89" } @inproceedings{kn:Pea89a, AUTHOR= "B.A. Pearlmutter", TITLE= "Learning State Space Trajectories in Recurrent Neural Nets", booktitle= "IJCNN", YEAR= "1989", organization= "IEEE" } @article{kn:Eck89a, author ="R. Eckmiller", title ="Generation of Movement Trajectories in Primates and Robots", journal ="Neural Computing Architectures, I. Aleksander ed.", year ="MIT , 1989" } @inproceedings{kn:Moo88a, title ="Learning with Localized Receptor Fields", booktitle ="Connectionist Models Summer School ", author ="J. Moody and C. Darken", year ="1988", volume ="?", organization ="Carnegie Mellon University" } @techreport{kn:Lap87a, author ="A. Lapedes and R. Farber", title ="Nonlinear Signal Processing using Neural Networks: Prediction and System Modelling", number ="LA UR 87-?", institution ="Los Alamos National Lab", year ="1987" } @techreport{kn:Far88a, author ="J.D. Farmer and J.J. Sidorowich", title ="Exploiting Chaos to Predict the Future and Reduce Noise", number ="LA UR 88-901", institution ="Los Alamos National Lab", year ="1988" } @inproceedings{kn:And88a, author ="S. Anderson", title ="Dynamic System Categorization with Recurrent Networks", booktitle ="Connectionist Models Summer School", year ="1988" } @ARTICLE{kn:Bou88a, AUTHOR= "H. Bourlard and C.J. Wellekens", TITLE= "Links between Markov Models and Multilayer Perceptrons", organization="Phillips Research Lab", YEAR= "1988", techreport= "M 263" } @inproceedings{kn:Bou89a, AUTHOR= "H. Bourlard and C.J. Wellekens", TITLE= "Speech Dynamics and Recurrent Neural Nets", booktitle= "ICASSP", organization="IEEE", YEAR= "1989", VOLUME= "?" } @article{kn:Elm88a, author ="J. L. Elman and D. Zipser", title ="Learning the Hidden Structure of Speech", journal ="Journal of the Acoustical Society of America", year ="1988", volume ="83" } @inproceedings{kn:Gra89a, AUTHOR= "K.A. Grajski and others ", TITLE= "A Preliminary Note on Training Static and Recurrent Neural Nets for Word-level Speech Recognition", booktitle= "IJCNN", YEAR= "1989", VOLUME= "?" } @incollection{kn:Koh89a, author ="T. Kohonen", editor ="I. Aleksander", title ="Speech Recognition based on Topology Preserving Maps", booktitle ="Neural Computing Architectures", year ="1989", publisher ="MIT" } @article{kn:Kuh90a, title ="Connected Recognition with a Recurrent Network", author ="G. Kuhn", journal ="Speech Communication", year ="1990", volume ="9", number ="2", pages ="?", } @inproceedings{kn:Pin88a, author= "F. Pineda", title = "Generalization of Backpropagation to Recurrent and Higher Order Neural Networks", booktitle= "Neural Information Processing Systems", editor= "D. Anderson", year = "1988" } @article{kn:Pra86a, title ="Boltzmann Machines for Speech Recognition", author ="R. Prager and others", journal ="Computer , Speech and Language", year ="1986", volume ="1", number ="?", pages ="1-20", } @phdthesis{kn:Rob89a, author = "Anthony J. Robinson", title ="Dynamic Error Propagation Networks", school ="Cambridge University Engineering Department", address ="Cambridge, England", year ="1989" } @techreport{kn:Rob88a, author ="Anthony J. Robinson", title ="A Dynamic Connectionist Model for Phoneme Recognition", organization="Cambridge University Engineering Department", address ="Cambridge, England", year ="1988" } @inproceedings{kn:Wat87a, key = "Watrous" , author = "Watrous, R.L. and Shastri, L." , title = "Learning Phonetic Features Using Connectionist Networks: An Experiment in Speech Recognition" , booktitle= "Int. Conf. on Neural Networks" , organization= "IEEE", month = "June" , year = "1987" } @inproceedings{kn:Wat87b, author = "R.L. Watrous and others" , title = "Learned Phonetic Discrimination Using Connectionist Networks" , booktitle= "European Conference on Speech Technology" , month = "September" , address = "Edinburgh" , year = "1987" , pages = "377-380" } @phdthesis{kn:Wat88a, key = "Watrous" , author = "Watrous, R." , title = "Speech Recognition Using Connectionist Networks" , school = "University of Pennsylvania" , month = "October" , year = "1988" } @article{kn:Wai89a, title = "Phoneme Recognition Using Time-Delay Neural Networks" , author = "A. Waibel and others " , year = "1989" , journal = "IEEE, Trans. Acoustics, Speech and Signal Processing", month = "March" } @INCOLLECTION{KN:Ale89a, author ="I. Aleksander", editor ="I. Aleksander", title ="The Logic of Connectionist Systems", booktitle ="Neural Computing Architectures", year ="1989", publisher ="MIT" } @ARTICLE{KN:Ale73a, title ="Cycle Activity in Nature: Causes of Stability", author ="I. Aleksander and P. Atlas", journal ="Int. J. of Neuroscience", year ="1973", volume ="6", number ="?", pages ="45-50", } @INPROCEEDINGS{KN:ALL89C, AUTHOR ="Allen, R.B. and Kauffman, S.M.", year ="1989", title ="Developing agent models with a neural reinforcement technique", ORGANIZATION="IEE", booktitle= "Conf. on Artificial Neural Networks" } @INPROCEEDINGS{KN:ALL89D, AUTHOR ="Allen, R.B.", year ="1989", title ="Sequence Generation with Connectionist State Machines", booktitle= "IJCNN" } @article{kn:Ama71a, title ="Characteristics of Randomly Connected Threshold Elements and Network Systems", author ="S.I. Amari", journal ="Proc. of the IEEE", year ="1971", volume ="39", number ="?", pages ="33-47", } @article{kn:Ama72a, title ="Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements", author ="S.I. Amari", journal ="IEEE Trans. on Computers", year ="1972", volume ="21", number ="?", pages ="1197-1206" } @article{kn:Ama83a, title ="Field Theory of Self-Organizing Neural Nets", author ="S.I. Amari", journal ="IEEE Trans. on Systems , Man and Cybernetics", year ="1983", volume ="?", number ="?", pages ="741-748", } @incollection{kn:Cai89a, author ="E. R. Caianello", editor ="I. Aleksander", title ="A Theory of Neural Networks", booktitle ="Neural Computing Architectures", year ="1989", publisher ="MIT" } @article{kn:Cai75a, title ="Synthesis of Boolean Nets and Time-Behavior of a General Mathematical Neuron", author ="E. Caianello and E. Grimson", journal ="Biol. Cyb.", year ="1975", volume ="18", number ="?", pages ="111-117", } @article{kn:Cai86a, title ="Linearization and Synthesis of Cellular Automata. The Additive Case", author ="E. Caianello and M. Marinaro", journal ="Physica Scripta", year ="1986", volume ="34", number ="?", pages ="444", } @article{kn:Cai76a, title ="Methods of Analysis of Neural Nets", author ="E. Caianello and E. Grimson", journal ="Biol. Cyb.", year ="1976", volume ="21", number ="?", pages ="1-6", } @article{kn:Cai70a, title ="Reverberations and Control of Neural Networks", author ="E. Caianello", journal ="Kybernetik", year ="1970", volume ="7", number ="5", pages ="191", } @article{kn:Fog82a, title ="Specific Roles of the Different Boolean Mappings in Random Networks", author ="F. Fogelman-Soulie and others", journal ="Bull. of Math. Biol.", year ="1982", volume ="44", number ="5", pages ="715-730", } @article{kn:Fog85a, title ="Frustration and Stability in Random Boolean Networks", author ="F. Fogelman-Soulie", journal ="Discrete Applied Math.", year ="?", volume ="?", number ="?", pages ="?", } @article{kn:Fog83a, title ="Transient Length in Sequential Iterations of Threshold Functions ", author ="F. Fogelman-Soulie and Others", journal ="Discrete Applied Math.", year ="1983", volume ="6", number ="?", pages ="95-98", } @article{kn:Kau69a, title ="Metabolic Stability and Epigenesis in Randomly Constructed Genetic Nets", author ="S.A. Kauffman", journal ="J. Theoret. Biology", year ="1969", volume ="22", number ="?", pages ="437-467", } @incollection{kn:Mar89a, author ="D. Martland", editor ="I. Aleksander", title ="Dynamic Behavior of Boolean Networks", booktitle="Neural Computing Architectures", year ="1989", publisher="MIT" } @inproceedings{kn:Mar87a, title ="Behavior of Autonomous (Synchronous) Boolean Networks", booktitle ="1st Int. Conf. on Neural Networks ", author ="D. Martland", year ="1987", volume ="II", organization ="IEEE" } @article{kn:Mcc43a, title ="A Logical Calculus of the Ideas Immanent in Nervous Activity ", author ="W.S. McCulloch and W. Pitts ", journal ="Bull. Math. Biophysics", year ="1943", volume ="5", pages ="115-143", } @ARTICLE{kn:Par89a, AUTHOR= "I. Parberry", TITLE= "Relating Boltzmann Machines to Conventional Models of Computation", JOURNAL= "Neural Networks", YEAR= "1989", VOLUME= "2" } @ARTICLE{kn:Roz69a, AUTHOR= "L.I. Rozonoer", TITLE= "Random Logical Nets, I-III (in Russian)", JOURNAL= "Avtomatika i Telemekhanika", YEAR= "1969", VOLUME= "5" } @INPROCEEDINGS{KN:Sun89a, AUTHOR ="R. Sun", year ="1989", title ="A Discrete Neural Network Model for Conceptual Representation and Reasoning", booktitle= "11th Ann. Cog. Sci. Soc. Conf." } @TECHREPORT{KN:SUN89b, AUTHOR ="R. Sun", TITLE ="The Discrete Neuronal Model and the Probabilistic Discrete Neuronal Model", NUMBER ="?", INSTITUTION ="Computer Sc. Dept., Brandeis Un.", year ="1989" } @ARTICLE{kn:Sch, AUTHOR ="R.E. Schneider", TITLE ="The Neuron as a Sequential Mahine", JOURNAL ="?", YEAR ="?", VOLUME ="?" } @ARTICLE{kn:Raj, AUTHOR= "V. Rajlich", TITLE= "Dynamics of certain Discrete Systems and Self Reproduction of Patterns", JOURNAL= "?", YEAR= "?", VOLUME= "?" } @inproceedings{kn:Tsu89a, author ="Tsung, Fu-Sheng and Cottrell, G.", title ="A sequential adder using recurrent neural networks", year ="1989", booktitle ="IJCNN", } @inproceedings{kn:Cot89a, author ="Cottrell, G. and Tsung, Fu-Sheng", title ="Learning simple arithmetic procedures", booktitle ="11th Annual Conf. of Cog. Sci. Soc.", year ="1989" } @INCOLLECTION{KN:ALL89A , author ="Allen, R.B. and Riecken, M.E.", year ="1989", title ="Reference in connectionist language users", booktitle ="Connectionism in Perspective", editor ="R. Pfeifer and others", publisher ="Elsevier", pages ="301-308" } @phdthesis{kn:Rie88a, author ="Riecken, M.E.", year ="1988", title ="Neural networks in natural language processing and distributed artificial intelligence", school ="University of New Mexico" } @inproceedings{kn:All89b, author ="Allen, R.B.", year ="1989", title ="Developing agent models with a neural reinforcement technique", organization="IEEE", booktitle= "Systems Man and Cybernetics Conference", } @inproceedings{kn:All88a, author ="Allen, R.B.", year ="1988", title ="Sequential connectionist networks for answering simple questions about a microworld", booktitle ="10th Ann. Cog. Sci. Soc. Conf.", pages ="489-495" } @inproceedings{kn:Cot85a, title ="Connectionist Parsing", author ="G.W. Cottrell", BOOKTITLE ="7th Ann. Conf. of Cog. Sci. Soc. ", year ="1985" } @incollection{kn:San89a, author ="E. {Santos Jr.}", title ="A Massively Parallel Self-Tuning Context-free Parser", booktitle ="Adv. in Neural Information Processing Systems", year ="1989", publisher ="Morgan Kauffman" } @inproceedings{kn:Cha87a, title ="A Connectionist Context-free Parser which is not Context- Free but then it is not really Connectionist either", author ="E. Charniak and E. Santos", BOOKTITLE ="9th Annual Conference of Cog. Sci. Soc. ", year ="1987" } @techreport{kn:Fan85a, author ="M. Fanty", title ="Context-Free Parsing in Connectionist Networks", number ="TR 174", institution ="Un. of Rochester, Computer Sc. Dept.", year ="1985" } @inproceedings{kn:Han87a, title ="PARSNIP:A connectionist Network that learns Natural Language from Exposure to Natural Language Sentences", author ="S. Hanson and J. Kegl", BOOKTITLE ="9th Annual Conference of Cog. Sci. Soc.", year ="1987" } @techreport{kn:Mik89a, author ="R. Miikkulainen and M.G. Dyer", title ="A Modular Neural Network Architecture for Sequential Paraphrasing of Script-Based Stories", NUMBER ="TR UCLA-AI-89-02", INSTITUTION ="Un. of California at Los Angeles" , year ="1989" } @inproceedings{kn:Mii88a, author ="R. Miikkulainen and M.G. Dyer", title ="Encoding Input/Output Representations in Connectionist Cognitive Systems", booktitle ="Connectionist Models Summer School", year ="1988" } @article{kn:Cle90a, author ="A. Cleeremans and others", year ="1990", title ="Finite state automata and simple recurrent networks", journal ="Neural Computation", volume ="1" } @techreport{kn:Ser88a, AUTHOR ="D. Servan-Schreiber and others", year ="1988", title ="Encoding sequential structure in simple recurrent networks", number ="CMU-CS-88-183", institution ="School of Computer Science, Carnegie Mellon Un.", } @techreport{kn:Sej86a, author ="T.J. Sejnowski and C. Rosenberg", title ="NETtalk:A Parallel Network that Learns to Read Aloud", number ="JHU EECS 86-01", institution ="John Hopkins University", year ="1986" } @techreport{kn:StJ85a, author ="M. StJohn and J.L. McLelland", title ="Learning and Applying Contextual Constraints in Sentence Comprehension", number ="?", institution ="Dept. of Psychology, Carnegie Mellon Un.", year ="1985" } @techreport{kn:Whe89a, author = "D.W. Wheeler and D.S. Touretzky", title = "A Connectionist Implementation of Cognitive Phonology", year = "1989" , institution = "CS Dept., Carnegie Mellon University", number = "CMU-CS-89-144" } @inproceedings{kn:Alm87a, title ="A Learning Rule for Asynchronous Perceptrons with Feedback in a Combinatorial Environment", author ="L.B. Almeida", booktitle ="1st Int. Conf. on Neural Networks, S. Diego ", organization ="IEEE", year ="1987", } @incollection{kn:Alm89a, title ="Backpropagation in Nonfeedforward Networks", author ="L.B. Almeida", booktitle ="Neural Computing Architectures", editor ="I. Aleksander", publisher ="MIT Press", year ="1989", } @incollection{kn:Alm88a, title ="Backpropagation in Perceptrons with Feedback", author ="L.B. Almeida", booktitle ="Neural Computers", editor ="R. Eckmiller and C.v.d. Malsburg", publisher ="Springer", year ="1988", } @incollection{kn:Alm89b, title ="Recurrent Backpropagation and Hopfield Networks", author ="L.B. Almeida and J.P. Neto", booktitle ="Neuro Computing, Algorithms Architectures and Applications", editor ="F. Fogelman-Soulie ", PUBLISHER ="Springer", chapter ="?", year ="1989", } @incollection{kn:Alm90a, title ="Backpropagation in Feedforward and Recurrent Networks", author ="L.B. Almeida", booktitle ="?", editor ="B. Shriver", chapter ="submitted to", publisher ="IEEE Press", year ="1990", } @article{kn:Bab87a, title ="Dynamics of Simple Electronic Neural Networks", author ="K.L. Babcock and R.M. Westervelt", journal ="Physica", year ="1987", volume ="28D", number ="?", pages ="305", } @article{kn:Bar85a, author ="A.G. Barto and P. Anandan", title ="Pattern Recognizing Stochastic Learning Automata", journal ="IEEE Trans. on Systems , Man and Cybernetics", volume ="SMC 15", Number ="3", year ="June 1985" } @article{kn:Bar81a, author ="A.G. Barto and others", title ="Associative Search Network:Reinforcement Learning Associative Memory", journal ="Biol. Cyb. 40", year ="1981" } @inproceedings{kn:Bel88a, author ="T. Bell", title ="Sequential Processing using Attractor Transitions", booktitle ="Connectionist Models Summer School", year ="1988" } @article{kn:Deh87a, title ="Neural Networks that learn Temporal Sequences by Selection ", pages ="2727", author ="S. Dehaene and others", year ="1987 ", volume ="84", journal ="Proc. of Nat'l Acad. Sci. USA" } @inproceedings{kn:Elm89b, author ="J. L. Elman", title ="Representation and structure in connectionist models", BOOKTITLE ="11th Ann. Conf. Cog. Sci. Soc.", year ="1989" } @TECHREPORT{KN:ELM88B, author ="J. L. Elman", title ="Finding structure in time", institution="Center for Research in Language, University of California at San Diego", number ="CRL TR 8801", year ="1988" } @inproceedings{kn:Gal88a, author =" Gallant, S. I. and King, D. J", title ="Experiments with Sequential Associative Memories", booktitle ="10th Ann. Conf. of Cog. Sci. Soc.", year ="1988", page ="40-47", } @ARTICLE{kn:Gol86a, AUTHOR= "R.M. Golden", TITLE= "The Brain-State-in-a-Box Neural Model is a Gradient Descent Algorithm", JOURNAL= "Journal of Mathematical Psychology", YEAR= "1986", VOLUME= "30" , NUMBER= "1" } @ARTICLE{kn:Gol88a, AUTHOR= "R.M. Golden", TITLE= "A Unified Framework for Connectionist Systems", JOURNAL= "Biol. Cyb.", YEAR= "1988", VOLUME= "59" } @article{kn:Gut88a, title ="Processing of Temporal Sequences in Neural Networks", author ="H. Gutfreund and M. Mezard", journal ="Phys. Rev. Let.", year ="1988", volume ="61", number ="?", pages ="235", } @article{kn:Guy82a, title ="Storage and Retrieval of Complex sequences in Neural Networks", author ="I. Guyon and others", journal ="Phys. Rev. A", year ="1982", volume ="38", number ="?", pages ="6365", } @inproceedings{kn:Jor86a, title ="Attractor Dynamics and Parallelism in a Connectionist Sequential Machine", author ="M.I. Jordan", BOOKTITLE ="8th Ann. Conf. of Cog. Sci. Soc." , year ="1986" } @article{kn:Kur86a, title ="Chaos in Neural Systems", author ="K.E. Kurten and J.W. Clark", journal ="Phys. Lett.", year ="1986", volume ="114A", number ="?", pages ="413", } @incollection{kn:Mar89b, title ="Dynamics of Analog Neural Networks with Time Delay", booktitle ="Advances in Neural Information Processing Systems I", author ="C.M. Marcus and R.M. Westervelt", editor ="D. Touretzky", publisher ="Morgan Kauffman", year ="1989", } @ARTICLE{kn:Moz89a, AUTHOR ="M. C. Mozer", TITLE ="A Focused Back Propagation Algorithm for Temporal Pattern Recognition", JOURNAL ="Complex Systems", YEAR ="1989 or 1990 (accepted for publication)", VOLUME ="??" } @ARTICLE{KN:Now88a, AUTHOR ="S.J. Nowlan", TITLE ="Gain Variation in Recurrent Error Propagation Networks", JOURNAL ="Complex Systems", YEAR ="1988", VOLUME= "2" } @incollection{kn:Ott89a, title ="Fixed Point Analysis for Recurrent Neural Networks", booktitle ="Advances in Neural Information Processing Systems I", author ="M.B. Ottaway", editor ="D. Touretzky", publisher ="Morgan Kauffman", year ="1989", } @inproceedings{kn:Par88a, author = "K. Park", title = "Sequential Learning: Observations on the Internal Code Generation Problem", booktitle= "Connectionist Models Summer School", year = "1988" } @ARTICLE{kn:Pin87a, AUTHOR= "F.J. Pineda", TITLE= "Generalization of Back Propagation to Recurrent Neural Nets", JOURNAL= "Physical Review Letters" , YEAR= "1987", VOLUME= "59" } @ARTICLE{kn:Pin88b, AUTHOR= "F.J. Pineda", TITLE= "Dynamics and Architecture for Neural Computation", JOURNAL= "Journal of Complexity", YEAR= "1988", VOLUME= "4" , pages= "216" } @inproceedings{kn:Pol87a, title ="Cascaded Back Propagation on Dynamic Connectionist Networks", author ="J. B. Pollack", booktitle ="9th Ann. Conf. of the Cog. Sci. Soc. ", year ="1987" , page ="391-404", } @inproceedings{kn:Pol88a, title ="Recursive Auto-Associative Memory: Devising Compositional Distributed Representations", author ="J. B. Pollack", booktitle ="10th Ann. Conf. of the Cog. Sci. Soc. ", year ="1988", page ="33-39" } @inproceedings{kn:Ren90a, title ="Chaos in Neural Networks ", booktitle ="EURASIP ", author ="S. Renals", year ="1990", volume ="?", organization ="?" } @inproceedings{kn:Roh90a, title ="The Moving Targets Training Algorithm", booktitle ="EURASIP ", author ="R. Rohwer", year ="1990", volume ="?", organization ="?" } @inproceedings{kn:Roh87a, title ="Training Time-Dependence in Neural Networks", booktitle ="1st Int. Conf. on Neural Networks", author ="R. Rohwer", year ="1987", volume ="?", organization ="IEEE" } @article{kn:Ren90b, title ="A Study of Network Dynamics", author ="S. Renals and R. Rohwer", journal ="J. Stat. Phys.", year ="1990", volume ="In Press", number ="?", pages ="?", } @ARTICLE{KN:RIED88A, title ="Temporal Sequences and Chaos in Neural Networks", author ="Riedel and others", journal ="Phys. Rev. A", year ="1988", volume ="36", number ="?", pages ="1428", } @inproceedings{kn:Sch89a, title ="Networks Adjusting Networks", author ="J. Schmidhuber", booktitle ="Distributed Adaptive Neural Infornmation Processing", year ="1989", } @inproceedings{kn:Sch88a, title ="The Neural Bucket Brigade", author ="J. Schmidhuber", booktitle ="International Conference on Connectionism in Perspective", year ="1988", } @article{kn:Sim88a, author ="P.Y. Simard and others", title ="Analysis of Recurrent Back-Propagation", journal ="Proc. of the Connectionist Models Summer Schools", year ="1988" } @techreport{kn:Slo67a, author ="N.J. Sloane", title ="Lengths of Cycle Times in Random Neural Networks", number ="10", institution ="Cornell Un., Cognitive Systems Research Program", year ="1967" } @ARTICLE{KN:SOM88A, title ="Chaos in Random Neural Networks", author ="Sompolinsky and others", journal ="Phys. Rev. Lett.", year ="1988", volume ="61", number ="?", pages ="259", } @ARTICLE{KN:STO, AUTHOR= "W.S. Stornetta", TITLE= "A Dynamical Approach to Temporal Pattern Processing", JOURNAL= "?", YEAR= "?", VOLUME= "?" } @ARTICLE{kn:Sun, AUTHOR= "G.Z. Sun", TITLE= "A Recurrent Network that learns Context Free Grammars", JOURNAL= "?", YEAR= "?", VOLUME= "?" } @article{kn:Sut88a, title ="Learning to Predict by the Methods of Temporal Difference", author ="R.S. Sutton", journal ="Machine Learning", year ="1988", volume ="3", number ="?", pages ="9-44", } @techreport{kn:Wil88a, author ="R.J. Williams and D. Zipser", title ="A Learning Algorithm for Continually Running Fully Connected Recurrent Neural Networks", number ="ICS-8805", institution ="Un. of California at San Diego", year ="1988" } ------------end of bibliographic database and of message ------- From Connectionists-Request at CS.CMU.EDU Tue Dec 12 17:10:48 1989 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Tue, 12 Dec 89 17:10:48 EST Subject: update of recurrent nets bibliography Message-ID: <23529.629503848@B.GP.CS.CMU.EDU> Sorry about sending out Thanasis' whole bibliography. The updated version of the how-to-capture-temporal-relationships-bibliography is still called recurrent.bib. How to FTP Files from the CONNECTIONISTS Archive ------------------------------------------------ 1. Open an FTP connection to host B.GP.CS.CMU.EDU (Internet address 128.2.242.8). 2. Login as user anonymous with password your username. 3. 'cd' directly to one of the following directories: /usr/connect/connectionists/archives /usr/connect/connectionists/bibliographies 4. The archives and bibliographies directories are the ONLY ones you can access. You can't even find out whether any other directories exist. If you are using the 'cd' command you must cd DIRECTLY into one of these two directories. Access will be denied to any others, including their parent directory. 5. The archives subdirectory contains back issues of the mailing list. Some bibliographies are in the bibliographies subdirectory. Problems? - contact me at "Connectionists-Request at cs.cmu.edu". Happy Browsing Scott Crowder Connectionists-Request at cs.cmu.edu From pollack at cis.ohio-state.edu Wed Dec 13 12:11:43 1989 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Wed, 13 Dec 89 12:11:43 EST Subject: CogSci Meeting Message-ID: <8912131711.AA01095@wizard.cis.ohio-state.edu> I just received the updated call for papers: New Deadline: March 15th, 1990 Submit 4 photo-ready copies of a full paper (8 pages max. including figures, tables, and references and a 250 word abstract). There will be no revision of accepted papers. Cognitive Science Society Meeting c/o MIT Center for Cognitive Science Room 20B-225 77 Mass. Ave Cambridge, MA 02139 Paper presentations will either be 20 minutes or 10 minutes, assigned by the referees and organizng committee. You must first indicate your preference for POSTER or PAPER. If you prefer to present, you must further answer two questions: Will you ACCEPT a 10 minute [y/n] Do you PREFER a 10 minute slot [y/n] This is followed by a threat: If a 10-minute slot is unacceptable, but a 20 minute slot is not available, the committee will be unable to accept the paper(!!!) Jordan Pretty labrynthine stuff going on with the organization of cognitive science this year; they have too many plenary speakers and special-interest panels, so there can't be many normal talks, and the committee has therefore taken it upon itself to reject (or absurdly shorten the presentation time) of papers (or areas?) they don't like. From norman%cogsci at ucsd.edu Wed Dec 13 18:02:06 1989 From: norman%cogsci at ucsd.edu (Donald A Norman-UCSD Cog Sci Dept) Date: Wed, 13 Dec 89 15:02:06 PST Subject: CogSci Meeting In-Reply-To: Jordan B Pollack's message of Wed, 13 Dec 89 12:11:43 EST <8912131711.AA01095@wizard.cis.ohio-state.edu> Message-ID: <8912132302.AA16889@cogsci.UCSD.EDU> In fairness to the Cognitive Science society. And why we need a strong connectionist showing at the society meetings. Each year, the conference is held in a different location, run by volunteers who must spend a considerable amount of time and energy to organize things. As partial payment for the effort, the Society gives the organizers a good deal of latitude on the structure of the conference. We welcomed the overture from MIT to hold a conference, especially since in the west pole/east pole split in the science, MIT folks (prototype east polers) tended to ignore the conference and society: having MIT sponsor the conference was seen as a positive step toward including all views of cognitive science in the society. (Note that connectionists are viewed with alarm and suspicion by east polers -- if they are viewed at all (the preference would be that you--we-- would all go away). And since the Cog Sci conference has become a major place for substantive connectionist reports on science (as opposed to techniques, methodology, and engineering applications), again, having MIT host the conference is a wonderful opportunity.) HOWEVER: there were severe problems and conflicts in getting the conference going. It almost got scrubbed, except that by the time the Society was informed of the problems, it was too late to find another host. Dave Rumelhart played a major role in getting things smoothed over and getting the conference on track again. It now does look like we have a conference. The scheduling problems and the balance of programs and the other apparent mishaps seem minor incidents in the attempt to make the Cognitive Science Society's conference a major scientific forum for all views in the substantive study of cognition. Please do attend: connectionism promises to revolutionize our views of cognition (I know, many of you think it already has), but both you folks and the others need to interact so that we can better explore the experimental phenomena and the theoretical alternatives. don norman (Disclaimer: I am a member of the Governing Board of the Society, but I have played very little role in this conference. The current chair of the society is Dave Rumelhart: give him the credits, and save the complaints for others. Remember: would YOU want to host a large, complex conference? (And if the answer is yes, then by all means volunteer, after making sure that you have sufficient meeting rooms, hotel and dorm rooms, and local support.) Don Norman INTERNET: dnorman at ucsd.edu Department of Cognitive Science D-015 BITNET: dnorman at ucsd University of California, San Diego AppleLink: d.norman La Jolla, California 92093 USA FAX: (619) 534-1128 From karit at hutmc Thu Dec 14 03:09:14 1989 From: karit at hutmc (karit@hutmc) Date: Thu, 14 Dec 89 10:09:14 +0200 Subject: ICANN-91 Message-ID: <8912140809.AA13599@santra.hut.fi> +---------------------------------------------------------------+ | | | I C A N N - 91 | | | | International Conference on Artificial Neural Networks, | | Helsinki University of Technology, Finland, June 24-28, 1991 | | | +---------------------------------------------------------------+ FIRST ANNOUNCEMENT Theories, implementations, and applications of Artificial Neural Networks are progressing at a growing speed in Europe and elsewhere. The first commercial hardware for neural circuits and systems are emerging. This conference will be a major international contact forum for experts from academia and industry worldwide. Around 1000 participants are expected. TOPICS: CONFERENCE CHAIRMAN: networks and algorithms Prof. Teuvo Kohonen neural software neural hardware PROGRAM CHAIRMAN: applications Prof. Igor Aleksander brain and neural theories INTERNATIONAL CONFERENCE COMMITTEE: ACTIVITIES: B.Angeniol tutorials E.Caianiello oral and poster sessions R.Eckmiller prototype demonstrations J.Hertz videopresentations L.Steels industrial exhibition J.G.Taylor Fore more information, please contact: Prof. Olli Simula, chairman, organizations committee ICANN-91, Helsinki University of Technology SF-02150 Espoo, Finland Fax: +358-04513277 Telex: 1251 61 htkk sf Email: ollis at hutmc.hut.fi From Dave.Touretzky at B.GP.CS.CMU.EDU Thu Dec 14 05:59:41 1989 From: Dave.Touretzky at B.GP.CS.CMU.EDU (Dave.Touretzky@B.GP.CS.CMU.EDU) Date: Thu, 14 Dec 89 05:59:41 EST Subject: tech report available Message-ID: <7549.629636381@DST.BOLTZ.CS.CMU.EDU> Controlling Search Dynamics by Manipulating Energy Landscapes David S. Touretzky CMU-CS-89-113 December, 1989 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213-3890 Touretzky and Hinton's DCPS (Distributed Connectionist Production System) is a neural network with complex dynamical properties. Visualization of the energy landscapes of some of its component modules leads to a better intuitive understanding of the model. Three visualization techniques are used in this paper. Analysis of the way energy landscapes change as modules interact during an annealing search suggests ways in which the search dynamics can be controlled, thereby improving the model's performance on difficult match cases. ================ This report is available free by writing the School of Computer Science at the address above, or by sending electronic mail to Ms. Catherine Copetas. Her email address is copetas+ at cs.cmu.edu. Be sure to ask for technical report number CMU-CS-89-113. From jose at neuron.siemens.com Thu Dec 14 06:53:47 1989 From: jose at neuron.siemens.com (Steve Hanson) Date: Thu, 14 Dec 89 06:53:47 EST Subject: CogSci Meeting Message-ID: <8912141153.AA09977@neuron.siemens.com.siemens.com> Re: don norman's note here, here! I would also like to encourage those who regularly attend the NIPS conference and have some cognitive, or behavioral science interests or even cognitive neuroscience to submit relevant work to the Cognitive Science Conference. Also note NIPS90 will have a Cognitive Science and AI track this next year in order to encourage the crosstalk between the high quality scientific work in Neural Networks and Cognitive Science. Watch this space for the call for papers. Steve Hanson (NIPS organizing committee) From hendler at cs.UMD.EDU Thu Dec 14 10:08:09 1989 From: hendler at cs.UMD.EDU (Jim Hendler) Date: Thu, 14 Dec 89 10:08:09 -0500 Subject: CogSci Meeting Message-ID: <8912141508.AA02196@dormouse.cs.UMD.EDU> I can't really let Don Norman's message go by without feeling compelled to add my own two cents -- I agree with Don that it is important that connectionists attend the Cog Swci meeting -- it is an important meeting for new researcj in cognitive science. However, as far as I'm concerned, those going simply to 'spread the gospel' of connectionism, rather than to also find out what is happening elsewhere are (well let's be polite) perhaps not availing themselves of the potential to learn important information. I think some very BAD cognitive results have come from people in this camp (and I do consider myself a connectionist to some degree, although somewhat reluctantly) because they have simply ignored a large body of research which discusses important cognitive phenomena which our models MUST account for someday (you cannot ignore experimental results without making a compelling argument as to why they are wrong). There have also been some very important results which have derived from connectionist modeling (Rumelhart's work, Norman's own recent work, etc.) and the Cognitive Science community has been forced to pay attention. - because the people doing this work did NOT ignore the data. So, just to summarize, I think people should plan on attending, but not simply to convince us east-poler's that connectionism is the word of God, but rather to learn for yourselves where the greatest challenges to connectionists lie. cheers Jim H. From mike at bucasb.BU.EDU Thu Dec 14 14:05:11 1989 From: mike at bucasb.BU.EDU (Michael Cohen) Date: Thu, 14 Dec 89 14:05:11 EST Subject: WANG INSTITUTE CONFERENCE Message-ID: <8912141905.AA18969@bucasb.bu.edu> BOSTON UNIVERSITY, A WORLD LEADER IN NEURAL NETWORK RESEARCH AND TECHNOLOGY, PRESENTS TWO MAJOR SCIENTIFIC EVENTS: MAY 6--11, 1990 NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS A self-contained systematic course by leading neural architects. MAY 11--13, 1990 NEURAL NETWORKS FOR AUTOMATIC TARGET RECOGNITION An international research conference presenting INVITED and CONTRIBUTED papers, herewith solicited, on one of the most active research topics in science and technology today. SPONSORED BY THE CENTER FOR ADAPTIVE SYSTEMS, THE GRADUATE PROGRAM IN COGNITIVE AND NEURAL SYSTEMS, AND THE WANG INSTITUTE OF BOSTON UNIVERSITY WITH PARTIAL SUPPORT FROM THE AIR FORCE OFFICE OF SCIENTIFIC RESEARCH ----------------------------------------------------------------------------- CALL FOR PAPERS --------------- NEURAL NETWORKS FOR AUTOMATIC TARGET RECOGNITION MAY 11--13, 1990 This research conference at the cutting edge of neural network science and technology will bring together leading experts in academe, government, and industry to present their latest results on automatic target recognition in invited lectures and contributed posters. Automatic target recognition is a key process in systems designed for vision and image processing, speech and time series prediction, adaptive pattern recognition, and adaptive sensory-motor control and robotics. It is one of the areas emphasized by the DARPA Neural Networks Program, and has attracted intense research activity around the world. Invited lecturers include: JOE BROWN, Martin Marietta, "Multi-Sensor ATR using Neural Nets" GAIL CARPENTER, Boston University, "Target Recognition by Adaptive Resonance: ART for ATR" NABIL FARHAT, University of Pennsylvania, "Bifurcating Networks for Target Recognition" STEPHEN GROSSBERG, Boston University, "Recent Results on Self-Organizing ATR Networks" ROBERT HECHT-NIELSEN, HNC, "Spatiotemporal Attention Focusing by Expectation Feedback" KEN JOHNSON, Hughes Aircraft, "The Application of Neural Networks to the Acquisition and Tracking of Maneuvering Tactical Targets in High Clutter IR Imagery" PAUL KOLODZY, MIT Lincoln Laboratory, "A Multi-Dimensional ATR System" MICHAEL KUPERSTEIN, Neurogen, "Adaptive Sensory-Motor Coordination using the INFANT Controller" YANN LECUN, AT&T Bell Labs, "Structured Back Propagation Networks for Handwriting Recognition" CHRISTOPHER SCOFIELD, Nestor, "Neural Network Automatic Target Recognition by Active and Passive Sonar Signals" STEVEN SIMMES, Science Applications International Co., "Massively Parallel Approaches to Automatic Target Recognition" ALEX WAIBEL, Carnegie Mellon University, "Patterns, Sequences and Variability: Advances in Connectionist Speech Recognition" ALLEN WAXMAN, MIT Lincoln Laboratory, "Invariant Learning and Recognition of 3D Objects from Temporal View Sequences" FRED WEINGARD, Booz-Allen and Hamilton, "Current Status and Results of Two Major Government Programs in Neural Network-Based ATR" BARBARA YOON, DARPA, "DARPA Artificial Neural Networks Technology Program: Automatic Target Recognition" ------------------------------------------------------ CALL FOR PAPERS---ATR POSTER SESSION: A featured poster session on ATR neural network research will be held on May 12, 1990. Attendees who wish to present a poster should submit 3 copies of an extended abstract (1 single-spaced page), postmarked by March 1, 1990, for refereeing. Include with the abstract the name, address, and telephone number of the corresponding author. Mail to: ATR Poster Session, Neural Networks Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be informed of abstract acceptance by March 31, 1990. SITE: The Wang Institute possesses excellent conference facilities on a beautiful 220-acre rustic setting. It is easily reached from Boston's Logan Airport and Route 128. REGISTRATION FEE: Regular attendee--$90; full-time student--$70. Registration fee includes admission to all lectures and poster session, one reception, two continental breakfasts, one lunch, one dinner, daily morning and afternoon coffee service. STUDENTS: Read below about FELLOWSHIP support. REGISTRATION: To register by telephone with VISA or MasterCard call (508) 649-9731 between 9:00AM--5:00PM (EST). To register by FAX, fill out the registration form and FAX back to (508) 649-6926. To register by mail, complete the registration form and mail with your full form of payment as directed. Make check payable in U.S. dollars to "Boston University". See below for Registration Form. To register by electronic mail, use the address "rosenber at bu-tyng.bu.edu". On-site registration on a space-available basis will take place from 1:00--5:00PM on Friday, May 11. A RECEPTION will be held from 3:00--5:00PM on Friday, May 11. LECTURES begin at 5:00PM on Friday, May 11 and conclude at 1:00PM on Sunday, May 13. ------------------------------------------------------------------------------ NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS MAY 6--11, 1989 This in-depth, systematic, 5-day course is based upon the world's leading graduate curriculum in the technology, computation, mathematics, and biology of neural networks. Developed at the Center for Adaptive Systems (CAS) and the Graduate Program in Cognitive and Neural Systems (CNS) of Boston University, twenty-eight hours of the course will be taught by six CAS/CNS faculty. Three distinguished guest lecturers will present eight hours of the course. COURSE OUTLINE -------------- MAY 7, 1990 ----------- MORNING SESSION (PROFESSOR GROSSBERG) HISTORICAL OVERVIEW: Introduction to the binary, linear, and continuous-nonlinear streams of neural network research: McCulloch-Pitts, Rosenblatt, von Neumann; Anderson, Kohonen, Widrow; Hodgkin-Huxley, Hartline-Ratliff, Grossberg. CONTENT ADDRESSABLE MEMORY: Classification and analysis of neural network models for absolutely stable CAM. Models include: Cohen-Grossberg, additive, shunting, Brain-State-In-A-Box, Hopfield, Boltzmann Machine, McCulloch-Pitts, masking field, bidirectional associative memory. COMPETITIVE DECISION MAKING: Analysis of asynchronous variable-load parallel processing by shunting competitive networks; solution of noise-saturation dilemma; classification of feedforward networks: automatic gain control, ratio processing, Weber law, total activity normalization, noise suppression, pattern matching, edge detection, brightness constancy and contrast, automatic compensation for variable illumination or other background energy distortions; classification of feedback networks: influence of nonlinear feedback signals, notably sigmoid signals, on pattern transformation and memory storage, winner-take-all choices, partial memory compression, tunable filtering, quantization and normalization of total activity, emergent boundary segmentation; method of jumps for classifying globally consistent and inconsistent competitive decision schemes. ASSOCIATIVE LEARNING: Derivation of associative equations for short-term memory and long-term memory. Overview and analysis of associative outstars, instars, computational maps, avalanches, counterpropagation nets, adaptive bidrectional associative memories. Analysis of unbiased associative pattern learning by asynchronous parallel sampling channels; classification of associative learning laws. AFTERNOON SESSION (PROFESSORS JORDAN AND MINGOLLA) COMBINATORIAL OPTIMIZATION PERCEPTRONS: Adeline, Madeline, delta rule, gradient descent, adaptive statistical predictor, nonlinear separability. INTRODUCTION TO BACK PROPAGATION: Supervised learning of multidimensional nonlinear maps, NETtalk, image compression, robotic control. RECENT DEVELOPMENTS OF BACK PROPAGATION: This two-hour guest tutorial lecture will provide a systematic review of recent developments of the back propagation learning network, especially focussing on recurrent back propagation variations and applications to outstanding technological problems. EVENING SESSION: DISCUSSIONS WITH TUTORS MAY 8, 1990 ----------- MORNING SESSION (PROFESSORS CARPENTER AND GROSSBERG) ADAPTIVE PATTERN RECOGNITION: Adaptive filtering; contrast enhancement; competitive learning of recognition categories; adaptive vector quantization; self-organizing computational maps; statistical properties of adaptive weights; learning stability and causes of instability. INTRODUCTION TO ADAPTIVE RESONANCE THEORY: Absolutely stable recognition learning, role of learned top-down expectations; attentional priming; matching by 2/3 Rule; adaptive search; self-controlled hypothesis testing; direct access to globally optimal recognition code; control of categorical coarseness by attentional vigilance; comparison with relevant behavioral and brain data to emphasize biological basis of ART computations. ANALYSIS OF ART 1: Computational analysis of ART 1 architecture for self-organized real-time hypothesis testing, learning, and recognition of arbitrary sequences of binary input patterns. AFTERNOON SESSION (PROFESSOR CARPENTER) ANALYSIS OF ART 2: Computational analysis of ART 2 architecture for self-organized real-time hypothesis testing, learning, and recognition for arbitrary sequences of analog or binary input patterns. ANALYSIS OF ART 3: Computational analysis of ART 3 architecture for self-organized real-time hypothesis testing, learning, and recognition within distributed network hierarchies; role of chemical transmitter dynamics in forming a memory representation distinct from short-term memory and long-term memory; relationships to brain data concerning neuromodulators and synergetic ionic and transmitter interactions. SELF-ORGANIZATION OF INVARIANT PATTERN RECOGNITION CODES: Computational analysis of self-organizing ART architectures for recognizing noisy imagery undergoing changes in position, rotation, and size. NEOCOGNITION: Recognition and completion of images by hierarchical bottom-up filtering and top-down attentive feedback. EVENING SESSION: DISCUSSIONS WITH TUTORS MAY 9, 1990 ----------- MORNING SESSION (PROFESSORS GROSSBERG & MINGOLLA) VISION AND IMAGE PROCESSING: Introduction to Boundary Contour System for emergent segmentation and Feature Contour System for filling-in after compensation for variable illumination; image compression, orthogonalization, and reconstruction; multidimensional filtering, multiplexing, and fusion; coherent boundary detection, regularization, self-scaling, and completion; compensation for variable illumination sources, including artificial sensors (infrared sensors, laser radars); filling-in of surface color and form; 3-D form from shading, texture, stereo, and motion; parallel processing of static form and moving form; motion capture and induced motion; synthesis of static form and motion form representations. AFTERNOON SESSION (PROFESSORS BULLOCK, COHEN, & GROSSBERG) ADAPTIVE SENSORY-MOTOR CONTROL AND ROBOTICS: Overview of recent progress in adaptive sensory-motor control and related robotics research. Reaching to, grasping, and transporting objects of variable mass and form under visual guidance in a cluttered environment will be used as a target behavioral competence to clarify subproblems of real-time adaptive sensory-motor control. The balance of the tutorial will be spent detailing neural network modules that solve various subproblems. Topics include: Self-organizing networks for real-time control of eye movements, arm movements, and eye-arm coordination; learning of invariant body-centered target position maps; learning of intermodal associative maps; real-time trajectory formation; adaptive vector encoders; circular reactions between action and sensory feedback; adaptive control of variable speed movements; varieties of error signals; supportive behavioral and neural data; inverse kinematics; automatic compensation for unexpected perturbations; independent adaptive control of force and position; adaptive gain control by cerebellar learning; position-dependent sampling from spatial maps; predictive motor planning and execution. SPEECH PERCEPTION AND PRODUCTION: Hidden Markov models; self-organization of speech perception and production codes; eighth nerve Average Localized Synchrony Response; phoneme recognition by back propagation, time delay networks, and vector quantization. MAY 10, 1990 ------------ MORNING SESSION (PROFESSORS COHEN, GROSSBERG, & MERRILL) SPEECH PERCEPTION AND PRODUCTION: Disambiguation of coarticulated vowels and consonants; dynamics of working memory; multiple-scale adaptive coding by masking fields; categorical perception; phonemic restoration; contextual disambiguation of speech tokens; resonant completion and grouping of noisy variable-rate speech streams. REINFORCEMENT LEARNING AND PREDICTION: Recognition learning, reinforcement learning, and recall learning are the 3 R's of neural network learning. Reinforcement learning clarifies how external events interact with internal organismic requirements to trigger learning processes capable of focussing attention upon and generating appropriate actions towards motivationally desired goals. A neural network model will be derived to show how reinforcement learning and recall learning can self-organize in response to asynchronous series of significant and irrelevant events. These mechanisms also control selective forgetting of memories that are no longer predictive, adaptive timing of behavioral responses, and self-organization of goal directed problem solvers. AFTERNOON SESSION (PROFESSORS GROSSBERG & MERRILL AND DR. HECHT-NIELSEN) REINFORCEMENT LEARNING AND PREDICTION: Analysis of drive representations, adaptive critics, conditioned reinforcers, role of motivational feedback in focusing attention on predictive data; attentional blocking and unblocking; adaptively timed problem solving; synthesis of perception, recognition, reinforcement, recall, and robotics mechanisms into a total neural architecture; relationship to data about hypothalamus, hippocampus, neocortex, and related brain regions. RECENT DEVELOPMENTS IN THE NEUROCOMPUTER INDUSTRY: This two-hour guest tutorial will provide an overview of the growth and prospects of the burgeoning neurocomputer industry by one of its most important leaders. EVENING SESSION: DISCUSSIONS WITH TUTORS MAY 11, 1990 ------------ MORNING SESSION (DR. FAGGIN) VLSI IMPLEMENTATION OF NEURAL NETWORKS: This is a four-hour self-contained tutorial on the application and development of VLSI techniques for creating compact real-time chips embodying neural network designs for applications in technology. Review of neural networks from a hardware implementation perspective; hardware requirements and alternatives; dedicated digital implementation of neural networks; neuromorphic design methodology using VLSI CMOS technology; applications and performance of neuromorphic implementations; comparison of neuromorphic and digital hardware; future prospectus. ---------------------------------------------------------------------------- COURSE FACULTY FROM BOSTON UNIVERSITY ------------------------------------- STEPHEN GROSSBERG, Wang Professor of CNS, as well as Professor of Mathematics, Psychology, and Biomedical Engineering, is one of the world's leading neural network pioneers and most versatile neural architects; Founder and 1988 President of the International Neural Network Society (INNS); Founder and Co-Editor-in-Chief of the INNS journal "Neural Networks"; an editor of the journals "Neural Computation", "Cognitive Science", and "IEEE Expert"; Founder and Director of the Center for Adaptive Systems; General Chairman of the 1987 IEEE First International Conference on Neural Networks (ICNN); Chief Scientist of Hecht-Nielsen Neurocomputer Company (HNC); and one of the four technical consultants to the national DARPA Neural Network Study. He is author of 200 articles and books about neural networks, including "Neural Networks and Natural Intelligence" (MIT Press, 1988), "Neural Dynamics of Adaptive Sensory-Motor Control" (with Michael Kuperstein, Pergamon Press, 1989), "The Adaptive Brain, Volumes I and II" (Elsevier/North-Holland, 1987), "Studies of Mind and Brain" (Reidel Press, 1982), and the forthcoming "Pattern Recognition by Self-Organizing Neural Networks" (with Gail Carpenter). GAIL CARPENTER is Professor of Mathematics and CNS; Co-Director of the CNS Graduate Program; 1989 Vice President of the International Neural Network Society (INNS); Organization Chairman of the 1988 INNS annual meeting; Session Chairman at the 1989 and 1990 IEEE/INNS International Joint Conference on Neural Networks (IJCNN); one of four technical consultants to the national DARPA Neural Network Study; editor of the journals "Neural Networks", "Neural Computation", and "Neural Network Review"; and a member of the scientific advisory board of HNC. A leading neural architect, Carpenter is especially well-known for her seminal work on developing the adaptive resonance theory architectures (ART 1, ART 2, ART 3) for adaptive pattern recognition. MICHAEL COHEN, Associate Professor of Computer Science and CNS, is a leading architect of neural networks for content addressable memory (Cohen-Grossberg model), vision (Feature Contour System), and speech (Masking Fields); editor of "Neural Networks"; Session Chairman at the 1987 ICNN, and the 1989 IJCNN; and member of the DARPA Neural Network Study panel on Simulation/Emulation Tools and Techniques. ENNIO MINGOLLA, Assistant Professor of Psychology and CNS, is holder of one of the first patented neural network architectures for vision and image processing (Boundary Contour System); Co-Organizer of the 3rd Workshop on Human and Machine Vision in 1985; editor of the journals "Neural Networks" and "Ecological Psychology"; member of the DARPA Neural Network Study panel of Adaptive Knowledge Processing; consultant to E.I. duPont de Nemours, Inc.; Session Chairman for vision and image processing at the 1987 ICNN, and the 1988 INNS meetings. DANIEL BULLOCK, Assistant Professor of Psychology and CNS, is developer of neural network models for real-time adaptive sensory-motor control of arm movements and eye-arm coordination, notably the VITE and FLETE models for adaptive control of multi-joint trajectories; editor of "Neural Networks"; Session Chairman for adaptive sensory-motor control and robotics at the 1987 ICNN and the 1988 INNS meetings; invited speaker at the 1990 IJCNN. JOHN MERRILL, Assistant Professor of Mathematics and CNS, is developing neural network models for adaptive pattern recognition, speech recognition, reinforcement learning, and adaptive timing in problem solving behavior, after having received his Ph.D. in mathematics from the University of Wisconsin at Madison, and completing postdoctoral research in computer science and linguistics at Indiana University. GUEST LECTURERS --------------- FEDERICO FAGGIN is co-founder and president of Synaptics, Inc. Dr. Faggin developed the Silicon Gate Technology at Fairchild Semiconductor. He also designed the first commercial circuit using Silicon Gate Technology: the 3708, an 8-bit analog multiplexer. At Intel Corporation he was responsible for designing what was to become the first microprocessor---the 4000 family, also called MCS-4. He and Hal Feeney designed the 8008, the first 8-bit microprocessor introduced in 1972, and later Faggin conceived the 8080 and with M. Shima designed it. The 8080 was the first high-performance 8-bit microprocessor. At Zilog Inc., Faggin conceived the Z80 microprocessor family and directed the design of the Z80 CPU. Faggin also started Cygnet Technologies, which developed a voice and data communication peripheral for the personal computer. In 1986 Faggin co-founded Synaptics Inc., a company dedicated to the creation of a new type of VLSI hardware for artificial neural networks and other machine intelligence applications. Faggin is the recipient of the 1988 Marconi Fellowship Award for his contributions to the birth of the microprocessor. ROBERT HECHT-NIELSEN is co-founder and chairman of the Board of Directors of Hecht-Nielsen Neurocomputer Corporation (HNC), a pioneer in neurocomputer technology and the application of neural networks, and a recognized leader in the field. Prior to the formation of HNC, he founded and managed the neurocomputer development and neural network applications at TRW (1983--1986) and Motorola (1979--1983). He has been active in neural network technology and neurocomputers since 1961 and earned his Ph.D. in mathematics in 1974. He is currently a visiting lecturer in the Electrical Engineering Department at the University of California at San Diego, and is the author of influential technical reports and papers on neurocomputers, neural networks, pattern recognition, signal processing algorithms, and artificial intelligence. MICHAEL JORDAN is an Assistant Professor of Brain and Cognitive Sciences at MIT. One of the key developers of the recurrent back propagation algorithms, Professor Jordan's research is concerned with learning in recurrent networks and with the use of networks as forward models in planning and control. His interest in interdisciplinary research on neural networks is founded in his training for a Bachelors degree in Psychology, a Masters degree in Mathematics, and a Ph.D. in Cognitive Science from the University of California at San Diego. He was a postdoctoral researcher in Computer Science at the University of Massachusetts at Amherst before assuming his present position at MIT. ---------------------------------------------------------- REGISTRATION FEE: Regular attendee--$950; full-time student--$250. Registration fee includes five days of tutorials, course notebooks, one reception, five continental breakfasts, five lunches, four dinners, daily morning and afternoon coffee service, evening discussion sessions with leading neural architects. REGISTRATION: To register by telephone with VISA or MasterCard call (508) 649-9731 between 9:00AM--5:00PM (EST). To register by FAX, fill out the registration form and FAX back to (508) 649-6926. To register by mail, complete the registration form and mail with you full form of payment as directed. Make check payable in U.S. dollars to "Boston University". See below for Registration Form. To register by electronic mail, use the address "rosenber at bu-tyng.bu.edu". On-site registration on a space-available basis will take place from 2:00--7:00PM on Sunday, May 6 and from 7:00--8:00AM on Monday, May 7, 1990. A RECEPTION will be held from 4:00--7:00PM on Sunday, May 6. LECTURES begin at 8:00AM on Monday, May 7 and conclude at 12:30PM on Friday, May 11. STUDENT FELLOWSHIPS supporting travel, registration, and lodging for the Course and the Research Conference are available to full-time graduate students in a PhD program. Applications must be postmarked by March 1, 1990. Send curriculum vitae, a one-page essay describing your interest in neural networks, and a letter from a faculty advisor to: Student Fellowships, Neural Networks Course, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships for Ph.D. candidates in the CNS Graduate Program. Corporate and individual gifts to endow CNS Fellowships are also welcome. Please write: Cognitive and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston University, 111 Cummington Street, Boston, MA 02215. ------------------------------------------------------------------------------ REGISTRATION FOR COURSE AND RESEARCH CONFERENCE Course: Neural Network Foundations and Applications, May 6--11, 1990 Research Conference: Neural Networks for Automatic Target Recognition, May 11--13, 1990 NAME: _________________________________________________________________ ORGANIZATION (for badge): _____________________________________________ MAILING ADDRESS: ______________________________________________________ ______________________________________________________ CITY/STATE/COUNTRY: ___________________________________________________ POSTAL/ZIP CODE: ______________________________________________________ TELEPHONE(S): _________________________________________________________ COURSE RESEARCH CONFERENCE ------ ------------------- [ ] regular attendee $950 [ ] regular attendee $90 [ ] full-time student $250 [ ] full-time student $70 (limited number of spaces) (limited number of spaces) [ ] Gift to CNS Fellowship Fund TOTAL PAYMENT: $________ FORM OF PAYMENT: [ ] check or money order (payable in U.S. dollars to Boston University) [ ] VISA [ ] MasterCard Card Number: ______________________________________________ Expiration Date: ______________________________________________ Signature: ______________________________________________ Please complete and mail to: Neural Networks Wang Institute of Boston University 72 Tyng Road Tyngsboro, MA 01879 USA To register by telephone, call: (508) 649-9731. HOTEL RESERVATIONS: Room blocks have been reserved at 3 hotels near the Wang Institute. Hotel names, rates, and telephone numbers are listed below. A shuttle bus will take attendees to and from the hotels for the Course and Research Conference. Attendees should make their own reservations by calling the hotel. The special conference rate applies only if you mention the name and dates of the meeting when making the reservations. Sheraton Tara Red Roof Inn Stonehedge Inn Nashua, NH Nashua, NH Tyngsboro, MA (603) 888-9970 (603) 888-1893 (508) 649-4342 $70/night+tax $39.95/night+tax $89/night+tax The hotels in Nashua are located approximately 5 miles from the Wang Institute. A shuttle bus will be provided. ------------------------------------------------------------------------------- From noel%CS.EXETER.AC.UK at VMA.CC.CMU.EDU Fri Dec 15 07:52:19 1989 From: noel%CS.EXETER.AC.UK at VMA.CC.CMU.EDU (Noel Sharkey) Date: Fri, 15 Dec 89 12:52:19 GMT Subject: CogSci Meeting In-Reply-To: Jim Hendler's message of Thu, 14 Dec 89 10:08:09 -0500 <8912141508.AA02196@dormouse.cs.UMD.EDU Message-ID: <5100.8912151252@entropy.cs.exeter.ac.uk> i support jim h. fully on his points, but knowing the breadth of don norman's work, i am sure he would agree. noel From R09614%BBRBFU01.BITNET at vma.CC.CMU.EDU Mon Dec 18 08:24:49 1989 From: R09614%BBRBFU01.BITNET at vma.CC.CMU.EDU (R09614%BBRBFU01.BITNET@vma.CC.CMU.EDU) Date: Mon, 18 Dec 89 14:24:49 +0100 Subject: NATO Conference Announcement Message-ID: ANNOUNCEMENT: _______________________________________ NATO Advanced Research Workshop on Self-organization, Emerging Properties and Learning. Center for Studies in Statistical Mechanics and Complex Systems The University of Texas Austin, Texas, USA March 12-14, 1990 _______________________________________ Topics ------ - Self-Organization and Dynamics in Networks of Interacting Elements - Dynamical Aspects of Neural Activity: Experiments and Modelling - From Statistical Physics to Neural Networks - Role of Dynamical Attractors in Cognition and Memory - Dynamics of Learning in Biological and Social Systems The goal of the workshop is to review recent progress on self- organization and the generation of spatio-temporal patterns in multi-unit networks of interacting elements, with special emphasis on the role of coupling and connectivity on the observed behavior. The importance of these findings will be assessed from the standpoint of information and cognitive sciences, and their possible usefulness in the field of artificial intelligence will be discussed. We will compare the collective behavior of model networks with the dynamics inferred from the analysis of cortical activity. This confrontation should lead to the design of more realistic networks, sharing some of the basic properties of real-world neurons. Sponsors -------- - NATO International Scientific Exchange Programmes - International Solvay Institutes for Physics and Chemistry, Brussels, Belgium - Center for Statistical Mechanics and Complex Systems, The University of Texas at Austin - IC2 Institute of The University of Texas at Austin International Organizing Committee -------------------------------- Ilya Prigogine, The University of Texas at Austin and Free University of Brussels Gregoire Nicolis, Free University of Brussels Agnes Babloyantz, Free University of Brussels J. Demongeot, University of Grenoble, France Linda Reichl, The University of Texas at Austin Local Organizing Committee ------------------------- Ilya Prigogine, George Kozmetsky, Ping Chen, Linda Reichl, William Schieve, Robert Herman, Harry Swinney, Fred Phillips For Further Information Contact: ----------------------------- Professor Linda Reichl Center for Statistical Mechanics The University of Texas Austin, TX 78712, USA Phone: (512) 471-7253; Fax: (512) 471-9637; Bitnet: CTAA450 at UTA3081 or PAPE at UTAPHY From Ajay.Jain at ANJ.BOLTZ.CS.CMU.EDU Mon Dec 18 14:17:38 1989 From: Ajay.Jain at ANJ.BOLTZ.CS.CMU.EDU (Ajay.Jain@ANJ.BOLTZ.CS.CMU.EDU) Date: Mon, 18 Dec 89 14:17:38 EST Subject: tech report available Message-ID: A CONNECTIONIST ARCHITECTURE FOR SEQUENTIAL SYMBOLIC DOMAINS Ajay N. Jain School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213-3890 Technical Report CMU-CS-89-187 December, 1989 Abstract: This report describes a connectionist architecture specifically intended for use in sequential domains requiring symbol manipulation. The architecture is based on a network formalism which differs from other connectionist networks developed for use in temporal/sequential domains. Units in this formalism are updated synchronously and retain partial activation between updates. They produce two output values: the standard sigmoidal function of the activity and its velocity. Activation flowing along connections can be gated by units. Well-behaved symbol buffers which learn complex assignment behavior can be constructed using gates. Full recurrence is supported. The network architecture, its underlying formalism, and its performance on an incremental parsing task requiring non-trivial dynamic behavior are presented. This report discusses a connectionist parser built for a smaller task than was discussed at NIPS. ---------------------------------------------------------------------- TO ORDER COPIES of this tech report: send electronic mail to copetas at cs.cmu.edu, or write the School of Computer Science at the address above. Those of you who requested copies of the report at NIPS a couple of weeks ago need not make a request (your copies are in the mail). ****** Do not use your mailer's "reply" command. ****** From Ajay.Jain at ANJ.BOLTZ.CS.CMU.EDU Tue Dec 19 11:43:31 1989 From: Ajay.Jain at ANJ.BOLTZ.CS.CMU.EDU (Ajay.Jain@ANJ.BOLTZ.CS.CMU.EDU) Date: Tue, 19 Dec 89 11:43:31 EST Subject: TR CMU-CS-89-187 Message-ID: The report won't be mailed until after the holidays. It isn't back from the printers yet. Your requests will be processed as soon as possible. Ajay From jose at neuron.siemens.com Tue Dec 19 17:01:37 1989 From: jose at neuron.siemens.com (Steve Hanson) Date: Tue, 19 Dec 89 17:01:37 EST Subject: COGNITIVE NEUROSCIENCE RFP Message-ID: <8912192201.AA03476@neuron.siemens.com.siemens.com> MCDONNELL-PEW PROGRAM IN COGNITIVE NEUROSCIENCE December 1989 Individual Grants-in-Aid for Research and Training Supported jointly by the James S. McDonnell Foundation and The Pew Charitable Trusts INTRODUCTION The McDonnell-Pew Program in Cognitive Neuroscience has been created jointly by the James S. McDonnell Foundation and The Pew Charitable Trusts to promote the development of cognitive neuroscience. The foundations have allocated $12 million over an initial three-year period for this program. Cognitive neuroscience attempts to understand human mental events by specifying how neural tissue carries out computations. Work in cognitive neuroscience is interdisciplinary in character, drawing on developments in clinical and basic neuroscience, computer science, psychology, linguistics, and philosophy. Cognitive neuroscience excludes descriptions of psychological function that do not address the underlying brain mechanisms and neuroscientific descriptions that do not speak to psychological function. The program has three components. (1) Institutional grants have been awarded for the purpose of creating centers where cognitive scientists and neuroscientists can work together. (2) To encourage Ph.D. and M.D. investigators in cognitive neuroscience, small grants-in-aid will be awarded for individual research projects. (3) To encourage Ph.D. and M.D. investigators to acquire skills for interdisciplinary research, small training grants will be awarded. During the program's initial three-year period, approximately $4 million will be available for the latter two components -- individual grants-in-aid for research and training -- which this announcement describes. RESEARCH GRANTS The McDonnell-Pew Program in Cognitive Neuroscience will issue a limited number of awards to support collaborative work by cognitive neuroscientists. Applications are sought for projects of exceptional merit that are not currently fundable through other channels, and from investigators who are not already supported by institutional grants under this Program. Preference will be given to support projects requiring collaboration or interaction between at least two subfields of cognitive neuroscience. The goal is to encourage broad, national participation in the development of the field and to facilitate the participation of investigators outside the major centers of cognitive neuroscience. Submissions will be reviewed by the program's advisory board. Grant support under this component is limited to $30,000 per year for two years, with indirect costs limited to 10 percent of direct costs. These grants are not renewable. The program is looking for innovative proposals that would, for example: -- combine experimental data from cognitive psychology and neuroscience; -- explore the implications of neurobiological methods for the study of the higher cognitive processes; -- bring formal modeling techniques to bear on cognition; -- use sensing or imaging techniques to observe the brain during conscious activity; -- make imaginative use of patient populations to analyze cognition; -- develop new theories of the human mind/brain system. This list of examples is necessarily incomplete but should suggest the general kind of proposals desired. Ideally, a small grant-in-aid for research should facilitate the initial exploration of a novel or risky idea, with success leading to more extensive funding from other sources. TRAINING GRANTS A limited number of grants will also be awarded to support training investigators in cognitive neuroscience. Here again, the objective is to support proposals of exceptional merit that are underfunded or unlikely to be funded from other sources. Some postdoctoral awards for exceptional young scientists will be available; postdoctoral stipends will be funded at prevailing rates at the host institution, and will be renewable annually for periods up to three years. Highest priority will be given to candidates seeking postdoctoral training outside the field of their previous training. Innovative programs for training young scientists, or broadening the experience of senior scientists, are also encouraged. Some examples of appropriate proposals follow. -- Collaboration between a junior scientist in a relevant discipline and a senior scientist in a different discipline has been suggested as an effective method for developing the field. -- Two senior scientists might wish to learn each other's discipline through a collaborative project. -- An applicant might wish to visit several laboratories in order to acquire new research techniques. -- Senior researchers might wish to investigate new methods or technologies in their own fields that are unavailable at their home institutions. Here again, examples can only suggest the kind of training experience that might be considered appropriate. APPLICATIONS Applicants should submit five copies of a proposal no longer than 10 pages (5,000 words). Proposals for research grants should include: -- a description of the work to be done and where it might lead; -- an account of the investigator's professional qualifications to do the work. Proposals for training grants should include: -- a description of the training sought and its relationship to the applicant's work and previous training; -- a statement from the mentor as well as the applicant concerning the acceptability of the training plan. Proposals for both research grants and training grants should include: -- an account of any plans to collaborate with other cognitive neuroscientists; -- a brief description of the available research facilities; -- no appendices. The proposal should be accompanied by the following separate information: -- a brief, itemized budget and budget justification for the proposed work, including direct costs, with indirect costs not to exceed 10 percent of direct costs; -- curricula vitae of the participating investigators; -- evidence that the sponsoring organization is a nonprofit, tax-exempt, public institution; -- an authorized form indicating clearance for the use of human and animal subjects; -- an endorsement letter from the officer of the sponsoring institution who will be responsible for administering the grant. Applications received on or before March 1 will be acted on by the following September 1; applications received on or before September 1 will be acted on by the following March 1. INFORMATION For more information contact: McDonnell-Pew Program in Cognitive Neuroscience Green Hall 1-N-6 Princeton University Princeton, New Jersey 08544-1010 Telephone: 609-258-5014 Facsimile: 609-258-3031 Email: cns at confidence.princeton.edu ADVISORY BOARD Emilio Bizzi, M.D. Eugene McDermott Professor in the Brain Sciences and Human Behavior Chairman, Department of Brain and Cognitive Sciences Whitaker College Massachusetts Institute of Technology, E25-526 Cambridge, Massachusetts 02139 Sheila Blumstein, Ph.D. Professor of Cognitive and Linguistic Sciences Dean of the College Brown University University Hall, Room 218 Providence, Rhode Island 02912 Stephen J. Hanson, Ph.D. Group Leader Learning and Knowledge Acquisition Research Group Siemens Research Center 755 College Road East Princeton, New Jersey 08540 Jon Kaas, Ph.D. Centennial Professor Department of Psychology Vanderbilt University Nashville, Tennessee 37240 George A. Miller, Ph.D. James S. McDonnell Distinguished University Professor of Psychology Department of Psychology Princeton University Princeton, New Jersey 08544 Mortimer Mishkin, Ph.D. Laboratory of Neuropsychology National Institute of Mental Health 9000 Rockville Pike Building 9, Room 1N107 Bethesda, Maryland 20892 Marcus Raichle, M.D. Professor of Neurology and Radiology Department of Radiology Washington University School of Medicine Barnes Hospital 510 S. Kingshighway, Campus Box 8131 St. Louis, Missouri 63110 Endel Tulving, Ph.D. Department of Psychology University of Toronto Toronto, Ontario M5S 1A1 Canada From gasser at iuvax.cs.indiana.edu Tue Dec 19 21:32:02 1989 From: gasser at iuvax.cs.indiana.edu (Michael Gasser) Date: Tue, 19 Dec 89 21:32:02 -0500 Subject: tech report available Message-ID: NETWORKS THAT LEARN PHONOLOGY Michael Gasser Chan-Do Lee Computer Science Department Indiana University Bloomington, IN 47405 Technical Report 300 December 1989 Abstract: Natural language phonology presents a challenge to connectionists because it is an example of apparently symbolic, rule-governed behavior. This paper describes two experiments investigating the power of simple recurrent networks (SRNs) to acquire aspects of phonological regularity. The first experiment demonstrates the ability of an SRN to learn harmony constraints, restrictions on the cooccurrence of particular types of segments within a word. The second experiment shows that an SRN is capable of learning the kinds of phonological alternations that appear at morpheme boundaries, in this case those occurring in the regular plural forms of English nouns. This behavior is usually characterized in terms of a derivation from a more to a less abstract level, and in previous connectionist treatments (Rumelhart & McClelland, 1986; Plunkett & Marchman, 1989) it has been dealt with as a process of yielding the combined form (plural) from the simpler form (stem). Here the behavior takes the form of the more psychologically plausible process of the production of a sequence of segments given a meaning or of a meaning given a sequence of segments. This is accomplished by having both segmental and semantic inputs and outputs in the network. The network is trained to auto-associate the current segment and the meaning and to predict the next phoneme. ---------------------------------------------------------------------- To order copies of this tech report, send mail to Nancy Garrett at nlg at cs.indiana.edu / Computer Science Department, Indiana University, Bloomington, IN 47405. From elman at amos.ucsd.edu Wed Dec 20 14:20:50 1989 From: elman at amos.ucsd.edu (Jeff Elman) Date: Wed, 20 Dec 89 11:20:50 PST Subject: Announcement: 1990 Connectionists Models Summer School Message-ID: <8912201920.AA06467@amos.ucsd.edu> December 20, 1989 ANNOUNCEMENT & SOLICITATION FOR APPLICATIONS CONNECTIONIST MODELS SUMMER SCHOOL / SUMMER 1990 UCSD La Jolla, California The next Connectionist Models Summer School will be held at the University of California, San Diego from June 19 to 29, 1990. This will be the third session in the series which was held at Carnegie Mellon in the summers of 1986 and 1988. Previous summer schools have been extremely success- ful, and we look forward to the 1990 session with anticipa- tion of another exciting summer school. The summer school will offer courses in a variety of areas of connectionist modelling, with emphasis on computa- tional neuroscience, cognitive models, and hardware imple- mentation. A variety of leaders in the field will serve as Visiting Faculty (the list of invited faculty appears below). In addition to daily lectures, there will be a series of shorter tutorials and public colloquia. Proceed- ings of the summer school will be published the following fall by Morgan-Kaufmann (previous proceedings appeared as 'Proceedings of the 1988 Connectionist Models Summer School', Ed., David Touretzky, Morgan-Kaufmann). As in the past, participation will be limited to gradu- ate students enrolled in PhD. programs (full- or part-time). Admission will be on a competitive basis. Tuition is sub- sidized for all students and scholarships are available to cover housing costs ($250). Applications should include the following: (1) A statement of purpose, explaining major areas of interest and prior background in connectionist model- ing (if any). (2) A description of a problem area you are interested in modeling. (3) A list of relevant coursework, with instructors' names and grades. (4) Names of the three individuals whom you will be ask- ing for letters of recommendation (see below). (5) If you are requesting support for housing, please include a statement explaining the basis for need. Please also arrange to have letters of recommendation sent directly from three individuals who know your current work. Applications should be sent to Marilee Bateman Institute for Neural Computation, B-047 University of California, San Diego La Jolla, CA 92093 (619) 534-7880 All application material must be received by March 15, 1990. Decisions about acceptance and scholarship awards will be announced April 1. If you have further questions, contact Marilee Bateman (address above), or one of the members of the Organizing Committee. Jeff Elman Terry Sejnowski UCSD UCSD/Salk Institute elman at amos.ucsd.edu terry at sdbio2.ucsd.edu Geoff Hinton Dave Touretzky Toronto CMU hinton at ai.toronto.edu touretzky at cs.cmu.edu -------------------------------------------------- INVITED FACULTY: Yaser Abu-Mostafa (CalTech) Richard Lippmann (MIT Lincoln Labs) Dana Ballard (Rochester) James L. McClelland (Carnegie Mellon) Andy Barto (UMass/Amherst) Carver Mead (CalTech) Gail Carpenter (BU) David Rumelhart (Stanford) Patricia Churchland (UCSD) Terry Sejnowski (UCSD/Salk) Jack Cowan (Chicago) Al Selverston (UCSD) Jeff Elman (UCSD) Paul Smolensky (Colorado) Jerry Feldman (ICSI/UCB) David Tank (Bell Labs) Geoffrey Hinton (Toronto) David Touretzky (Carnegie Mellon) Michael Jordan (MIT) Halbert White (UCSD) Teuvo Kohonen (Helsinki) Ron Williams (Northeastern) George Lakoff (UCB) David Zipser (UCSD) From D4PBPHB2%EB0UB011.BITNET at VMA.CC.CMU.EDU Wed Dec 20 19:26:54 1989 From: D4PBPHB2%EB0UB011.BITNET at VMA.CC.CMU.EDU (Perfecto Herrera-Boyer) Date: Wed, 20 Dec 89 19:26:54 HOE Subject: Hardware for NN Message-ID: Dear connectionists: I am trying to make a survey of hardware suited to work with NN on IBM/PS computers (80286) in order to acquire some equipment for our laboratory. I am thinking of coprocessors, cards, and so on... Could anybody send me information about them? (It would be interesting to receive not only "objective" data but also "subjective" impressions from people who is working with those devices). I promise you a summary if you want it. Thanks in advance: Perfecto Herrera-Boyer Dpt. Psicologia Basica Univ. Barcelona From ang at hertz.njit.edu Thu Dec 21 11:35:32 1989 From: ang at hertz.njit.edu (nirwan ansari fac ee) Date: Thu, 21 Dec 89 11:35:32 EST Subject: Call for papers for GLOBECOM '90 Message-ID: <8912211635.AA18916@hertz.njit.edu> The 1990 IEEE Global Telecommunications Conference (GLOBECOM 90) will be held in San Diego, California, Decemebr 2-5, 1990. I was asked by the technical committee to organize a session "Neural Networks in Communication Systems." You are cordially invited to submit an original technical paper related to this topic for consideration for GLOBECOM 90. The SCHEDULE is as follows: Complete Manuscript Due 3/15/1990 Notification of Acceptance Mailed 5/30/1990 Camera-ready Manuscript Due 8/20/1990 INSTRUCTIONS: The title page must include the author's name, complete return address, telephone, telex and fax number and abstract (100 words). For papers with multiple authors, please designate the author to whom all correspondence should be sent by listing that author first. All other pages should have the title and first author of the paper. The manuscript should not exceed 3,000 words in English. Page charges will be assessed for camera-ready copies exceeding five pages. Please send six double-spaced copies of the manuscript in English to: Dr. Arne Mortensen GLOBECOM '90 Technical Program Secretary M/A-COM Government Systems 3033 Science Park Road San Diego, CA 92121 Phone:(619) 457-2340 Telex:910-337-1277 FAX:(619) 457-0579, and a copy to me: Dr. Nirwan Ansari GLOBECOM '90 Neural Network Session Organizer Electrical and Computer Engineering Department New Jersey Insitute of Technology University Heights Newark, NJ 07102 Phone:(201) 596-5739. Please also indicate in your cover letter to Dr. Mortensen that you have communicated with and sent me a copy of your manuscript for consideration for the "Neural Networks in Communication Systems" Session. For further questions, please feel free to contact me using the above address or the e-mail address, ang at hertz.njit.edu (node address: 128.235.1.26). From skrzypek at CS.UCLA.EDU Thu Dec 21 18:14:38 1989 From: skrzypek at CS.UCLA.EDU (Dr. Josef Skrzypek) Date: Thu, 21 Dec 89 15:14:38 PST Subject: neural nets and light adaptation (TR) Message-ID: <8912212314.AA20901@retina.cs.ucla.edu> NEURAL NETWORK CONTRIBUTION TO LIGHT ADAPTATION: FEEDBACK FROM HORIZONTAL CELLS TO CONES JOSEF SKRZYPEK Machine Perception Laboratory, Computer Science Department and CRUMP Institute of Medical Engineering. UCLA SUMMARY Vertebrate cones respond to a stepwise increase in localized light intensity with a graded potential change of corresponding amplitude. This S-shaped intensity-response (I-R) relation is limited to 3 log units of the stimulating light and yet, cone vision remains functional between twilight and the brightest time of day. This is in part due to light adaptation mechanism localized in the outer segment of a cone. The phenomenon of light adaptation can be described as a resetting of the system's response-generation mechanism to a new intensity domain that reflects the ambient illumination. In this project we examined spatial effects of annular illumination on resetting of I-R relation by measuring intracellular photoresponses in cones. Our results suggest that peripheral illumination contributes to the cellular mechanism of adaptation. This is done by a neural network involving feedback synapse from horizontal cell to cones. The effect is to unsaturate the membrane potential of a fully hyperpolarized cone, by "instantaneously" shifting cone's I-R curves along intensity axis to be in register with ambient light level of the periphery. An equivalent electrical circuit with three different transmembrane channels leakage, photocurrent and feedback was used to model static behavior of a cone. SPICE simulation showed that interactions between feedback synapse and the light sensitive conductance in the outer segment can shift the I-R curves along the intensity domain, provided that phototransduction mechanism is not saturated during maximally hyperpolarized light response. Key words: adaptation, feedback, cones, retina, lateral interactions Josef Skrzypek Computer Science Department 3532D Boelter Hall UCLA Los Angeles, California 90024 INTERNET: SKRZYPEK at CS.UCLA.EDU From NHATAOKA%vax1.tcd.ie at cunyvm.cuny.edu Thu Dec 21 13:38:00 1989 From: NHATAOKA%vax1.tcd.ie at cunyvm.cuny.edu (NHATAOKA%vax1.tcd.ie@cunyvm.cuny.edu) Date: Thu, 21 Dec 89 18:38 GMT Subject: Technical report is available Message-ID: <8912271217.AA01539@uunet.uu.net> The following technical report is available. Unfortunately, I want to post this on this connectionists_mailing list only, so "Please don't forward to other newsgroups or mailing lists." Speaker-Independent Phoneme Recognition on TIMIT Database Using Integrated Time-Delay Neural Networks (TDNNs) Nobuo Hataoka(*) and Alex H. Waibel November 27, 1989 CMU-CS -89-190 (also, CMU-CMT-89-115) School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Abstract: This paper describes a new structure of Neural Networks (NNs) for speaker- independent and context-independent phoneme recognition. This structure is based on the integration of Time-Delay Neural Networks(TDNN, Waibel et al.) which have several TDNNs separated according to the duration of phonemes. As a result, the proposed structure has the advantage that it deals with phonemes of varying duration more effectively. In the experimental evaluation of the proposed new structure, 16-English vowel recognition was performed using 5268 vowel tokens picked from 480 sentences spoken by 140 speakers (98 males and 42 females) on the TIMIT (TI-MIT) database. The number of training tokens and testing tokens was 4326 from 100 speakers (69 males and 31 females) and 942 from 40 speakers (29 males and 11 females), respectively. The result was a 60.5% recognition rate (around 70% for a collapsed 13-vowel case), which was improved from 56% in the single TDNN structure, showing the effectiveness of the proposed new structure to use temporal information. (*) The author was a visiting researcher from Central Research Laboratory, Hitachi, Ltd., Japan. This work has been done on a collaborative research project between the Center for Machine Translation of CMU and Hitachi, Ltd. Currently, the author is working for Hitachi Dublin Laboratory in Trinity College, Ireland. --------------------------------------------------- If you want to have a copy of this report, please send an e-mail or a letter to the following address. nhataoka%vax1.tcd.ie at cunyvm.cuny.edu or ^--(one) Alison Dunne Hitachi Dublin Laboratory O'Reilly Institute Trinity College Dublin 2, Ireland P.S. Do not use your mailer's "reply" command. From poggio at ai.mit.edu Wed Dec 27 11:07:40 1989 From: poggio at ai.mit.edu (Tomaso Poggio) Date: Wed, 27 Dec 89 11:07:40 EST Subject: MIT AI Lab memo 1164 Message-ID: <8912271607.AA23473@rice-chex> the following technical report is available from the MIT AI Lab Publication Office (send e-mail to liz at ai.mit.edu) Networks and the Best Approximation Property by Federico Girosi and Tomaso Poggio ABSTRACT Networks can be considered as approximation schemes. Multilayer networks of the backpropagation type can approximate arbitrarily well continuous functions (Cybenko, 1989; Funahashi, 1989; Stinchcombe and White, 1989). We prove that networks derived from regularization theory and including Radial Basis Functions (Poggio and Girosi, 1989, AI memo 1140), have a similar property. From the point of view of approximation theory, however, the property of approximating continuous functions arbitrarily well is not sufficient for characterizing good approximation schemes. More critical is the property of {\it best approximation}. The main result of this paper is that multilayer networks, of the type used in backpropagation, are not best approximation. For regularization networks (in particular Radial Basis Function networks) we prove existence and uniqueness of best approximation. From pollack at cis.ohio-state.edu Wed Dec 27 22:08:06 1989 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Wed, 27 Dec 89 22:08:06 EST Subject: FTP Service is Down; should it come up? Message-ID: <8912280308.AA00375@toto.cis.ohio-state.edu> **Do not forward to other newsgroups** The directory of connectionist electronic tech-reports, pub/neuroprose on cheops.cis.ohio-state.edu, seems to have been deleted. It is impossible to tell whether it was done by a local diskspace scrounger or one of us, perhaps a double agent really working for symbolic AI!!! A request for backup has been made to the local authorities, and I will post another message when it is restored to its former glory. Will take this opportunity for a straw poll: 1) have you ever put a report in neuroprose? 2) Approx how many reports have you retrieved this way? 3) Do you find ftp easy or difficult to use? 4) do you find ftp's binary mode, and the compress/uncompress protocol easy or difficult to use? 5) have you had any problems printing the postscript or tex posted by others? 6) Any other comments on the viability of continuing the distribution of preprints in this fashion? **Do not forward to other newsgroups** Jordan Pollack Laboratory for AI Research CIS Dept/OSU 2036 Neil Ave email: pollack at cis.ohio-state.edu Columbus, OH 43210 Fax/Phone: (614) 292-4890 From smk at flash.bellcore.com Thu Dec 28 11:16:57 1989 From: smk at flash.bellcore.com (Selma M Kaufman) Date: Thu, 28 Dec 89 11:16:57 EST Subject: No subject Message-ID: <8912281616.AA24130@flash.bellcore.com> Subject: Reprint Available: Learning of Stable States in Stochastic Asymmetric Networks Robert B. Allen and Joshua Alspector Bellcore TR-AR-89-351 December, 1989 Boltzmann-based models with asymmetric connections are investigated. Although they are initially unstable, we find that these networks spontaneously self-stablize as a result of learning. Moreover, we find that pairs of weights symmetrize during learning; however, the symmetry is not enough to account for the observed stability. To characterize the system we consider how its entropy is affected by learning and the entropy of the information stream. Finally, the stability of an asymmetric network was confirmed with an electronic model. For paper copies, contact: Selma Kaufman, Bellcore, 2M-356, 445 South St., Morristown, NJ 07960-1910. smk at flash.bellcore.com From hinton at ai.toronto.edu Thu Dec 28 13:15:13 1989 From: hinton at ai.toronto.edu (Geoffrey Hinton) Date: Thu, 28 Dec 89 13:15:13 EST Subject: Technical Report available Message-ID: <89Dec28.131528est.11309@ephemeral.ai.toronto.edu> Please do not reply to this message. To order a copy of the TR described below, please send email to carol at ai.toronto.edu _________________________________________________________________________ DETERMINISTIC BOLTZMANN LEARNING IN NETWORKS WITH ASYMMETRIC CONNECTIVITY Conrad C. Galland and Geoffrey E. Hinton Department of Computer Science University of Toronto 10 Kings College Road Toronto M5S 1A4, Canada Technical Report CRG-TR-89-6 The simplicity and locality of the "contrastive Hebb synapse" (CHS) used in Boltzmann machine learning makes it an attractive model for real biological synapses. The slow learning exhibited by the stochastic Boltzmann machine can be greatly improved by using a mean field approximation and it has been shown (Hinton, 1989) that the CHS also performs steepest descent in these deterministic mean field networks. A major weakness of the learning procedure, from a biological perspective, is that the derivation assumes detailed symmetry of the connectivity. Using networks with purely asymmetric connectivity, we show that the CHS still works in practice provided the connectivity is grossly symmetrical so that if unit i sends a connection to unit j, there are numerous indirect feedback paths from j to i. So long as the network settles to a stable state, we show that the CHS approximates steepest descent and that the proportional error in the approximation can be expected to scale as 1/sqrt(N), where N is the number of connections. ________________________________________________________________________ PS: The research described in this TR uses a different kind of network and a different analysis than the research described in the TR by Allen and Alspector that was recently advertised on the connectionists mailing list. However, the general conclusion of both TR's is the same. From gasser at iuvax.cs.indiana.edu Thu Dec 28 15:51:33 1989 From: gasser at iuvax.cs.indiana.edu (Michael Gasser) Date: Thu, 28 Dec 89 15:51:33 -0500 Subject: tech report Message-ID: **********DO NOT FORWARD TO OTHER BBOARDS************** **********DO NOT FORWARD TO OTHER BBOARDS************** **********DO NOT FORWARD TO OTHER BBOARDS************** The TR "Networks that Learn Phonology" (Gasser & Lee, Indiana University Computer Science Dept. TR #300), advertised here last week, has been added to the (recently restored) neuroprose database at Ohio State. If you already asked for a hardcopy, please try the ftp option. If this is not convenient, you can request a copy from Nancy Garrett, nlg at iuvax.cs.indiana.edu, Computer Science Department, Indiana University, Bloomington, IN 47405. Here's how to obtain a copy using ftp: unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): neuron ftp> cd pub/neuroprose ftp> type binary ftp> get (remote-file) gasser.phonology.ps.Z (local-file) foo.ps.Z ... ftp> quit unix> uncompress foo.ps unix> lpr -P(your_local_postscript_printer) foo.ps From pollack at cis.ohio-state.edu Fri Dec 29 16:12:36 1989 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Fri, 29 Dec 89 16:12:36 -0500 Subject: neuroprose Message-ID: <8912292112.AA16700@giza.cis.ohio-state.edu> *****do not forward to other newsgroups ***** It seems that the pub/neuroprose directory on cheops.cis.ohio-state.edu was set up in such a fashion that ANY anonymous user could have deleted it. It has been restored as of November 11, 1989. If you placed a report in there since then, I apologize for the inconvenience and ask anyone whose work was lost to re-post. Current contents: barto.control.ps barto.control.ps.Z barto.sequential_decisions.ps barto.sequential_decisions.ps.Z gasser.phonology.ps.Z kehagias.hmm0289.tex maclennan.contin_comp.tex maclennan.tex miikkulainen.hierarchical.ps.Z pollack.newraam.ps pollack.newraam.ps.Z pollack.nips88.ps pollack.perceptrons.ps tenorio.cluster.plain tenorio.speech_dev.ps (This is obviously just a fraction of the tech reports announced on the newsgroup, but the poll (below) shows that a few people appreciate rapid retrieval.) At the suggestion of Barak, I have changed the protocol somewhat to avoid the problem of malicious vandalism in the future. Unfortunately it puts me in the loop. The pub/neuroprose directory is now publically readable. It contains a subdirectory called Inbox, which is publically writable. To post a report, PUT it the pub/neuroprose/Inbox directory and send me email, and I will move your file to the neuroprose directory and acknowledge. Similar intervention is required for deleting a report. This seems less horrible than discovering the directory missing every couple of months. Jordan -------------------------------- Here are initial results of the straw poll. 2 respondents have posted reports 12 have retrieved reports (average of 2) 12 find FTP easy to use, 1 hard 8 find binary compression easy, 4 hard 11 votes for continuation. Major Problems: No service in europe (4) Difficulty with Tex standard (3) Still cutting & pasting figures, cant use it (2) Non-unix mac is pretty incompatible (1) Plagiarization worry (1) Suspicion of technology (1) *****do not forward to other newsgroups ***** From pauls at neuron.Colorado.EDU Fri Dec 29 19:06:55 1989 From: pauls at neuron.Colorado.EDU (Paul Smolensky) Date: Fri, 29 Dec 89 17:06:55 MST Subject: Behavioral Neuroscience Faculty Position at Boulder Message-ID: <8912300006.AA00736@neuron.colorado.edu> Below is the job description for a faculty position in Behavioral Neuroscience at the University of Colorado at Boulder. Our campus has an active, collaborative, multi-disciplinary connectionist community. Anyone interested in more information is welcome to contact us; if you apply for the job, let us know so we can follow up. -- Paul Smolensky & Mike Mozer BEHAVIORAL NEUROSCIENCE POSITION University of Colorado, Boulder The Department of Psychology at the University of Colorado at Boulder invites applications for a faculty position in Behavioral Neuroscience, starting September 1990. Outstanding applicants at any rank are encouraged to apply. This position carries with it attractive research space and significant start-up funds. Appli- cants should send a vita, 3 letters of recommendation, and a statement of teaching and research interest to: Jerry Rudy, Chairperson, Behavioral Neuroscience Search Committee, Department of Psychology, Box 345, University of Colorado, Boulder, CO 80309. Application deadline is January 15, 1990. From mv10801 at uc.msc.umn.edu Fri Dec 29 10:42:34 1989 From: mv10801 at uc.msc.umn.edu (mv10801@uc.msc.umn.edu) Date: Fri, 29 Dec 89 09:42:34 CST Subject: Symmetrizing weights Message-ID: <8912291542.AA20346@uc.msc.umn.edu> On a related note, I described in a 1988 tech report how asymmetric lateral connections in a self-organizing neural network can symmetrize by using an inhibitory learning rule. The paper is called "Self- Organizing Neural Networks for Perception of Visual Motion," by J.A.Marshall. A more concise version will appear in the next issue of Neural Networks (January 1990). To obtain the TR, you can write to the Dept. Secretary, Boston Univ. Computer Science Dept., 111 Cummington St., Boston, MA 02215, U.S.A., [pam at cs.bu.edu], and ask for CS-TR-88-010; the price is $7.00. --Jonathan A. Marshall mv10801 at uc.msc.umn.edu Center for Research in Learning, Perception, and Cognition 205 Elliott Hall, University of Minnesota Minneapolis, MN 55455, U.S.A.