From tenorio at ee.ecn.purdue.edu Mon Jul 2 12:06:05 1990 From: tenorio at ee.ecn.purdue.edu (Manoel Fernando Tenorio) Date: Mon, 02 Jul 90 11:06:05 EST Subject: Report on Non Linear Optimization Message-ID: <9007021606.AA14753@ee.ecn.purdue.edu> This report is now available from Purdue University. There is a fee for overseas hardcopies. An electronic copy will be soon available at the Ohio database for ftp. Please send your request to Jerry Dixon (jld at ee.ecn.purdue.edu). Do not reply to this message. COMPUTATIONAL PROPERTIES OF GENERALIZED HOPFIELD NETWORKS APPLIED TO NONLINEAR OPTIMIZATION Anthanasios G. Tsirukis and Gintaras V. Reklaitis School of Chemical Engineering Manoel F. Tenorio School of Electrical Engineering Technical Report TREE 89-69 Parallel Distributed Structures Laboratory School of Electrical Engineering Purdue University ABSTRACT A nonlinear neural framework, called the Generalized Hopfield Network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, named the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of content-addressable memories). From POSTMAST at idui1.uidaho.edu Mon Jul 2 11:22:05 1990 From: POSTMAST at idui1.uidaho.edu (Marty Zimmerman) Date: Mon, 02 Jul 90 11:22:05 PAC Subject: No subject Message-ID: ========================================================================= From Marty Mon Jul 2 11:12:23 1990 From: Marty (208) Date: 2 July 1990, 11:12:23 PAC Subject: No subject Message-ID: Greetings: We have recently removed one of our BITNET nodes here at the University of Idaho: IDCSVAX. Currently, your electronic digest is being sent to userid KULAS at IDCSVAX. Please change his address to kulas at ted.cs.uidaho.edu as soon as possible. Thank you. Marty Zimmerman, U of Idaho Postmaster From kruschke at ucs.indiana.edu Mon Jul 2 16:26:00 1990 From: kruschke at ucs.indiana.edu (KRUSCHKE,JOHN,PSY) Date: 2 Jul 90 15:26:00 EST Subject: roommate wanted for cog sci conference Message-ID: I'm looking for a roommate for the Cognitive Science Society Conference, July 25-28 (inclusive). I'll be staying at the Hyatt Regency. Your half of the cost would be $57.50 x 4 nights = $230, plus your share of any taxes. (I've checked rates at other places near MIT, and they're all much higher.) I don't smoke and I don't snore; that's all I ask of a roommate. ------------------------------------------------------------------------- John K. Kruschke Asst. Prof. of Psych. & Cog. Sci. Dept. of Psychology kruschke at ucs.indiana.edu Indiana University kruschke at iubacs.bitnet Bloomington, IN 47405-4201 USA (812) 855-3192 ------------------------------------------------------------------------- From het at seiden.psych.mcgill.ca Tue Jul 3 15:41:55 1990 From: het at seiden.psych.mcgill.ca (Phil Hetherington) Date: Tue, 3 Jul 90 15:41:55 EDT Subject: cog sci accomodation info Message-ID: <9007031941.AA15061@seiden.psych.mcgill.ca.> I would like any information on cheap accomodation in Boston for the Cognitive Science Conference. I'm looking for something on a bus or subway line so that I can comute. Does anyone know of a cheap hotel/motel/bed-breakfast place where I can crash? There are four of us looking. Please send to: hynie at seiden.psych.mcgill.ca Thanks. From PU6MI6Q5%CINE88.CINECA.IT at vma.CC.CMU.EDU Tue Jul 3 16:33:00 1990 From: PU6MI6Q5%CINE88.CINECA.IT at vma.CC.CMU.EDU (PU6MI6Q5%CINE88.CINECA.IT@vma.CC.CMU.EDU) Date: Tue, 3 JUL 90 16:33 N Subject: e-address change Message-ID: <2569@CINE88.CINECA.IT> Please delete this account from the mailing list. In addition, please add "pu6mi6q2 at icineca2.bitnet" as a new recipient. All attempts to send this to an administrator have resulted in returned e-mail. -Joe Giampapa pu6mi6q5 at icineca2.bitnet From POPX at vax.oxford.ac.uk Thu Jul 5 06:07:04 1990 From: POPX at vax.oxford.ac.uk (POPX@vax.oxford.ac.uk) Date: Thu, 5 JUL 90 10:07:04 GMT Subject: No subject Message-ID: FROM: Jocelyn Paine, Department of Experimental Psychology, South Parks Road, Oxford OX1 3UD. JANET: POPX @ UK.AC.OX.VAX Phone: (0865) 271444 - messages. (0865) 271339 - direct. ******************************************* * * * POPLOG USERS' GROUP CONFERENCE 1990 * * * * JULY 17TH - 18TH * * * * OXFORD * * * ******************************************* Why am I posting news about a Poplog conference to the neural net digests? After all, Poplog is an implementation of Pop-11, Prolog, Lisp, and ML - all very conventional and symbolic AI languages. Well, following from work done by David Young at Sussex, you can now buy from Integral Solutions Limited (Poplog's commercial distributors) the "Poplog Neural" package. This allows you to design neural nets of various kinds; display them graphically using Poplog's windowing system; build fast production versions; and integrate what you've designed with existing code written in Pop-11, Prolog, Lisp, or ML. So if you need to build a mixed net/symbolic program, Poplog is well worth considering. And you get the convenience of a rather nice development environment for your nets; plus the four languages I've mentioned, a built-in editor, a window manager, and the object-oriented "Flavours" package. If you want to find out more about Poplog Neural, and Poplog in general, this year's User Group conference, PLUG90, is the place to do it. We still have places left at PLUG90, and can accept bookings if made quickly. The conference will be held in Oxford on the 17th and 18th of July; accomodation is provided in Keble College, and talks themselves will be in Experimental Psychology. Registration will open at 11 am on the 17th, with the conference proper beginning at 2; it will close at about 4 on the 18th. There will be a rather good conference dinner on the night of the 17th (main course: duck in lime and ginger sauce). The price is #75 to members of PLUG and #95 to non-members (#15 non-residential without dinner; #37 non-residential with dinner). Integral Solutions Limited, who distribute Poplog commercially, has generously paid for three free places. These will be offered to academic members of PLUG who have not attended a PLUG conference before, and who have difficulty raising funds. All three are still available. ~~~~~ This is the provisional list of talks: "Poplog Neural", Colin Shearer, Integral Solutions Ltd. (30 mins) A demonstration of ISL's new neural-networking system. It's implemented in Poplog, does its number crunching in Fortran, and allows you to build and test nets by drawing on a window. Fully interfaceable with Pop-11, Prolog, Lisp and ML. "Pop9X - The Standard", Steve Knight, Hewlett-Packard Labs. (1 hour) Gives a review of the BSI standardisation process and the progress of the Pop standard - the language YOU will be writing soon (ish). "Assembly code translation in Prolog" Ian O'Neill, Program Validation. (30 mins) State of the art assembly language translation in Prolog. "THESEUS: a production-system simulation of the spinning behaviour of an orb-web spider", Nick Gotts, Oxford University Zoology Department. (30 mins) As well as giving a demo, Nick will talk about his experiences of using AlphaPop: Theseus runs on a Macintosh. "MODEL: From Package to Language", James Anderson, Reading University. (1 hour) A six year old software package for model based vision goes up in flames under the heat of self-criticism - to be replaced by a language. TALK INCLUDES VIDEO PRESENTATION "GRIT - General Real-time Interactive Ikbs Toolset", Mark Swabey, Avonicom. (30 mins) Mark will talk about GRIT, but also about his experiences (nice and otherwise) of Poplog as a development environment. "IVE - Interactive Visual Environment" Anthony Worrall, Reading University. (30 mins) The software environment that beat the pants off MODEL (based on MODEL and PWM and quite a lot of other things besides.) TALK INCLUDES VIDEO PRESENTATION "Embedded Systems", Rob Zancanato, Cambridge Consultants. (30 mins) Poplog for embedded systems - especially MUSE for designing real-time controllers. We hope to have a demo of one such self- contained system. "Doing representation theory in Prolog", John Fitzgerald, Oxford University Maths Department. (30 mins) Representation theory is part of the study of (mathematical) groups. Prolog copes surprisingly well with such a geometric topic. "Building User Interfaces with Flavours", Chris Price, Department of Computer Science, University College of Wales. (30 mins) Object-oriented user-interface design, using Poplog's OO flavours package and window manager. "TPM - a graphical Prolog debugger", Dick Broughton, Expert Systems Limited. (1 hour) Dick will show how a debugger should be designed: with TPM, you can display Prolog proof trees as trees, rewind and fast forward execution, zoom in and out, watch the "cut" prune branches, and generally do everything you can't do with 'spy'. "Processing of Road Accident Data", Jiashu Wu, UCL Transport Studies. (30 mins) UCL use Poplog for an EMYCIN-based expert system which advises on accident blackspots, taking 'raw' accident data from incident reports. They like Poplog because it's an "open system": its jobs include fuzzy matching, stats, and handling very big databases. "Faust - an online fault-diagnosis system", David Cockburn, Electricity Research and Development Centre. (30 mins) (to be confirmed) "Design for testability", Lawrence Smith, SD Scicon. (30 mins) (to be confirmed) A system for advising the users of CAD packages on loopholes in testability. Something on the future of Poplog Integral Solutions. (1 hour) (details awaited) ~~~~~ And this is the provisional timetable: Accomodation is provided in Keble College, Parks Road, Oxford. Luggage can be left there from mid-day on the 17th. The conference itself will be in the Department of Experimental Psychology, South Parks Road. July 17th --------- Registration and coffee: 11:00 - 12:30 Lunch: 12:30 - 2:00 Talks: 2:00 - 3:30 Tea: 3:30 - 4:00 Talks: 4:00 - 6:00 (Depart for Keble). Keble bar opens from 6 to 11 pm. Dinner starts at 7. July 18th --------- From mjolsness-eric at CS.YALE.EDU Thu Jul 5 14:39:11 1990 From: mjolsness-eric at CS.YALE.EDU (Eric Mjolsness) Date: Thu, 5 Jul 90 14:39:11 EDT Subject: new TR: development Message-ID: <9007051839.AA06131@NEBULA.SYSTEMSZ.CS.YALE.EDU> ***** Please do not distribute to other lists ***** The following technical report is now available: "A Connectionist Model of Development" by Eric Mjolsness (1), David H. Sharp (2), and John Reinitz (3) (1) Department of Computer Science, Yale University (2) Theoretical Division, Los Alamos National Laboratory (3) Department of Biological Sciences, Columbia University Abstract: We present a phenomenological modeling framework for development. Our purpose is to provide a systematic method for discovering and expressing correlations in experimental data on gene expression and other developmental processes. The modeling framework is based on a connectionist or ``neural net'' dynamics for biochemical regulators, coupled to ``grammatical rules'' which describe certain features of the birth, growth, and death of cells, synapses and other biological entities. We outline how spatial geometry can be included, although this part of the model is not complete. As an example of the application of our results to a specific biological system, we show in detail how to derive a rigorously testable model of the network of segmentation genes operating in the blastoderm of Drosophila. To further illustrate our methods, we sketch how they could be applied to two other important developmental processes: cell cycle control and cell-cell induction. We also present a simple biochemical model leading to our assumed connectionist dynamics which shows that the dynamics used is at least compatible with known chemical mechanisms. For copies of YALEU/DCS/RR-796, please send email to shanahan-mary at cs.yale.edu or physical mail to Eric Mjolsness Yale Computer Science Department P.O. Box 2158 Yale Station New Haven CT 06520-2158 ------- From LAUTRUP at nbivax.nbi.DK Mon Jul 9 04:33:00 1990 From: LAUTRUP at nbivax.nbi.DK (Benny Lautrup) Date: Mon, 9 Jul 90 10:33 +0200 (NBI, Copenhagen) Subject: Protein structure and neural nets Message-ID: <6970A295C1BFE0C4B0@nbivax.nbi.dk> Re: New paper on neural nets and protein structure Title: ``Analysis of the Secondary Structure of the Human Immunodefieciency Virus (HIV) Proteins p17, gp120, and gp41 by Computer Modeling Based on Neural Network Methods'' H. Andreassen, H. Bohr, J. Bohr, S. Brunak, T. Bugge, R.M.J. Cotterill, C. Jacobsen, P. Kusk, B. Lautrup, S.B. Petersen, T. Saermark, and K. Ulrich, in Journal of Acquired Immune Deficiency Syndromes (AIDS), vol. 3, 615-622, 1990. From sun at umiacs.UMD.EDU Mon Jul 9 12:07:46 1990 From: sun at umiacs.UMD.EDU (Guo-Zheng Sun) Date: Mon, 9 Jul 90 12:07:46 -0400 Subject: Protein structure and neural nets Message-ID: <9007091607.AA15551@neudec.umiacs.UMD.EDU> Dr. Lautrup: Could you please send me a copy of your paper about protein structure analysis using neural network? My address is Guo-Zheng Sun UMIACS University of Maryland College Park, MD 20742 Tel.(301)454-1984 Thank you in advance. From Atul.Chhabra at UC.EDU Mon Jul 9 19:59:08 1990 From: Atul.Chhabra at UC.EDU (atul k chhabra) Date: Mon, 9 Jul 90 19:59:08 EDT Subject: Character Recognition Bibliography? Message-ID: <9007092359.AA11073@uceng.UC.EDU> I am looking for a character recognition bibliography. I am interested in all aspects of character recognition, i.e., preprocessing and segmentation, OCR, typewritten character recognition, handwritten character recognition, neural network based recognition, statistical and syntactic recognition, hardware implementations, and commercial character recognition systems. If someone out there has such a bibliography, or something that fits a part of the above description, I would appreciate receiving a copy. Even if you know of only a few references, please email me the references. Please email the references or bibliography to me. I will summarize on the vision-list. Thanks, Atul Chhabra Department of Electrical & Computer Engineering University of Cincinnati, ML 030 Cincinnati, OH 45221-0030 From jmerelo at ugr.es Sat Jul 7 06:21:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 7 Jul 90 12:21 +0200 Subject: Parameters in Kohonen's Feature Map Message-ID: <69*jmerelo@ugr.es> I have already implemented, in C for Sun machines, a Kohonen Feature Map, but I have found in the literature no way to change the two parameters, k1 and k2, that appear in the alpha formula. Does anybody know how to compute them? Or is it just an empiric a l computation? JJ ================== From jmerelo at ugr.es Fri Jul 6 06:49:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 6 Jul 90 12:49 +0200 Subject: Kohonen's network Message-ID: <65*jmerelo@ugr.es> I am working on Kohonen's network. I was using previously a 15 input, 8x8 output layer network, with parameters stated in Kohonen's 1984 paper ( also in Aleksander book ). Y switched to a 16 input, 9x9 layer output, with the same parameters, and everythi n g got screwed up. Should I use the same parameters? How should them be changed? What do they depend on? Please, somebody answer, or i'll cast my Sun ws thru the window JJ ================== From tenorio at ee.ecn.purdue.edu Mon Jul 2 12:06:05 1990 From: tenorio at ee.ecn.purdue.edu (Manoel Fernando Tenorio) Date: Mon, 02 Jul 90 11:06:05 EST Subject: Report on Non Linear Optimization Message-ID: <9007021606.AA14753@ee.ecn.purdue.edu> This report is now available from Purdue University. There is a fee for overseas hardcopies. An electronic copy will be soon available at the Ohio database for ftp. Please send your request to Jerry Dixon (jld at ee.ecn.purdue.edu). Do not reply to this message. COMPUTATIONAL PROPERTIES OF GENERALIZED HOPFIELD NETWORKS APPLIED TO NONLINEAR OPTIMIZATION Anthanasios G. Tsirukis and Gintaras V. Reklaitis School of Chemical Engineering Manoel F. Tenorio School of Electrical Engineering Technical Report TREE 89-69 Parallel Distributed Structures Laboratory School of Electrical Engineering Purdue University ABSTRACT A nonlinear neural framework, called the Generalized Hopfield Network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, named the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of content-addressable memories). From POSTMAST at idui1.uidaho.edu Mon Jul 2 11:22:05 1990 From: POSTMAST at idui1.uidaho.edu (Marty Zimmerman) Date: Mon, 02 Jul 90 11:22:05 PAC Subject: No subject Message-ID: ========================================================================= From Marty Mon Jul 2 11:12:23 1990 From: Marty (208) Date: 2 July 1990, 11:12:23 PAC Subject: No subject Message-ID: Greetings: We have recently removed one of our BITNET nodes here at the University of Idaho: IDCSVAX. Currently, your electronic digest is being sent to userid KULAS at IDCSVAX. Please change his address to kulas at ted.cs.uidaho.edu as soon as possible. Thank you. Marty Zimmerman, U of Idaho Postmaster From kruschke at ucs.indiana.edu Mon Jul 2 16:26:00 1990 From: kruschke at ucs.indiana.edu (KRUSCHKE,JOHN,PSY) Date: 2 Jul 90 15:26:00 EST Subject: roommate wanted for cog sci conference Message-ID: I'm looking for a roommate for the Cognitive Science Society Conference, July 25-28 (inclusive). I'll be staying at the Hyatt Regency. Your half of the cost would be $57.50 x 4 nights = $230, plus your share of any taxes. (I've checked rates at other places near MIT, and they're all much higher.) I don't smoke and I don't snore; that's all I ask of a roommate. ------------------------------------------------------------------------- John K. Kruschke Asst. Prof. of Psych. & Cog. Sci. Dept. of Psychology kruschke at ucs.indiana.edu Indiana University kruschke at iubacs.bitnet Bloomington, IN 47405-4201 USA (812) 855-3192 ------------------------------------------------------------------------- From het at seiden.psych.mcgill.ca Tue Jul 3 15:41:55 1990 From: het at seiden.psych.mcgill.ca (Phil Hetherington) Date: Tue, 3 Jul 90 15:41:55 EDT Subject: cog sci accomodation info Message-ID: <9007031941.AA15061@seiden.psych.mcgill.ca.> I would like any information on cheap accomodation in Boston for the Cognitive Science Conference. I'm looking for something on a bus or subway line so that I can comute. Does anyone know of a cheap hotel/motel/bed-breakfast place where I can crash? There are four of us looking. Please send to: hynie at seiden.psych.mcgill.ca Thanks. From PU6MI6Q5%CINE88.CINECA.IT at vma.CC.CMU.EDU Tue Jul 3 16:33:00 1990 From: PU6MI6Q5%CINE88.CINECA.IT at vma.CC.CMU.EDU (PU6MI6Q5%CINE88.CINECA.IT@vma.CC.CMU.EDU) Date: Tue, 3 JUL 90 16:33 N Subject: e-address change Message-ID: <2569@CINE88.CINECA.IT> Please delete this account from the mailing list. In addition, please add "pu6mi6q2 at icineca2.bitnet" as a new recipient. All attempts to send this to an administrator have resulted in returned e-mail. -Joe Giampapa pu6mi6q5 at icineca2.bitnet From POPX at vax.oxford.ac.uk Thu Jul 5 06:07:04 1990 From: POPX at vax.oxford.ac.uk (POPX@vax.oxford.ac.uk) Date: Thu, 5 JUL 90 10:07:04 GMT Subject: No subject Message-ID: FROM: Jocelyn Paine, Department of Experimental Psychology, South Parks Road, Oxford OX1 3UD. JANET: POPX @ UK.AC.OX.VAX Phone: (0865) 271444 - messages. (0865) 271339 - direct. ******************************************* * * * POPLOG USERS' GROUP CONFERENCE 1990 * * * * JULY 17TH - 18TH * * * * OXFORD * * * ******************************************* Why am I posting news about a Poplog conference to the neural net digests? After all, Poplog is an implementation of Pop-11, Prolog, Lisp, and ML - all very conventional and symbolic AI languages. Well, following from work done by David Young at Sussex, you can now buy from Integral Solutions Limited (Poplog's commercial distributors) the "Poplog Neural" package. This allows you to design neural nets of various kinds; display them graphically using Poplog's windowing system; build fast production versions; and integrate what you've designed with existing code written in Pop-11, Prolog, Lisp, or ML. So if you need to build a mixed net/symbolic program, Poplog is well worth considering. And you get the convenience of a rather nice development environment for your nets; plus the four languages I've mentioned, a built-in editor, a window manager, and the object-oriented "Flavours" package. If you want to find out more about Poplog Neural, and Poplog in general, this year's User Group conference, PLUG90, is the place to do it. We still have places left at PLUG90, and can accept bookings if made quickly. The conference will be held in Oxford on the 17th and 18th of July; accomodation is provided in Keble College, and talks themselves will be in Experimental Psychology. Registration will open at 11 am on the 17th, with the conference proper beginning at 2; it will close at about 4 on the 18th. There will be a rather good conference dinner on the night of the 17th (main course: duck in lime and ginger sauce). The price is #75 to members of PLUG and #95 to non-members (#15 non-residential without dinner; #37 non-residential with dinner). Integral Solutions Limited, who distribute Poplog commercially, has generously paid for three free places. These will be offered to academic members of PLUG who have not attended a PLUG conference before, and who have difficulty raising funds. All three are still available. ~~~~~ This is the provisional list of talks: "Poplog Neural", Colin Shearer, Integral Solutions Ltd. (30 mins) A demonstration of ISL's new neural-networking system. It's implemented in Poplog, does its number crunching in Fortran, and allows you to build and test nets by drawing on a window. Fully interfaceable with Pop-11, Prolog, Lisp and ML. "Pop9X - The Standard", Steve Knight, Hewlett-Packard Labs. (1 hour) Gives a review of the BSI standardisation process and the progress of the Pop standard - the language YOU will be writing soon (ish). "Assembly code translation in Prolog" Ian O'Neill, Program Validation. (30 mins) State of the art assembly language translation in Prolog. "THESEUS: a production-system simulation of the spinning behaviour of an orb-web spider", Nick Gotts, Oxford University Zoology Department. (30 mins) As well as giving a demo, Nick will talk about his experiences of using AlphaPop: Theseus runs on a Macintosh. "MODEL: From Package to Language", James Anderson, Reading University. (1 hour) A six year old software package for model based vision goes up in flames under the heat of self-criticism - to be replaced by a language. TALK INCLUDES VIDEO PRESENTATION "GRIT - General Real-time Interactive Ikbs Toolset", Mark Swabey, Avonicom. (30 mins) Mark will talk about GRIT, but also about his experiences (nice and otherwise) of Poplog as a development environment. "IVE - Interactive Visual Environment" Anthony Worrall, Reading University. (30 mins) The software environment that beat the pants off MODEL (based on MODEL and PWM and quite a lot of other things besides.) TALK INCLUDES VIDEO PRESENTATION "Embedded Systems", Rob Zancanato, Cambridge Consultants. (30 mins) Poplog for embedded systems - especially MUSE for designing real-time controllers. We hope to have a demo of one such self- contained system. "Doing representation theory in Prolog", John Fitzgerald, Oxford University Maths Department. (30 mins) Representation theory is part of the study of (mathematical) groups. Prolog copes surprisingly well with such a geometric topic. "Building User Interfaces with Flavours", Chris Price, Department of Computer Science, University College of Wales. (30 mins) Object-oriented user-interface design, using Poplog's OO flavours package and window manager. "TPM - a graphical Prolog debugger", Dick Broughton, Expert Systems Limited. (1 hour) Dick will show how a debugger should be designed: with TPM, you can display Prolog proof trees as trees, rewind and fast forward execution, zoom in and out, watch the "cut" prune branches, and generally do everything you can't do with 'spy'. "Processing of Road Accident Data", Jiashu Wu, UCL Transport Studies. (30 mins) UCL use Poplog for an EMYCIN-based expert system which advises on accident blackspots, taking 'raw' accident data from incident reports. They like Poplog because it's an "open system": its jobs include fuzzy matching, stats, and handling very big databases. "Faust - an online fault-diagnosis system", David Cockburn, Electricity Research and Development Centre. (30 mins) (to be confirmed) "Design for testability", Lawrence Smith, SD Scicon. (30 mins) (to be confirmed) A system for advising the users of CAD packages on loopholes in testability. Something on the future of Poplog Integral Solutions. (1 hour) (details awaited) ~~~~~ And this is the provisional timetable: Accomodation is provided in Keble College, Parks Road, Oxford. Luggage can be left there from mid-day on the 17th. The conference itself will be in the Department of Experimental Psychology, South Parks Road. July 17th --------- Registration and coffee: 11:00 - 12:30 Lunch: 12:30 - 2:00 Talks: 2:00 - 3:30 Tea: 3:30 - 4:00 Talks: 4:00 - 6:00 (Depart for Keble). Keble bar opens from 6 to 11 pm. Dinner starts at 7. July 18th --------- From mjolsness-eric at CS.YALE.EDU Thu Jul 5 14:39:11 1990 From: mjolsness-eric at CS.YALE.EDU (Eric Mjolsness) Date: Thu, 5 Jul 90 14:39:11 EDT Subject: new TR: development Message-ID: <9007051839.AA06131@NEBULA.SYSTEMSZ.CS.YALE.EDU> ***** Please do not distribute to other lists ***** The following technical report is now available: "A Connectionist Model of Development" by Eric Mjolsness (1), David H. Sharp (2), and John Reinitz (3) (1) Department of Computer Science, Yale University (2) Theoretical Division, Los Alamos National Laboratory (3) Department of Biological Sciences, Columbia University Abstract: We present a phenomenological modeling framework for development. Our purpose is to provide a systematic method for discovering and expressing correlations in experimental data on gene expression and other developmental processes. The modeling framework is based on a connectionist or ``neural net'' dynamics for biochemical regulators, coupled to ``grammatical rules'' which describe certain features of the birth, growth, and death of cells, synapses and other biological entities. We outline how spatial geometry can be included, although this part of the model is not complete. As an example of the application of our results to a specific biological system, we show in detail how to derive a rigorously testable model of the network of segmentation genes operating in the blastoderm of Drosophila. To further illustrate our methods, we sketch how they could be applied to two other important developmental processes: cell cycle control and cell-cell induction. We also present a simple biochemical model leading to our assumed connectionist dynamics which shows that the dynamics used is at least compatible with known chemical mechanisms. For copies of YALEU/DCS/RR-796, please send email to shanahan-mary at cs.yale.edu or physical mail to Eric Mjolsness Yale Computer Science Department P.O. Box 2158 Yale Station New Haven CT 06520-2158 ------- From LAUTRUP at nbivax.nbi.DK Mon Jul 9 04:33:00 1990 From: LAUTRUP at nbivax.nbi.DK (Benny Lautrup) Date: Mon, 9 Jul 90 10:33 +0200 (NBI, Copenhagen) Subject: Protein structure and neural nets Message-ID: <6970A295C1BFE0C4B0@nbivax.nbi.dk> Re: New paper on neural nets and protein structure Title: ``Analysis of the Secondary Structure of the Human Immunodefieciency Virus (HIV) Proteins p17, gp120, and gp41 by Computer Modeling Based on Neural Network Methods'' H. Andreassen, H. Bohr, J. Bohr, S. Brunak, T. Bugge, R.M.J. Cotterill, C. Jacobsen, P. Kusk, B. Lautrup, S.B. Petersen, T. Saermark, and K. Ulrich, in Journal of Acquired Immune Deficiency Syndromes (AIDS), vol. 3, 615-622, 1990. From sun at umiacs.UMD.EDU Mon Jul 9 12:07:46 1990 From: sun at umiacs.UMD.EDU (Guo-Zheng Sun) Date: Mon, 9 Jul 90 12:07:46 -0400 Subject: Protein structure and neural nets Message-ID: <9007091607.AA15551@neudec.umiacs.UMD.EDU> Dr. Lautrup: Could you please send me a copy of your paper about protein structure analysis using neural network? My address is Guo-Zheng Sun UMIACS University of Maryland College Park, MD 20742 Tel.(301)454-1984 Thank you in advance. From Atul.Chhabra at UC.EDU Mon Jul 9 19:59:08 1990 From: Atul.Chhabra at UC.EDU (atul k chhabra) Date: Mon, 9 Jul 90 19:59:08 EDT Subject: Character Recognition Bibliography? Message-ID: <9007092359.AA11073@uceng.UC.EDU> I am looking for a character recognition bibliography. I am interested in all aspects of character recognition, i.e., preprocessing and segmentation, OCR, typewritten character recognition, handwritten character recognition, neural network based recognition, statistical and syntactic recognition, hardware implementations, and commercial character recognition systems. If someone out there has such a bibliography, or something that fits a part of the above description, I would appreciate receiving a copy. Even if you know of only a few references, please email me the references. Please email the references or bibliography to me. I will summarize on the vision-list. Thanks, Atul Chhabra Department of Electrical & Computer Engineering University of Cincinnati, ML 030 Cincinnati, OH 45221-0030 From jmerelo at ugr.es Sat Jul 7 06:21:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 7 Jul 90 12:21 +0200 Subject: Parameters in Kohonen's Feature Map Message-ID: <69*jmerelo@ugr.es> I have already implemented, in C for Sun machines, a Kohonen Feature Map, but I have found in the literature no way to change the two parameters, k1 and k2, that appear in the alpha formula. Does anybody know how to compute them? Or is it just an empiric a l computation? JJ ================== From jmerelo at ugr.es Fri Jul 6 06:49:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 6 Jul 90 12:49 +0200 Subject: Kohonen's network Message-ID: <65*jmerelo@ugr.es> I am working on Kohonen's network. I was using previously a 15 input, 8x8 output layer network, with parameters stated in Kohonen's 1984 paper ( also in Aleksander book ). Y switched to a 16 input, 9x9 layer output, with the same parameters, and everythi n g got screwed up. Should I use the same parameters? How should them be changed? What do they depend on? Please, somebody answer, or i'll cast my Sun ws thru the window JJ ================== From cpd at aic.hrl.hac.com Tue Jul 10 13:52:42 1990 From: cpd at aic.hrl.hac.com (cpd@aic.hrl.hac.com) Date: Tue, 10 Jul 90 10:52:42 -0700 Subject: Call for Participation in Connectionist Natural Language Processing Message-ID: <9007101752.AA04818@aic.hrl.hac.com> AAAI Spring Symposium Connectionist Natural Language Processing Recent results have lead some researchers to propose that connectionism is an alternative to AI/Linguistic approaches to natural language processing, both as a cognitive model and for practical applications. This symposium will bring together both critics and proponents of connectionist NLP to discuss its strengths and weaknesses. This symposium will cover a number of areas, spanning from new phonology models to connectionist treatments of anaphora and discourse issues. Participants should address what is new that connectionism brings to the study of language. The purpose of the symposium is to examine this issue from a range of perspectives including: Spoken language understanding/generation Parsing Semantics Pragmatics Language acquisition Linguistic and representational capacity issues Applications Some of the questions expecting to be addressed include: What mechanisms/representations from AI/Linguistics are necessary for connectionist NLP? Why? Can connectionism help integrate signal processing with knowledge of language? What does connectionism add to other theories of semantics? Do connectionist theories have implications for psycholinguistics? Prospective participants are encouraged to contact a member of the program committee to obtain a more detailed description of the symposium's goals and issues. Those interested in participating in this symposium are asked to submit a 1-2 page position paper abstract and a list of relevant publications. Abstracts of work in progress are encouraged, and potential participants may also include 3 copies of a full length paper describing previous work. Submitted papers or abstracts will be included in the symposium working notes, and participants will be asked to participate in panel discussions. Three (3) copies of each submission should be sent to arrive by November 16, 1990 to: Charles Dolan, Hughes Research Laboratories, RL96, 3011 Malibu Canyon Road, Malibu CA, 90265 All submissions will be promptly acknowledged. E-Mail inquiries may be sent to: cpd at aic.hrl.hac.com Program Committee: Robert Allen, Charles Dolan (chair), James McClelland, Peter Norvig, and Jordan Pollack. From Atul.Chhabra at UC.EDU Tue Jul 10 15:44:32 1990 From: Atul.Chhabra at UC.EDU (atul k chhabra) Date: Tue, 10 Jul 90 15:44:32 EDT Subject: Character Recognition Bibliography? Message-ID: <9007101944.AA07725@uceng.UC.EDU> > Please email the references or bibliography to me. I will summarize on the > vision-list. Thanks, ^^^^^^^^^^^ I posted the request for a character recognition bibliography to connectionists at cs.cmu.edu, vision-list at ads.com, and to the usenet newsgroup comp.ai.neural-nets. I will post a summary of the responses on all of these mailing lists/newsgroups. Although I prefer the bibtex format, any format will do. Atul Chhabra Department of Electrical & Computer Engineering University of Cincinnati, ML 030 Cincinnati, OH 45221-0030 From cpd at aic.hrl.hac.com Tue Jul 10 19:52:56 1990 From: cpd at aic.hrl.hac.com (cpd@aic.hrl.hac.com) Date: Tue, 10 Jul 90 16:52:56 -0700 Subject: Apologies on Call for Paticipation Message-ID: <9007102352.AA05988@aic.hrl.hac.com> My previous message left out some information. AAAI Spring Symposium March 26-28 Stanford University Connectionist Natural Language Processing ... from previous message... All submissions will be promptly acknowledged and registration materials will be sent to authors who submit abstracts. The Spring Symposium Series is an annual event of AAAI. Members of AAAI will receive a mailing from them. Attendance is by submission and acceptance of materials only, except that AAAI will fill available spaces if the program committee does not select enough people. Enough people is 30-40. No proceedings will not be published, but working papers, with submissions from attendees, will be distributed at the Symposium. Sorry for the momentary confusion -Charlie Dolan From tsejnowski at UCSD.EDU Fri Jul 13 21:06:21 1990 From: tsejnowski at UCSD.EDU (Terry Sejnowski) Date: Fri, 13 Jul 90 18:06:21 PDT Subject: NEURAL COMPUTATION 2:2 Message-ID: <9007140106.AA07155@sdbio2.UCSD.EDU> NEURAL COMPUTATION Table of Contents -- Volume 2:2 Visual Perception of Three-Dimensional Motion David J. Heeger and Allan Jepson Distributed Symbolic Representation of Visual Shape Eric Saund Modeling Orientation Discrimination at Multiple Reference Orientations with a Neural Network M. Devos and G. A. Orban Temporal Differentiation and Violation of Time-Reversal Invariance in Neurocomputation of Visual Information D. S. Tang and V. Menon Analysis of Linsker's Simulations of Hebbian Rules David J. C. MacKay and Kenneth D. Miller Applying Temporal Difference Methods to Fully Recurrent Reinforcement Learning Networks Jurgen Schmidhuber Generalizing Smoothness Constraints from Discrete Samples Chuanyi Ji, Robert R. Snapp, and Demetri Psaltis The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks Marcus Frean Layered Neural Networks with Gaussian Hidden Units As Universal Approximations Eric Hartman, James D. Keeler, and Jacek M. Kowalski A Neural Network for Nonlinear Bayesian Estimation in Drug Therapy Reza Shadmehr and David D'Argenio Analysis of Neural Networks with Redundancy Yoshio Izui and Alex Pentland Stability of the Random Neural Network Model Erol Gelenbe The Perceptron Algorithm Is Fast for Nonmaliciouis Distribution Eric B. Baum SUBSCRIPTIONS: Volume 2 ______ $35 Student ______ $50 Individual ______ $100 Institution Add $12. for postage outside USA and Canada surface mail. Add $18. for air mail. (Back issues of volume 1 are available for $25 each.) MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142. (617) 253-2889. ----- From jmerelo at ugr.es Sun Jul 15 13:33:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 15 Jul 90 19:33 +0200 Subject: Visit to Poland Message-ID: <111*jmerelo@ugr.es> I am a professor and PhD student working on the subject of neural networks for speech recognition. I am going to make a tourist visit to Poland on August 12, and I would like to meet anybody working on the same subjecto. If anybody is interested, please e -mail to my address so that we can arrange a meeting. Thanks JJ MERELO JMERELO at UGR.ES ================== From jmerelo at ugr.es Sun Jul 15 13:33:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 15 Jul 90 19:33 +0200 Subject: Anybody in Poland working in NN Message-ID: <95*jmerelo@ugr.es> I am going to Poland very soon on a tourist visit, and I would like to contact with anybody working on the same subject. If there is anybody in Cracow or Warszaw working on any aspect of Neural Networks, please e-mail to JJ Merelo JMERELO at UGR.ES giving me his/her name, address and e-mail address, and we will try to contact when I go to his/her city. The visit will be on the 3rd week of August. From fozzard at boulder.Colorado.EDU Tue Jul 17 14:19:01 1990 From: fozzard at boulder.Colorado.EDU (Richard Fozzard) Date: Tue, 17 Jul 90 12:19:01 -0600 Subject: Networks for pattern recognition problems? Message-ID: <9007171819.AA06599@alumni.colorado.edu> Do you know of any references to work done using connectionist (neural) networks for pattern recognition problems? I particularly am interested in problems where the network was shown to outperform traditional algorithms. I am working on a presentation to NOAA (National Oceanic and Atmospheric Admin.) management that partially involves pattern recognition and am trying to argue against the statement: "...results thus far [w/ networks] have not been notably more impressive than with more traditional pattern recognition techniques". I have always felt that pattern recognition is one of the strengths of connectionist network approaches over other techniques and would like some references to back this up. thanks much, rich ======================================================================== Richard Fozzard "Serendipity empowers" Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303 fozzard at boulder.colorado.edu (303)497-6011 or 444-3168 From ahmad at ICSI.Berkeley.EDU Tue Jul 17 16:30:06 1990 From: ahmad at ICSI.Berkeley.EDU (Subutai Ahmad) Date: Tue, 17 Jul 90 13:30:06 PDT Subject: New tech report Message-ID: <9007172030.AA02134@icsib18.Berkeley.EDU> The following technical report is available: A Network for Extracting the Locations of Point Clusters Using Selective Attention by Subutai Ahmad & Stephen Omohundro International Computer Science Institute ICSI Technical Report #90-011 Abstract This report explores the problem of dynamically computing visual relations in connectionist systems. It concentrates on the task of learning whether three clumps of points in a 256x256 image form an equilateral triangle. We argue that feed-forward networks for solving this task would not scale well to images of this size. One reason for this is that local information does not contribute to the solution: it is necessary to compute relational information such as the distances between points. Our solution implements a mechanism for dynamically extracting the locations of the point clusters. It consists of an efficient focus of attention mechanism and a cluster detection scheme. The focus of attention mechanism allows the system to select any circular portion of the image in constant time. The cluster detector directs the focus of attention to clusters in the image. These two mechanisms are used to sequentially extract the relevant coordinates. With this new representation (locations of the points) very few training examples are required to learn the correct function. The resulting network is also very compact: the number of required weights is proportional to the number of input pixels. Copies can be obtained in one of two ways: 1) ftp a postscript copy from cheops.cis.ohio-state.edu. The file is ahmad.tr90-11.ps.Z in the pub/neuroprose directory. You can either use the Getps script or follow these steps: unix:2> ftp cheops.cis.ohio-state.edu Connected to cheops.cis.ohio-state.edu. Name (cheops.cis.ohio-state.edu:): anonymous 331 Guest login ok, send ident as password. Password: neuron 230 Guest login ok, access restrictions apply. ftp> cd pub/neuroprose ftp> binary ftp> get ahmad.tr90-11.ps.Z ftp> quit unix:4> uncompress ahmad.tr90-11.ps.Z unix:5> lpr ahmad.tr90-11.ps 2) Order a hard copy from ICSI: The cost is $1.75 per copy for postage and handling. Please enclose your check with the order. Charges will be waived for ICSI sponsors and for institutions that have exchange agreements with the Institute. Make checks payable (in U.S. Dollars only) to "ICSI" and send to: International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 Be sure to mention TR 90-011 and include your physical address. For more information, send e-mail to: info at icsi.berkeley.edu --Subutai Ahmad ahmad at icsi.berkeley.edu From MJ_CARTE%UNHH.BITNET at vma.CC.CMU.EDU Tue Jul 17 17:53:00 1990 From: MJ_CARTE%UNHH.BITNET at vma.CC.CMU.EDU (MJ_CARTE%UNHH.BITNET@vma.CC.CMU.EDU) Date: Tue, 17 Jul 90 16:53 EST Subject: Tech Report Available Message-ID: ***** Please Do Not Distribute to Other Bulletin Boards ***** ***** Please Do Not Distribute to Other Bulletin Boards ***** The following technical report may be requested in hardcopy by sending e-mail to e_mcglauflin at unhh.bitnet and specifying UNH Intelligent Structures Group Report ECE.IS.90.03, or by sending physical mail to Michael J. Carter Dept. of Electrical and Computer Engineering University of New Hampshire Durham, NH 03824-3591 Abstract follows: "Slow Learning in CMAC Networks and Implications for Fault Tolerance" by M.J. Carter, A.J. Nucci, E. An, W.T. Miller, and F.J. Rudolph Intelligent Structures Group Dept. of Electrical and Computer Engineering University of New Hampshire Durham, NH 03824 The overlapping structure of receptive fields in the CMAC network can produce situations in which learning is unusually slow. It is shown that sinusoidal functions with spatial frequency near certain critical frequencies are particularly hard to learn with conventional CMAC networks. Moreover, the resulting learned weights have significantly greater RMS value near these critical frequencies, and this poses some concern for network fault tolerance. It is then demonstrated that CMAC networks using tapered receptive fields often exhibit faster learning near the critical frequencies, and the resulting learned weights can have smaller RMS value than those obtained using the conventional CMAC. **** Please Do Not Distribute to Other Bulletin Boards **** Mike Carter From elman at amos.ucsd.edu Fri Jul 20 19:09:46 1990 From: elman at amos.ucsd.edu (Jeff Elman) Date: Fri, 20 Jul 90 16:09:46 PDT Subject: TR announcement: 'Learning & evolution in neural networks' Message-ID: <9007202309.AA05366@amos.ucsd.edu> Learning and Evolution in Neural Networks by Stefano Nolfi Jeffrey L. Elman Domenico Parisi CRL-TR-9019, July 1990 Center for Research in Language University of California, San Diego La Jolla, CA 92093-0126 ABSTRACT In this report we present the results of a series of simula- tions in which neural networks undergo change as a result of two forces: learning during the "lifetime" of a network, and evolutionary change over the course of several "generations" of networks. The results demonstrate how complex and apparently purposeful behavior can arise from random varia- tion in networks. We believe that these results provide a good starting basis for modeling the more complex phenomena observed in biological systems. A more specific problem for which our results may be relevant is determining the role of behavior in evolution (Plotkin, 1988); that is, how learning at the individual level can have an influence on evolution at the population level within a strictly Darwinian--not Lamarckian--framework. -------------------------------------- Copies of this technical report may be requested by email, by sending a letter to crl at amos.ucsd.edu and requesting TR- 9019. From Atul.Chhabra at UC.EDU Sat Jul 21 14:06:31 1990 From: Atul.Chhabra at UC.EDU (atul k chhabra) Date: Sat, 21 Jul 90 14:06:31 EDT Subject: (Summary) Re: Character Recognition Bibliography? Message-ID: <9007211806.AA21705@uceng.UC.EDU> Here is a summary of what I received in response to my request for references on character recognition. I had asked for references in all aspects of character recognition -- preprocessing and segmentation, OCR, typewritten character recognition, handwritten character recognition, neural network based recognition, statistical and syntactic recognition, hardware implementations, and commercial character recognition systems. THANKS TO ALL WHO RESPONDED. IF ANYONE OUT THERE HAS MORE REFERENCES, PLEASE EMAIL ME. I WILL SUMMARIZE NEW RESPONSES AFTER ANOTHER TWO WEEKS. THANKS. Atul Chhabra Department of Electrical & Computer Engineering University of Cincinnati, ML 030 Cincinnati, OH 45221-0030 Phone: (513)556-6297 Email: achhabra at uceng.uc.edu ---------------------------------------------------------- From: Sol Sol Delgado Instituto de Automatica Industial La Poveda Arganda del Rey 28500 MADRID SPAIN sol at iai.es [ 1]_ Off-Line cursive script word recognition Radmilo M. Bozinovic, Sargur N. Srihari IEEE transactions on pattern analysis and machine intelligence. Vol. 11, January 1989. [ 2]_ Visual recognition of script characters. Neural networt architectures. Jodef Skrzypek, Jeff Hoffman. MPL (Machine Perception Lab). Nov 1989. [ 3]_ On recognition of printed characters of any font and size. Simon Kahan, Theo Pavlidis, Henry S. Baird. IEEE transactions on pattern analysis and machine intelligence Vol PAMI_9, No 2, March 1987. [ 4]_ Research on machine recognition of handprinted characters. Shunji Mori, Kazuhiko Yamamoto, Michio Yasuda. IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_6, No 4. July 1984. [ 5]_ A pattern description and generation method of structural characters Hiroshi nagahashi, Mikio Nakatsuyama. IEEE transactions on pattern analysis and machine intelligence Vol PAMI_8, No 1, January 1986. [ 6]_ An on-line procedure for recognition of handprinted alphanumeric characters. W. W. Loy, I. D. Landau. IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_4, No 4, July 1982. [ 7]_ A string correction algorithm for cursive script recognition. Radmilo Bozinovic, Sargur N. Srihari. IEEE transactions on pattern analysis and machine inteligence. Vol PAMI_4, No 6, November 1982. [ 8]_ Analisys and design of a decision tree based on entropy reduction and its application to large character set recognition Qing Ren Wang, Ching Y. Suen. IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_6, No 4, July 1984. [ 9]_ A method for selecting constrained hand-printed character shapes for machine recognition Rajjan Shinghal, Ching Y. Suen IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_4, No 1, January 1982 [10]_ Pixel classification based on gray level and local "busyness" Philip A. Dondes, Azriel Rosenfeld. IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_4, No 1, January 1982. [11]_ Experiments in the contextual recognition of cursive script Roger W. Ehrich, Kenneth J. Koehler IEEE transactions on computers, vol c-24, No. 2, February 1975. [12]_ Character recognition by computer and applications. Ching Y. Suen. Handbook of pattern recognition and image procesing. ACADEMIC PRESS, INC. August 1988. [13]_ A robust algorithm for text string separation from mixed text/graphics images Lloyd Alan Fletcher, Rangachar Kasturi IEEE transactions on pattern analysis and machine intelligence. Vol 10, No 6, November 1988. [14]_ Segmentation of document images. Torfinn Taxt, Patrick J. Flynn, Anil K. Jain IEEE transactions on pattern analysis and machine intelligence. Vol 11, No 12, december 1989. [15]_ Experiments in text recognition with Binary n_Gram and Viterbi algorithms. Jonathan J. Hull, Sargur N. Srihari IEEE transactions on pattern analysis and machine intelligence Vol PAMI-4, No 5, september 1982. [16]- Designing a handwriting reader. D. J. Burr IEEE transactions on pattern analysis and machine intelligence Vol PAMI-5, No 5, september 1983. [17]_ Experiments on neural net recognition of spoken and written text David J. Burr IEEE transactions on acoustics, speech and signals processing vol 36, No 7, july 1988 [18]_ Experimets with a connectionist text reader D. J. Burr Bell communications research Morristow, N. J. 07960 [19]_ An Algorithm for finding a common structure shared by a family of strings Anne M. Landraud, Jean-Francois Avril, Philippe Chretienne. IEEE transactions on pattern analysis and machine intelligence Vol 11, No 8, august 1989 [20]_ Word_level recognition of cursive script Raouf F. H. Farag IEEE transactions on computers Vol C-28, No 2, february 1979 [21]_ Pattern Classification by neural network: an experimental system for icon recognition Eric Gullichsen, Ernest Chang Marzo, 1987 [22]_ Recognition of handwritten chinese characters by modified hough transform techniques. Fang-Hsuan Cheng, Wen-Hsing Hsu, Mei-Ying Chen IEEE transactions on pattern analysis and machine intelligence Vol 11, No 4, April 1989 [23]_ Inheret bias and noise in the Hough transform Christopher M. Brown IEEE transactions on pattern analysis and machine intelligence Vol PAMI-5, No 5, september 1983. [24]_ From pixels to features J. C. Simon North-Holland _ Feature selection and Language syntax in text recognition. J.J. Hull _ Feature extraction for locating address blocks on mail pieces. S.N. Srihari. [25]_ A model for variability effects in hand-printing, with implications for the design of on line character recognition systems. J.R. Ward and T. Kuklinski. IEEE transactions on systems, man and cybernetics. Vol 18, No 3, May/June 1988. [26]_ Selection of a neural network system for visual inspection. Paul J. Stomski, Jr and Adel S. Elmaghraby Engineering Mathematics And Computer Science University of Louisville, Kentucky 40292 [27]_ Self-organizing model for pattern learning and its application to robot eyesight. Hisashi Suzuki, Suguru Arimoto. Proceedings of the fourth conference on A.I. san Diego, March 1988. The computer society of the IEEE. ---------------------------------------------------------- From: J. Whiteley I only have five references I can offer, all are from the Proceedings of the 1989 International Joint Conference on Neural Networks held in Washington D.C. Yamada, K. Kami, H. Tsukumo, J. Temma, T. Handwritten Numeral Recognition by Multi-layered Neural Network with Improved Learning Algorithm Volume II, pp. 259-266 Morasso, P. Neural Models of Cursive Script Handwriting Volume II, pp.539-542 Guyon, I. Poujaud, I. Personnaz, L. Dreyfus, G. Comparing Different Neural Network Architectures for Classifying Handwritten Digits Volume II, pp.127-132 Weideman, W.E. A Comparison of a Nearest Neighbor Classifier and a Neural Network for Numeric Handprint Character Recognition Volume I, pp.117-120 Barnard, E. Casasent, D. Image Processing for Image Understanding with Neural Nets Volume I, pp.111-115 Hopefully you are being deluged with references. --Rob Whiteley Dept. of Chemical Engineering Ohio State University email: whiteley-j at osu-20.ircc.ohio-state.edu ------- ---------------------------------------------------------- From: avi at dgp.toronto.edu (Avi Naiman) %L Baird 86 %A H. S. Baird %T Feature Identification for Hybrid Structural/Statistical Pattern Classification %R Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition %D June 1986 %P 150-155 %L Casey and Jih 83 %A R. G. Casey %A C. R. Jih %T A Processor-Based OCR System %J IBM Journal of Research and Development %V 27 %N 4 %D July 1983 %P 386-399 %L Cash and Hatamian 87 %A G. L. Cash %A M. Hatamian %T Optical Character Recognition by the Method of Moments %J Computer Vision, Graphics and Image Processing %V 39 %N 3 %D September 1987 %P 291-310 %L Chanda et al. 84 %A B. Chanda %A B. B. Chaudhuri %A D. Dutta Majumder %T Some Algorithms for Image Enhancement Incorporating Human Visual Response %J Pattern Recognition %V 17 %D 1984 %P 423-428 %L Cox et al. 74 %A C. Cox %A B. Blesser %A M. Eden %T The Application of Type Font Analysis to Automatic Character Recognition %J Proceedings of the Second International Joint Conference on Pattern Recognition %D 1974 %P 226-232 %L Frutiger 67 %A Adrian Frutiger %T OCR-B: A Standardized Character for Optical Recognition %J Journal of Typographic Research %V 1 %N 2 %D April 1967 %P 137-146 %L Goclawska 88 %A Goclawska %T Method of Description of the Alphanumeric Printed Characters by Signatures for Automatic Text Readers %J AMSE Review %V 7 %N 2 %D 1988 %P 31-34 %L Gonzalez 87 %A Gonzalez %T Designing Balance into an OCR System %J Photonics Spectra %V 21 %N 9 %D September 1987 %P 113-116 %L GSA 84 %A General Services Administration %T Technology Assessment Report: Speech and Pattern Recognition; Optical Character Recognition; Digital Raster Scanning %I National Archives and Records Service %C Washington, District of Columbia %D October 1984 %L Hull et al. 84 %A J. J. Hull %A G. Krishnan %A P. W. Palumbo %A S. N. Srihari %T Optical Character Recognition Techniques in Mail Sorting: A Review of Algorithms %R 214 %I SUNY Buffalo Computer Science %D June 1984 %L IBM 86 %A IBM %T Character Recognition Apparatus %J IBM Technical Disclosure Bulletin %V 28 %N 9 %D February 1986 %P 3990-3993 %L Kahan et al. 87 %A A. Kahan %A Theo Pavlidis %A H. S. Baird %T On the Recognition of Printed Characters of any Font and Size %J IEEE Transactions on Pattern Analysis and Machine Intelligence %V PAMI-9 %N 2 %D March 1987 %P 274-288 %L Lam and Baird 86 %A S. W. Lam %A H. S. Baird %T Performance Testing of Mixed-Font, Variable-Size Character Recognizers %R AT&T Bell Laboratories Computing Science Technical Report No. 126 %C Murray Hill, New Jersey %D November 1986 %L Lashas et al. 85 %A A. Lashas %A R. Shurna %A A. Verikas %A A. Dosimas %T Optical Character Recognition Based on Analog Preprocessing and Automatic Feature Extraction %J Computer Vision, Graphics and Image Processing %V 32 %N 2 %D November 1985 %P 191-207 %L Mantas 86 %A J. Mantas %T An Overview of Character Recognition Methodologies %J Pattern Recognition %V 19 %N 6 %D 1986 %P 425-430 %L Murphy 74 %A Janet Murphy %T OCR: Optical Character Recognition %C Hatfield %I Hertis %D 1974 %L Nagy 82 %A G. Nagy %T Optical Character Recognition \(em Theory and Practice %B Handbook of Statistics %E P. R. Krishnaiah and L. N. Kanal %V 2 %I North-Holland %C Amsterdam %D 1982 %P 621-649 %L Pavlidis 86 %A Theo Pavlidis %T A Vectorizer and Feature Extractor for Document Recognition %J Computer Vision, Graphics and Image Processing %V 35 %N 1 %D July 1986 %P 111-127 %L Piegorsch et al. 84 %A W. Piegorsch %A H. Stark %A M. Farahani %T Application of Image Correlation for Optical Character Recognition in Printed Circuit Board Inspection %R Proceedings of SPIE \(em The International Society for Optical Engineering: Applications of Digital Image Processing VII %V 504 %D 1984 %P 367-378 %L Rutovitz 68 %A D. Rutovitz %T Data Structures for Operations on Digital Images %B Pictorial Pattern Recognition %E G. C. Cheng et al. %I Thompson Book Co. %C Washington, D. C. %D 1968 %P 105-133 %L Smith and Merali 85 %A J. W. T. Smith %A Z. Merali %T Optical Character Recognition: The Technology and its Application in Information Units and Libraries %R Library and Information Research Report 33 %I The British Library %D 1985 %L Suen 86 %A C. Y. Suen %T Character Recognition by Computer and Applications %B Handbook of Pattern Recognition and Image Processing %D 1986 %P 569-586 %L Wang 85 %A P. S. P. Wang %T A New Character Recognition Scheme with Lower Ambiguity and Higher Recognizability %J Pattern Recognition Letters %V 3 %D 1985 %P 431-436 %L White and Rohrer 83 %A J.M. White %A G.D. Rohrer %T Image Thresholding for Optical Character Recognition and Other Applications Requiring Character Image Extraction %J IBM Journal of Research and Development %V 27 %N 4 %D July 1983 %P 400-411 %L Winzer 75 %A Gerhard Winzer %T Character Recognition With a Coherent Optical Multichannel Correlator %J IEEE Transactions on Computers %V C-24 %N 4 %D April 1975 %P 419-423 ---------------------------------------------------------- From: nad at computer-lab.cambridge.ac.uk Hi, I've only got two references for you - but they have 42 and 69 references, respectively (some of the refs will be the same, but you get at least 69 references!). They are: "An overview of character recognition methodologies" J. Mantas Pattern Recognition, Volume 19, Number 6, 1986 pages 425-430 "Methodologies in pattern recognition and image analysis - a brief survey" J. Mantas Pattern Recognition, Volume 20, Number 1, 1987 pages 1-6 Neil Dodgson ============ ---------------------------------------------------------- From: YAEGER.L at AppleLink.Apple.COM I presume you know of "The 1989 Neuro-Computing Bibliography" edited by Casimir C. Klimasauskas, a Bradford Book, from MIT Press. It lists 11 references for character recognition in its index. - larryy at apple.com ---------------------------------------------------------- From: Tetsu Fujisaki 1. Suen, C. Y., Berthod, M., and Mori, S., "Automatic Recognition of Handprinted Characters - The State of the Art", Proc. IEEE, 68, 4 (April 1980) 469-487 2. Tappert, C. C., Suen, C. Y., and Wakahara T, "The State-of-the-Art in on-line handwriting recognition", IEEE Proc. 9th Int'l Conf. on Pattern Recognition, Rome Italy, Nov. 1988. Also in IBM RC 14045. ---------------------------------------------------------- From: burrow at grad1.cis.upenn.edu (Tom Burrow) Apparently, the state of the art in connectionism, as a lot of people will tell you, I'm sure, is Y. Le Cun et al's work which can be found in NIPS 90. Other significant connectionist approaches are Fukushima's neocognitron and Denker et al's work which I *believe* is in NIPS 88. I am interested in handprinted character recognition. Type set character recognition is basically solved, and I believe you shouldn't have any trouble locating texts on this (although I've only looked at the text edited by Kovalevsky (sp?), which I believe is just entitled "Reading Machines". Bayesian classifiers, which you can read about in any statistical pattern recognition text (eg, Duda and Hart, Gonzalez, etc), are capable of performing recognition, since one can choose reliable features present in machine printed text (eg, moments, projections, etc), and the segmentation problem is fairly trivial). Perhaps the greatest problem in handprinted recognition is the segmentation problem. Unfortunately, most connectionist approaches fail miserably in this respect, relying on traditional methods for segmentation which become a bottleneck. I am inspecting connectionist methods which perform segmentation and recognition concurrently, and I recommend you do not inspect the problems independently. I am by no means expert in any area which I've commented on, but I hope this helps. Also, again, please send me your compiled responses. Thank you and good luck. Tom Burrow ---------------------------------------------------------- From skrzypek at CS.UCLA.EDU Sun Jul 22 18:22:43 1990 From: skrzypek at CS.UCLA.EDU (Dr. Josef Skrzypek) Date: Sun, 22 Jul 90 15:22:43 PDT Subject: IJPRAI CALL FOR PAPERS Message-ID: <9007222222.AA05465@retina.cs.ucla.edu> IJPRAI CALL FOR PAPERS IJPRAI We are organizing a special issue of IJPRAI (Intl. Journal of Pattern Recognition and Artificial Intelligence) dedicated to the subject of neural networks in vision and pattern recognition. Papers will be refereed. The plan calls for the issue to be published in the fall of 1991. I would like to invite your participation. DEADLINE FOR SUBMISSION: 10th of December, 1990 VOLUME TITLE: Neural Networks in Vision and Pattern Recognition VOLUME GUEST EDITORS: Prof. Josef Skrzypek and Prof. Walter Karplus Department of Computer Science, 3532 BH UCLA Los Angeles CA 90024-1596 Email: skrzypek at cs.ucla.edu or karplus at cs.ucla.edu Tel: (213) 825 2381 Fax: (213) UCLA CSD DESCRIPTION The capabilities of neural architectures (supervised and unsupervised learning, feature detection and analysis through approximate pattern matching, categorization and self-organization, adaptation, soft constraints, and signal based processing) suggest new approaches to solving problems in vision, image processing and pattern recognition as applied to visual stimuli. The purpose of this special issue is to encourage further work and discussion in this area. The volume will include both invited and submitted peer-reviewed articles. We are seeking submissions from researchers in relevant fields, including, natural and artificial vision, scientific computing, artificial intelligence, psychology, image processing and pattern recognition. "We encourage submission of: 1) detailed presentations of models or supporting mechanisms, 2) formal theoretical analyses, 3) empirical and methodological studies. 4) critical reviews of neural networks applicability to various subfields of vision, image processing and pattern recognition. Submitted papers may be enthusiastic or critical on the applicability of neural networks to processing of visual information. The IJPRAI journal would like to encourage submissions from both, researchers engaged in analysis of biological systems such as modeling psychological/neurophysiological data using neural networks as well as from members of the engineering community who are synthesizing neural network models. The number of papers that can be included in this special issue will be limited. Therefore, some qualified papers may be encouraged for submission to the regular issues of IJPRAI. SUBMISSION PROCEDURE Submissions should be sent to Josef Skrzypek, by 12-10-1990. The suggested length is 20-22 double-spaced pages including figures, references, abstract and so on. Format details, etc. will be supplied on request. Authors are strongly encouraged to discuss ideas for possible submissions with the editors. The Journal is published by the World Scientific and was established in 1986. Thank you for your considerations. From wilson at Think.COM Mon Jul 23 16:09:00 1990 From: wilson at Think.COM (Stewart Wilson) Date: Mon, 23 Jul 90 16:09:00 EDT Subject: SAB90 Announcement Message-ID: <9007232009.AA07579@nugodot.think.com> Dear Connectionist list Manager, Would you kindly post this Announcement for the SAB90 Conference on your bulletin board? Thank you. Stewart Wilson ======================================================================= ANNOUNCEMENT Simulation of Adaptive Behavior: From Animals to Animats An International Conference To be held in Paris, September 24-28, 1990 Sponsored by Ecole Normale Superieure US Air Force Office of Scientific Research Electricite de France IBM France Computers, Communications and Visions (C2V) Offilib and a Corporate Donor 1. Conference dates and site The conference will take place Monday through Friday, September 24-28, 1990 at the Ministere de la Recherche et de la Technologie, 1 rue Descartes, Paris, France. 2. Conference Committee Conference chair Dr. Jean-Arcady Meyer Dr. Stewart W. Wilson Ecole Normale Superieure The Rowland Institute for Science France USA Organizing Committee Groupe de BioInformatique Ecole Normale Superieure France Program Committee Lashon Booker, U.S. Naval Research Lab, USA Rodney Brooks, MIT Artificial Intelligence Lab, USA Patrick Colgan, Queen's University at Kingston, Canada Patrick Greussay, Universite Paris VIII, France David McFarland, Oxford Balliol College, UK Luc Steels, VUB AI Lab, Belgium Richard Sutton, GTE Laboratories, USA Frederick Toates, The Open University, UK David Waltz, Thinking Machines Corp. and Brandeis University, USA 3. Official language: English 4. Conference Objective The conference objective is to bring together researchers in ethology, ecology, cybernetics, artificial intelligence, robotics, and related fields so as to further our understanding of the behaviors and underlying mechanisms that allow animals and, potentially, robots to adapt and survive in uncertain environments. Said somewhat differently, the objective is to investigate how the robot can aid in comprehending the animal and, inversely, to seek inspiration from the animal in the construction of autonomous robots. The conference will provide opportunities for dialogue between specialists with different scientific perspectives--ethology and artificial intelligence notably--a dialogue that will be enhanced by the common technical language imposed by simulation models. As the first of its kind in the world, the conference will make it possible to establish not only the state of the art of "adaptive autonomous systems, natural and artificial", but a list of the most promising future research topics. The conference is expected to promote: 1. Identification of the organizational principles, functional laws, and minimal properties that make it possible for a real or artificial system to persist in an uncertain environment. 2. Better understanding of how and under what conditions such systems can themselves discover these principles through conditioning, learning, induction, or processes of self-organization. 3. Specification of the applicability of the theoretical knowledge thus acquired to the building of autonomous robots. 4. Improved theoretical and practical knowledge concerning adaptive systems in general, both natural and artificial. Finally, special emphasis will be given to the following topics, as viewed from the perspective of adaptive behavior: Individual and collective behaviors Autonomous robots Action selection and behavioral Hierarchical and parallel organizations sequences Self organization of behavioral Conditioning, learning and induction modules Neural correlates of behavior Problem solving and planning Perception and motor control Goal directed behavior Motivation and emotion Neural networks and classifier Behavioral ontogeny and evolution systems Cognitive maps and internal Emergent structures and behaviors world models 5. Conference Proceedings The proceedings will be published about two months after the end of the conference by The MIT Press/Bradford Books. 6. Conference Organization Among the papers received by the organizers and reviewed by the Program Committee members, approximately 50 have been accepted for publication in the proceedings. They will be presented as talks or posters. (To receive by e-mail a preliminary program please contact one of the conference chairmen). Since the conference intersects animal and "animat" research, lively interaction can be expected, including controversy. At least one panel discussion will be organized around the theme of what each viewpoint can contribute to the other. Because the conference is emphasizing simulation models, it is anticipated that many participants will have computer programs demonstrating their work. To make such demonstrations possible, the Organizers will provide workstations and video equipment. An evening session during the week will be devoted to demonstrations. Morning and afternoon coffee breaks will be provided. To further promote interaction among a diverse group of participants, the conference will provide lunch each day. 7. Additional Information Additional information can be obtained from the chairmen: Dr. Jean-Arcady Meyer Groupe de Bioinformatique URA686.Ecole Normale Superieure 46 rue d'Ulm 75230 Paris Cedex 05 France e-mail: meyer at frulm63.bitnet meyer at hermes.ens.fr Tel: (1) 43.29.12.25 FAX: (1) 43.29.81.72 Dr. Stewart W. Wilson The Rowland Institute for Science 100 Cambridge Parkway Cambridge, MA 02142 USA e-mail: wilson at think.com Tel: (617) 497-4650 FAX: (617) 497-4627 8. Travel and Lodging Participants will be responsible for their own travel and lodging arrangements. However, you may contact any of three hotel reservations services which have agreed to offer advantageous locations and rates to participants in SAB90. We advise making early reservations and mentioning "SAB90" in your request. These services are: - Hotel Pullman Saint-Jacques(****): rooms at 800-900 FF, fax (33 1 45 88 43 93) - Tradotel(*** and **): rooms at 440-520 FF, fax (33 1 47 27 05 87) - AJF: student rooms at 80-90 FF, fax (33 1 40 27 08 71) 9. Registration fees Attendance at SAB90 will be open to any person paying the registration fee which is set at $ 220 (or 1200 FF) for non-students and $ 110 (or 600 FF) for students. The registration fee covers five lunches, coffee-breaks, and a copy of the Proceedings. ****************************************************************************** *WARNING: The audience size is strictly limited to 150 persons. Registrations* *will be closed beyond this number. * ****************************************************************************** REGISTRATION FORM ------------------------------------------------------------------------------ Last name: First name: Profession/Title: Organization: Address: State/Zip Code/Country: Telephone: Fax: E-mail: ------------------------------------------------------------------------------ This form should be sent to: Dr. Jean-Arcady MEYER Groupe de BioInformatique URA686. Ecole Normale Superieure 46 rue d'Ulm 75230 PARIS Cedex 05 FRANCE with a check for the registration fee to the order of: J.A. MEYER 'SAB90' The check can be in US Dollars or French Francs. To receive the student rate, please attach evidence of student status from your University or Scientific Advisor. ============================================================================== From booker at AIC.NRL.Navy.Mil Tue Jul 24 13:10:16 1990 From: booker at AIC.NRL.Navy.Mil (booker@AIC.NRL.Navy.Mil) Date: Tue, 24 Jul 90 13:10:16 EDT Subject: Call for Papers - ICGA-91 Message-ID: <9007241710.AA00778@sun7.aic.nrl.navy.mil> Call for Papers ICGA-91 The Fourth International Conference on Genetic Algorithms The Fourth International Conference on Genetic Algorithms (ICGA-91), will be held on July 13-16, 1991 at the University of California - San Diego in La Jolla, CA. This meeting brings together an international community from academia, government, and industry interested in algorithms suggested by the evolutionary process of natural selection. Topics of particular interest include: genetic algorithms and classifier systems, machine learning and optimization using these systems, and their relations to other learning paradigms (e.g., connectionist networks). Papers discussing how genetic algorithms and classifier systems are related to biological modeling issues (e.g., evolution of nervous systems, computational ethology, artificial life) are encouraged. Papers describing significant, unpublished research in this area are solicited. Authors must submit four (4) complete copies of their paper, postmarked by February 1, 1991, to the Program Co-Chair: Dr. Richard K. Belew Computer Science & Engr. Dept. (C-014) Univ. California - San Diego La Jolla, CA 92093 Electronic submissions (LaTeX source only) can be mailed to rik at cs.ucsd.edu. Papers should be no longer than 10 pages, single spaced, and printed using 12 pt. type. All papers will be subject to peer review. Evaluation criteria include the significance of results, originality, and the clarity and quality of the presentation. Important Dates: February 1, 1991: Submissions must be postmarked March 22, 1991: Notification to authors mailed May 6, 1991: Revised, final camera-ready paper due July 13-16, 1991: Conference dates ICGA-91 Conference Committee: Conference Co-Chairs: Kenneth A. De Jong, George Mason University J. David Schaffer, Philips Labs Vice Chair and Publicity: David E. Goldberg, Univ. of Illinois at Urbana-Champaign Program Co-Chairs: Richard K. Belew, Univ. of California at San Diego Lashon B. Booker, MITRE Financial Chair: Gil Syswerda, BBN Local Arrangements: Richard K. Belew, Univ. of California at San Diego From PVR%AUTOCTRL.RUG.AC.BE at VMA.CC.CMU.EDU Wed Jul 25 18:19:00 1990 From: PVR%AUTOCTRL.RUG.AC.BE at VMA.CC.CMU.EDU (PVR%AUTOCTRL.RUG.AC.BE@VMA.CC.CMU.EDU) Date: Wed, 25 Jul 90 18:19 N Subject: Neocognitron information wanted References: > The Transputer Lab, Grotesteenweg Noord 2, +32 91 22 57 55 Message-ID: Dear neural netters, I have a research student who is implementing Prof. Fukushima's neocognitron. The network will be used for object recognition and will finally be implemented on a multiprocessor network. The student is facing a large number of problems, for which we are not always able to find a solution. We would therefore like to get in touch with other researchers who have tried to implement the neocognitron or who have thoroughly studied this particular type of network. Could you please send a message to: pvr at autoctrl.rug.ac.be or to pvr at bgerug51.bitnet If you have technical reports that could make our work easier, we would certainly appreciate to have a copy. In return, we will send you some articles about the implementation and application. Many thanks in advance, Patrick ***************************************************************************** * Patrick Van Renterghem, BITNET: pvr at bgerug51.bitnet * * R&D Assistant, EDU: pvr%bgerug51.bitnet at cunyvm.cuny.edu * * State University of Ghent UUCP: mcsun!bgerug51.bitnet!pvr * * Belgium JANET: PVR%earn.bgerug51 at earn-relay * * * * Automatic Control Lab, | Tel: +32 91 22 57 55 ext. 313 * * State University of Ghent, | Fax: +32 91 22 85 91 * * Grotesteenweg Noord 2, | * * B-9710 Ghent-Zwijnaarde, Belgium | * *******You***Don't***Need***A***PhD***To***Write***Parallel***Programs******* From jm2z+ at ANDREW.CMU.EDU Wed Jul 25 15:16:16 1990 From: jm2z+ at ANDREW.CMU.EDU (Javier Movellan) Date: Wed, 25 Jul 90 15:16:16 -0400 (EDT) Subject: Integrating Information Message-ID: I am interested in the problem of integrating information from different processing levels. For instance,if we have a good system to map sounds into phonemes, we may want to use the statistical structure at the phoneme level to improve recognition or to clean up the inputs. Some of the domains in which this issue arises are: speech recognition, character recognition, spell-checkers... If you have references or comments regarding this issue please let me know. Classical statistical approaches, neural network approaches, theoretical perspectives, psychological models, and dirty tricks used in artificial systems are welcomed. I will compile the references/comments and make them public to the connectionist list. Javier From fozzard at boulder.Colorado.EDU Thu Jul 26 12:34:46 1990 From: fozzard at boulder.Colorado.EDU (Richard Fozzard) Date: Thu, 26 Jul 90 10:34:46 -0600 Subject: Summary (long): pattern recognition comparisons Message-ID: <9007261634.AA04726@alumni.colorado.edu> Here are the responses I got for my question regarding comparisons of connectionist methods with traditional pattern recognition techniques. I believe Mike Mozer (among others) puts it best: "Neural net algorithms just let you do a lot of the same things that traditional statistical algorithms allow you to do, but they are more accessible to many people (and perhaps easier to use)." Read on for the detailed responses. (Note: this does not include anything posted to comp.ai.neural-nets, to save on bandwidth) rich ======================================================================== Richard Fozzard "Serendipity empowers" Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303 fozzard at boulder.colorado.edu (303)497-6011 or 444-3168 From honavar at cs.wisc.edu Tue Jul 17 14:53:55 1990 From: honavar at cs.wisc.edu (Vasant Honavar) Date: Tue, 17 Jul 90 13:53:55 -0500 Subject: pattern recognition with nn Message-ID: <9007171853.AA13318@goat.cs.wisc.edu> Honavar, V. & Uhr, L. (1989). Generation, Local Receptive Fields, and Global Convergence Improve Perceptual Learning in Connectionist Networks, In: Proceedings of the 1989 International Joint Conference on Artificial Intelligence, San Mateo, CA: Morgan Kaufmann. Honavar, V. & Uhr, L. (1989). Brain-Structured Connectionist Networks that Perceive and Learn, Connection Science: Journal of Neural Computing, Artificial Intelligence and Cognitive Research, 1 139-159. Le Cun, Y. et al. (1990). Handwritten Digit Recognition With a Backpropagation Network, In: Neural Information Processing Systems 2, D. S. Touretzky (ed.), San Mateo, CA: 1990. Rogers, D. (1990). Predicting Weather Using a Genetic Memory: A Combination of Kanerva's Sparse Distributed Memory With Holland's Genetic Algorithms, In: Neural Information Processing Systems 2, D. S. Touretzky (ed.), San Mateo, CA: 1990. From perry at seismo.CSS.GOV Tue Jul 17 16:39:12 1990 From: perry at seismo.CSS.GOV (John Perry) Date: Tue, 17 Jul 90 16:39:12 EDT Subject: Re Message-ID: <9007172039.AA11042@beno.CSS.GOV> Richard, It depends on which neural network you are using, and the underlying complexity in seperating pattern classes. We at ENSCO have developed a neural network architecture that shows far superior performance over traditional algorithms. Mail me if you are interested. John L. Perry ENSCO, Inc. 5400 Port Royal Road Springfiedl, Virginia 22151 (Springfield) 703-321-9000 email: perry at dewey.css.gov, perry at beno.css.gov From shaw_d at clipr.colorado.edu Tue Jul 17 16:36:00 1990 From: shaw_d at clipr.colorado.edu (Dave Shaw) Date: 17 Jul 90 14:36:00 MDT Subject: Networks for pattern recognition problems? Message-ID: <9007172044.AA24992@boulder.Colorado.EDU> Rich- our experience with the solar data is still inconclusive, but would seem to indicate that neural nets have exhibit no distinct advantage over more traditional techniques, in terms of 'best' performance figures. The reason appears to be that although the task is understood to be non-linear, (which should presumably lead to better performance by non-linear systems such as networks), there is not enough data at the critical points to define the boundaries of the decision surface. This would seem to be a difficulty that all recognition problems must deal with. Dave From kortge at galadriel.Stanford.EDU Tue Jul 17 17:02:44 1990 From: kortge at galadriel.Stanford.EDU (Chris Kortge) Date: Tue, 17 Jul 90 14:02:44 PDT Subject: pattern recognition Message-ID: <9007172102.AA26014@boulder.Colorado.EDU> You may know of this already, but Gorman & Sejnowski have a paper on sonar return classification in Neural Networks Vol. 1, #1, pg 75, where a net did better than nearest neighbor, and comparable to a person. I would be very interested in obtaining your list of "better-than- conventional-methods" papers, if possible (maybe the whole connectionists list would, for that matter). Thanks-- Chris Kortge kortge at psych.stanford.edu From galem at mcc.com Tue Jul 17 18:06:14 1990 From: galem at mcc.com (Gale Martin) Date: Tue, 17 Jul 90 17:06:14 CDT Subject: Networks for pattern recognition problems? Message-ID: <9007172206.AA01492@sunkist.aca.mcc.com> I do handwriting recognition with backprop nets and have anecdotal evidence that the nets do better than the systems developed by some of the research groups we work with. The problem with such comparisons is that the success of the recognition systems depend on the expertise of the developers. There will never be a definitive study. However, I've come to believe that such accuracy comparisons miss the point. Traditional recognition technologies usually involve alot of hand-crafting (e.g., selecting features) that you can avoid by using backprop nets. For example, I can feed a net with "close to" raw inputs and the net learns to segment it into characters, extract features, and classify the characters. You may be able to do this with traditional techniques, but it will take alot longer. Extending the work to different character sets becomes prohibitive; whereas it is a simple task with a net. Gale Martin MCC Austin, TX From ted at aps1.spa.umn.edu Tue Jul 17 18:12:33 1990 From: ted at aps1.spa.umn.edu (Ted Stockwell) Date: Tue, 17 Jul 90 17:12:33 CDT Subject: Networks for pattern recognition problems? In-Reply-To: ; from "Richard Fozzard" at Jul 17, 90 6:16 pm Message-ID: <9007172212.AA05795@aps1.spa.umn.edu> > > Do you know of any references to work done using connectionist (neural) > networks for pattern recognition problems? I particularly am interested > in problems where the network was shown to outperform traditional algorithms. > > I am working on a presentation to NOAA (National Oceanic and Atmospheric > Admin.) management that partially involves pattern recognition > and am trying to argue against the statement: > "...results thus far [w/ networks] have not been notably more > impressive than with more traditional pattern recognition techniques". > This may not be quite what you're looking for, but here are a few suggestions: 1) Pose the question to salespeople who sell neural network software. They probably have faced the question before. 2) One advantage is that the network chacterizes the classes for you. Instead of spending days/weeks/months developing statistical models you can get a reasonable classifier by just handing the training data to the network and let it run overnight. It does the work for you so development costs should be much lower. 3) Networks seem to be more often compared to humans than to other software techniques. I don't have the referrences with me, but I recall that someone (Sejnowski?) developed a classifier for sonar signals that performed slightly better than human experts (which *is* the "traditional pattern recognition technique"). -- Ted Stockwell U of MN, Dept. of Astronomy ted at aps1.spa.umn.edu Automated Plate Scanner Project From mariah!yak at tucson.sie.arizona.edu Tue Jul 17 17:58:40 1990 From: mariah!yak at tucson.sie.arizona.edu (mariah!yak@tucson.sie.arizona.edu) Date: Tue, 17 Jul 90 14:58:40 -0700 Subject: No subject Message-ID: <9007172158.AA29760@tucson.sie.arizona.edu> Dear Dr. Fozzrad, I read your email message on a call for pattern recog. problems for which NN's are known to outperform traditional methods. I've worked in statistics and pattern recognition for some while. Have a fair number of publications. I've been reading th neural net literature and I'd be quite surprised if you get convincing replies in the affirmative, to your quest. My opinion is that stuff even from the '60's and '70's, such as the books by Duda and Hart, Gonzales and Fu, implemented on standard computers, are still much more effective than methodology I've come across using NN algorithms, which are mathematically much more rstrictive. In brief, if you hear of good solid instances favorable to NN's, please let me know. Sincerely, Sid Yakowitz Professor From John.Hampshire at SPEECH2.CS.CMU.EDU Tue Jul 17 21:18:33 1990 From: John.Hampshire at SPEECH2.CS.CMU.EDU (John.Hampshire@SPEECH2.CS.CMU.EDU) Date: Tue, 17 Jul 90 21:18:33 EDT Subject: Networks for pattern recognition problems? Message-ID: <9007180127.AA09023@boulder.Colorado.EDU> Rich, Go talk with Smolensky out there in Boulder. He should be able to give you a bunch of refs. See also works by Lippmann over the past two years. Barak Pearlmutter and I are working on a paper that will appear in the proceedings of the 1990 Connectionist Models Summer School which shows that certain classes of MLP classifiers yield (optimal) Bayesian classification performance on stochastic patterns. This beats traditional linear classifiers... There are a bunch of results in many fields showing that non-linear classifiers out perform more traditional ones. The guys at NOAA aren't up on the literature. One last reference --- check the last few years of NIPS and (to a lesser extent) IJCNN proceedings NIPS = Advances in Neural Information Processing Systems Dave Touretzky ed., Morgan Kaufmann publishers ICJNN = Proceedings of the International Joint Conference on Neural Networks, IEEE Press John From mozer at neuron Tue Jul 17 22:51:28 1990 From: mozer at neuron (Michael C. Mozer) Date: Tue, 17 Jul 90 20:51:28 MDT Subject: Help for a NOAA connectionist "primer" Message-ID: <9007180251.AA04754@neuron.colorado.edu> Your boss is basically correct. Neural net algorithms just let you do a lot of the same things that traditional statistical algorithms allow you to do, but they are more accessible to many people (and perhaps easier to use). There is a growing set of examples where neural nets beat out conventional algorithms, but nothing terribly impressive. And it's difficult to tell in these examples whether the conventional methods were applied appropriately (or the NN algorithm in cases where NNs lose to conventional methods for that matter). Mike From burrow at grad1.cis.upenn.edu Wed Jul 18 00:40:21 1990 From: burrow at grad1.cis.upenn.edu (Tom Burrow) Date: Wed, 18 Jul 90 00:40:21 EDT Subject: procedural vs connectionist p.r. Message-ID: <9007180440.AA16913@grad2.cis.upenn.edu> Sorry, this isn't much of a contribution -- mostly a request for your replies. If you are not going to repost them via the connectionist mailing list, could you mail them to me? No, for my mini-contribution: Yan Lecun, et al's work as seen in NIPS 90 on segmented character recognition is fairly impressive, and they claim that their results are state of the art. Tom Burrow From jsaxon at cs.tamu.edu Wed Jul 18 10:32:08 1990 From: jsaxon at cs.tamu.edu (James B Saxon) Date: Wed, 18 Jul 90 09:32:08 CDT Subject: Networks for pattern recognition problems? In-Reply-To: <23586@boulder.Colorado.EDU> Message-ID: <9007181432.AA08709@cs.tamu.edu> In article <23586 at boulder.Colorado.EDU> you write: >Do you know of any references to work done using connectionist (neural) >networks for pattern recognition problems? I particularly am interested >in problems where the network was shown to outperform traditional algorithms. > >I am working on a presentation to NOAA (National Oceanic and Atmospheric >Admin.) management that partially involves pattern recognition >and am trying to argue against the statement: >"...results thus far [w/ networks] have not been notably more >impressive than with more traditional pattern recognition techniques". > >I have always felt that pattern recognition is one of the strengths of >connectionist network approaches over other techniques and would like >some references to back this up. > >thanks much, rich >======================================================================== >Richard Fozzard "Serendipity empowers" >Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303 >fozzard at boulder.colorado.edu (303)497-6011 or 444-3168 Well, aside from the question of ultimate generality to which the answer is "OF COURSE there are references to neural network pattern recognition systems. The world is completely full of them!" Anyway, maybe you'd better do some more research. Here's a couple off the top of my head: Kohonen is really hot in the area, he's been doing it for at least ten years. Everybody refers to some aspect of his work. I also suggest picking up a copy of the IJCNN '90 San Diego, all 18lbs of it. (International Joint Conference on Neural Networks) But for a preview: I happened to sit in on just the sort of presentation you would have liked to hear. His title was "Meterological Classification of Satellite Imagery Using Neural Network Data Fusion" Oh Boy!!! Big title! Oh, it's by Ira G. Smotroff, Timothy P. Howells, and Steven Lehar. >From the MITRE Corporation (MITRE-Bedford Neural Network Research Group) Bedford, MA 01730. Well the presentation wasn't to hot, he sort of hand waved over the "classification" of his meterological data though he didn't describe what we were looking at. The idea was that the system was supposed to take heterogeneous sensor data (I hope you know these: GOES--IR and visual, PROFS database--wind profilers, barometers, solarometers, thermometers, etc) and combine them. Cool huh. If they had actually done this, I imagine the results would have been pretty good. It seems though that they merely used an IR image and a visual image and combined only these two. Their pattern recognition involved typical modeling of the retina which sort of acts as a band pass filter with orientations, thus it detects edges. Anyway, their claim was the following: "The experiments described showed reasonably good classification performance. There was no attempt to determine optimal performance by adding hidden units [Hey, if it did it without hidden units, it's doing rather well.], altering learning parameters, etc., because we are currently implementing self-scaling learning algorithms which will determine many of those issues automatically. [Which reminds me, Lehar works with Grossberg at Boston University. He's big on pattern recognition too, both analog and digital. Check out Adaptive Resonance Theory, or ART, ART2, ART3.]..." Anyway it looks like a first shot and they went minimal. There's lots they could add to make it work rather well. In terms of performance, I'd just like to make one of those comments... From what I saw at the conference, neural networks will outperform traditional techniques in this sort of area. The conference was brimming over with successful implementations. Anyway... Enough rambling, from a guy who should be writing his thesis right now... Good luck on your presentation! Oh, I think it automacally puts my signature on..... Did it? -- ---- \ / ---- /--------------------------------------------\ James Bennett Saxon | O| | O| | "I aught to join the club and beat you | Visualization Laboratory | | | | | over the head with it." -- Groucho Marx | Texas A&M University ---- ---- <---------------------------------------------/ jsaxon at cssun.tamu.edu From arseno at phy.ulaval.ca Wed Jul 18 10:38:16 1990 From: arseno at phy.ulaval.ca (Henri Arsenault) Date: Wed, 18 Jul 90 10:38:16 EDT Subject: papers on pattern recognition Message-ID: <9007181438.AA19593@einstein.phy.ulaval.ca> In response to your request about papers on neural nets in pattern recognition, there is a good review in IEEE transactions on neural networks, vol. 1, p. 28. "Survey of neural network technology for automatic target recognition", by M. W. Roth. The paper has many references. arseno at phy.ulaval.ca From d38987%proteus.pnl.gov at pnlg.pnl.gov Wed Jul 18 12:07:50 1990 From: d38987%proteus.pnl.gov at pnlg.pnl.gov (d38987%proteus.pnl.gov@pnlg.pnl.gov) Date: Wed, 18 Jul 90 09:07:50 PDT Subject: NN and Pattern Recognition Message-ID: <9007181607.AA05172@proteus.pnl.gov> Richard, We have done some work in this area, as have many other people. I suggest you call Roger Barga at (509)375-2802 and talk to him, or send him mail at: d3c409%calypso at pnlg.pnl.gov Good luck, Ron Melton Pacific Northwest Laboratory Richland, WA 99352 From mozer at neuron Wed Jul 18 13:16:22 1990 From: mozer at neuron (Michael C. Mozer) Date: Wed, 18 Jul 90 11:16:22 MDT Subject: Help for a NOAA connectionist "primer" Message-ID: <9007181716.AA06051@neuron.colorado.edu> I think NNs are more accessible because the mathematics is so straightforward, and the methods work pretty well even if you don't know what you're doing (as opposed to many statistical techniques that require some expertise to use correctly). For me, the win of NNs is as a paradigm for modeling human cognition. Whether the NN learning algorithms existed previously in other fields is irrelevant. What is truly novel is that we're bringing these numerical and statistical techniques to the study of human cognition. Also, connectionists (at least the cog sci oriented ones) are far more concerned with representation -- a critical factor, one that has been much studied by psychologists but not by statisticians. Mike From cole at cse.ogi.edu Wed Jul 18 13:33:14 1990 From: cole at cse.ogi.edu (Ron Cole) Date: Wed, 18 Jul 90 10:33:14 -0700 Subject: Networks for pattern recognition problems? Message-ID: <9007181733.AA26297@cse.ogi.edu> Call Les Atlas at U Washington. He has an article coming out in IEEE Proceedings August comparing NNs and CART on 3 realworld problems. Ron Les Atlas: 206 685 1315 From bimal at jupiter.risc.com Wed Jul 18 13:59:24 1990 From: bimal at jupiter.risc.com (Bimal Mathur) Date: Wed, 18 Jul 90 10:59:24 PDT Subject: pattern recognition Message-ID: <9007181759.AA22486@jupiter.risc.com> The net result of experiments done by us in pattern classification for two dimensinal data i.e. image to features, classify features using NN, is that there is no significant improvement in performance of the overall system. -bimal mathur - Rockwell Int From PH706008 at brownvm.brown.edu Wed Jul 18 14:10:28 1990 From: PH706008 at brownvm.brown.edu (Chip Bachmann) Date: Wed, 18 Jul 90 14:10:28 EDT Subject: Networks for pattern recognition problems? In-Reply-To: Your message of Tue, 17 Jul 90 12:19:01 -0600 Message-ID: <9007181907.AA15014@boulder.Colorado.EDU> An example of research directly comparing neural networks with traditional statistical methods can be found in: R. A. Cole, Y. K. Muthusamy, and L. Atlas, "Speaker-Independent Vowel Recognition: Comparison of Backpropagation and Trained Classification Trees", in Proceedings of the Twenty-Third Annual Hawaii International Conference on System Sciences, Kailua-Kona, Hawaii, January 2-5, 1990, Vol. 1, pp. 132-141. The neural network achieves better results than the CART algorithm, in this case for a twelve-class vowel recognition task. The data was extracted from the TIMIT database, and a variety of different encoding schemes was employed. Tangentially, I thought that I would enquire if you know of any postdoctoral or other research positions available at NOAA, CIRES, or U. of Colorado. I completed my Ph.D. in physics at Brown University under Leon Cooper (Nobel laureate, 1972) this past May; my undergraduate degree was from Princeton University and was also in physics. My dissertation research was carried out as part of an interdisciplinary team in the Center for Neural Science here at Brown. The primary focus of my dissertation was the development of an alternative backward propagation algorithm which incorporates a gain modification procedure. I also investigated the feature extraction and generalization of backward propagation for a speech database of stop-consonants developed here in our laboratory at Brown. In addition, I discussed hybrid network architectures and, in particular, in a high-dimensional, multi-class vowel recognition problem (namely with the data which Cole et. al. used in the paper which I mentioned above), demonstrated an approach using smaller sub-networks to partition the data. Such approaches offer a means of dealing with the "curse of dimensionality." If there are any openings that I might apply for, I would be happy to forward my resume and any supporting materials that you might require. Charles M. Bachmann Box 1843 Physics Department & Center for Neural Science Brown University Providence, R.I. 02912 e-mail: ph706008 at brownvm From wilson at magi.ncsl.nist.gov Wed Jul 18 16:05:39 1990 From: wilson at magi.ncsl.nist.gov (Charles Wilson x2080) Date: Wed, 18 Jul 90 16:05:39 EDT Subject: character recognition Message-ID: <9007182005.AA13132@magi.ncsl.nist.gov> We have shown on character recognition problem that neural networks are as good in accuracy as traditional methods but much faster (on a parallel computer), much easier to prpgram ( a few hundred lines of parallel fortran) and less brittle. see C. L. Wilson, R. A. Wilkinson, and M. D. Garris, "self-organizing Neural Network Character Recognition on a Massively Parallel Computer", Proc. of the IJCNN, vol 2, pp. 325-329, June 1990. From shaw_d at clipr.colorado.edu Wed Jul 18 18:22:00 1990 From: shaw_d at clipr.colorado.edu (Dave Shaw) Date: 18 Jul 90 16:22:00 MDT Subject: Networks for pattern recognition problems? Message-ID: To date we have compared the expert system originally built for the task with many configurations of neural nets (based on your work), multiple linear regression equations, discriminant analysis, many types of nearest neighbor systems, and some work on automatic decision tree generation algorithms. Performance in measured both in the ROC P sub a (which turns out to be only a moderate indicator of performance, due to the unequal n's in the two distributions), and maximum percent correct, given the optimal bias setting. All systems have been trained and tested on the same sets of training and test data. As I indicated before, the story isn't completely in yet, but it is very hard to show significant differences between any of these systems on the solar flare task. Dave From uvm-gen!idx!gsk at uunet.UU.NET Wed Jul 18 17:45:41 1990 From: uvm-gen!idx!gsk at uunet.UU.NET (George Kaczowka) Date: Wed, 18 Jul 90 16:45:41 EST Subject: Networks for pattern recognition problems? Message-ID: <9007182148.AA22762@uvm-gen.uvm.edu> > Do you know of any references to work done using connectionist (neural) > networks for pattern recognition problems? I particularly am interested > in problems where the network was shown to outperform traditional algorithms. > > I am working on a presentation to NOAA (National Oceanic and Atmospheric > Admin.) management that partially involves pattern recognition > and am trying to argue against the statement: > "...results thus far [w/ networks] have not been notably more > impressive than with more traditional pattern recognition techniques". > > I have always felt that pattern recognition is one of the strengths of > connectionist network approaches over other techniques and would like > some references to back this up. > > thanks much, rich > ======================================================================== Rich -- I don't know if this helps, but a company in Providence RI called NESTOR has put together a couple of products.. come of which have been customized systems for customers solving pattern recognition problems.. One I remember was regarding bond trading in the financial world.. I seem to remenber that the model outperformed the "experts" by at least 10-15%, and that this was used (and is as far as I know) by some on wall street. I know that they have been in the insurance field for claim analysys as well as physical pattern recognition.. They were founded by a phd out of Brown University, and I am sure that you caould obtain reference works from them.. I understand that they are involved in a few military pattern recognition systems for fighters as well.. Good luck.. I was interested in their work some time ago, but have been off on other topics for over a year.. -- George -- ------------------------------------------------------------ - George Kaczowka IDX Corp Marlboro, MA - gsk at idx.UUCP - ------------------------------------------------------------ From marwan at ee.su.oz.AU Fri Jul 20 09:48:46 1990 From: marwan at ee.su.oz.AU (Marwan Jabri) Date: Fri, 20 Jul 90 08:48:46 EST Subject: pattern recognition Message-ID: <9007192248.AA07141@ee.su.oz.AU> We have been working on the application of neural nets to the pattern r recognition of ECG signals (medical). I will be happy in mailing you some of our very good results that are better of what has been achieved using conventional techniques. Is this the sort of things you are looking for? what media you want? Marwan Jabri ------------------------------------------------------------------- Marwan Jabri, PhD Email: marwan at ee.su.oz.au Systems Engineering and Design Automation Tel: (+61-2) 692-2240 Laboratory (SEDAL) Fax: (+61-2) 692-3847 Sydney University Electrical Engineering NSW 2006 Australia From PP219113 at tecmtyvm.mty.itesm.mx Fri Jul 20 12:10:26 1990 From: PP219113 at tecmtyvm.mty.itesm.mx (PP219113@tecmtyvm.mty.itesm.mx) Date: Fri, 20 Jul 90 10:10:26 CST Subject: Networks for pattern recognition problems? Message-ID: <900720.101026.CST.PP219113@tecmtyvm.mty.itesm.mx> hi, David J. Burr (in 'Experiments on NN Recognition of Spoken and Written Text, suggests that NN and Nearest neighbor classification performs at near the same level of accuracy, IEEE Trans on ASSP, vol 36, #7,pp1162-68, july 88) My own experience with character recognition using neural nets actually suggests that NN have better performance than nearest neighbor and hierarchical clustering, (I suggest to talk to Prof. Kelvin Wagner, ECE, UC-Boulder) See also, "Survery of Neural Net Tech for Automatic Target Recognition" in Trans in Neural Net, March 90, pp 28 by M.W. Roth. jose luis contreras-vidal From shaw_d at clipr.colorado.edu Fri Jul 20 17:32:00 1990 From: shaw_d at clipr.colorado.edu (Dave Shaw) Date: 20 Jul 90 15:32:00 MDT Subject: Networks for pattern recognition problems? Message-ID: Rich- the network configurations we have used are all single hidden layer of varying size (except for 1 network with no hidden layer). Hidden layer size has been varied from 1 to 30 units. Input layer=17 units, output layer=1 unit. All activation functions sigmoidal. As I indicated before, there was essentially no difference between any of the networks. We are moving towards a paper (one at least) and this work will likely be included as part of my dissertation as well. Dave From kanal at cs.UMD.EDU Thu Jul 26 22:51:18 1990 From: kanal at cs.UMD.EDU (Laveen N. KANAL) Date: Thu, 26 Jul 90 22:51:18 -0400 Subject: Summary (long): pattern recognition comparisons Message-ID: <9007270251.AA29622@mimsy.UMD.EDU> Hello folks, I have been into pattern recognition methodologies a long time and can tell you that comparison of techniques is a tricky business involving questions of dimensionality, sample size, and error estimation procedures. For some information on these matters, see e.g., l.kanal, Patterns in Pattern rRecognition,:1968-1974, IEEE trans. on Information theory, vol IT-20,no6, Nov. 1974, and articles in Krishnaiah and Kanal(eds), handbook of statistics 2-Classification, Pattern recognition and Reduction of Dimensionality, North-Holland, 1982, and various papers by Fukunaga in IEEE Trans. on PAMI. The papers by Halbert White in AI Expert Dec. 89, and Neural Computation vol. 1 indicate, that, as in the perceptron days, many of the NN algortihms can be shown to be intimately related to stochastic approximation methods ala Robbins-Munro, Dvoretsky etc.(see Skalansky & Wassel, Pattern Classifiers and Trainable Machines, Springer, 1981). But the results showing "Multilayer Feedforward networks are Universal Approximators", by Hornik, et al in Neural Networks, and subsequent publications along that line suggest the multilayer networks offer access to more interesting possibilities than previous "one-shot" statistical classifiers. The one-shot is in contrast to heirarchical statistical classifiers or statistical decision trees ( see Dattatreya & kanal, " Decision Trees in Pattern recognition, pp 189-239, in Progress in Pattern recognition 2, Kanal & Rosenfeld(eds), North-Holland,1985). Using an interactive pattern analysis and classification system to design statistical decision trees on some industrial inspection data from Global holonetics(now unfortunately defunct) I found that a 5 minute session with the i interactive system ISPAHAN/IPACS led to a two level classifier implementing Fisher dis discriminants at each level, that got an error rate which was generally better than that which Global Holonetics had obtained using a NN board and error backpropogation running several thousand iterations on the training set. The point is that the same technology that is making Neural net simulators so user friendly is also available now to make IPACS user friendly. However, as in the early days ( see L.Kanal, Interactive Pattern Analysis and Classification Systems--A survey and Commentary, Proc. IEEE, vol 60, pp 1200-1215, Oct. 72), the NN algorithms are a lot easier for many engineers and computer scientists to get into than the highly mathematical statistical pattern recognition procedures. But pattern recognition continues to be a "bag of problems and a bag of tools", and it behoves us to understand the various tools available to us rather than expecting any one given methodolgy to do it all or do it consistently better than all others. So we have statistical, linguistic, structural( grammars, AND/or Grphs, SOME/OR graphs, ), ANN, Genetic algorithms, etc. methods available and of en a hybrid approach will be what satisfies a problem domain). As has been "he who only has a hammer, thinks the whole world is a nail". I think the recent developments in ANN's are quite exciting and there are many new challenging problems to understand and resolve. But there is no free lunch, and we should not expect ANN's to free us from having to think hard about the true nature of the pattern recognition problems we wish to solve. By the way our quicke comparison of ISPAHAN/IPACS with BP was presented in a paper , Kanal, Gelsema, et al, "Comparing Hierarchical Statistical Classifiers with Error Back propogation Neural nets ", Vision 89, Society for Manufacturing Enginees, detroit, 1989. Please excuse the long note. I would have made it shorter if I had more time. L.K. From geoffg at cogs.sussex.ac.uk Fri Jul 27 05:35:58 1990 From: geoffg at cogs.sussex.ac.uk (Geoffrey Goodhill) Date: Fri, 27 Jul 90 10:35:58 +0100 Subject: pattern recognition comparisons Message-ID: <20618.9007270935@ticsuna.cogs.susx.ac.uk> re. Mike Mozer's comments about NN's vs. conventional techniques. > Neural net algorithms just let you > do a lot of the same things that traditional statistical algorithms allow > you to do, but they are more accessible to many people (and perhaps > easier to use). I entirely agree with this point of view. I argue that the neat thing about Neural Networks is that they provide insight into what sort of mathematics may be useful for understanding the brain, and hence how to solve perceptual-type problems: statistics, gradient descent, mean field theory etc. However, I see no reason why NN implementations of these sorts of mathematics, which are constrained by attempting to be "biologically plausible", should outperform more general methods which do not have this constraint, for practical real-world problems. Geoff From thsspxw at iitmax.iit.edu Fri Jul 27 14:45:08 1990 From: thsspxw at iitmax.iit.edu (Peter Wohl) Date: Fri, 27 Jul 90 12:45:08 -0600 Subject: abstract Message-ID: <9007271845.AA18103@iitmax.iit.edu> This is the abstract of a paper to appear in the Proceedings of the International Conference on Parallel Processing, 08/13-17, 1990: SIMD Neural Net Mapping on MIMD Architectures ============================================= Peter Wohl and Thomas W. Christopher Dept. of Computer Science Illinois Institute of Technology IIT Center, Chicago, IL 60616 Abstract -------- The massively parallel, communication intensive SIMD algorithm of multilayer back-propagation neural networks was encoded for coarse grained MIMD architectures using a relatively low level message-driven programming paradigm. The computation / communication ratio was software adjusted by defining a "temporal window" grain over the set of training instances. Extensive experiments showed an improvement in multiprocessor utilization over similar reported results and the simulator scaled up well with more complex networks on larger machines. The code can be easily modified to accommodate back-prop variations, like quickprop or cascade-correlation learning, as well as other neural network architectures and learning algorithms. [] Copies of the paper can be obtained by writing to the authors at the above address, or email-ing to: thsspxw at iitmax.iit.edu (P. Wohl). -Peter Wohl From tenorio at ee.ecn.purdue.edu Mon Jul 2 12:06:05 1990 From: tenorio at ee.ecn.purdue.edu (Manoel Fernando Tenorio) Date: Mon, 02 Jul 90 11:06:05 EST Subject: Report on Non Linear Optimization Message-ID: <9007021606.AA14753@ee.ecn.purdue.edu> This report is now available from Purdue University. There is a fee for overseas hardcopies. An electronic copy will be soon available at the Ohio database for ftp. Please send your request to Jerry Dixon (jld at ee.ecn.purdue.edu). Do not reply to this message. COMPUTATIONAL PROPERTIES OF GENERALIZED HOPFIELD NETWORKS APPLIED TO NONLINEAR OPTIMIZATION Anthanasios G. Tsirukis and Gintaras V. Reklaitis School of Chemical Engineering Manoel F. Tenorio School of Electrical Engineering Technical Report TREE 89-69 Parallel Distributed Structures Laboratory School of Electrical Engineering Purdue University ABSTRACT A nonlinear neural framework, called the Generalized Hopfield Network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, named the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of content-addressable memories). From POSTMAST at idui1.uidaho.edu Mon Jul 2 11:22:05 1990 From: POSTMAST at idui1.uidaho.edu (Marty Zimmerman) Date: Mon, 02 Jul 90 11:22:05 PAC Subject: No subject Message-ID: ========================================================================= From Marty Mon Jul 2 11:12:23 1990 From: Marty (208) Date: 2 July 1990, 11:12:23 PAC Subject: No subject Message-ID: Greetings: We have recently removed one of our BITNET nodes here at the University of Idaho: IDCSVAX. Currently, your electronic digest is being sent to userid KULAS at IDCSVAX. Please change his address to kulas at ted.cs.uidaho.edu as soon as possible. Thank you. Marty Zimmerman, U of Idaho Postmaster From kruschke at ucs.indiana.edu Mon Jul 2 16:26:00 1990 From: kruschke at ucs.indiana.edu (KRUSCHKE,JOHN,PSY) Date: 2 Jul 90 15:26:00 EST Subject: roommate wanted for cog sci conference Message-ID: I'm looking for a roommate for the Cognitive Science Society Conference, July 25-28 (inclusive). I'll be staying at the Hyatt Regency. Your half of the cost would be $57.50 x 4 nights = $230, plus your share of any taxes. (I've checked rates at other places near MIT, and they're all much higher.) I don't smoke and I don't snore; that's all I ask of a roommate. ------------------------------------------------------------------------- John K. Kruschke Asst. Prof. of Psych. & Cog. Sci. Dept. of Psychology kruschke at ucs.indiana.edu Indiana University kruschke at iubacs.bitnet Bloomington, IN 47405-4201 USA (812) 855-3192 ------------------------------------------------------------------------- From het at seiden.psych.mcgill.ca Tue Jul 3 15:41:55 1990 From: het at seiden.psych.mcgill.ca (Phil Hetherington) Date: Tue, 3 Jul 90 15:41:55 EDT Subject: cog sci accomodation info Message-ID: <9007031941.AA15061@seiden.psych.mcgill.ca.> I would like any information on cheap accomodation in Boston for the Cognitive Science Conference. I'm looking for something on a bus or subway line so that I can comute. Does anyone know of a cheap hotel/motel/bed-breakfast place where I can crash? There are four of us looking. Please send to: hynie at seiden.psych.mcgill.ca Thanks. From PU6MI6Q5%CINE88.CINECA.IT at vma.CC.CMU.EDU Tue Jul 3 16:33:00 1990 From: PU6MI6Q5%CINE88.CINECA.IT at vma.CC.CMU.EDU (PU6MI6Q5%CINE88.CINECA.IT@vma.CC.CMU.EDU) Date: Tue, 3 JUL 90 16:33 N Subject: e-address change Message-ID: <2569@CINE88.CINECA.IT> Please delete this account from the mailing list. In addition, please add "pu6mi6q2 at icineca2.bitnet" as a new recipient. All attempts to send this to an administrator have resulted in returned e-mail. -Joe Giampapa pu6mi6q5 at icineca2.bitnet From POPX at vax.oxford.ac.uk Thu Jul 5 06:07:04 1990 From: POPX at vax.oxford.ac.uk (POPX@vax.oxford.ac.uk) Date: Thu, 5 JUL 90 10:07:04 GMT Subject: No subject Message-ID: FROM: Jocelyn Paine, Department of Experimental Psychology, South Parks Road, Oxford OX1 3UD. JANET: POPX @ UK.AC.OX.VAX Phone: (0865) 271444 - messages. (0865) 271339 - direct. ******************************************* * * * POPLOG USERS' GROUP CONFERENCE 1990 * * * * JULY 17TH - 18TH * * * * OXFORD * * * ******************************************* Why am I posting news about a Poplog conference to the neural net digests? After all, Poplog is an implementation of Pop-11, Prolog, Lisp, and ML - all very conventional and symbolic AI languages. Well, following from work done by David Young at Sussex, you can now buy from Integral Solutions Limited (Poplog's commercial distributors) the "Poplog Neural" package. This allows you to design neural nets of various kinds; display them graphically using Poplog's windowing system; build fast production versions; and integrate what you've designed with existing code written in Pop-11, Prolog, Lisp, or ML. So if you need to build a mixed net/symbolic program, Poplog is well worth considering. And you get the convenience of a rather nice development environment for your nets; plus the four languages I've mentioned, a built-in editor, a window manager, and the object-oriented "Flavours" package. If you want to find out more about Poplog Neural, and Poplog in general, this year's User Group conference, PLUG90, is the place to do it. We still have places left at PLUG90, and can accept bookings if made quickly. The conference will be held in Oxford on the 17th and 18th of July; accomodation is provided in Keble College, and talks themselves will be in Experimental Psychology. Registration will open at 11 am on the 17th, with the conference proper beginning at 2; it will close at about 4 on the 18th. There will be a rather good conference dinner on the night of the 17th (main course: duck in lime and ginger sauce). The price is #75 to members of PLUG and #95 to non-members (#15 non-residential without dinner; #37 non-residential with dinner). Integral Solutions Limited, who distribute Poplog commercially, has generously paid for three free places. These will be offered to academic members of PLUG who have not attended a PLUG conference before, and who have difficulty raising funds. All three are still available. ~~~~~ This is the provisional list of talks: "Poplog Neural", Colin Shearer, Integral Solutions Ltd. (30 mins) A demonstration of ISL's new neural-networking system. It's implemented in Poplog, does its number crunching in Fortran, and allows you to build and test nets by drawing on a window. Fully interfaceable with Pop-11, Prolog, Lisp and ML. "Pop9X - The Standard", Steve Knight, Hewlett-Packard Labs. (1 hour) Gives a review of the BSI standardisation process and the progress of the Pop standard - the language YOU will be writing soon (ish). "Assembly code translation in Prolog" Ian O'Neill, Program Validation. (30 mins) State of the art assembly language translation in Prolog. "THESEUS: a production-system simulation of the spinning behaviour of an orb-web spider", Nick Gotts, Oxford University Zoology Department. (30 mins) As well as giving a demo, Nick will talk about his experiences of using AlphaPop: Theseus runs on a Macintosh. "MODEL: From Package to Language", James Anderson, Reading University. (1 hour) A six year old software package for model based vision goes up in flames under the heat of self-criticism - to be replaced by a language. TALK INCLUDES VIDEO PRESENTATION "GRIT - General Real-time Interactive Ikbs Toolset", Mark Swabey, Avonicom. (30 mins) Mark will talk about GRIT, but also about his experiences (nice and otherwise) of Poplog as a development environment. "IVE - Interactive Visual Environment" Anthony Worrall, Reading University. (30 mins) The software environment that beat the pants off MODEL (based on MODEL and PWM and quite a lot of other things besides.) TALK INCLUDES VIDEO PRESENTATION "Embedded Systems", Rob Zancanato, Cambridge Consultants. (30 mins) Poplog for embedded systems - especially MUSE for designing real-time controllers. We hope to have a demo of one such self- contained system. "Doing representation theory in Prolog", John Fitzgerald, Oxford University Maths Department. (30 mins) Representation theory is part of the study of (mathematical) groups. Prolog copes surprisingly well with such a geometric topic. "Building User Interfaces with Flavours", Chris Price, Department of Computer Science, University College of Wales. (30 mins) Object-oriented user-interface design, using Poplog's OO flavours package and window manager. "TPM - a graphical Prolog debugger", Dick Broughton, Expert Systems Limited. (1 hour) Dick will show how a debugger should be designed: with TPM, you can display Prolog proof trees as trees, rewind and fast forward execution, zoom in and out, watch the "cut" prune branches, and generally do everything you can't do with 'spy'. "Processing of Road Accident Data", Jiashu Wu, UCL Transport Studies. (30 mins) UCL use Poplog for an EMYCIN-based expert system which advises on accident blackspots, taking 'raw' accident data from incident reports. They like Poplog because it's an "open system": its jobs include fuzzy matching, stats, and handling very big databases. "Faust - an online fault-diagnosis system", David Cockburn, Electricity Research and Development Centre. (30 mins) (to be confirmed) "Design for testability", Lawrence Smith, SD Scicon. (30 mins) (to be confirmed) A system for advising the users of CAD packages on loopholes in testability. Something on the future of Poplog Integral Solutions. (1 hour) (details awaited) ~~~~~ And this is the provisional timetable: Accomodation is provided in Keble College, Parks Road, Oxford. Luggage can be left there from mid-day on the 17th. The conference itself will be in the Department of Experimental Psychology, South Parks Road. July 17th --------- Registration and coffee: 11:00 - 12:30 Lunch: 12:30 - 2:00 Talks: 2:00 - 3:30 Tea: 3:30 - 4:00 Talks: 4:00 - 6:00 (Depart for Keble). Keble bar opens from 6 to 11 pm. Dinner starts at 7. July 18th --------- From mjolsness-eric at CS.YALE.EDU Thu Jul 5 14:39:11 1990 From: mjolsness-eric at CS.YALE.EDU (Eric Mjolsness) Date: Thu, 5 Jul 90 14:39:11 EDT Subject: new TR: development Message-ID: <9007051839.AA06131@NEBULA.SYSTEMSZ.CS.YALE.EDU> ***** Please do not distribute to other lists ***** The following technical report is now available: "A Connectionist Model of Development" by Eric Mjolsness (1), David H. Sharp (2), and John Reinitz (3) (1) Department of Computer Science, Yale University (2) Theoretical Division, Los Alamos National Laboratory (3) Department of Biological Sciences, Columbia University Abstract: We present a phenomenological modeling framework for development. Our purpose is to provide a systematic method for discovering and expressing correlations in experimental data on gene expression and other developmental processes. The modeling framework is based on a connectionist or ``neural net'' dynamics for biochemical regulators, coupled to ``grammatical rules'' which describe certain features of the birth, growth, and death of cells, synapses and other biological entities. We outline how spatial geometry can be included, although this part of the model is not complete. As an example of the application of our results to a specific biological system, we show in detail how to derive a rigorously testable model of the network of segmentation genes operating in the blastoderm of Drosophila. To further illustrate our methods, we sketch how they could be applied to two other important developmental processes: cell cycle control and cell-cell induction. We also present a simple biochemical model leading to our assumed connectionist dynamics which shows that the dynamics used is at least compatible with known chemical mechanisms. For copies of YALEU/DCS/RR-796, please send email to shanahan-mary at cs.yale.edu or physical mail to Eric Mjolsness Yale Computer Science Department P.O. Box 2158 Yale Station New Haven CT 06520-2158 ------- From LAUTRUP at nbivax.nbi.DK Mon Jul 9 04:33:00 1990 From: LAUTRUP at nbivax.nbi.DK (Benny Lautrup) Date: Mon, 9 Jul 90 10:33 +0200 (NBI, Copenhagen) Subject: Protein structure and neural nets Message-ID: <6970A295C1BFE0C4B0@nbivax.nbi.dk> Re: New paper on neural nets and protein structure Title: ``Analysis of the Secondary Structure of the Human Immunodefieciency Virus (HIV) Proteins p17, gp120, and gp41 by Computer Modeling Based on Neural Network Methods'' H. Andreassen, H. Bohr, J. Bohr, S. Brunak, T. Bugge, R.M.J. Cotterill, C. Jacobsen, P. Kusk, B. Lautrup, S.B. Petersen, T. Saermark, and K. Ulrich, in Journal of Acquired Immune Deficiency Syndromes (AIDS), vol. 3, 615-622, 1990. From sun at umiacs.UMD.EDU Mon Jul 9 12:07:46 1990 From: sun at umiacs.UMD.EDU (Guo-Zheng Sun) Date: Mon, 9 Jul 90 12:07:46 -0400 Subject: Protein structure and neural nets Message-ID: <9007091607.AA15551@neudec.umiacs.UMD.EDU> Dr. Lautrup: Could you please send me a copy of your paper about protein structure analysis using neural network? My address is Guo-Zheng Sun UMIACS University of Maryland College Park, MD 20742 Tel.(301)454-1984 Thank you in advance. From Atul.Chhabra at UC.EDU Mon Jul 9 19:59:08 1990 From: Atul.Chhabra at UC.EDU (atul k chhabra) Date: Mon, 9 Jul 90 19:59:08 EDT Subject: Character Recognition Bibliography? Message-ID: <9007092359.AA11073@uceng.UC.EDU> I am looking for a character recognition bibliography. I am interested in all aspects of character recognition, i.e., preprocessing and segmentation, OCR, typewritten character recognition, handwritten character recognition, neural network based recognition, statistical and syntactic recognition, hardware implementations, and commercial character recognition systems. If someone out there has such a bibliography, or something that fits a part of the above description, I would appreciate receiving a copy. Even if you know of only a few references, please email me the references. Please email the references or bibliography to me. I will summarize on the vision-list. Thanks, Atul Chhabra Department of Electrical & Computer Engineering University of Cincinnati, ML 030 Cincinnati, OH 45221-0030 From jmerelo at ugr.es Sat Jul 7 06:21:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 7 Jul 90 12:21 +0200 Subject: Parameters in Kohonen's Feature Map Message-ID: <69*jmerelo@ugr.es> I have already implemented, in C for Sun machines, a Kohonen Feature Map, but I have found in the literature no way to change the two parameters, k1 and k2, that appear in the alpha formula. Does anybody know how to compute them? Or is it just an empiric a l computation? JJ ================== From jmerelo at ugr.es Fri Jul 6 06:49:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 6 Jul 90 12:49 +0200 Subject: Kohonen's network Message-ID: <65*jmerelo@ugr.es> I am working on Kohonen's network. I was using previously a 15 input, 8x8 output layer network, with parameters stated in Kohonen's 1984 paper ( also in Aleksander book ). Y switched to a 16 input, 9x9 layer output, with the same parameters, and everythi n g got screwed up. Should I use the same parameters? How should them be changed? What do they depend on? Please, somebody answer, or i'll cast my Sun ws thru the window JJ ================== From tenorio at ee.ecn.purdue.edu Mon Jul 2 12:06:05 1990 From: tenorio at ee.ecn.purdue.edu (Manoel Fernando Tenorio) Date: Mon, 02 Jul 90 11:06:05 EST Subject: Report on Non Linear Optimization Message-ID: <9007021606.AA14753@ee.ecn.purdue.edu> This report is now available from Purdue University. There is a fee for overseas hardcopies. An electronic copy will be soon available at the Ohio database for ftp. Please send your request to Jerry Dixon (jld at ee.ecn.purdue.edu). Do not reply to this message. COMPUTATIONAL PROPERTIES OF GENERALIZED HOPFIELD NETWORKS APPLIED TO NONLINEAR OPTIMIZATION Anthanasios G. Tsirukis and Gintaras V. Reklaitis School of Chemical Engineering Manoel F. Tenorio School of Electrical Engineering Technical Report TREE 89-69 Parallel Distributed Structures Laboratory School of Electrical Engineering Purdue University ABSTRACT A nonlinear neural framework, called the Generalized Hopfield Network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, named the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of content-addressable memories). From POSTMAST at idui1.uidaho.edu Mon Jul 2 11:22:05 1990 From: POSTMAST at idui1.uidaho.edu (Marty Zimmerman) Date: Mon, 02 Jul 90 11:22:05 PAC Subject: No subject Message-ID: ========================================================================= From Marty Mon Jul 2 11:12:23 1990 From: Marty (208) Date: 2 July 1990, 11:12:23 PAC Subject: No subject Message-ID: Greetings: We have recently removed one of our BITNET nodes here at the University of Idaho: IDCSVAX. Currently, your electronic digest is being sent to userid KULAS at IDCSVAX. Please change his address to kulas at ted.cs.uidaho.edu as soon as possible. Thank you. Marty Zimmerman, U of Idaho Postmaster From kruschke at ucs.indiana.edu Mon Jul 2 16:26:00 1990 From: kruschke at ucs.indiana.edu (KRUSCHKE,JOHN,PSY) Date: 2 Jul 90 15:26:00 EST Subject: roommate wanted for cog sci conference Message-ID: I'm looking for a roommate for the Cognitive Science Society Conference, July 25-28 (inclusive). I'll be staying at the Hyatt Regency. Your half of the cost would be $57.50 x 4 nights = $230, plus your share of any taxes. (I've checked rates at other places near MIT, and they're all much higher.) I don't smoke and I don't snore; that's all I ask of a roommate. ------------------------------------------------------------------------- John K. Kruschke Asst. Prof. of Psych. & Cog. Sci. Dept. of Psychology kruschke at ucs.indiana.edu Indiana University kruschke at iubacs.bitnet Bloomington, IN 47405-4201 USA (812) 855-3192 ------------------------------------------------------------------------- From het at seiden.psych.mcgill.ca Tue Jul 3 15:41:55 1990 From: het at seiden.psych.mcgill.ca (Phil Hetherington) Date: Tue, 3 Jul 90 15:41:55 EDT Subject: cog sci accomodation info Message-ID: <9007031941.AA15061@seiden.psych.mcgill.ca.> I would like any information on cheap accomodation in Boston for the Cognitive Science Conference. I'm looking for something on a bus or subway line so that I can comute. Does anyone know of a cheap hotel/motel/bed-breakfast place where I can crash? There are four of us looking. Please send to: hynie at seiden.psych.mcgill.ca Thanks. From PU6MI6Q5%CINE88.CINECA.IT at vma.CC.CMU.EDU Tue Jul 3 16:33:00 1990 From: PU6MI6Q5%CINE88.CINECA.IT at vma.CC.CMU.EDU (PU6MI6Q5%CINE88.CINECA.IT@vma.CC.CMU.EDU) Date: Tue, 3 JUL 90 16:33 N Subject: e-address change Message-ID: <2569@CINE88.CINECA.IT> Please delete this account from the mailing list. In addition, please add "pu6mi6q2 at icineca2.bitnet" as a new recipient. All attempts to send this to an administrator have resulted in returned e-mail. -Joe Giampapa pu6mi6q5 at icineca2.bitnet From POPX at vax.oxford.ac.uk Thu Jul 5 06:07:04 1990 From: POPX at vax.oxford.ac.uk (POPX@vax.oxford.ac.uk) Date: Thu, 5 JUL 90 10:07:04 GMT Subject: No subject Message-ID: FROM: Jocelyn Paine, Department of Experimental Psychology, South Parks Road, Oxford OX1 3UD. JANET: POPX @ UK.AC.OX.VAX Phone: (0865) 271444 - messages. (0865) 271339 - direct. ******************************************* * * * POPLOG USERS' GROUP CONFERENCE 1990 * * * * JULY 17TH - 18TH * * * * OXFORD * * * ******************************************* Why am I posting news about a Poplog conference to the neural net digests? After all, Poplog is an implementation of Pop-11, Prolog, Lisp, and ML - all very conventional and symbolic AI languages. Well, following from work done by David Young at Sussex, you can now buy from Integral Solutions Limited (Poplog's commercial distributors) the "Poplog Neural" package. This allows you to design neural nets of various kinds; display them graphically using Poplog's windowing system; build fast production versions; and integrate what you've designed with existing code written in Pop-11, Prolog, Lisp, or ML. So if you need to build a mixed net/symbolic program, Poplog is well worth considering. And you get the convenience of a rather nice development environment for your nets; plus the four languages I've mentioned, a built-in editor, a window manager, and the object-oriented "Flavours" package. If you want to find out more about Poplog Neural, and Poplog in general, this year's User Group conference, PLUG90, is the place to do it. We still have places left at PLUG90, and can accept bookings if made quickly. The conference will be held in Oxford on the 17th and 18th of July; accomodation is provided in Keble College, and talks themselves will be in Experimental Psychology. Registration will open at 11 am on the 17th, with the conference proper beginning at 2; it will close at about 4 on the 18th. There will be a rather good conference dinner on the night of the 17th (main course: duck in lime and ginger sauce). The price is #75 to members of PLUG and #95 to non-members (#15 non-residential without dinner; #37 non-residential with dinner). Integral Solutions Limited, who distribute Poplog commercially, has generously paid for three free places. These will be offered to academic members of PLUG who have not attended a PLUG conference before, and who have difficulty raising funds. All three are still available. ~~~~~ This is the provisional list of talks: "Poplog Neural", Colin Shearer, Integral Solutions Ltd. (30 mins) A demonstration of ISL's new neural-networking system. It's implemented in Poplog, does its number crunching in Fortran, and allows you to build and test nets by drawing on a window. Fully interfaceable with Pop-11, Prolog, Lisp and ML. "Pop9X - The Standard", Steve Knight, Hewlett-Packard Labs. (1 hour) Gives a review of the BSI standardisation process and the progress of the Pop standard - the language YOU will be writing soon (ish). "Assembly code translation in Prolog" Ian O'Neill, Program Validation. (30 mins) State of the art assembly language translation in Prolog. "THESEUS: a production-system simulation of the spinning behaviour of an orb-web spider", Nick Gotts, Oxford University Zoology Department. (30 mins) As well as giving a demo, Nick will talk about his experiences of using AlphaPop: Theseus runs on a Macintosh. "MODEL: From Package to Language", James Anderson, Reading University. (1 hour) A six year old software package for model based vision goes up in flames under the heat of self-criticism - to be replaced by a language. TALK INCLUDES VIDEO PRESENTATION "GRIT - General Real-time Interactive Ikbs Toolset", Mark Swabey, Avonicom. (30 mins) Mark will talk about GRIT, but also about his experiences (nice and otherwise) of Poplog as a development environment. "IVE - Interactive Visual Environment" Anthony Worrall, Reading University. (30 mins) The software environment that beat the pants off MODEL (based on MODEL and PWM and quite a lot of other things besides.) TALK INCLUDES VIDEO PRESENTATION "Embedded Systems", Rob Zancanato, Cambridge Consultants. (30 mins) Poplog for embedded systems - especially MUSE for designing real-time controllers. We hope to have a demo of one such self- contained system. "Doing representation theory in Prolog", John Fitzgerald, Oxford University Maths Department. (30 mins) Representation theory is part of the study of (mathematical) groups. Prolog copes surprisingly well with such a geometric topic. "Building User Interfaces with Flavours", Chris Price, Department of Computer Science, University College of Wales. (30 mins) Object-oriented user-interface design, using Poplog's OO flavours package and window manager. "TPM - a graphical Prolog debugger", Dick Broughton, Expert Systems Limited. (1 hour) Dick will show how a debugger should be designed: with TPM, you can display Prolog proof trees as trees, rewind and fast forward execution, zoom in and out, watch the "cut" prune branches, and generally do everything you can't do with 'spy'. "Processing of Road Accident Data", Jiashu Wu, UCL Transport Studies. (30 mins) UCL use Poplog for an EMYCIN-based expert system which advises on accident blackspots, taking 'raw' accident data from incident reports. They like Poplog because it's an "open system": its jobs include fuzzy matching, stats, and handling very big databases. "Faust - an online fault-diagnosis system", David Cockburn, Electricity Research and Development Centre. (30 mins) (to be confirmed) "Design for testability", Lawrence Smith, SD Scicon. (30 mins) (to be confirmed) A system for advising the users of CAD packages on loopholes in testability. Something on the future of Poplog Integral Solutions. (1 hour) (details awaited) ~~~~~ And this is the provisional timetable: Accomodation is provided in Keble College, Parks Road, Oxford. Luggage can be left there from mid-day on the 17th. The conference itself will be in the Department of Experimental Psychology, South Parks Road. July 17th --------- Registration and coffee: 11:00 - 12:30 Lunch: 12:30 - 2:00 Talks: 2:00 - 3:30 Tea: 3:30 - 4:00 Talks: 4:00 - 6:00 (Depart for Keble). Keble bar opens from 6 to 11 pm. Dinner starts at 7. July 18th --------- From mjolsness-eric at CS.YALE.EDU Thu Jul 5 14:39:11 1990 From: mjolsness-eric at CS.YALE.EDU (Eric Mjolsness) Date: Thu, 5 Jul 90 14:39:11 EDT Subject: new TR: development Message-ID: <9007051839.AA06131@NEBULA.SYSTEMSZ.CS.YALE.EDU> ***** Please do not distribute to other lists ***** The following technical report is now available: "A Connectionist Model of Development" by Eric Mjolsness (1), David H. Sharp (2), and John Reinitz (3) (1) Department of Computer Science, Yale University (2) Theoretical Division, Los Alamos National Laboratory (3) Department of Biological Sciences, Columbia University Abstract: We present a phenomenological modeling framework for development. Our purpose is to provide a systematic method for discovering and expressing correlations in experimental data on gene expression and other developmental processes. The modeling framework is based on a connectionist or ``neural net'' dynamics for biochemical regulators, coupled to ``grammatical rules'' which describe certain features of the birth, growth, and death of cells, synapses and other biological entities. We outline how spatial geometry can be included, although this part of the model is not complete. As an example of the application of our results to a specific biological system, we show in detail how to derive a rigorously testable model of the network of segmentation genes operating in the blastoderm of Drosophila. To further illustrate our methods, we sketch how they could be applied to two other important developmental processes: cell cycle control and cell-cell induction. We also present a simple biochemical model leading to our assumed connectionist dynamics which shows that the dynamics used is at least compatible with known chemical mechanisms. For copies of YALEU/DCS/RR-796, please send email to shanahan-mary at cs.yale.edu or physical mail to Eric Mjolsness Yale Computer Science Department P.O. Box 2158 Yale Station New Haven CT 06520-2158 ------- From LAUTRUP at nbivax.nbi.DK Mon Jul 9 04:33:00 1990 From: LAUTRUP at nbivax.nbi.DK (Benny Lautrup) Date: Mon, 9 Jul 90 10:33 +0200 (NBI, Copenhagen) Subject: Protein structure and neural nets Message-ID: <6970A295C1BFE0C4B0@nbivax.nbi.dk> Re: New paper on neural nets and protein structure Title: ``Analysis of the Secondary Structure of the Human Immunodefieciency Virus (HIV) Proteins p17, gp120, and gp41 by Computer Modeling Based on Neural Network Methods'' H. Andreassen, H. Bohr, J. Bohr, S. Brunak, T. Bugge, R.M.J. Cotterill, C. Jacobsen, P. Kusk, B. Lautrup, S.B. Petersen, T. Saermark, and K. Ulrich, in Journal of Acquired Immune Deficiency Syndromes (AIDS), vol. 3, 615-622, 1990. From sun at umiacs.UMD.EDU Mon Jul 9 12:07:46 1990 From: sun at umiacs.UMD.EDU (Guo-Zheng Sun) Date: Mon, 9 Jul 90 12:07:46 -0400 Subject: Protein structure and neural nets Message-ID: <9007091607.AA15551@neudec.umiacs.UMD.EDU> Dr. Lautrup: Could you please send me a copy of your paper about protein structure analysis using neural network? My address is Guo-Zheng Sun UMIACS University of Maryland College Park, MD 20742 Tel.(301)454-1984 Thank you in advance. From Atul.Chhabra at UC.EDU Mon Jul 9 19:59:08 1990 From: Atul.Chhabra at UC.EDU (atul k chhabra) Date: Mon, 9 Jul 90 19:59:08 EDT Subject: Character Recognition Bibliography? Message-ID: <9007092359.AA11073@uceng.UC.EDU> I am looking for a character recognition bibliography. I am interested in all aspects of character recognition, i.e., preprocessing and segmentation, OCR, typewritten character recognition, handwritten character recognition, neural network based recognition, statistical and syntactic recognition, hardware implementations, and commercial character recognition systems. If someone out there has such a bibliography, or something that fits a part of the above description, I would appreciate receiving a copy. Even if you know of only a few references, please email me the references. Please email the references or bibliography to me. I will summarize on the vision-list. Thanks, Atul Chhabra Department of Electrical & Computer Engineering University of Cincinnati, ML 030 Cincinnati, OH 45221-0030 From jmerelo at ugr.es Sat Jul 7 06:21:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 7 Jul 90 12:21 +0200 Subject: Parameters in Kohonen's Feature Map Message-ID: <69*jmerelo@ugr.es> I have already implemented, in C for Sun machines, a Kohonen Feature Map, but I have found in the literature no way to change the two parameters, k1 and k2, that appear in the alpha formula. Does anybody know how to compute them? Or is it just an empiric a l computation? JJ ================== From jmerelo at ugr.es Fri Jul 6 06:49:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 6 Jul 90 12:49 +0200 Subject: Kohonen's network Message-ID: <65*jmerelo@ugr.es> I am working on Kohonen's network. I was using previously a 15 input, 8x8 output layer network, with parameters stated in Kohonen's 1984 paper ( also in Aleksander book ). Y switched to a 16 input, 9x9 layer output, with the same parameters, and everythi n g got screwed up. Should I use the same parameters? How should them be changed? What do they depend on? Please, somebody answer, or i'll cast my Sun ws thru the window JJ ================== From cpd at aic.hrl.hac.com Tue Jul 10 13:52:42 1990 From: cpd at aic.hrl.hac.com (cpd@aic.hrl.hac.com) Date: Tue, 10 Jul 90 10:52:42 -0700 Subject: Call for Participation in Connectionist Natural Language Processing Message-ID: <9007101752.AA04818@aic.hrl.hac.com> AAAI Spring Symposium Connectionist Natural Language Processing Recent results have lead some researchers to propose that connectionism is an alternative to AI/Linguistic approaches to natural language processing, both as a cognitive model and for practical applications. This symposium will bring together both critics and proponents of connectionist NLP to discuss its strengths and weaknesses. This symposium will cover a number of areas, spanning from new phonology models to connectionist treatments of anaphora and discourse issues. Participants should address what is new that connectionism brings to the study of language. The purpose of the symposium is to examine this issue from a range of perspectives including: Spoken language understanding/generation Parsing Semantics Pragmatics Language acquisition Linguistic and representational capacity issues Applications Some of the questions expecting to be addressed include: What mechanisms/representations from AI/Linguistics are necessary for connectionist NLP? Why? Can connectionism help integrate signal processing with knowledge of language? What does connectionism add to other theories of semantics? Do connectionist theories have implications for psycholinguistics? Prospective participants are encouraged to contact a member of the program committee to obtain a more detailed description of the symposium's goals and issues. Those interested in participating in this symposium are asked to submit a 1-2 page position paper abstract and a list of relevant publications. Abstracts of work in progress are encouraged, and potential participants may also include 3 copies of a full length paper describing previous work. Submitted papers or abstracts will be included in the symposium working notes, and participants will be asked to participate in panel discussions. Three (3) copies of each submission should be sent to arrive by November 16, 1990 to: Charles Dolan, Hughes Research Laboratories, RL96, 3011 Malibu Canyon Road, Malibu CA, 90265 All submissions will be promptly acknowledged. E-Mail inquiries may be sent to: cpd at aic.hrl.hac.com Program Committee: Robert Allen, Charles Dolan (chair), James McClelland, Peter Norvig, and Jordan Pollack. From Atul.Chhabra at UC.EDU Tue Jul 10 15:44:32 1990 From: Atul.Chhabra at UC.EDU (atul k chhabra) Date: Tue, 10 Jul 90 15:44:32 EDT Subject: Character Recognition Bibliography? Message-ID: <9007101944.AA07725@uceng.UC.EDU> > Please email the references or bibliography to me. I will summarize on the > vision-list. Thanks, ^^^^^^^^^^^ I posted the request for a character recognition bibliography to connectionists at cs.cmu.edu, vision-list at ads.com, and to the usenet newsgroup comp.ai.neural-nets. I will post a summary of the responses on all of these mailing lists/newsgroups. Although I prefer the bibtex format, any format will do. Atul Chhabra Department of Electrical & Computer Engineering University of Cincinnati, ML 030 Cincinnati, OH 45221-0030 From cpd at aic.hrl.hac.com Tue Jul 10 19:52:56 1990 From: cpd at aic.hrl.hac.com (cpd@aic.hrl.hac.com) Date: Tue, 10 Jul 90 16:52:56 -0700 Subject: Apologies on Call for Paticipation Message-ID: <9007102352.AA05988@aic.hrl.hac.com> My previous message left out some information. AAAI Spring Symposium March 26-28 Stanford University Connectionist Natural Language Processing ... from previous message... All submissions will be promptly acknowledged and registration materials will be sent to authors who submit abstracts. The Spring Symposium Series is an annual event of AAAI. Members of AAAI will receive a mailing from them. Attendance is by submission and acceptance of materials only, except that AAAI will fill available spaces if the program committee does not select enough people. Enough people is 30-40. No proceedings will not be published, but working papers, with submissions from attendees, will be distributed at the Symposium. Sorry for the momentary confusion -Charlie Dolan From tsejnowski at UCSD.EDU Fri Jul 13 21:06:21 1990 From: tsejnowski at UCSD.EDU (Terry Sejnowski) Date: Fri, 13 Jul 90 18:06:21 PDT Subject: NEURAL COMPUTATION 2:2 Message-ID: <9007140106.AA07155@sdbio2.UCSD.EDU> NEURAL COMPUTATION Table of Contents -- Volume 2:2 Visual Perception of Three-Dimensional Motion David J. Heeger and Allan Jepson Distributed Symbolic Representation of Visual Shape Eric Saund Modeling Orientation Discrimination at Multiple Reference Orientations with a Neural Network M. Devos and G. A. Orban Temporal Differentiation and Violation of Time-Reversal Invariance in Neurocomputation of Visual Information D. S. Tang and V. Menon Analysis of Linsker's Simulations of Hebbian Rules David J. C. MacKay and Kenneth D. Miller Applying Temporal Difference Methods to Fully Recurrent Reinforcement Learning Networks Jurgen Schmidhuber Generalizing Smoothness Constraints from Discrete Samples Chuanyi Ji, Robert R. Snapp, and Demetri Psaltis The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks Marcus Frean Layered Neural Networks with Gaussian Hidden Units As Universal Approximations Eric Hartman, James D. Keeler, and Jacek M. Kowalski A Neural Network for Nonlinear Bayesian Estimation in Drug Therapy Reza Shadmehr and David D'Argenio Analysis of Neural Networks with Redundancy Yoshio Izui and Alex Pentland Stability of the Random Neural Network Model Erol Gelenbe The Perceptron Algorithm Is Fast for Nonmaliciouis Distribution Eric B. Baum SUBSCRIPTIONS: Volume 2 ______ $35 Student ______ $50 Individual ______ $100 Institution Add $12. for postage outside USA and Canada surface mail. Add $18. for air mail. (Back issues of volume 1 are available for $25 each.) MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142. (617) 253-2889. ----- From jmerelo at ugr.es Sun Jul 15 13:33:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 15 Jul 90 19:33 +0200 Subject: Visit to Poland Message-ID: <111*jmerelo@ugr.es> I am a professor and PhD student working on the subject of neural networks for speech recognition. I am going to make a tourist visit to Poland on August 12, and I would like to meet anybody working on the same subjecto. If anybody is interested, please e -mail to my address so that we can arrange a meeting. Thanks JJ MERELO JMERELO at UGR.ES ================== From jmerelo at ugr.es Sun Jul 15 13:33:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 15 Jul 90 19:33 +0200 Subject: Anybody in Poland working in NN Message-ID: <95*jmerelo@ugr.es> I am going to Poland very soon on a tourist visit, and I would like to contact with anybody working on the same subject. If there is anybody in Cracow or Warszaw working on any aspect of Neural Networks, please e-mail to JJ Merelo JMERELO at UGR.ES giving me his/her name, address and e-mail address, and we will try to contact when I go to his/her city. The visit will be on the 3rd week of August. From fozzard at boulder.Colorado.EDU Tue Jul 17 14:19:01 1990 From: fozzard at boulder.Colorado.EDU (Richard Fozzard) Date: Tue, 17 Jul 90 12:19:01 -0600 Subject: Networks for pattern recognition problems? Message-ID: <9007171819.AA06599@alumni.colorado.edu> Do you know of any references to work done using connectionist (neural) networks for pattern recognition problems? I particularly am interested in problems where the network was shown to outperform traditional algorithms. I am working on a presentation to NOAA (National Oceanic and Atmospheric Admin.) management that partially involves pattern recognition and am trying to argue against the statement: "...results thus far [w/ networks] have not been notably more impressive than with more traditional pattern recognition techniques". I have always felt that pattern recognition is one of the strengths of connectionist network approaches over other techniques and would like some references to back this up. thanks much, rich ======================================================================== Richard Fozzard "Serendipity empowers" Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303 fozzard at boulder.colorado.edu (303)497-6011 or 444-3168 From ahmad at ICSI.Berkeley.EDU Tue Jul 17 16:30:06 1990 From: ahmad at ICSI.Berkeley.EDU (Subutai Ahmad) Date: Tue, 17 Jul 90 13:30:06 PDT Subject: New tech report Message-ID: <9007172030.AA02134@icsib18.Berkeley.EDU> The following technical report is available: A Network for Extracting the Locations of Point Clusters Using Selective Attention by Subutai Ahmad & Stephen Omohundro International Computer Science Institute ICSI Technical Report #90-011 Abstract This report explores the problem of dynamically computing visual relations in connectionist systems. It concentrates on the task of learning whether three clumps of points in a 256x256 image form an equilateral triangle. We argue that feed-forward networks for solving this task would not scale well to images of this size. One reason for this is that local information does not contribute to the solution: it is necessary to compute relational information such as the distances between points. Our solution implements a mechanism for dynamically extracting the locations of the point clusters. It consists of an efficient focus of attention mechanism and a cluster detection scheme. The focus of attention mechanism allows the system to select any circular portion of the image in constant time. The cluster detector directs the focus of attention to clusters in the image. These two mechanisms are used to sequentially extract the relevant coordinates. With this new representation (locations of the points) very few training examples are required to learn the correct function. The resulting network is also very compact: the number of required weights is proportional to the number of input pixels. Copies can be obtained in one of two ways: 1) ftp a postscript copy from cheops.cis.ohio-state.edu. The file is ahmad.tr90-11.ps.Z in the pub/neuroprose directory. You can either use the Getps script or follow these steps: unix:2> ftp cheops.cis.ohio-state.edu Connected to cheops.cis.ohio-state.edu. Name (cheops.cis.ohio-state.edu:): anonymous 331 Guest login ok, send ident as password. Password: neuron 230 Guest login ok, access restrictions apply. ftp> cd pub/neuroprose ftp> binary ftp> get ahmad.tr90-11.ps.Z ftp> quit unix:4> uncompress ahmad.tr90-11.ps.Z unix:5> lpr ahmad.tr90-11.ps 2) Order a hard copy from ICSI: The cost is $1.75 per copy for postage and handling. Please enclose your check with the order. Charges will be waived for ICSI sponsors and for institutions that have exchange agreements with the Institute. Make checks payable (in U.S. Dollars only) to "ICSI" and send to: International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 Be sure to mention TR 90-011 and include your physical address. For more information, send e-mail to: info at icsi.berkeley.edu --Subutai Ahmad ahmad at icsi.berkeley.edu From MJ_CARTE%UNHH.BITNET at vma.CC.CMU.EDU Tue Jul 17 17:53:00 1990 From: MJ_CARTE%UNHH.BITNET at vma.CC.CMU.EDU (MJ_CARTE%UNHH.BITNET@vma.CC.CMU.EDU) Date: Tue, 17 Jul 90 16:53 EST Subject: Tech Report Available Message-ID: ***** Please Do Not Distribute to Other Bulletin Boards ***** ***** Please Do Not Distribute to Other Bulletin Boards ***** The following technical report may be requested in hardcopy by sending e-mail to e_mcglauflin at unhh.bitnet and specifying UNH Intelligent Structures Group Report ECE.IS.90.03, or by sending physical mail to Michael J. Carter Dept. of Electrical and Computer Engineering University of New Hampshire Durham, NH 03824-3591 Abstract follows: "Slow Learning in CMAC Networks and Implications for Fault Tolerance" by M.J. Carter, A.J. Nucci, E. An, W.T. Miller, and F.J. Rudolph Intelligent Structures Group Dept. of Electrical and Computer Engineering University of New Hampshire Durham, NH 03824 The overlapping structure of receptive fields in the CMAC network can produce situations in which learning is unusually slow. It is shown that sinusoidal functions with spatial frequency near certain critical frequencies are particularly hard to learn with conventional CMAC networks. Moreover, the resulting learned weights have significantly greater RMS value near these critical frequencies, and this poses some concern for network fault tolerance. It is then demonstrated that CMAC networks using tapered receptive fields often exhibit faster learning near the critical frequencies, and the resulting learned weights can have smaller RMS value than those obtained using the conventional CMAC. **** Please Do Not Distribute to Other Bulletin Boards **** Mike Carter From elman at amos.ucsd.edu Fri Jul 20 19:09:46 1990 From: elman at amos.ucsd.edu (Jeff Elman) Date: Fri, 20 Jul 90 16:09:46 PDT Subject: TR announcement: 'Learning & evolution in neural networks' Message-ID: <9007202309.AA05366@amos.ucsd.edu> Learning and Evolution in Neural Networks by Stefano Nolfi Jeffrey L. Elman Domenico Parisi CRL-TR-9019, July 1990 Center for Research in Language University of California, San Diego La Jolla, CA 92093-0126 ABSTRACT In this report we present the results of a series of simula- tions in which neural networks undergo change as a result of two forces: learning during the "lifetime" of a network, and evolutionary change over the course of several "generations" of networks. The results demonstrate how complex and apparently purposeful behavior can arise from random varia- tion in networks. We believe that these results provide a good starting basis for modeling the more complex phenomena observed in biological systems. A more specific problem for which our results may be relevant is determining the role of behavior in evolution (Plotkin, 1988); that is, how learning at the individual level can have an influence on evolution at the population level within a strictly Darwinian--not Lamarckian--framework. -------------------------------------- Copies of this technical report may be requested by email, by sending a letter to crl at amos.ucsd.edu and requesting TR- 9019. From Atul.Chhabra at UC.EDU Sat Jul 21 14:06:31 1990 From: Atul.Chhabra at UC.EDU (atul k chhabra) Date: Sat, 21 Jul 90 14:06:31 EDT Subject: (Summary) Re: Character Recognition Bibliography? Message-ID: <9007211806.AA21705@uceng.UC.EDU> Here is a summary of what I received in response to my request for references on character recognition. I had asked for references in all aspects of character recognition -- preprocessing and segmentation, OCR, typewritten character recognition, handwritten character recognition, neural network based recognition, statistical and syntactic recognition, hardware implementations, and commercial character recognition systems. THANKS TO ALL WHO RESPONDED. IF ANYONE OUT THERE HAS MORE REFERENCES, PLEASE EMAIL ME. I WILL SUMMARIZE NEW RESPONSES AFTER ANOTHER TWO WEEKS. THANKS. Atul Chhabra Department of Electrical & Computer Engineering University of Cincinnati, ML 030 Cincinnati, OH 45221-0030 Phone: (513)556-6297 Email: achhabra at uceng.uc.edu ---------------------------------------------------------- From: Sol Sol Delgado Instituto de Automatica Industial La Poveda Arganda del Rey 28500 MADRID SPAIN sol at iai.es [ 1]_ Off-Line cursive script word recognition Radmilo M. Bozinovic, Sargur N. Srihari IEEE transactions on pattern analysis and machine intelligence. Vol. 11, January 1989. [ 2]_ Visual recognition of script characters. Neural networt architectures. Jodef Skrzypek, Jeff Hoffman. MPL (Machine Perception Lab). Nov 1989. [ 3]_ On recognition of printed characters of any font and size. Simon Kahan, Theo Pavlidis, Henry S. Baird. IEEE transactions on pattern analysis and machine intelligence Vol PAMI_9, No 2, March 1987. [ 4]_ Research on machine recognition of handprinted characters. Shunji Mori, Kazuhiko Yamamoto, Michio Yasuda. IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_6, No 4. July 1984. [ 5]_ A pattern description and generation method of structural characters Hiroshi nagahashi, Mikio Nakatsuyama. IEEE transactions on pattern analysis and machine intelligence Vol PAMI_8, No 1, January 1986. [ 6]_ An on-line procedure for recognition of handprinted alphanumeric characters. W. W. Loy, I. D. Landau. IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_4, No 4, July 1982. [ 7]_ A string correction algorithm for cursive script recognition. Radmilo Bozinovic, Sargur N. Srihari. IEEE transactions on pattern analysis and machine inteligence. Vol PAMI_4, No 6, November 1982. [ 8]_ Analisys and design of a decision tree based on entropy reduction and its application to large character set recognition Qing Ren Wang, Ching Y. Suen. IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_6, No 4, July 1984. [ 9]_ A method for selecting constrained hand-printed character shapes for machine recognition Rajjan Shinghal, Ching Y. Suen IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_4, No 1, January 1982 [10]_ Pixel classification based on gray level and local "busyness" Philip A. Dondes, Azriel Rosenfeld. IEEE transactions on pattern analysis and machine intelligence. Vol PAMI_4, No 1, January 1982. [11]_ Experiments in the contextual recognition of cursive script Roger W. Ehrich, Kenneth J. Koehler IEEE transactions on computers, vol c-24, No. 2, February 1975. [12]_ Character recognition by computer and applications. Ching Y. Suen. Handbook of pattern recognition and image procesing. ACADEMIC PRESS, INC. August 1988. [13]_ A robust algorithm for text string separation from mixed text/graphics images Lloyd Alan Fletcher, Rangachar Kasturi IEEE transactions on pattern analysis and machine intelligence. Vol 10, No 6, November 1988. [14]_ Segmentation of document images. Torfinn Taxt, Patrick J. Flynn, Anil K. Jain IEEE transactions on pattern analysis and machine intelligence. Vol 11, No 12, december 1989. [15]_ Experiments in text recognition with Binary n_Gram and Viterbi algorithms. Jonathan J. Hull, Sargur N. Srihari IEEE transactions on pattern analysis and machine intelligence Vol PAMI-4, No 5, september 1982. [16]- Designing a handwriting reader. D. J. Burr IEEE transactions on pattern analysis and machine intelligence Vol PAMI-5, No 5, september 1983. [17]_ Experiments on neural net recognition of spoken and written text David J. Burr IEEE transactions on acoustics, speech and signals processing vol 36, No 7, july 1988 [18]_ Experimets with a connectionist text reader D. J. Burr Bell communications research Morristow, N. J. 07960 [19]_ An Algorithm for finding a common structure shared by a family of strings Anne M. Landraud, Jean-Francois Avril, Philippe Chretienne. IEEE transactions on pattern analysis and machine intelligence Vol 11, No 8, august 1989 [20]_ Word_level recognition of cursive script Raouf F. H. Farag IEEE transactions on computers Vol C-28, No 2, february 1979 [21]_ Pattern Classification by neural network: an experimental system for icon recognition Eric Gullichsen, Ernest Chang Marzo, 1987 [22]_ Recognition of handwritten chinese characters by modified hough transform techniques. Fang-Hsuan Cheng, Wen-Hsing Hsu, Mei-Ying Chen IEEE transactions on pattern analysis and machine intelligence Vol 11, No 4, April 1989 [23]_ Inheret bias and noise in the Hough transform Christopher M. Brown IEEE transactions on pattern analysis and machine intelligence Vol PAMI-5, No 5, september 1983. [24]_ From pixels to features J. C. Simon North-Holland _ Feature selection and Language syntax in text recognition. J.J. Hull _ Feature extraction for locating address blocks on mail pieces. S.N. Srihari. [25]_ A model for variability effects in hand-printing, with implications for the design of on line character recognition systems. J.R. Ward and T. Kuklinski. IEEE transactions on systems, man and cybernetics. Vol 18, No 3, May/June 1988. [26]_ Selection of a neural network system for visual inspection. Paul J. Stomski, Jr and Adel S. Elmaghraby Engineering Mathematics And Computer Science University of Louisville, Kentucky 40292 [27]_ Self-organizing model for pattern learning and its application to robot eyesight. Hisashi Suzuki, Suguru Arimoto. Proceedings of the fourth conference on A.I. san Diego, March 1988. The computer society of the IEEE. ---------------------------------------------------------- From: J. Whiteley I only have five references I can offer, all are from the Proceedings of the 1989 International Joint Conference on Neural Networks held in Washington D.C. Yamada, K. Kami, H. Tsukumo, J. Temma, T. Handwritten Numeral Recognition by Multi-layered Neural Network with Improved Learning Algorithm Volume II, pp. 259-266 Morasso, P. Neural Models of Cursive Script Handwriting Volume II, pp.539-542 Guyon, I. Poujaud, I. Personnaz, L. Dreyfus, G. Comparing Different Neural Network Architectures for Classifying Handwritten Digits Volume II, pp.127-132 Weideman, W.E. A Comparison of a Nearest Neighbor Classifier and a Neural Network for Numeric Handprint Character Recognition Volume I, pp.117-120 Barnard, E. Casasent, D. Image Processing for Image Understanding with Neural Nets Volume I, pp.111-115 Hopefully you are being deluged with references. --Rob Whiteley Dept. of Chemical Engineering Ohio State University email: whiteley-j at osu-20.ircc.ohio-state.edu ------- ---------------------------------------------------------- From: avi at dgp.toronto.edu (Avi Naiman) %L Baird 86 %A H. S. Baird %T Feature Identification for Hybrid Structural/Statistical Pattern Classification %R Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition %D June 1986 %P 150-155 %L Casey and Jih 83 %A R. G. Casey %A C. R. Jih %T A Processor-Based OCR System %J IBM Journal of Research and Development %V 27 %N 4 %D July 1983 %P 386-399 %L Cash and Hatamian 87 %A G. L. Cash %A M. Hatamian %T Optical Character Recognition by the Method of Moments %J Computer Vision, Graphics and Image Processing %V 39 %N 3 %D September 1987 %P 291-310 %L Chanda et al. 84 %A B. Chanda %A B. B. Chaudhuri %A D. Dutta Majumder %T Some Algorithms for Image Enhancement Incorporating Human Visual Response %J Pattern Recognition %V 17 %D 1984 %P 423-428 %L Cox et al. 74 %A C. Cox %A B. Blesser %A M. Eden %T The Application of Type Font Analysis to Automatic Character Recognition %J Proceedings of the Second International Joint Conference on Pattern Recognition %D 1974 %P 226-232 %L Frutiger 67 %A Adrian Frutiger %T OCR-B: A Standardized Character for Optical Recognition %J Journal of Typographic Research %V 1 %N 2 %D April 1967 %P 137-146 %L Goclawska 88 %A Goclawska %T Method of Description of the Alphanumeric Printed Characters by Signatures for Automatic Text Readers %J AMSE Review %V 7 %N 2 %D 1988 %P 31-34 %L Gonzalez 87 %A Gonzalez %T Designing Balance into an OCR System %J Photonics Spectra %V 21 %N 9 %D September 1987 %P 113-116 %L GSA 84 %A General Services Administration %T Technology Assessment Report: Speech and Pattern Recognition; Optical Character Recognition; Digital Raster Scanning %I National Archives and Records Service %C Washington, District of Columbia %D October 1984 %L Hull et al. 84 %A J. J. Hull %A G. Krishnan %A P. W. Palumbo %A S. N. Srihari %T Optical Character Recognition Techniques in Mail Sorting: A Review of Algorithms %R 214 %I SUNY Buffalo Computer Science %D June 1984 %L IBM 86 %A IBM %T Character Recognition Apparatus %J IBM Technical Disclosure Bulletin %V 28 %N 9 %D February 1986 %P 3990-3993 %L Kahan et al. 87 %A A. Kahan %A Theo Pavlidis %A H. S. Baird %T On the Recognition of Printed Characters of any Font and Size %J IEEE Transactions on Pattern Analysis and Machine Intelligence %V PAMI-9 %N 2 %D March 1987 %P 274-288 %L Lam and Baird 86 %A S. W. Lam %A H. S. Baird %T Performance Testing of Mixed-Font, Variable-Size Character Recognizers %R AT&T Bell Laboratories Computing Science Technical Report No. 126 %C Murray Hill, New Jersey %D November 1986 %L Lashas et al. 85 %A A. Lashas %A R. Shurna %A A. Verikas %A A. Dosimas %T Optical Character Recognition Based on Analog Preprocessing and Automatic Feature Extraction %J Computer Vision, Graphics and Image Processing %V 32 %N 2 %D November 1985 %P 191-207 %L Mantas 86 %A J. Mantas %T An Overview of Character Recognition Methodologies %J Pattern Recognition %V 19 %N 6 %D 1986 %P 425-430 %L Murphy 74 %A Janet Murphy %T OCR: Optical Character Recognition %C Hatfield %I Hertis %D 1974 %L Nagy 82 %A G. Nagy %T Optical Character Recognition \(em Theory and Practice %B Handbook of Statistics %E P. R. Krishnaiah and L. N. Kanal %V 2 %I North-Holland %C Amsterdam %D 1982 %P 621-649 %L Pavlidis 86 %A Theo Pavlidis %T A Vectorizer and Feature Extractor for Document Recognition %J Computer Vision, Graphics and Image Processing %V 35 %N 1 %D July 1986 %P 111-127 %L Piegorsch et al. 84 %A W. Piegorsch %A H. Stark %A M. Farahani %T Application of Image Correlation for Optical Character Recognition in Printed Circuit Board Inspection %R Proceedings of SPIE \(em The International Society for Optical Engineering: Applications of Digital Image Processing VII %V 504 %D 1984 %P 367-378 %L Rutovitz 68 %A D. Rutovitz %T Data Structures for Operations on Digital Images %B Pictorial Pattern Recognition %E G. C. Cheng et al. %I Thompson Book Co. %C Washington, D. C. %D 1968 %P 105-133 %L Smith and Merali 85 %A J. W. T. Smith %A Z. Merali %T Optical Character Recognition: The Technology and its Application in Information Units and Libraries %R Library and Information Research Report 33 %I The British Library %D 1985 %L Suen 86 %A C. Y. Suen %T Character Recognition by Computer and Applications %B Handbook of Pattern Recognition and Image Processing %D 1986 %P 569-586 %L Wang 85 %A P. S. P. Wang %T A New Character Recognition Scheme with Lower Ambiguity and Higher Recognizability %J Pattern Recognition Letters %V 3 %D 1985 %P 431-436 %L White and Rohrer 83 %A J.M. White %A G.D. Rohrer %T Image Thresholding for Optical Character Recognition and Other Applications Requiring Character Image Extraction %J IBM Journal of Research and Development %V 27 %N 4 %D July 1983 %P 400-411 %L Winzer 75 %A Gerhard Winzer %T Character Recognition With a Coherent Optical Multichannel Correlator %J IEEE Transactions on Computers %V C-24 %N 4 %D April 1975 %P 419-423 ---------------------------------------------------------- From: nad at computer-lab.cambridge.ac.uk Hi, I've only got two references for you - but they have 42 and 69 references, respectively (some of the refs will be the same, but you get at least 69 references!). They are: "An overview of character recognition methodologies" J. Mantas Pattern Recognition, Volume 19, Number 6, 1986 pages 425-430 "Methodologies in pattern recognition and image analysis - a brief survey" J. Mantas Pattern Recognition, Volume 20, Number 1, 1987 pages 1-6 Neil Dodgson ============ ---------------------------------------------------------- From: YAEGER.L at AppleLink.Apple.COM I presume you know of "The 1989 Neuro-Computing Bibliography" edited by Casimir C. Klimasauskas, a Bradford Book, from MIT Press. It lists 11 references for character recognition in its index. - larryy at apple.com ---------------------------------------------------------- From: Tetsu Fujisaki 1. Suen, C. Y., Berthod, M., and Mori, S., "Automatic Recognition of Handprinted Characters - The State of the Art", Proc. IEEE, 68, 4 (April 1980) 469-487 2. Tappert, C. C., Suen, C. Y., and Wakahara T, "The State-of-the-Art in on-line handwriting recognition", IEEE Proc. 9th Int'l Conf. on Pattern Recognition, Rome Italy, Nov. 1988. Also in IBM RC 14045. ---------------------------------------------------------- From: burrow at grad1.cis.upenn.edu (Tom Burrow) Apparently, the state of the art in connectionism, as a lot of people will tell you, I'm sure, is Y. Le Cun et al's work which can be found in NIPS 90. Other significant connectionist approaches are Fukushima's neocognitron and Denker et al's work which I *believe* is in NIPS 88. I am interested in handprinted character recognition. Type set character recognition is basically solved, and I believe you shouldn't have any trouble locating texts on this (although I've only looked at the text edited by Kovalevsky (sp?), which I believe is just entitled "Reading Machines". Bayesian classifiers, which you can read about in any statistical pattern recognition text (eg, Duda and Hart, Gonzalez, etc), are capable of performing recognition, since one can choose reliable features present in machine printed text (eg, moments, projections, etc), and the segmentation problem is fairly trivial). Perhaps the greatest problem in handprinted recognition is the segmentation problem. Unfortunately, most connectionist approaches fail miserably in this respect, relying on traditional methods for segmentation which become a bottleneck. I am inspecting connectionist methods which perform segmentation and recognition concurrently, and I recommend you do not inspect the problems independently. I am by no means expert in any area which I've commented on, but I hope this helps. Also, again, please send me your compiled responses. Thank you and good luck. Tom Burrow ---------------------------------------------------------- From skrzypek at CS.UCLA.EDU Sun Jul 22 18:22:43 1990 From: skrzypek at CS.UCLA.EDU (Dr. Josef Skrzypek) Date: Sun, 22 Jul 90 15:22:43 PDT Subject: IJPRAI CALL FOR PAPERS Message-ID: <9007222222.AA05465@retina.cs.ucla.edu> IJPRAI CALL FOR PAPERS IJPRAI We are organizing a special issue of IJPRAI (Intl. Journal of Pattern Recognition and Artificial Intelligence) dedicated to the subject of neural networks in vision and pattern recognition. Papers will be refereed. The plan calls for the issue to be published in the fall of 1991. I would like to invite your participation. DEADLINE FOR SUBMISSION: 10th of December, 1990 VOLUME TITLE: Neural Networks in Vision and Pattern Recognition VOLUME GUEST EDITORS: Prof. Josef Skrzypek and Prof. Walter Karplus Department of Computer Science, 3532 BH UCLA Los Angeles CA 90024-1596 Email: skrzypek at cs.ucla.edu or karplus at cs.ucla.edu Tel: (213) 825 2381 Fax: (213) UCLA CSD DESCRIPTION The capabilities of neural architectures (supervised and unsupervised learning, feature detection and analysis through approximate pattern matching, categorization and self-organization, adaptation, soft constraints, and signal based processing) suggest new approaches to solving problems in vision, image processing and pattern recognition as applied to visual stimuli. The purpose of this special issue is to encourage further work and discussion in this area. The volume will include both invited and submitted peer-reviewed articles. We are seeking submissions from researchers in relevant fields, including, natural and artificial vision, scientific computing, artificial intelligence, psychology, image processing and pattern recognition. "We encourage submission of: 1) detailed presentations of models or supporting mechanisms, 2) formal theoretical analyses, 3) empirical and methodological studies. 4) critical reviews of neural networks applicability to various subfields of vision, image processing and pattern recognition. Submitted papers may be enthusiastic or critical on the applicability of neural networks to processing of visual information. The IJPRAI journal would like to encourage submissions from both, researchers engaged in analysis of biological systems such as modeling psychological/neurophysiological data using neural networks as well as from members of the engineering community who are synthesizing neural network models. The number of papers that can be included in this special issue will be limited. Therefore, some qualified papers may be encouraged for submission to the regular issues of IJPRAI. SUBMISSION PROCEDURE Submissions should be sent to Josef Skrzypek, by 12-10-1990. The suggested length is 20-22 double-spaced pages including figures, references, abstract and so on. Format details, etc. will be supplied on request. Authors are strongly encouraged to discuss ideas for possible submissions with the editors. The Journal is published by the World Scientific and was established in 1986. Thank you for your considerations. From wilson at Think.COM Mon Jul 23 16:09:00 1990 From: wilson at Think.COM (Stewart Wilson) Date: Mon, 23 Jul 90 16:09:00 EDT Subject: SAB90 Announcement Message-ID: <9007232009.AA07579@nugodot.think.com> Dear Connectionist list Manager, Would you kindly post this Announcement for the SAB90 Conference on your bulletin board? Thank you. Stewart Wilson ======================================================================= ANNOUNCEMENT Simulation of Adaptive Behavior: From Animals to Animats An International Conference To be held in Paris, September 24-28, 1990 Sponsored by Ecole Normale Superieure US Air Force Office of Scientific Research Electricite de France IBM France Computers, Communications and Visions (C2V) Offilib and a Corporate Donor 1. Conference dates and site The conference will take place Monday through Friday, September 24-28, 1990 at the Ministere de la Recherche et de la Technologie, 1 rue Descartes, Paris, France. 2. Conference Committee Conference chair Dr. Jean-Arcady Meyer Dr. Stewart W. Wilson Ecole Normale Superieure The Rowland Institute for Science France USA Organizing Committee Groupe de BioInformatique Ecole Normale Superieure France Program Committee Lashon Booker, U.S. Naval Research Lab, USA Rodney Brooks, MIT Artificial Intelligence Lab, USA Patrick Colgan, Queen's University at Kingston, Canada Patrick Greussay, Universite Paris VIII, France David McFarland, Oxford Balliol College, UK Luc Steels, VUB AI Lab, Belgium Richard Sutton, GTE Laboratories, USA Frederick Toates, The Open University, UK David Waltz, Thinking Machines Corp. and Brandeis University, USA 3. Official language: English 4. Conference Objective The conference objective is to bring together researchers in ethology, ecology, cybernetics, artificial intelligence, robotics, and related fields so as to further our understanding of the behaviors and underlying mechanisms that allow animals and, potentially, robots to adapt and survive in uncertain environments. Said somewhat differently, the objective is to investigate how the robot can aid in comprehending the animal and, inversely, to seek inspiration from the animal in the construction of autonomous robots. The conference will provide opportunities for dialogue between specialists with different scientific perspectives--ethology and artificial intelligence notably--a dialogue that will be enhanced by the common technical language imposed by simulation models. As the first of its kind in the world, the conference will make it possible to establish not only the state of the art of "adaptive autonomous systems, natural and artificial", but a list of the most promising future research topics. The conference is expected to promote: 1. Identification of the organizational principles, functional laws, and minimal properties that make it possible for a real or artificial system to persist in an uncertain environment. 2. Better understanding of how and under what conditions such systems can themselves discover these principles through conditioning, learning, induction, or processes of self-organization. 3. Specification of the applicability of the theoretical knowledge thus acquired to the building of autonomous robots. 4. Improved theoretical and practical knowledge concerning adaptive systems in general, both natural and artificial. Finally, special emphasis will be given to the following topics, as viewed from the perspective of adaptive behavior: Individual and collective behaviors Autonomous robots Action selection and behavioral Hierarchical and parallel organizations sequences Self organization of behavioral Conditioning, learning and induction modules Neural correlates of behavior Problem solving and planning Perception and motor control Goal directed behavior Motivation and emotion Neural networks and classifier Behavioral ontogeny and evolution systems Cognitive maps and internal Emergent structures and behaviors world models 5. Conference Proceedings The proceedings will be published about two months after the end of the conference by The MIT Press/Bradford Books. 6. Conference Organization Among the papers received by the organizers and reviewed by the Program Committee members, approximately 50 have been accepted for publication in the proceedings. They will be presented as talks or posters. (To receive by e-mail a preliminary program please contact one of the conference chairmen). Since the conference intersects animal and "animat" research, lively interaction can be expected, including controversy. At least one panel discussion will be organized around the theme of what each viewpoint can contribute to the other. Because the conference is emphasizing simulation models, it is anticipated that many participants will have computer programs demonstrating their work. To make such demonstrations possible, the Organizers will provide workstations and video equipment. An evening session during the week will be devoted to demonstrations. Morning and afternoon coffee breaks will be provided. To further promote interaction among a diverse group of participants, the conference will provide lunch each day. 7. Additional Information Additional information can be obtained from the chairmen: Dr. Jean-Arcady Meyer Groupe de Bioinformatique URA686.Ecole Normale Superieure 46 rue d'Ulm 75230 Paris Cedex 05 France e-mail: meyer at frulm63.bitnet meyer at hermes.ens.fr Tel: (1) 43.29.12.25 FAX: (1) 43.29.81.72 Dr. Stewart W. Wilson The Rowland Institute for Science 100 Cambridge Parkway Cambridge, MA 02142 USA e-mail: wilson at think.com Tel: (617) 497-4650 FAX: (617) 497-4627 8. Travel and Lodging Participants will be responsible for their own travel and lodging arrangements. However, you may contact any of three hotel reservations services which have agreed to offer advantageous locations and rates to participants in SAB90. We advise making early reservations and mentioning "SAB90" in your request. These services are: - Hotel Pullman Saint-Jacques(****): rooms at 800-900 FF, fax (33 1 45 88 43 93) - Tradotel(*** and **): rooms at 440-520 FF, fax (33 1 47 27 05 87) - AJF: student rooms at 80-90 FF, fax (33 1 40 27 08 71) 9. Registration fees Attendance at SAB90 will be open to any person paying the registration fee which is set at $ 220 (or 1200 FF) for non-students and $ 110 (or 600 FF) for students. The registration fee covers five lunches, coffee-breaks, and a copy of the Proceedings. ****************************************************************************** *WARNING: The audience size is strictly limited to 150 persons. Registrations* *will be closed beyond this number. * ****************************************************************************** REGISTRATION FORM ------------------------------------------------------------------------------ Last name: First name: Profession/Title: Organization: Address: State/Zip Code/Country: Telephone: Fax: E-mail: ------------------------------------------------------------------------------ This form should be sent to: Dr. Jean-Arcady MEYER Groupe de BioInformatique URA686. Ecole Normale Superieure 46 rue d'Ulm 75230 PARIS Cedex 05 FRANCE with a check for the registration fee to the order of: J.A. MEYER 'SAB90' The check can be in US Dollars or French Francs. To receive the student rate, please attach evidence of student status from your University or Scientific Advisor. ============================================================================== From booker at AIC.NRL.Navy.Mil Tue Jul 24 13:10:16 1990 From: booker at AIC.NRL.Navy.Mil (booker@AIC.NRL.Navy.Mil) Date: Tue, 24 Jul 90 13:10:16 EDT Subject: Call for Papers - ICGA-91 Message-ID: <9007241710.AA00778@sun7.aic.nrl.navy.mil> Call for Papers ICGA-91 The Fourth International Conference on Genetic Algorithms The Fourth International Conference on Genetic Algorithms (ICGA-91), will be held on July 13-16, 1991 at the University of California - San Diego in La Jolla, CA. This meeting brings together an international community from academia, government, and industry interested in algorithms suggested by the evolutionary process of natural selection. Topics of particular interest include: genetic algorithms and classifier systems, machine learning and optimization using these systems, and their relations to other learning paradigms (e.g., connectionist networks). Papers discussing how genetic algorithms and classifier systems are related to biological modeling issues (e.g., evolution of nervous systems, computational ethology, artificial life) are encouraged. Papers describing significant, unpublished research in this area are solicited. Authors must submit four (4) complete copies of their paper, postmarked by February 1, 1991, to the Program Co-Chair: Dr. Richard K. Belew Computer Science & Engr. Dept. (C-014) Univ. California - San Diego La Jolla, CA 92093 Electronic submissions (LaTeX source only) can be mailed to rik at cs.ucsd.edu. Papers should be no longer than 10 pages, single spaced, and printed using 12 pt. type. All papers will be subject to peer review. Evaluation criteria include the significance of results, originality, and the clarity and quality of the presentation. Important Dates: February 1, 1991: Submissions must be postmarked March 22, 1991: Notification to authors mailed May 6, 1991: Revised, final camera-ready paper due July 13-16, 1991: Conference dates ICGA-91 Conference Committee: Conference Co-Chairs: Kenneth A. De Jong, George Mason University J. David Schaffer, Philips Labs Vice Chair and Publicity: David E. Goldberg, Univ. of Illinois at Urbana-Champaign Program Co-Chairs: Richard K. Belew, Univ. of California at San Diego Lashon B. Booker, MITRE Financial Chair: Gil Syswerda, BBN Local Arrangements: Richard K. Belew, Univ. of California at San Diego From PVR%AUTOCTRL.RUG.AC.BE at VMA.CC.CMU.EDU Wed Jul 25 18:19:00 1990 From: PVR%AUTOCTRL.RUG.AC.BE at VMA.CC.CMU.EDU (PVR%AUTOCTRL.RUG.AC.BE@VMA.CC.CMU.EDU) Date: Wed, 25 Jul 90 18:19 N Subject: Neocognitron information wanted References: > The Transputer Lab, Grotesteenweg Noord 2, +32 91 22 57 55 Message-ID: Dear neural netters, I have a research student who is implementing Prof. Fukushima's neocognitron. The network will be used for object recognition and will finally be implemented on a multiprocessor network. The student is facing a large number of problems, for which we are not always able to find a solution. We would therefore like to get in touch with other researchers who have tried to implement the neocognitron or who have thoroughly studied this particular type of network. Could you please send a message to: pvr at autoctrl.rug.ac.be or to pvr at bgerug51.bitnet If you have technical reports that could make our work easier, we would certainly appreciate to have a copy. In return, we will send you some articles about the implementation and application. Many thanks in advance, Patrick ***************************************************************************** * Patrick Van Renterghem, BITNET: pvr at bgerug51.bitnet * * R&D Assistant, EDU: pvr%bgerug51.bitnet at cunyvm.cuny.edu * * State University of Ghent UUCP: mcsun!bgerug51.bitnet!pvr * * Belgium JANET: PVR%earn.bgerug51 at earn-relay * * * * Automatic Control Lab, | Tel: +32 91 22 57 55 ext. 313 * * State University of Ghent, | Fax: +32 91 22 85 91 * * Grotesteenweg Noord 2, | * * B-9710 Ghent-Zwijnaarde, Belgium | * *******You***Don't***Need***A***PhD***To***Write***Parallel***Programs******* From jm2z+ at ANDREW.CMU.EDU Wed Jul 25 15:16:16 1990 From: jm2z+ at ANDREW.CMU.EDU (Javier Movellan) Date: Wed, 25 Jul 90 15:16:16 -0400 (EDT) Subject: Integrating Information Message-ID: I am interested in the problem of integrating information from different processing levels. For instance,if we have a good system to map sounds into phonemes, we may want to use the statistical structure at the phoneme level to improve recognition or to clean up the inputs. Some of the domains in which this issue arises are: speech recognition, character recognition, spell-checkers... If you have references or comments regarding this issue please let me know. Classical statistical approaches, neural network approaches, theoretical perspectives, psychological models, and dirty tricks used in artificial systems are welcomed. I will compile the references/comments and make them public to the connectionist list. Javier From fozzard at boulder.Colorado.EDU Thu Jul 26 12:34:46 1990 From: fozzard at boulder.Colorado.EDU (Richard Fozzard) Date: Thu, 26 Jul 90 10:34:46 -0600 Subject: Summary (long): pattern recognition comparisons Message-ID: <9007261634.AA04726@alumni.colorado.edu> Here are the responses I got for my question regarding comparisons of connectionist methods with traditional pattern recognition techniques. I believe Mike Mozer (among others) puts it best: "Neural net algorithms just let you do a lot of the same things that traditional statistical algorithms allow you to do, but they are more accessible to many people (and perhaps easier to use)." Read on for the detailed responses. (Note: this does not include anything posted to comp.ai.neural-nets, to save on bandwidth) rich ======================================================================== Richard Fozzard "Serendipity empowers" Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303 fozzard at boulder.colorado.edu (303)497-6011 or 444-3168 From honavar at cs.wisc.edu Tue Jul 17 14:53:55 1990 From: honavar at cs.wisc.edu (Vasant Honavar) Date: Tue, 17 Jul 90 13:53:55 -0500 Subject: pattern recognition with nn Message-ID: <9007171853.AA13318@goat.cs.wisc.edu> Honavar, V. & Uhr, L. (1989). Generation, Local Receptive Fields, and Global Convergence Improve Perceptual Learning in Connectionist Networks, In: Proceedings of the 1989 International Joint Conference on Artificial Intelligence, San Mateo, CA: Morgan Kaufmann. Honavar, V. & Uhr, L. (1989). Brain-Structured Connectionist Networks that Perceive and Learn, Connection Science: Journal of Neural Computing, Artificial Intelligence and Cognitive Research, 1 139-159. Le Cun, Y. et al. (1990). Handwritten Digit Recognition With a Backpropagation Network, In: Neural Information Processing Systems 2, D. S. Touretzky (ed.), San Mateo, CA: 1990. Rogers, D. (1990). Predicting Weather Using a Genetic Memory: A Combination of Kanerva's Sparse Distributed Memory With Holland's Genetic Algorithms, In: Neural Information Processing Systems 2, D. S. Touretzky (ed.), San Mateo, CA: 1990. From perry at seismo.CSS.GOV Tue Jul 17 16:39:12 1990 From: perry at seismo.CSS.GOV (John Perry) Date: Tue, 17 Jul 90 16:39:12 EDT Subject: Re Message-ID: <9007172039.AA11042@beno.CSS.GOV> Richard, It depends on which neural network you are using, and the underlying complexity in seperating pattern classes. We at ENSCO have developed a neural network architecture that shows far superior performance over traditional algorithms. Mail me if you are interested. John L. Perry ENSCO, Inc. 5400 Port Royal Road Springfiedl, Virginia 22151 (Springfield) 703-321-9000 email: perry at dewey.css.gov, perry at beno.css.gov From shaw_d at clipr.colorado.edu Tue Jul 17 16:36:00 1990 From: shaw_d at clipr.colorado.edu (Dave Shaw) Date: 17 Jul 90 14:36:00 MDT Subject: Networks for pattern recognition problems? Message-ID: <9007172044.AA24992@boulder.Colorado.EDU> Rich- our experience with the solar data is still inconclusive, but would seem to indicate that neural nets have exhibit no distinct advantage over more traditional techniques, in terms of 'best' performance figures. The reason appears to be that although the task is understood to be non-linear, (which should presumably lead to better performance by non-linear systems such as networks), there is not enough data at the critical points to define the boundaries of the decision surface. This would seem to be a difficulty that all recognition problems must deal with. Dave From kortge at galadriel.Stanford.EDU Tue Jul 17 17:02:44 1990 From: kortge at galadriel.Stanford.EDU (Chris Kortge) Date: Tue, 17 Jul 90 14:02:44 PDT Subject: pattern recognition Message-ID: <9007172102.AA26014@boulder.Colorado.EDU> You may know of this already, but Gorman & Sejnowski have a paper on sonar return classification in Neural Networks Vol. 1, #1, pg 75, where a net did better than nearest neighbor, and comparable to a person. I would be very interested in obtaining your list of "better-than- conventional-methods" papers, if possible (maybe the whole connectionists list would, for that matter). Thanks-- Chris Kortge kortge at psych.stanford.edu From galem at mcc.com Tue Jul 17 18:06:14 1990 From: galem at mcc.com (Gale Martin) Date: Tue, 17 Jul 90 17:06:14 CDT Subject: Networks for pattern recognition problems? Message-ID: <9007172206.AA01492@sunkist.aca.mcc.com> I do handwriting recognition with backprop nets and have anecdotal evidence that the nets do better than the systems developed by some of the research groups we work with. The problem with such comparisons is that the success of the recognition systems depend on the expertise of the developers. There will never be a definitive study. However, I've come to believe that such accuracy comparisons miss the point. Traditional recognition technologies usually involve alot of hand-crafting (e.g., selecting features) that you can avoid by using backprop nets. For example, I can feed a net with "close to" raw inputs and the net learns to segment it into characters, extract features, and classify the characters. You may be able to do this with traditional techniques, but it will take alot longer. Extending the work to different character sets becomes prohibitive; whereas it is a simple task with a net. Gale Martin MCC Austin, TX From ted at aps1.spa.umn.edu Tue Jul 17 18:12:33 1990 From: ted at aps1.spa.umn.edu (Ted Stockwell) Date: Tue, 17 Jul 90 17:12:33 CDT Subject: Networks for pattern recognition problems? In-Reply-To: ; from "Richard Fozzard" at Jul 17, 90 6:16 pm Message-ID: <9007172212.AA05795@aps1.spa.umn.edu> > > Do you know of any references to work done using connectionist (neural) > networks for pattern recognition problems? I particularly am interested > in problems where the network was shown to outperform traditional algorithms. > > I am working on a presentation to NOAA (National Oceanic and Atmospheric > Admin.) management that partially involves pattern recognition > and am trying to argue against the statement: > "...results thus far [w/ networks] have not been notably more > impressive than with more traditional pattern recognition techniques". > This may not be quite what you're looking for, but here are a few suggestions: 1) Pose the question to salespeople who sell neural network software. They probably have faced the question before. 2) One advantage is that the network chacterizes the classes for you. Instead of spending days/weeks/months developing statistical models you can get a reasonable classifier by just handing the training data to the network and let it run overnight. It does the work for you so development costs should be much lower. 3) Networks seem to be more often compared to humans than to other software techniques. I don't have the referrences with me, but I recall that someone (Sejnowski?) developed a classifier for sonar signals that performed slightly better than human experts (which *is* the "traditional pattern recognition technique"). -- Ted Stockwell U of MN, Dept. of Astronomy ted at aps1.spa.umn.edu Automated Plate Scanner Project From mariah!yak at tucson.sie.arizona.edu Tue Jul 17 17:58:40 1990 From: mariah!yak at tucson.sie.arizona.edu (mariah!yak@tucson.sie.arizona.edu) Date: Tue, 17 Jul 90 14:58:40 -0700 Subject: No subject Message-ID: <9007172158.AA29760@tucson.sie.arizona.edu> Dear Dr. Fozzrad, I read your email message on a call for pattern recog. problems for which NN's are known to outperform traditional methods. I've worked in statistics and pattern recognition for some while. Have a fair number of publications. I've been reading th neural net literature and I'd be quite surprised if you get convincing replies in the affirmative, to your quest. My opinion is that stuff even from the '60's and '70's, such as the books by Duda and Hart, Gonzales and Fu, implemented on standard computers, are still much more effective than methodology I've come across using NN algorithms, which are mathematically much more rstrictive. In brief, if you hear of good solid instances favorable to NN's, please let me know. Sincerely, Sid Yakowitz Professor From John.Hampshire at SPEECH2.CS.CMU.EDU Tue Jul 17 21:18:33 1990 From: John.Hampshire at SPEECH2.CS.CMU.EDU (John.Hampshire@SPEECH2.CS.CMU.EDU) Date: Tue, 17 Jul 90 21:18:33 EDT Subject: Networks for pattern recognition problems? Message-ID: <9007180127.AA09023@boulder.Colorado.EDU> Rich, Go talk with Smolensky out there in Boulder. He should be able to give you a bunch of refs. See also works by Lippmann over the past two years. Barak Pearlmutter and I are working on a paper that will appear in the proceedings of the 1990 Connectionist Models Summer School which shows that certain classes of MLP classifiers yield (optimal) Bayesian classification performance on stochastic patterns. This beats traditional linear classifiers... There are a bunch of results in many fields showing that non-linear classifiers out perform more traditional ones. The guys at NOAA aren't up on the literature. One last reference --- check the last few years of NIPS and (to a lesser extent) IJCNN proceedings NIPS = Advances in Neural Information Processing Systems Dave Touretzky ed., Morgan Kaufmann publishers ICJNN = Proceedings of the International Joint Conference on Neural Networks, IEEE Press John From mozer at neuron Tue Jul 17 22:51:28 1990 From: mozer at neuron (Michael C. Mozer) Date: Tue, 17 Jul 90 20:51:28 MDT Subject: Help for a NOAA connectionist "primer" Message-ID: <9007180251.AA04754@neuron.colorado.edu> Your boss is basically correct. Neural net algorithms just let you do a lot of the same things that traditional statistical algorithms allow you to do, but they are more accessible to many people (and perhaps easier to use). There is a growing set of examples where neural nets beat out conventional algorithms, but nothing terribly impressive. And it's difficult to tell in these examples whether the conventional methods were applied appropriately (or the NN algorithm in cases where NNs lose to conventional methods for that matter). Mike From burrow at grad1.cis.upenn.edu Wed Jul 18 00:40:21 1990 From: burrow at grad1.cis.upenn.edu (Tom Burrow) Date: Wed, 18 Jul 90 00:40:21 EDT Subject: procedural vs connectionist p.r. Message-ID: <9007180440.AA16913@grad2.cis.upenn.edu> Sorry, this isn't much of a contribution -- mostly a request for your replies. If you are not going to repost them via the connectionist mailing list, could you mail them to me? No, for my mini-contribution: Yan Lecun, et al's work as seen in NIPS 90 on segmented character recognition is fairly impressive, and they claim that their results are state of the art. Tom Burrow From jsaxon at cs.tamu.edu Wed Jul 18 10:32:08 1990 From: jsaxon at cs.tamu.edu (James B Saxon) Date: Wed, 18 Jul 90 09:32:08 CDT Subject: Networks for pattern recognition problems? In-Reply-To: <23586@boulder.Colorado.EDU> Message-ID: <9007181432.AA08709@cs.tamu.edu> In article <23586 at boulder.Colorado.EDU> you write: >Do you know of any references to work done using connectionist (neural) >networks for pattern recognition problems? I particularly am interested >in problems where the network was shown to outperform traditional algorithms. > >I am working on a presentation to NOAA (National Oceanic and Atmospheric >Admin.) management that partially involves pattern recognition >and am trying to argue against the statement: >"...results thus far [w/ networks] have not been notably more >impressive than with more traditional pattern recognition techniques". > >I have always felt that pattern recognition is one of the strengths of >connectionist network approaches over other techniques and would like >some references to back this up. > >thanks much, rich >======================================================================== >Richard Fozzard "Serendipity empowers" >Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303 >fozzard at boulder.colorado.edu (303)497-6011 or 444-3168 Well, aside from the question of ultimate generality to which the answer is "OF COURSE there are references to neural network pattern recognition systems. The world is completely full of them!" Anyway, maybe you'd better do some more research. Here's a couple off the top of my head: Kohonen is really hot in the area, he's been doing it for at least ten years. Everybody refers to some aspect of his work. I also suggest picking up a copy of the IJCNN '90 San Diego, all 18lbs of it. (International Joint Conference on Neural Networks) But for a preview: I happened to sit in on just the sort of presentation you would have liked to hear. His title was "Meterological Classification of Satellite Imagery Using Neural Network Data Fusion" Oh Boy!!! Big title! Oh, it's by Ira G. Smotroff, Timothy P. Howells, and Steven Lehar. >From the MITRE Corporation (MITRE-Bedford Neural Network Research Group) Bedford, MA 01730. Well the presentation wasn't to hot, he sort of hand waved over the "classification" of his meterological data though he didn't describe what we were looking at. The idea was that the system was supposed to take heterogeneous sensor data (I hope you know these: GOES--IR and visual, PROFS database--wind profilers, barometers, solarometers, thermometers, etc) and combine them. Cool huh. If they had actually done this, I imagine the results would have been pretty good. It seems though that they merely used an IR image and a visual image and combined only these two. Their pattern recognition involved typical modeling of the retina which sort of acts as a band pass filter with orientations, thus it detects edges. Anyway, their claim was the following: "The experiments described showed reasonably good classification performance. There was no attempt to determine optimal performance by adding hidden units [Hey, if it did it without hidden units, it's doing rather well.], altering learning parameters, etc., because we are currently implementing self-scaling learning algorithms which will determine many of those issues automatically. [Which reminds me, Lehar works with Grossberg at Boston University. He's big on pattern recognition too, both analog and digital. Check out Adaptive Resonance Theory, or ART, ART2, ART3.]..." Anyway it looks like a first shot and they went minimal. There's lots they could add to make it work rather well. In terms of performance, I'd just like to make one of those comments... From what I saw at the conference, neural networks will outperform traditional techniques in this sort of area. The conference was brimming over with successful implementations. Anyway... Enough rambling, from a guy who should be writing his thesis right now... Good luck on your presentation! Oh, I think it automacally puts my signature on..... Did it? -- ---- \ / ---- /--------------------------------------------\ James Bennett Saxon | O| | O| | "I aught to join the club and beat you | Visualization Laboratory | | | | | over the head with it." -- Groucho Marx | Texas A&M University ---- ---- <---------------------------------------------/ jsaxon at cssun.tamu.edu From arseno at phy.ulaval.ca Wed Jul 18 10:38:16 1990 From: arseno at phy.ulaval.ca (Henri Arsenault) Date: Wed, 18 Jul 90 10:38:16 EDT Subject: papers on pattern recognition Message-ID: <9007181438.AA19593@einstein.phy.ulaval.ca> In response to your request about papers on neural nets in pattern recognition, there is a good review in IEEE transactions on neural networks, vol. 1, p. 28. "Survey of neural network technology for automatic target recognition", by M. W. Roth. The paper has many references. arseno at phy.ulaval.ca From d38987%proteus.pnl.gov at pnlg.pnl.gov Wed Jul 18 12:07:50 1990 From: d38987%proteus.pnl.gov at pnlg.pnl.gov (d38987%proteus.pnl.gov@pnlg.pnl.gov) Date: Wed, 18 Jul 90 09:07:50 PDT Subject: NN and Pattern Recognition Message-ID: <9007181607.AA05172@proteus.pnl.gov> Richard, We have done some work in this area, as have many other people. I suggest you call Roger Barga at (509)375-2802 and talk to him, or send him mail at: d3c409%calypso at pnlg.pnl.gov Good luck, Ron Melton Pacific Northwest Laboratory Richland, WA 99352 From mozer at neuron Wed Jul 18 13:16:22 1990 From: mozer at neuron (Michael C. Mozer) Date: Wed, 18 Jul 90 11:16:22 MDT Subject: Help for a NOAA connectionist "primer" Message-ID: <9007181716.AA06051@neuron.colorado.edu> I think NNs are more accessible because the mathematics is so straightforward, and the methods work pretty well even if you don't know what you're doing (as opposed to many statistical techniques that require some expertise to use correctly). For me, the win of NNs is as a paradigm for modeling human cognition. Whether the NN learning algorithms existed previously in other fields is irrelevant. What is truly novel is that we're bringing these numerical and statistical techniques to the study of human cognition. Also, connectionists (at least the cog sci oriented ones) are far more concerned with representation -- a critical factor, one that has been much studied by psychologists but not by statisticians. Mike From cole at cse.ogi.edu Wed Jul 18 13:33:14 1990 From: cole at cse.ogi.edu (Ron Cole) Date: Wed, 18 Jul 90 10:33:14 -0700 Subject: Networks for pattern recognition problems? Message-ID: <9007181733.AA26297@cse.ogi.edu> Call Les Atlas at U Washington. He has an article coming out in IEEE Proceedings August comparing NNs and CART on 3 realworld problems. Ron Les Atlas: 206 685 1315 From bimal at jupiter.risc.com Wed Jul 18 13:59:24 1990 From: bimal at jupiter.risc.com (Bimal Mathur) Date: Wed, 18 Jul 90 10:59:24 PDT Subject: pattern recognition Message-ID: <9007181759.AA22486@jupiter.risc.com> The net result of experiments done by us in pattern classification for two dimensinal data i.e. image to features, classify features using NN, is that there is no significant improvement in performance of the overall system. -bimal mathur - Rockwell Int From PH706008 at brownvm.brown.edu Wed Jul 18 14:10:28 1990 From: PH706008 at brownvm.brown.edu (Chip Bachmann) Date: Wed, 18 Jul 90 14:10:28 EDT Subject: Networks for pattern recognition problems? In-Reply-To: Your message of Tue, 17 Jul 90 12:19:01 -0600 Message-ID: <9007181907.AA15014@boulder.Colorado.EDU> An example of research directly comparing neural networks with traditional statistical methods can be found in: R. A. Cole, Y. K. Muthusamy, and L. Atlas, "Speaker-Independent Vowel Recognition: Comparison of Backpropagation and Trained Classification Trees", in Proceedings of the Twenty-Third Annual Hawaii International Conference on System Sciences, Kailua-Kona, Hawaii, January 2-5, 1990, Vol. 1, pp. 132-141. The neural network achieves better results than the CART algorithm, in this case for a twelve-class vowel recognition task. The data was extracted from the TIMIT database, and a variety of different encoding schemes was employed. Tangentially, I thought that I would enquire if you know of any postdoctoral or other research positions available at NOAA, CIRES, or U. of Colorado. I completed my Ph.D. in physics at Brown University under Leon Cooper (Nobel laureate, 1972) this past May; my undergraduate degree was from Princeton University and was also in physics. My dissertation research was carried out as part of an interdisciplinary team in the Center for Neural Science here at Brown. The primary focus of my dissertation was the development of an alternative backward propagation algorithm which incorporates a gain modification procedure. I also investigated the feature extraction and generalization of backward propagation for a speech database of stop-consonants developed here in our laboratory at Brown. In addition, I discussed hybrid network architectures and, in particular, in a high-dimensional, multi-class vowel recognition problem (namely with the data which Cole et. al. used in the paper which I mentioned above), demonstrated an approach using smaller sub-networks to partition the data. Such approaches offer a means of dealing with the "curse of dimensionality." If there are any openings that I might apply for, I would be happy to forward my resume and any supporting materials that you might require. Charles M. Bachmann Box 1843 Physics Department & Center for Neural Science Brown University Providence, R.I. 02912 e-mail: ph706008 at brownvm From wilson at magi.ncsl.nist.gov Wed Jul 18 16:05:39 1990 From: wilson at magi.ncsl.nist.gov (Charles Wilson x2080) Date: Wed, 18 Jul 90 16:05:39 EDT Subject: character recognition Message-ID: <9007182005.AA13132@magi.ncsl.nist.gov> We have shown on character recognition problem that neural networks are as good in accuracy as traditional methods but much faster (on a parallel computer), much easier to prpgram ( a few hundred lines of parallel fortran) and less brittle. see C. L. Wilson, R. A. Wilkinson, and M. D. Garris, "self-organizing Neural Network Character Recognition on a Massively Parallel Computer", Proc. of the IJCNN, vol 2, pp. 325-329, June 1990. From shaw_d at clipr.colorado.edu Wed Jul 18 18:22:00 1990 From: shaw_d at clipr.colorado.edu (Dave Shaw) Date: 18 Jul 90 16:22:00 MDT Subject: Networks for pattern recognition problems? Message-ID: To date we have compared the expert system originally built for the task with many configurations of neural nets (based on your work), multiple linear regression equations, discriminant analysis, many types of nearest neighbor systems, and some work on automatic decision tree generation algorithms. Performance in measured both in the ROC P sub a (which turns out to be only a moderate indicator of performance, due to the unequal n's in the two distributions), and maximum percent correct, given the optimal bias setting. All systems have been trained and tested on the same sets of training and test data. As I indicated before, the story isn't completely in yet, but it is very hard to show significant differences between any of these systems on the solar flare task. Dave From uvm-gen!idx!gsk at uunet.UU.NET Wed Jul 18 17:45:41 1990 From: uvm-gen!idx!gsk at uunet.UU.NET (George Kaczowka) Date: Wed, 18 Jul 90 16:45:41 EST Subject: Networks for pattern recognition problems? Message-ID: <9007182148.AA22762@uvm-gen.uvm.edu> > Do you know of any references to work done using connectionist (neural) > networks for pattern recognition problems? I particularly am interested > in problems where the network was shown to outperform traditional algorithms. > > I am working on a presentation to NOAA (National Oceanic and Atmospheric > Admin.) management that partially involves pattern recognition > and am trying to argue against the statement: > "...results thus far [w/ networks] have not been notably more > impressive than with more traditional pattern recognition techniques". > > I have always felt that pattern recognition is one of the strengths of > connectionist network approaches over other techniques and would like > some references to back this up. > > thanks much, rich > ======================================================================== Rich -- I don't know if this helps, but a company in Providence RI called NESTOR has put together a couple of products.. come of which have been customized systems for customers solving pattern recognition problems.. One I remember was regarding bond trading in the financial world.. I seem to remenber that the model outperformed the "experts" by at least 10-15%, and that this was used (and is as far as I know) by some on wall street. I know that they have been in the insurance field for claim analysys as well as physical pattern recognition.. They were founded by a phd out of Brown University, and I am sure that you caould obtain reference works from them.. I understand that they are involved in a few military pattern recognition systems for fighters as well.. Good luck.. I was interested in their work some time ago, but have been off on other topics for over a year.. -- George -- ------------------------------------------------------------ - George Kaczowka IDX Corp Marlboro, MA - gsk at idx.UUCP - ------------------------------------------------------------ From marwan at ee.su.oz.AU Fri Jul 20 09:48:46 1990 From: marwan at ee.su.oz.AU (Marwan Jabri) Date: Fri, 20 Jul 90 08:48:46 EST Subject: pattern recognition Message-ID: <9007192248.AA07141@ee.su.oz.AU> We have been working on the application of neural nets to the pattern r recognition of ECG signals (medical). I will be happy in mailing you some of our very good results that are better of what has been achieved using conventional techniques. Is this the sort of things you are looking for? what media you want? Marwan Jabri ------------------------------------------------------------------- Marwan Jabri, PhD Email: marwan at ee.su.oz.au Systems Engineering and Design Automation Tel: (+61-2) 692-2240 Laboratory (SEDAL) Fax: (+61-2) 692-3847 Sydney University Electrical Engineering NSW 2006 Australia From PP219113 at tecmtyvm.mty.itesm.mx Fri Jul 20 12:10:26 1990 From: PP219113 at tecmtyvm.mty.itesm.mx (PP219113@tecmtyvm.mty.itesm.mx) Date: Fri, 20 Jul 90 10:10:26 CST Subject: Networks for pattern recognition problems? Message-ID: <900720.101026.CST.PP219113@tecmtyvm.mty.itesm.mx> hi, David J. Burr (in 'Experiments on NN Recognition of Spoken and Written Text, suggests that NN and Nearest neighbor classification performs at near the same level of accuracy, IEEE Trans on ASSP, vol 36, #7,pp1162-68, july 88) My own experience with character recognition using neural nets actually suggests that NN have better performance than nearest neighbor and hierarchical clustering, (I suggest to talk to Prof. Kelvin Wagner, ECE, UC-Boulder) See also, "Survery of Neural Net Tech for Automatic Target Recognition" in Trans in Neural Net, March 90, pp 28 by M.W. Roth. jose luis contreras-vidal From shaw_d at clipr.colorado.edu Fri Jul 20 17:32:00 1990 From: shaw_d at clipr.colorado.edu (Dave Shaw) Date: 20 Jul 90 15:32:00 MDT Subject: Networks for pattern recognition problems? Message-ID: Rich- the network configurations we have used are all single hidden layer of varying size (except for 1 network with no hidden layer). Hidden layer size has been varied from 1 to 30 units. Input layer=17 units, output layer=1 unit. All activation functions sigmoidal. As I indicated before, there was essentially no difference between any of the networks. We are moving towards a paper (one at least) and this work will likely be included as part of my dissertation as well. Dave From kanal at cs.UMD.EDU Thu Jul 26 22:51:18 1990 From: kanal at cs.UMD.EDU (Laveen N. KANAL) Date: Thu, 26 Jul 90 22:51:18 -0400 Subject: Summary (long): pattern recognition comparisons Message-ID: <9007270251.AA29622@mimsy.UMD.EDU> Hello folks, I have been into pattern recognition methodologies a long time and can tell you that comparison of techniques is a tricky business involving questions of dimensionality, sample size, and error estimation procedures. For some information on these matters, see e.g., l.kanal, Patterns in Pattern rRecognition,:1968-1974, IEEE trans. on Information theory, vol IT-20,no6, Nov. 1974, and articles in Krishnaiah and Kanal(eds), handbook of statistics 2-Classification, Pattern recognition and Reduction of Dimensionality, North-Holland, 1982, and various papers by Fukunaga in IEEE Trans. on PAMI. The papers by Halbert White in AI Expert Dec. 89, and Neural Computation vol. 1 indicate, that, as in the perceptron days, many of the NN algortihms can be shown to be intimately related to stochastic approximation methods ala Robbins-Munro, Dvoretsky etc.(see Skalansky & Wassel, Pattern Classifiers and Trainable Machines, Springer, 1981). But the results showing "Multilayer Feedforward networks are Universal Approximators", by Hornik, et al in Neural Networks, and subsequent publications along that line suggest the multilayer networks offer access to more interesting possibilities than previous "one-shot" statistical classifiers. The one-shot is in contrast to heirarchical statistical classifiers or statistical decision trees ( see Dattatreya & kanal, " Decision Trees in Pattern recognition, pp 189-239, in Progress in Pattern recognition 2, Kanal & Rosenfeld(eds), North-Holland,1985). Using an interactive pattern analysis and classification system to design statistical decision trees on some industrial inspection data from Global holonetics(now unfortunately defunct) I found that a 5 minute session with the i interactive system ISPAHAN/IPACS led to a two level classifier implementing Fisher dis discriminants at each level, that got an error rate which was generally better than that which Global Holonetics had obtained using a NN board and error backpropogation running several thousand iterations on the training set. The point is that the same technology that is making Neural net simulators so user friendly is also available now to make IPACS user friendly. However, as in the early days ( see L.Kanal, Interactive Pattern Analysis and Classification Systems--A survey and Commentary, Proc. IEEE, vol 60, pp 1200-1215, Oct. 72), the NN algorithms are a lot easier for many engineers and computer scientists to get into than the highly mathematical statistical pattern recognition procedures. But pattern recognition continues to be a "bag of problems and a bag of tools", and it behoves us to understand the various tools available to us rather than expecting any one given methodolgy to do it all or do it consistently better than all others. So we have statistical, linguistic, structural( grammars, AND/or Grphs, SOME/OR graphs, ), ANN, Genetic algorithms, etc. methods available and of en a hybrid approach will be what satisfies a problem domain). As has been "he who only has a hammer, thinks the whole world is a nail". I think the recent developments in ANN's are quite exciting and there are many new challenging problems to understand and resolve. But there is no free lunch, and we should not expect ANN's to free us from having to think hard about the true nature of the pattern recognition problems we wish to solve. By the way our quicke comparison of ISPAHAN/IPACS with BP was presented in a paper , Kanal, Gelsema, et al, "Comparing Hierarchical Statistical Classifiers with Error Back propogation Neural nets ", Vision 89, Society for Manufacturing Enginees, detroit, 1989. Please excuse the long note. I would have made it shorter if I had more time. L.K. From geoffg at cogs.sussex.ac.uk Fri Jul 27 05:35:58 1990 From: geoffg at cogs.sussex.ac.uk (Geoffrey Goodhill) Date: Fri, 27 Jul 90 10:35:58 +0100 Subject: pattern recognition comparisons Message-ID: <20618.9007270935@ticsuna.cogs.susx.ac.uk> re. Mike Mozer's comments about NN's vs. conventional techniques. > Neural net algorithms just let you > do a lot of the same things that traditional statistical algorithms allow > you to do, but they are more accessible to many people (and perhaps > easier to use). I entirely agree with this point of view. I argue that the neat thing about Neural Networks is that they provide insight into what sort of mathematics may be useful for understanding the brain, and hence how to solve perceptual-type problems: statistics, gradient descent, mean field theory etc. However, I see no reason why NN implementations of these sorts of mathematics, which are constrained by attempting to be "biologically plausible", should outperform more general methods which do not have this constraint, for practical real-world problems. Geoff From thsspxw at iitmax.iit.edu Fri Jul 27 14:45:08 1990 From: thsspxw at iitmax.iit.edu (Peter Wohl) Date: Fri, 27 Jul 90 12:45:08 -0600 Subject: abstract Message-ID: <9007271845.AA18103@iitmax.iit.edu> This is the abstract of a paper to appear in the Proceedings of the International Conference on Parallel Processing, 08/13-17, 1990: SIMD Neural Net Mapping on MIMD Architectures ============================================= Peter Wohl and Thomas W. Christopher Dept. of Computer Science Illinois Institute of Technology IIT Center, Chicago, IL 60616 Abstract -------- The massively parallel, communication intensive SIMD algorithm of multilayer back-propagation neural networks was encoded for coarse grained MIMD architectures using a relatively low level message-driven programming paradigm. The computation / communication ratio was software adjusted by defining a "temporal window" grain over the set of training instances. Extensive experiments showed an improvement in multiprocessor utilization over similar reported results and the simulator scaled up well with more complex networks on larger machines. The code can be easily modified to accommodate back-prop variations, like quickprop or cascade-correlation learning, as well as other neural network architectures and learning algorithms. [] Copies of the paper can be obtained by writing to the authors at the above address, or email-ing to: thsspxw at iitmax.iit.edu (P. Wohl). -Peter Wohl