From MARTIJN.BOET at wkap.nl Mon Jun 2 08:04:41 1997 From: MARTIJN.BOET at wkap.nl (Martijn Boet) Date: Mon, 2 Jun 1997 14:04:41 +0200 Subject: Neural Processing Letters - Tables of Contents Volume 5, issue 1&2 Message-ID: <7332041302061997/A57767/WKVAX5/11B613293200*@MHS> Dear Sirs, Please feel free to pay a visit to the following URL for information on the journal Neural Processing Letters, the Journal for Rapid Publication of Neural Network Research: http://kapis.www.wkap.nl/kapis/CGI-BIN/WORLD/journalhome.htm?1370-4621 Neural Processing Letters Table of Contents Volume 5, Issue 1, February 1997 Implementation of the Exclusive-Or Function in a Hopfield Style Recurrent Network Roelof Brouwer pp. 1-7 Learning Temporally Encoded Patterns in Networks of Spiking Neurons Berthold Ruf, Michael Schmitt pp. 9-18 Exploitation of Contributions to Information Gain, from Learning Analysis to Architecture Synthesis Bertrand Augereau, Thierry Simon, Jacky Bernard pp. 19-24 Prediction of Neural Net Tolerance to Noise Khalid A. Al-Mashouq pp. 25-34 The LBG-U Method for Vector Quantization - an Improvement over LBG Inspired from Neural Networks Bernd Fritzke pp. 35-45 Efficient Minimisation of the KL Distance for the Approximation of Posterior Conditional Probabilities M. Battisti, P. Burrascano, D. Pirollo pp. 47-55 Table of Contents Volume 5, Issue 2, April 1997 The Metric Structure of Weight Space S.M. Ruger and A. Ossen pp. 63-71 A Constrained Generative Model Applied to Face Detection R. Feraud, O. Bernier and D. Collobert pp. 73-81 Virtual Sample Generation Using a Population of Networks S. Cho, M. Jang and S. Chang pp. 83-89 Training a Neuron in Sequential Learning H. Poulard and N. Hernandez pp. 91-96 Recognising Simple Behaviours Using Time-Delay RBF Networks A.J. Howell and H. Buxton pp. 97-104 Modified Oja's Algorithms for Principal Subspace and Minor Subspace Extraction T.Chen pp. 105-110 Vision Experiments with Neural Deformable Template Matching D.S. Banarse and A.W.G. Duller pp. 111-119 Increasing Attraction of Pseudo Inverse Autoassociative Networks D.O. Gorodnichy and A.M. Reznik pp. 121-125 Feedforward Neural Networks Based Input-Output Models for Railway Carriage System Identification T.W.S. Chow and O. Shuai pp. 127-137 Computationally Efficient Approximation of a Probabilistic Model for Document Representation in the WEBSOM Full-Text Analysis Method S. Kaski pp. 139-151 Interweaving Kohonen Maps of Fifferent Dimensions to handle Measure Zero Constraints on Topological Mappings L. Manevitz pp. 153-159 Instruction for Authors pp. 161-165 If you have any questions about this journal please feel free to contact me at: martijn.boet at wkap.nl With kind regards, Kluwer Academic Publishers Martijn Boet Product Manager martijn.boet at wkap.nl From bishopc at helios.aston.ac.uk Tue Jun 3 08:46:40 1997 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Tue, 03 Jun 1997 13:46:40 +0100 Subject: Newton Institute Message-ID: <3514.199706031246@sun.aston.ac.uk> Neural Networks and Machine Learning ------------------------------------ A six month programme at the Isaac Newton Institute, Cambridge, U.K. July to December, 1997 http://www.newton.cam.ac.uk/programs/nnm.html Junior Members Scheme ===================== The Newton Institute recognises that junior researchers have much to contribute to and much to gain from Institute programmes and events, and has therefore introduced the Junior Members Scheme. To be eligible for Junior Membership of the Institute you must be a Research Student or within 5 years of having received a PhD (with appropriate allowance for career breaks) and you must work or study in a UK University or a related research institution. Junior members will receive: * A bulletin giving regular advance information about programmes, workshops, conferences and other Institute events * Information about suitable sources of funding or support for visits to the Institute, when available. Newton Institute Junior Member Grants ------------------------------------- The Institute will make available some of its general funds specifically to support junior researchers' involvement in Institute activities. The types of involvement to be supported include (but are not limited to) attendance at workshops, conferences etc, and visits of up to 2 weeks to work or study with longer-term participants in the Institute's programmes. Further Information and Application Form ---------------------------------------- Full details of the Junior Members scheme, as well as an application form, are available from http://www.newton.cam.ac.uk/junior.html Mailing List ----------- To subscribe to the Junior Members' mailing list, send an e-mail to majordomo at newton.cam.ac.uk (leaving the subject field blank) in which the body of the message contains the line subscribe newton-jmember [your e-mail address] You may subscribe to this list even if you are not eligible to become a Junior Member. From malti at mpce.mq.edu.au Wed Jun 4 01:00:13 1997 From: malti at mpce.mq.edu.au (Malti Patel) Date: Wed, 4 Jun 1997 15:00:13 +1000 Subject: Research Assitant Position available Message-ID: <199706040500.PAA12166@krakatoa.mpce.mq.edu.au> RESEARCH ASSISTANT available at the Department of Computing and Department of Behavioural Sciences Macquarie University Sydney Australia A research assistant (RA) is required to work on a large ARC grant concerned with investigating semantic representations and their implementation in the Dual-Route Cascaded Reading Model (DRC). This work will be part of ongoing research into this area at Macquarie University and will build on previous results. The RA will be involved in performing experiments to collect semantic data in addition to running simulations using this data in the DRC. It is therefore desirable that the candidate have at least a II.1 Honours degree or equivalent postgraduate research experience and also have experience in the following areas: neural networks cognitive modelling good programming skills in C/C++ The position is available for a year, starting immediately, with a possibility of extension. The salary will be approximately $34,000 depending on experience. Applications in the form of a covering letter, plus CV including the names and addresses of two refereees should be sent or emailed to the address below by June 30th 1997. Dr Malti Patel Department of Computing School of MPCE Macquarie University Sydney NSW 2109 Australia malti at mpce.mq.edu.au From omori at cc.tuat.ac.jp Wed Jun 4 04:53:03 1997 From: omori at cc.tuat.ac.jp (Takashi Omori) Date: Wed, 4 Jun 1997 17:53:03 +0900 (JST) Subject: Call for Paper : ICONIP'98-Kitakyushu Message-ID: <199706040853.RAA27168@cc.tuat.ac.jp> ***************************************** * Call for Papers * ***************************************** ICONIP'98-Kitakyushu The Fifth International Conference on Neural Information Processing Organized by Japanese Neural Network Society (JNNS) Sponsored by Asian Pacific Neural Network Assembly (APNNA) October 21-23,1998 Kitakyushu International Conference Center 3-9-30 Asano, Kokura-ku, Kitakyushu 802, Japan The annual conference of the Asian Pacific Neural Network Assembly, ICONIP*98, will be held jointly with the ninth annual conference of Japanese Neural Network Society, from 21 to 23 October 1998 in Kitakyushu, Japan. The goal of ICONIP*98 is to provide a forum for researchers and engineers from academia and industries to meet and to exchange ideas on advanced techniques and recent developments in neural information processing. The conference further serves to stimulate local and regional interests in neural information processing and its potential applications to industries indigenous to this region. Topics of Interest Track$B-5(J: Neurobiological Basis of Brain Functions Track$B-6(J: Mathematical Theory of Brain Functions Track$B-7(J: Cognitive and Behavioral Aspects of Brain Functions Track$B-8(J: Theoretical and Technical Aspects of Neural Networks Track$B-9(J: Distributed Processing Systems Track$B-:(J: Applications of Neural Networks Track$B-;(J: Implementations of Neural Networks Topics cover (Key Words): Neuroscience, Neurobiology and Biophysics, Learning and Plasticity, Sensory and Motor Systems, Cognition and Perception Algorithms and Architectures, Learning and Generalization, Memory, Neurodynamics and Chaos, Probabilistic and Statistical Methods, Neural Coding Emotion, Consciousness and Attention, Visual and Auditory Computation, Speech and Languages, Neural Control and Robotics, Pattern Recognition and Signal Processing, Time Series Forecasting, Blind Separation, Knowledge Acquisition, Data Mining, Rule Extraction Emergent Computation, Distributed AI Systems, Agent-Based Systems, Soft Computing, Real World Systems, Neuro-Fuzzy Systems Neuro Device and Hardware, Neuro and Brain Computers, Software Tools, System Integration Conference Committee Conference Chair: Kunihiko Fukushima, Osaka University Conference Vice-chair: Minoru Tsukada, Tamagawa University Organizing Chair: Shuji Yoshizawa, Tokyo University Program Chair: Shiro Usui, Toyohashi University of Technology International Advisory Committee (tentative) Chair: Shun-ichi Amari, Institute of Physical and Chemical Research Members: S. Bang (Korea), J. Bezdek (USA), J. Dayhoff (USA), R. Eckmiller (Germany), W. Freeman (USA), N. Kasabov (New Zealand), H. Mallot (Germany), G. Matsumoto (Japan), N. Sugie (Japan), R. Suzuki (Japan), K. Toyama (Japan), Y. Wu (China), L.Xei (Hong Kong), J. Zurada (USA) Call for paper The Program Committee is looking for original papers on the above mentioned topics. Authors should pay special attention to explanation of theoretical and technical choices involved, point out possible limitations and describe the current states of their work. All received papers will be reviewed by the Program Committee. The authors will be informed about the decision of the review process by June 22, 1998. All accepted papers will be published. As the conference is a multi-disciplinary meeting the papers are required to be comprehensible to a wider rather than to a very specialized audience. Instruction to Authors Papers must be received by March 31, 1998. The papers must be submitted in a camera-ready format. Electronic or fax submission is not acceptable. Papers will be presented at the conference either in an oral or in a poster session. Please submit a completed full original pages and five copies of the paper written in English, and backing material in a large mailing envelope. Do not fold or bend your paper in any way. They must be prepared on A4-format white paper with one inch margins on all four sides, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. Centered at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by 100-150 words abstract and the text. Extra 2 pages are permitted with a cost of 5000 yen/page. Use black ink. Do not use any other color, either in the text or illustrations. The proceedings will be printed with black ink on white paper. In the covering letter the track and the topic of the paper according to the list above should be indicated. No changes will be possible after submission of your manuscript. Authors may also retrieve the ICONIP style "iconip98.tex", "iconip98.sty" and "sample.eps" files (they are compressed as form.tar.gz) for the conference via WWW at URL http://jnns-www.okabe.rcast.u-tokyo.ac.jp/jnns/ICONIP98.html. Language The use of English is required for papers and presentation. No simultaneous interpretation will be provided. Registration The deadline for Registration for speakers and Early Registration for non-speakers with remittance will be July 31, 1998. Both registration sheet and fee must be received by the day. The registration fee will be as follows: Japanese Yen Dead line General Participant Student Speaker July 31, 1998 \40,000 \10,000 Non-speaker Early Regist.July 31, 1998 \40,000 \10,000 After July 31, 1998 \50,000 \15,000 The registration fee for General Participant includes attendance to the conference, proceedings, banquette and reception. The registration fee for Student includes attendance to the conference and proceedings. Conference Venue Kitakyushu is a northern city in Kyushu Island, south west of Japan main islands. The place is one of the Japanese major industrial areas, and also has long history of two thousand years in Japanese and Chinese ancient records. There are direct flights from Asian and American major airports. You will be able to enjoy some technical tours and excursion in the area. Passport and Visa All foreign attendants entering Japan must possess a valid passport. Those requiring visas should apply to the Japanese council or diplomatic mission in their own country prior to departure. For details, participants are advised to consult their travel agents, air-line reservation office or the nearest Japanese mission. Accommodation Hotel information will be Available in the Second Circular. Events Exhibition, poster sessions, workshops, forum will be held at the conference. Two satellite workshops will be held just before or after the conference. Social Events Banquette, reception and excursion will be held at the conference. The details will be announced in the second circular. Workshops Two satellite workshops will be held. One is "Satellite workshop for young researcher on Information processing" that will be held after the conference. The detail is announced in the attached paper. Another workshop "Dynamical Brain" is under programming. This will take place in Brain Science Research Center, Tamagawa University Research Institute. The details will be announced in the Second Circular. Please see second circular for more information on these workshops, and possibly other new ones. Important Dates for ICONIP*98 Papers Due: March 31, 1998 Notification of Paper Acceptance: June 22, 1998 Second Circular (with Registration Form): June 22, 1998 Registration of at least one author of a paper: July 31, 1998 Early Registration: July 31, 1998 Conference: October 21-23, 1998 Workshop: October 24-26, 1998 Further Information & Paper Submissions ICONIP*98 Secretariat Mr. Masahito Matsue Japan Technical Information Service Sogo Kojimachi No.3 Bldg. 1-6 Kojimachi, Chiyoda-ku, Tokyo 102, Japan Tel:+81-3-3239-4565 Fax:+81-3-3239-4714 E-mail:KYD00356 at niftyserve.or.jp $B!y(J Could you suggest your friends and acquaintances who will be interested in ICONIP*98-Kitakyushu? Thank you. ------------------------------------------------------------- ICONIP*98-Kitakyushu 21-23 October, 1998 Tentative Registration (PLEASE PRINT) Name: Professor Dr. Ms. Mr. Last Name First Name Middle Name Affiliation: Address: Country: Telephone: Fax: E-mail: $B""(J I intend to submit a paper. The tentative title of my paper is: ------------------------------------------------------- $B""(J I intend to attend the conference. $B""(J I want to receive the Second Circular. Please mail a copy of this completed form to: ICONIP*98 Secretariat Mr. Masahito Matue Japan Technical Information Service Sogo Kojimachi No.3 Bldg. 1-6 Kojimachi, Chiyoda-ku, Tokyo 102, Japan Tel:+81-3-3239-4565 Fax:+81-3-3239-4714 E-mail:KYD00356 at niftyserve.or.jp ---------------------------------------------------------------- Satellite Workshop of ICONIP*98 Workshop for Young Researchers on Information Processing in the Central Nervous System --- Experimental and Theoretical Studies --- October 24-26, 1998 Site Hiko-san Seminar House (Tentative) Aim This workshop is a satellite meeting of ICONIP'98-Kitakyushu to be held in Kitakyushu, Japan, on October 21-23, 1998. In this workshop, invited talks by young researchers in the field of neuroscience and talks by young authors selected from among those who submitted papers to ICONIP'98 will be given. The talks will cover a broad range of topics in the field of neuroscience to promote interdisciplinary interactions and to familiarize the participants with a wide range of experimental and theoretical methodologies. About one hour and 30 minutes is assigned to each talk to allow in-depth discussion, which benefits both the speakers and audience, namely, better understanding of the talks for the audience and possible improvement of the paper for the speaker. Approximately 40 researchers will be able to participate. The organizer will provide partial support for room and board. The workshop will run from morning to evening every day and informal discussion will follow at night.! The organizers expect the attendance to exchange information and build new friendships through the workshop. Topics Information processing in single cells Sensory information processing/ Learning and memory / Motor learning and adaptive control/ Computational perspective / Methodology for data analysis / Hardware implementation Application A young author who is interested in participating in the workshop should mark "WS" in red on all six copies of the manuscript of his or her paper submitted to ICONIP'98, or contact the workshop organizer as follows: Shunsuke Sato Fax: +81-6-853-1809 e-mail: sato at bpe.es.osaka-u.ac.jp Information A bus will leave from the gate of Kitakyushu International Conference Center for the workshop site at 18:00, Oct. 23, 1998. --iconip98.sty------------------------------------------------ % proc.sty 24-Sep-85 % % This style file is originally for aw93. % % Modified (sc) 6-July-87 % DARPA style modifed for aw93 by JBH2 and RMS % Reformatted (rms) 27-Jan-92 % Title and abstract sizes moved to .sty file (Magerman) 31-1-92 % Reformatted (rms) 15-Jan-93 % Modified by Matsuo and Kuroyanagi (Nagoya I.T.) for ICONIP97 April 97. % % modified by kuroyanagi \def\large{\@setsize\large{12pt}\xpt\@xpt} % end modified \typeout{Document Style Option 'aw93' (1/93 version)} \let\yes=y \let\no=y \let\secnums=\yes \def\sitereport {\global\let\secnums=n} \def\thesection {} \def\thesubsection {} \def\thesubsubsection {} %% Redefine these for more compact headings: %\def\section{\@startsection {section}{1}{\z@} %{-2.0ex plus -0.5ex minus -.2ex}{3pt plus 2pt minus 1pt}{\large\bf}} %\def\subsection{\@startsection{subsection}{2}{\z@} %{-2.0ex plus -0.5ex minus -.2ex}{3pt plus 2pt minus 1pt}{\large\bf}} %\def\subsubsection{\@startsection{subsubsection}{3}{\z@} %{-6pt plus 2pt minus 1pt}{-1em}{\normalsize\bf}} %% Redifine margins etc. \voffset .19in \oddsidemargin -.25in \evensidemargin -.25in % \topmargin -.17in \headheight 0pt \headsep 0pt \footheight 12pt % modified by matsuo \topmargin -.50in \headheight 0pt \headsep 0pt \footheight 12pt % end \footskip 30pt \marginparwidth 0pt \marginparsep 0pt %modified bymatsuo %\textheight 9.00truein \textwidth 7.0truein \columnsep .25truein \textheight 9.50truein \textwidth 7.0truein \columnsep .25truein % end modified % \columnseprule 0pt %% make table and figure numbers (bold) % \def\thetable{\@arabic\c at table} % \def\fnum at table{Table \thetable} % \def\thefigure{\@arabic\c at figure} % \def\fnum at figure{Figure \thefigure} %% redefine the description list environment \def\descriptionlabel#1{\raggedright\bf #1} \def\description{\list{}{ \leftmargin=5em \labelwidth=4em \labelsep=1em \let\makelabel\descriptionlabel}} \let\enddescription\endlist %% this kills page number generation \def\@oddhead{}\def\@evenhead{} \def\@oddfoot{} \def\@evenfoot{\@oddfoot} %% Set up title format \def\institution#1{\gdef\@institution{#1}} % add by matsuo \def\email#1{\gdef\@email{#1}} % end add \def\maketitle{\par \begingroup \def\thefootnote{\fnsymbol{footnote}} \def\@makefnmark{\hbox to 0pt{$^{\@thefnmark}$\hss}} \twocolumn[\@maketitle] \@thanks \endgroup \setcounter{footnote}{0} \let\maketitle\relax \let\@maketitle\relax \gdef\@thanks{}\gdef\@author{}\gdef\@title{}\let\thanks\relax} % title's vertical height is no more/no less than 2.4 inches \def\@maketitle{\vbox to 1.8in{\hsize\textwidth \linewidth\hsize \centering {\LARGE\bf \@title \par} % \vfil {\large \bf \begin{tabular}[t]{c}\@author \end{tabular}\par} % \vfil {\begin{tabular}[t]{c}\@institution \end{tabular}\par} \vspace*{3mm} {\large\it \begin{tabular}[t]{c}\@author \end{tabular}\par} % add by matsuo \vspace*{1mm} {\large\it \begin{tabular}[t]{c}Email:\@email \end{tabular}\par} % end add \vspace*{3mm} {\large\begin{tabular}[t]{c}\@institution \end{tabular}\par} \vfil}} \def\copyrightspace#1{\unmarkedfootnote{\begin{minipage}[t]{4.23in} \parindent 1em \input{#1}\vrule height 3pt width 0pt \end{minipage}}} \def\abstract{\let\secnums=n \section{ABSTRACT} \small} \def\endabstract{\par\@endparenv} %add by matsuo \def\keyword{\let\secnums=n \vspace*{-3mm}\subsubsection{KEYWORDS:} \small\bf} \def\endkeyword{\par\@endparenv} %add by matsuo % add by matsuo for keyword %\newbox\abstractbox %\setbox\abstractbox\hbox{} %\def\abstract{\global\setbox\abstractbox=\hbox\bgroup% %\begin{minipage}[t]{137.5mm}%11Q 50zw %{\small\baselineskip=\bf\egtgt\hskip1zw $B$"$i$^$7 (B\hskip1zw}\small\cntrm\igno %respaces} %\def\endabstract{\end{minipage}\egroup} %\newbox\keywordbox %\setbox\keywordbox\hbox{} %\def\keyword{\global\setbox\keywordbox=\hbox\bgroup% %\begin{minipage}[t]{100mm}%11Q 50zw %{\small\bf Keyword:}\small\ignorespaces} %\def\endkeyword{\end{minipage}\egroup} % end add %\def\endabstract{\par\@endparenv\vskip 0.2in\vskip -3.5ex} %\def\abstract{\section*{ABSTRACT}} %\def\endabstract{\par} % taken from twocolumn.sty 27 Jan 85 \twocolumn \sloppy \flushbottom \parindent 0em \leftmargini 2em \leftmarginv .5em \leftmarginvi .5em % format for references \def\references{\let\secnums=n \section{References} \topsep=0pt plus 2pt \partopsep=0pt plus 2pt \def\baselinestretch{0.89}\small \vskip -9pt \list{\hfill [\arabic{enumi}]} {\settowidth\labelwidth{[99]}\leftmargin\labelwidth \advance\leftmargin\labelsep \usecounter{enumi}} \def\newblock{\hskip .11em plus .33em minus -.07em} \sloppy \sfcode`\.=1000\relax} \let\endreferences=\endlist \let\bibliography\references \let\endbibliography\endreferences %section format instructions %\def\ {} %\def\thesubsection {} %\def\thesubsubsection {} %\def\l at section#1#2{\relax} %\let\l at subsection\l at section %\let\l at subsubsection\l at section \def\section#1{\@startsection{section}{1}{0pt} {1ex plus 0.3ex minus 0.3ex} {0.8ex plus 0.2ex minus 0.2ex} {\centering\large\bf{ % {\large\bf{ \ifx\secnums\yes \arabic{section}. \else \addtocounter{section}{-1} \fi #1}}\relax \vskip -\parskip} \def\subsection#1{\@startsection{subsection}{2}{0pt} {0.25ex plus .2ex minus .1ex} {1.5ex plus 0.1ex minus 0.1ex} {\large\bf \ifx\secnums\yes \arabic{section}.\arabic{subsection}. \else \addtocounter{subsection}{-1} \fi #1}\relax \vskip -\parskip} \def\subsubsection#1{\@startsection{subsubsection}{3}{\parindent}{0pt} {-0.8em}{\bf}*{\bf{\vphantom{Zy}#1}}} \let\paragraph\subsubsection \renewcommand{\topfraction}{.9} \renewcommand{\dbltopfraction}{.9} \renewcommand{\bottomfraction}{.5} \renewcommand{\textfraction}{.15} \renewcommand{\floatpagefraction}{.85} \renewcommand{\dblfloatpagefraction}{.85} \setcounter{totalnumber}{3} \setcounter{topnumber}{1} \setcounter{dbltopnumber}{1} \setcounter{bottomnumber}{1} \setlength{\parskip}{2ex} \def\ninept{\def\baselinestretch{.95}\let\normalsize\small\normalsize} --iconip98.tex------------------------------------------------ %\documentstyle[aw93,times,psfig]{article} \documentstyle[iconip98,epsbox]{article} \begin{document} \ninept \title{ICONIP '98---TITLE OF THE PAPER} \author{Aikon Nippu \dag and Kyuju Hachi \dag\dag} \email{Nippu at dept.univ.ac.jp} \institution{\dag Dept. of Computer Eng., Univ University\\ Town1, City1, Prefecture 777, Japan\\ \dag\dag Dept. of Information, Univ University\\ Town2, City2, Prefecture 777, Japan} \maketitle \begin{abstract} This is an example of a ICONIP '98 proceedings paper\footnote{This is a sample of footnote.}, as you should submit it. All manuscripts must be in English. Electronic or fax submission is not acceptable. Use a two column format, with the title and author information centered across both columns on the first page, staying within the designated image area. Illustrations and figures may also span both columns. Dimensions for the paper are specified in this text. Your paper should begin with an abstract of about 100-150 words. \end{abstract} \begin{keyword} keyword1, keyword2, keyword3 \end{keyword} \section{INSTRUCTIONS FOR THE PROCEEDINGS PAPER} You must use the double column style specified. Please adhere to the style guidelines described in this sample paper. This allows us to maintain uniformity in the printed copy of the proceedings. \subsection{General Instructions} The manuscript that you prepare will be printed as it is received. Neatness is of paramount importance. Prepare upto 4 pages of text and figures for your paper. Extra 2 pages are permited with a cost of 5000 yen/page. Use black ink. Do not use any other color, either in the text or illustrations. The proceedings will be printed with black ink on white paper. Enclose the completed full original pages and 5 copies with a cover sheet, and backing material in a large mailing envelope; do not fold or bent your paper in any way. Papers must be received before March, 31, 1998 by ICONIP '98 Secretariat ( the address is listed at the end ). No electronic submission is accepted. No changes will be possible after submission of your manuscript. \subsection{Style Guidelines} All manuscripts must be single space, in the specified two column format. A full double column page with no illustrations should contain approximately 1000 words. All text should be in Times Roman or equivalent font, and the main text should have a fontsize of 10 points. Do not overcrowd and create an unreadable paper by making the lettering or the line spacing too small. The line spacing should be approximately 2.75 lines/cm (7 lines/inch). Authors may also retrieve the ICONIP style ``iconip98.tex'', ``iconip98.sty'' and ``sample.eps'' files (they are compressed as form.tar.gz) for the conference via WWW at following URL http://jnns-www.okabe.rcast.u-tokyo.ac.jp /jnns/ICONIP98.html \subsubsection{Cover Sheet} The cover sheet should include, \begin{quote} (1) Paper title, (2) Track number, (3) Key words, (4) Oral / Poster, (5) Contact author's name, (6) Full mailing address, (7) Telephone number, (8) Fax number and (9) Electronic mail address \end{quote} \subsubsection{Paper Title} The paper title should be in 18-point boldface capital letters, centered across the top of the two columns on the first page as indicated above. \subsubsection{Author's Name(s)} The author's name(s) and affiliations(s) appear below the title in 12-point capital and lower case letters. You should include a mailing address here if space permits. It is also useful to include your email. \subsubsection{Abstract} Each paper should contain an abstract of about 100-150 words that appears at the beginning of the first column of the first page. \subsubsection{Major Headings} Major headings appear in 12-point bold face capital letters, centered in the column. Examples of the various levels of headings are included in this sample paper. \subsubsection{Sub-headings} Sub-headings appear in 12-point boldface capital and lower case. They start at the left margin on a separate line. \subsubsection{Sub-Sub headings} Sub-sub headings appear in 10-point boldface capital and lower case in text. Justify and fill the text. \subsubsection{References} List and number all references at the end of the paper. Number the references in the order they first appear in the text. When referring to them in the text, type the corresponding reference number in the square brackets as shown at the end of this paper \cite{AUTHOR1}, \cite{AUTHOR2}. %[1], [2]. \subsubsection{Illustrations} Illustrations must appear within the designated margins, and must be part of the submitted paper. % They may span two columns. If possible, %position illustrations at the tops of columns, rather than the middle %or at the bottom. Caption and number every illustration. All halftone illustrations must be clear black and white prints. Line drawings may be made in black ink on white paper. Do not use any color illustrations. \subsubsection{Samples of Equation} \begin{eqnarray} O(t) & = & \frac{1}{1-e^{I(t)}} \\ I(t) & = & \sum ^{n}_{i=1} W_{i} \cdot X(i,t) \label{eq1} \\ y & = & ax_1 + bx_2 + cx_3 + dx_4 + ex_5 \nonumber \\ & & { } + fx_6 + gx_7 + hx_8 + jx_9 + kx_{10} \end{eqnarray} %\newpage \subsubsection{Sample of Table} This is a sample of table. \begin{table}[h] \caption{ A sample of table} \begin{center} \begin{tabular}{|c||c|c|c|c|c|} \hline & A & B & C & D & E \\ \hline\hline A & 1.0 & 0.4 & 0.5 & 0.9 & 0.3 \\ \hline B & 0.1 & 1.0 & 0.5 & 0.9 & 0.3 \\ \hline C & 0.1 & 0.8 & 1.0 & 0.3 & 0.5 \\ \hline D & 0.2 & 0.4 & 0.5 & 1.0 & 0.3 \\ \hline E & 0.1 & 0.4 & 0.5 & 0.4 & 1.0 \\ \hline \end{tabular} \end{center} \end{table} \subsubsection{Sample of Figure} This is a sample of figure. \begin{figure}[h] \begin{center} \psbox[width=2.5in]{sample} \caption{A sample of Figure} \end{center} \end{figure} \subsection{Layout Specification} You should use letter size or A4 size paper that will be printed exactly as received. Keep the top and left margins as specified. Detailed size specifications for the layout are given below. It is important that your printer produces clean, high quality output, since there will be no sharpness enhancing reduction of what you provide. Do not use a dot matrix or other low quality printer. If a suitable printer is not available, you can type or print your paper at a larger size and reduce it on a photocopying machine to the specified dimensions. All materials (text and illustrations) must appear within a bounding box that is 180mm wide and 215 mm high (7.1" x 8.4") area. Noting outside this area will be printed. The bounding box should be placed in the center of the paper, with equal sized left and right margins, and equal sized top and bottom margins. Indicate the order of your pages by numbering them at the back using a light pencil. Text should appear in two columns, each about 86 mm wide with a 6 mm space between the columns. On the first page, the top 47 mm (1.8") of both columns is reserved for the title, author(s) and affiliations(s). These items should be centered across both columns, starting at approximately 18 mm (0.7") from the top. On the second page, the text or illustrations start at approximately 18 mm (0.7") from the top. \subsection{Writing Style and Other Hints} It is sometimes difficult to describe your research work in detail and stay within 4 pages. However, by proper organization of the text you will be surprised how much information you can present. Emphasize the presentation of your new contributions. Background information and previous work can be summarized in a few lines accompanied by the proper references. Obvious errors or typos can easily be avoided and yet are often over looked. Some useful hints are given below. Try to have others who are familiar with the topic, but not necessarily a co-author, proofread the text. Also the use of spelling checkers can be quite useful. If English is not your nat ive language, try to have a native English speaker or a professional English editor proofread your text. Please direct any questions to the ICONIP '98 Secretariat,\\ \\ {\bf ICONIP '98 Secretariat\\ Mr. Masahito Matsue \\ Japan Technical Information Service\\ Sogo Kojimachi No.3 Bldg., 1-6 Kojimachi,\\ Chiyoda-ku, Tokyo 102, Japan\\ Tel: +81-3-3239-4565, Fax: +81-3-3239-4714\\ E-mail: KYD00356 at niftyserve.or.jp} \begin{references} \bibitem{AUTHOR1} Author, O., ``This is the first Reference (Journal),'' {\it IAGNU Trans.}, {\bf Vol.E99-D}, 12, pp.123--456 (1999). \bibitem{AUTHOR2} Author., T, {\it The Second Reference -- Book --}, Authors Press, 1900. \bibitem{FELLOW} Fellow, A.G., ``An Research about the Third Reference (Tech.Rep.),'' IEIE Tech. Rep., USO-88-123, pp.35-43 (1968). \bibitem{FELLOW2} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW3} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW4} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW5} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW6} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW7} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW8} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW9} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \end{references} \end{document} ------------------------------------------------------------ end of ICONIP'98 call for paper mail From hali at theophys.kth.se Thu Jun 5 06:12:35 1997 From: hali at theophys.kth.se (Hans Liljenstrom) Date: Thu, 5 Jun 1997 06:12:35 -0400 Subject: Workshop Announcment Message-ID: <9706051012.AA15740@> Workshop on Biology as a Source of Inspiration for Technology June 19 in Sigtuna, Sweden An informal satellite meeting to the International Conference on Engineering Applications of Neural Networks, EANN'97, is organised by the Agora for Biosystems on June 19 in Sigtuna, Sweden. The theme of the workshop is "Biology as a Source of Inspiration for Technology" and will be discussed by EANN conference participants, invited speakers and others interested in the topic. The idea is to focus attention on modern technical applications that have been inspired by biology, and to promote the dialogue between theorists and practitioners in the field. The workshop is taking place at the Agora for Biosystems, located in the nice and unique building of Sigtunastiftelsen Guesthouse, in the small town of Sigtuna, some 45 km north west of Stockholm, near Arlanda international airport. The workshop will start at 9 am on Thursday, June 19, 1997, and end around 5 pm the same day. Lunch and dinner can be served at self cost at Sigtunastiftelsen or in any nearby restaurant. Accomodation is also possible at the Guesthouse, or in nearby hotels or hostels. Participants who want to stay at the Guesthouse should contact the organisers or make room reservations directly at the Guesthouse, phone +46-(0)8-592 589 00. Please, also inform the organisers or the Guesthouse if you plan to have lunch and/or dinner at the Guesthouse. Informal presentations to initiate the discussions will be given by, among others, Prof. Jarl-Thure Ericsson, professor at the Dept. of Electrical Engineering and headmaster of Tampere University of Technology, Prof. Erik Mosekilde, Dept. of Physics, the Technical University of Denmark, Prof. Wlodzislaw Duch, Dept. of Computer Methods, Nicholas Copernicus University, Torun (Poland), and Prof. Hans Wiksell, Comair AB, Stockholm. Anyone willing to give a short oral presentation is welcome to contact the organisers asap and no later than June 12. For furhter information, please see the home page, http://bach.theophys.kth.se/~hali/eann97/eannwork.html or contact the organisers. All interested are welcome to contact the organisers for registration before June 12. Hans Liljenstrom Agora for Biosystems Box 57 S-193 22 Sigtuna Sweden Phone: +46-(0)8-592 509 01 (Agora) +46-(0)8-790 7172 (KTH) Fax: +46-(0)8-104879, Email: hali at theophys.kth.se. From marc at cogsci.ed.ac.uk Thu Jun 5 08:36:30 1997 From: marc at cogsci.ed.ac.uk (Marc Moens) Date: Thu, 05 Jun 1997 13:36:30 +0100 Subject: Job Advert -- Chairs in Informatics, Edinburgh Message-ID: <199706051236.NAA16051@stevenson.cogsci.ed.ac.uk> [Apologies if you receive this more than once.] Three Chairs in Informatics at the University of Edinburgh Chair of Artificial Intelligence Chair of Cognitive Science Chair of Computer Science The University intends to appoint to three Chairs, in Artificial Intelligence, Cognitive Science, and Computer Science. These posts are being filled to sustain and develop Edinburgh's identity as a centre of the highest international quality for research and teaching in Informatics. Applications are invited from candidates with an established international reputation in research. Salary will be on the Professorial scale. For further details see http://www.dcs.ed.ac.uk/ipu/chairs Closing date 27th June 1997 From zwerg at libra.chaos.gwdg.de Thu Jun 5 02:57:30 1997 From: zwerg at libra.chaos.gwdg.de (Dirk Brockmann) Date: Thu, 5 Jun 1997 08:57:30 +0200 (MET DST) Subject: New paper on orientation and ocular dominance available In-Reply-To: Message-ID: "Conditions for the joint emergence of orientation and ocular dominance in a high-dimensional self-organizing map." by D. Brockmann, M. Riesenhuber, H. U. Bauer, T. Geisel, (1997)> at http://www.chaos.gwdg.de/~zwerg/ Abstract: We investigate a model for the joint development of orientation and ocular dominance columns which is based on the high-dimensional version of Kohonen's Self-Organizing Map algorithm. The model is driven by stimuli which have correlated left eye/right eye and on center cell/off center cell components. We first analyze the model using a recently proposed mathematical technique which allows us to identify regions in parameter space for which either ocular dominance or orientation columns alone develop, or both together. Guided by these results we then simulate the model and indeed find jointly developed columnar systems. The prefered angle of intersection between iso-orientation lines and the boundaries of iso-ocularity domains turns out to be rather perpendicular, consistent with experimental observations. Dirk > ********************************************************************* > Dirk Brockmann e-mail: zwerg at chaos.gwdg.de > Max-Planck-Institut fuer Stroemungsforschung Tel: (49) 551 5176 411 > Goettingen FAX: (49) 551 5176 402 > ********************************************************************* > > From devin at psy.uq.edu.au Fri Jun 6 05:03:45 1997 From: devin at psy.uq.edu.au (Devin McAuley) Date: Fri, 6 Jun 1997 19:03:45 +1000 (EST) Subject: Connectionist Models of Cognition: A Workshop Message-ID: *** Registration deadline extended to June 21st *** CONNECTIONIST MODELS OF COGNITION: A WORKSHOP Monday July 7th - Friday July 11th, 1997 University of Queensland Brisbane, Queensland 4072 Australia sponsored by the School of Psychology and the Cognitive Science Program Workshop Home Page: http://psy.uq.edu.au/~brainwav/Workshop/ BrainWave Home Page: http://psy.uq.edu.au/~brainwav/ This workshop provides an opportunity for faculty and students with teaching and research interests in connectionist modeling to gain hands-on modeling experience with the BrainWave neural network simulator during an intensive 5-day workshop on connectionist models of cognition at the University of Queensland. The workshop has three primary objectives: * to provide training in specific connectionist models of cognition. * to introduce instructors to the BrainWave simulator and course materials. * to support the development of new models by the participants. The first two days of the workshop will provide training in specific connectionist models of cognition and introduce participants to the BrainWave simulator. Day 3 will focus on how to develop a new model using BrainWave. On days 4 and 5, the instructors will be available to assist faculty and students in the development of new models and to discuss the development of teaching materials for undergraduate and postgraduate courses on connectionist modeling. Instructors: * Simon Dennis, School of Psychology * Devin McAuley, School of Psychology * Janet Wiles, Schools of Information Technology and Psychology Registration for the 5-day workshop includes: * Course materials: o The BrainWave Simulator for the Mac, Windows95, and Unix Platforms o An Introduction to Neural Networks and the BrainWave Simulator o Three chapters of a workbook on connectionist models of cognition * Morning and afternoon tea (Monday - Friday) * Lunch (Monday and Tuesday) The deadline for registration is June 21st. ---------------------------------------------------------------------------- Workshop Program Monday, July 7th Morning Session 8:00-9:00 Registration 9:00-10:30 Lecture: Introduction to connectionist models of cognition 11:00-1:00 Laboratory: Introduction to the BrainWave simulator Afternoon Session 2:00-3:00 Lecture: The Interactive Activation and Competition model of letter perception 3:30-5:00 Laboratory: The word superiority effect Tuesday, July 8th Morning Session 9:00-10:00 Lecture: The BackPropagation model of word and color naming 10:30-12:30 Laboratory: The Stroop effect Afternoon Session 1:30-3:00 Lecture: The Matrix model of episodic memory 3:30-5:00 Laboratory: Familiarity versus recognition Wednesday, July 9th Morning Session (using the Scheme language interface to modify BrainWave) 9:00-10:30 Lecture: Introduction to the Scheme programming language 11:00-12:30 Laboratory: How to extend models in BrainWave Afternoon Session (a case study of model development in BrainWave) 1:30-3:00 Lecture: The Deep Dyslexia network 3:30-5:00 Organisation of group projects Thursday, July 10th Morning Session 9:30-11:00 Group Work: Model design 11:00-12:30 Group Work: Model construction I Afternoon Session: Stream 1 (continued model construction) 1:30-3:00 Group Work: Model construction II 3:30-5:00 Group Work: Model construction III Afternoon Session: Stream 2 (using the Java language to modify BrainWave) 1:30-3:00 Lecture: Introduction to the Java programming language 3:30-5:00 Laboratory: How to add new architectures to BrainWave Friday, July 11th Morning Session 9:30-12:30 Group Work: Model construction IV (preparing your demonstration) Afternoon Session 1:30-3:30 Workshop presentations and discussion ---------------------------------------------------------------------------- Send general inquiries and registration form to: email: brainwav at psy.uq.edu.au fax: +61-7-3365-4466 (attn: Dr. Devin McAuley) postal mail: BrainWave Workshop (c/o Dr. Devin McAuley) School of Psychology University of Queensland Brisbane, Queensland 4072 Australia ---------------------------------------------------------------------------- REGISTRATION FORM Name: ________________________________________ Email: ________________________________________ Address: ________________________________________ ________________________________________ ________________________________________ ________________________________________ Student (AUS$60) ____ Academic (AUS$95) ____ Industry (AUS$295) ____ ACCOMODATION I would like accomodation at King's College ($45 per night - private room with ensuite shared between two, bed and breakfast) Yes ____ No ____ Arrival Date: ____ Departure Date: ____ Accomodation total: AUS $ ______ I would like to be billeted: Yes ____ No ____ Arrival Date: ____ Departure Date: ____ Total payment including registration: AUS $______ FORM OF PAYMENT Cheque or Money Order ____ Visa ____ Mastercard ____ Card # ____________________________ Expiration Date _____ Please debit my credit card for AUS$_______________ Signature _________________________________________ Cheques and money orders should be made out to University of Queensland, School of Psychology. From bishopc at helios.aston.ac.uk Mon Jun 9 10:29:11 1997 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Mon, 09 Jun 1997 15:29:11 +0100 Subject: Postdoctoral Research Fellowship Message-ID: <17994.199706091429@sun.aston.ac.uk> ---------------------------------------------------------------------- POSTDOCTORAL RESEARCH FELLOWSHIP -------------------------------- Neural Computing Research Group Dept of Computer Science & Applied Mathematics Aston University, Birmingham, UK "Automatic Analysis of Fluorescence In-Situ Hybridization Images" ----------------------------------------------------------------- *** Full details at http://www.ncrg.aston.ac.uk/ *** A mathematically-oriented researcher is required to work on the development of robust pattern recognition algorithms for the automatic interpretation of fluorescence in situ hybridization (FISH) images arising from the analysis of human chromosomes. Candidates should have an established research background in statistical pattern recognition including the use of techniques such as neural networks. Ideally, candidates should also have experience of developing solutions to problems in biomedical image interpretation. The successful candidate will join the thriving Neural Computing Research Group at Aston University, and will work under the supervision of Professor Chris Bishop. Conditions of Service --------------------- The salary will be on the RA 1A scale, within the range 15,159 to 22,785 UK pounds. This salary scale is subject to annual increments. How to Apply ------------ If you wish to be considered for this Fellowship, please send a full CV and publications list (including full details and grades of academic qualifications), a statement of how you believe you can contribute to this project, and the names and contact information of 3 referees, to: Hanni Sondermann Neural Computing Research Group Dept. of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: 0121 333 4631 Fax: 0121 333 4586 e-mail: H.E.Sondermann at aston.ac.uk e-mail submission of postscript files is welcome. Closing date: 27 June, 1997. From rvicente at onsager.if.usp.br Mon Jun 9 18:34:21 1997 From: rvicente at onsager.if.usp.br (Renato Vicente) Date: Mon, 9 Jun 1997 19:34:21 -0300 (EST) Subject: preprint: Functional Optimisation in Neural Networks Message-ID: <199706092234.TAA10602@gibbs.if.usp.br> Title: Functional Optimisation of Online Algorithms in Neural Networks (submitted to J. Phys. A: Math. Gen.) Authors: Renato Vicente and Nestor Caticha. Available at : http://www.fma.if.usp.br/~rvicente/nnsp.html http://xxx.lanl.gov (cond-mat/9706015) Abstract: We study the online dynamics of learning in fully connected soft committee machines in the student-teacher scenario. The locally optimal modulation function, which determines the learning algorithm, is obtained from a variational argument in such a manner as to maximise the average generalisation error decay per example. Simulations results for the resulting algorithm are presented for a few cases. The symmetric phase plateaux are found to be vastly reduced in comparison to those found when online backpropagation algorithms are used. A discussion of the implementation of these ideas as practical algorithms is given. Comments are welcome. Renato Vicente Instituto de Fisica, Universidade de Sao Paulo, Brazil From ted at SPENCER.CTAN.YALE.EDU Tue Jun 10 13:51:04 1997 From: ted at SPENCER.CTAN.YALE.EDU (ted@SPENCER.CTAN.YALE.EDU) Date: Tue, 10 Jun 1997 17:51:04 GMT Subject: Preprint available Message-ID: <199706101751.RAA02479@CHARCOT.CTAN.YALE.EDU> A preprint of The NEURON Simulation Environment M.L. Hines and N.T. Carnevale Neural Computation, in press is now available in PostScript for UNIX systems as ftp.neuron.yale.edu/neuron/papers/nsimenv.ps.Z and for MSDOS systems as ftp.neuron.yale.edu/neuron/papers/nsimenps.zip The only difference between these two files is the method of compression. Length about 250K, expands to about 600K when uncompressed. An HTML formatted version has been posted at http://www.nnc.yale.edu/papers/nrnrefs.html This is graced by convenient internal and external links. The paper contains much helpful material that has not appeared elsewhere, and should be of interest both to current users and to those who are deciding what simulation tools to choose. This list of section headings gives an idea of what is covered. 1. INTRODUCTION 1.1 The problem domain 1.2 Experimental advances and quantitative modeling 2. OVERVIEW OF NEURON 3. MATHEMATICAL BASIS 3.1 The cable equation 3.2 Spatial discretization in a biological context: sections and segments 3.3 Integration methods 3.3.1 Efficiency 4. THE NEURON SIMULATION ENVIRONMENT 4.1 The hoc interpreter 4.2 A specific example 4.2.1 First step: establish model topology 4.2.2 Second step: assign anatomical and biophysical properties 4.2.3 Third step: attach stimulating electrodes 4.2.4 Fourth step: control simulation time course 4.3 Section variables 4.4 Range variables 4.5 Specifying geometry: stylized vs. 3-D 4.6 Density mechanisms and point processes 4.7 Graphical interface 4.8 Object-oriented syntax 4.8.1 Neurons 4.8.2 Networks 5. SUMMARY ACKNOWLEDGMENTS REFERENCES --Ted From ted at SPENCER.CTAN.YALE.EDU Tue Jun 10 14:17:45 1997 From: ted at SPENCER.CTAN.YALE.EDU (ted@SPENCER.CTAN.YALE.EDU) Date: Tue, 10 Jun 1997 18:17:45 GMT Subject: NEURON course at SFN 1997 Message-ID: <199706101817.SAA02517@CHARCOT.CTAN.YALE.EDU> In response to yesterday's announcement of record # of abstracts at the upcoming Society for Neuroscience meeting in New Orleans, everybody seems to be making travel arrangements early-- so we'd like to bring something to your attention, just in case it affects your travel plans. John Moore, Michael Hines & I will be presenting a 1 day course on Saturday, Oct. 25, entitled "Using the NEURON Simulation Environment." This is a regular Satellite Symposium, and eventually you will read the announcement in SFN's preliminary program. It will run from 9 AM to 5 PM. Coffee breaks and lunch will be provided. There will be a registration fee to cover these and related costs, e.g. AV equipment rental and handout materials. This will be in the ballpark of other 1 day courses; exact figure depends on factors we haven't yet learned from SFN. More information about this course is posted at http://www.neuron.yale.edu/sfn.html and soon also at nttp://neuron.duke.edu/sfn.html --Ted From at at cogsci.soton.ac.uk Tue Jun 10 12:13:39 1997 From: at at cogsci.soton.ac.uk (Adriaan Tijsseling) Date: Tue, 10 Jun 1997 16:13:39 +0000 Subject: ANN: BackProp Simulator for Macintosh Message-ID: Release: BackBrain version 1.0 Introduction A Macintosh shareware program called BackBrain that simulates the backpropagation type artificial neural network algorithm. This application lets you create, train, and analyze backprop nets using an intuitive interface. BackBrain also makes it possible to create 3D models of network dynamics. A step by step tutorial is available as an Apple Guide document (under the Balloon Help menu). You can also turn on balloon help to quickly assess the function of particular commands. System Requirements BackBrain only works with PowerMacintoshes with System 7 or later and with the Thread Manager? installed. If you have system 7.5 or later then the Thread Manager is already incorporated, but for former system versions you need the Thread Manager extension. You will also need QuickDraw3D 1.5.1 or later and QuickTime. Fees BackBrain is shareware with a value of $25,-. Site licences will be $10,- for each copy. More information can be found in the Readme file. Download the program The BackBrain program can be found on the following site: http://www.soton.ac.uk/~agt/mac/ ______________________________________________________________ Cognitive Scientist & Mac Guru Work: Cognitive Sciences Centre / Department of Psychology, University of Southampton, Highfield, SO17 1BJ, Southampton, UK Web-Sites: Cognitive Sciences Centre: http://www.soton.ac.uk/~coglab/coglab/ Ph.D.-Research: http://www.soton.ac.uk/~coglab/coglab/formalme.html Personal WebSite: http://www.soton.ac.uk/~agt/ E-mail: mailto:at at neuro.psy.soton.ac.uk From cas-cns at cns.bu.edu Wed Jun 11 12:44:43 1997 From: cas-cns at cns.bu.edu (CAS/CNS) Date: Wed, 11 Jun 1997 11:44:43 -0500 Subject: Neural Networks: CALL FOR PAPERS Message-ID: <199706111544.LAA29783@cns.bu.edu> *****CALL FOR PAPERS***** 1998 Special Issue of Neural Networks NEURAL CONTROL AND ROBOTICS: BIOLOGY AND TECHNOLOGY Planning and executing movements is of great importance in both biological and mechanical systems. This Special Issue will bring together a broad range of invited and contributed articles that describe progress in understanding the biology and technology of movement control. Movement control covers a wide range of topics, from integration of different types of sensory information, to flexible planning of movements, to generation of motor commands, to compensation for internal and external perturbations. Of particular importance are the coordinate transformations, memory systems, and attentional and volitional mechanisms needed to implement movement control. Neural control is the study of how biological systems have solved these problems with joints, muscles, and brains. Robotics is the attempt to build mechanical systems that can solve these problems under constraints of size, weight, robustness, and cost. This Special Issue welcomes high quality articles from both fields and seeks to explore the possible synergies between them. CO-EDITORS: Professor Rodney Brooks, Massachusetts Institute of Technology Professor Stephen Grossberg, Boston University Dr. Lance Optican, National Institutes of Health SUBMISSION: Deadline for submission: October 31, 1997 Notification of acceptance: January 31, 1998 Format: no longer than 10,000 words; APA format ADDRESS FOR SUBMISSION: Professor Stephen Grossberg Boston University Department of Cognitive and Neural Systems 677 Beacon Street Boston, Massachusetts 02215 From suem at soc.plym.ac.uk Wed Jun 11 06:03:35 1997 From: suem at soc.plym.ac.uk (Sue McCabe) Date: Wed, 11 Jun 1997 11:03:35 +0100 Subject: Postdoctoral Research Fellowship Message-ID: <1.5.4.32.19970611100335.006b70f0@soc.plym.ac.uk> Postdoctoral Research Fellowship in Neural Network Modelling of Brain Processes Centre for Neural and Adaptive Systems (http://www.tech.plym.ac.uk/soc/research/neural/ index.html) School of Computing University of Plymouth, UK The Centre for Neural and Adaptive Systems investigates computational neural models of major brain processes underlying intelligent behaviour and uses these models to provide inspiration for the development of novel neural computing architectures and systems for intelligent sensory information processing, condition monitoring, autonomous control and robotics. The Centre carries out collaborative research with leading international experts in the UK, USA and Europe, in both the fields of computational neuroscience and neural computation. A Research Fellow is sought by the Centre to participate in its on-going research programme in the area of neural network modelling of major brain processes, in collaboration with existing researchers in the Centre. Current work in the Centre is concerned with the investigation and development of neural network models of a number of brain regions and their interactions, including the hippocampus, septum, basal ganglia, prefrontal cortex, thalamus and sensory cortex. Specific investigations include learning and memory in spatial navigation tasks, streaming and grouping in auditory perception, and attentional set shifting in visual discrimination tasks. Over the next few years we wish to develop our links more strongly with leading experimental neuroscientists in the field in the UK, USA and Europe, and the Research Fellow will be expected to contribute mainly to this objective. He/she will be eligible, and encouraged, to apply for UK Research Council and other grants to support his/her research, and to take on research student supervision. Candidates for the Fellowship will be expected to hold, or be about to receive, a PhD degree either in the neural network modelling of brain processes, or in experimental neuroscience. If the latter, then he/she must have a strong interest in neural modelling. Candidates should also have or expect to achieve within two years, significant journal or conference publications in their area. A good knowledge of fundamental neuroscience principles is necessary, and candidates must either have good computational modelling skills (we currently use Matlab and C/C++). or be willing to acquire these skills. In addition to carrying out his/her research programme, the Research Fellow will be expected to undertake some teaching duties on the MSc Computational Intelligence programme to which the Centre makes a major contribution. The salary for the post will be in the range of #18,724 to #22,471 per annum. Further details about the Fellowship can be obtained by telephoning Professor Mike Denham on (+44) (0) 1752 232547, or by e-mail to mdenham at plym.ac.uk. Dr Sue McCabe School of computing University of Plymouth Drake Circus Plymouth PL4 8AA email: suem at soc.plym.ac.uk tel: 44 1752 232610 From HECKATHO at a.psc.edu Wed Jun 11 12:22:51 1997 From: HECKATHO at a.psc.edu (HECKATHO@a.psc.edu) Date: Wed, 11 Jun 1997 12:22:51 -0400 Subject: PSC Computational Neuroscience Workshop on the Mbone Message-ID: <970611122251.20222e9d@CPWSCA.PSC.EDU> The Pittsburgh Supercomputing Center will broadcast the "Simulations in Computational Neuroscience" workshop, June 12-14, 1997. We are inviting interested scientists to watch this lecture over the Mbone. Information regarding the reception of the workshop at your remote site can be found at: http://www.psc.edu/training/general/remote.html Agenda, lecture notes and further information at: http://www.psc.edu/biomed/workshops/wk-97/neuro/neural.html From oby at cs.tu-berlin.de Wed Jun 11 15:36:30 1997 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Wed, 11 Jun 1997 21:36:30 +0200 (MET DST) Subject: No subject Message-ID: <199706111936.VAA11586@pollux.cs.tu-berlin.de> Dear Connectionists, below please find the announcement of a one day workshop on computational neuroscience. Participation is free and everybody interested is wellcome. Klaus Obermayer ----------------------------------------------------------------------- Berlin Workshop on Computational Neuroscience Time: Wednesday July 9, 1997; 9.15 am - 5.00 pm Location: Lise-Meitner Hoersaal, Institute for Biochemistry, Free University of Berlin, Thielallee 63-67, 14195 Berlin, Germany Sponsor: German Science Foundation, training grant ``Signalling Chains in Living Systems'' (Prof. R. Menzel, PI) 9.15 - 9.30 Opening Remarks 9.30 - 10.20 Dr. Peter Stern, Max Planck Institute for Brain Research, Frankfurt, Clones, channels and kinetics - towards a complete model of the NMDA receptor 10.30 - 11.20 Dr. Klaus Pawelzik, Max Planck Institute for Fluid Dynamics, Goettingen, A model for the development and function of long range connections in cortex 11.30 - 12.20 Prof. Dr. Henning Scheich, Institute for Neurobiology, Magdeburg, Functional organisation, plasticity, and language processing in auditory cortex 12.30 - 14.00 Lunch 14.00 - 14.50 Prof. Dr. Werner von Seelen, Institute for Neuroinformatics, University of Bochum, Neural fields for the navigation of robots and for the interpretation of cortical representations 15.00 - 15.50 Prof. Dr. Martin Egelhaaf, Department of Biology, University of Bielefeld, A look into the cockpit of a fly: Neural circuits and visual orientation behavior 16.00 - 16.50 Prof. Dr. Holk Cruse, Department of Biology, University of Bielefeld, Towards an understanding of the control of complex movements: Experiments and simulations 17.00 Adjourn ----------------------------------------------------------------------- For further information please contact: Prof. Klaus Obermayer phone: 49-30-314-73442 FR2-1, KON, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://kon.cs.tu-berlin.de/ From yann at research.att.com Wed Jun 11 16:08:42 1997 From: yann at research.att.com (Yann LeCun) Date: Wed, 11 Jun 1997 16:08:42 -0400 Subject: AT&T is moving Message-ID: <199706112007.QAA18443@surfcity.research.att.com> Dear Colleagues: As you may have heard, AT&T split into three companies last year: AT&T, Lucent, and NCR. AT&T Bell Laboratories, home of quite a few NN and machine learning researchers, was split between AT&T and Lucent. The part that went with Lucent kept the name Bell Labs, while the part that stayed with AT&T took the name AT&T Labs-Research. Most former AT&T Bell Labs machine learning and neural net researchers are now with AT&T Labs-Research, and are moving from the Holmdel and Murray Hill locations (now owned by Lucent) to new AT&T Labs locations. The following people are moving to the location below: Name phone email Yoshua Bengio (732)345-3335 yoshua at research.att.com Leon Bottou (732)345-3336 leonb at research.att.com Eric Cosatto (732)345-3337 eric at research.att.com Hans-Peter Graf (732)345-3338 hpg at research.att.com Patrick Haffner (732)345-3339 haffner at research.att.com Yann LeCun (732)345-3333 yann at research.att.com Patrice Simard (732)345-3341 patrice at research.att.com Vladimir Vapnik (732)345-3342 vlad at research.att.com Shirley Bradford (732)345-3334 shirley at research.att.com (secretary) Image Processing Services Research Lab AT&T Labs - Research 100 Schulz Drive Red Bank, NJ 07701-7033 USA fax: (732)345-3031 web: http://www.research.att.com/ Other AT&T Researchers who have connections with the machine learning and connectionist communities including John Denker, Harris Drucker, and Larry Jackel, are also moving to the above address. AT&T Labs NN/ML Researchers who were previously in Murray Hill, NJ, including Corinna Cortes, Al Gorin, Michael Kearns, Esther Levin, Fernando Pereira, Daryl Pregibon, Mazin Rahim, Rob Shapire, and Yoram Singer, have moved to the following location: AT&T Labs - Research 180 Park Avenue P.O. Box 971 Florham Park, NJ 07932-0971 (973)360-8000 ================================================================ Yann LeCun Image Processing Services Research AT&T Labs - Research 100 Schulz Drive Red Bank, NJ 07701-7033 tel: (732)345-3333 fax: (732)345-3031 yann at research.att.com http://www.research.att.com/info/yann From scheler at ICSI.Berkeley.EDU Wed Jun 11 20:33:29 1997 From: scheler at ICSI.Berkeley.EDU (Gabriele Scheler) Date: Wed, 11 Jun 1997 17:33:29 -0700 (PDT) Subject: Preprints available Message-ID: <199706120033.RAA05823@icsib77.ICSI.Berkeley.EDU> Dear connectionists, a number of preprints on neuronal and statistical models of linguistic functions are available from my homepage now. The URL is: http://www.informatik.tu-muenchen.de/~scheler/publications.html Gabriele Scheler Institut fuer Informatik TU Muenchen D 80290 Muenchen present address: ICSI 1947 Center Street Berkeley, Ca. 94704 I include titles, abstracts and bibliographic references for two recent papers: ------------------------------------------------------------------- Scheler, Gabriele and Kerstin Fischer: The many functions of discourse particles: A computational model. to appear in Proceedings of Cognitive Science 1997. ------------------------------------------------------------------- We present a connectionist model for the interpretation of discourse particles in real dialogues that is based on neuronal principles of categorization (categorical perception, prototype formation, contextual interpretation). It can be shown that discourse particles operate just like other morphological and lexical items with respect to interpretation processes. The description proposed locates discourse particles in an elaborate model of communication which incorporates many different aspects of the communicative situation. We therefore also attempt to explore the content of the category discourse particle. We present a detailed analysis of the meaning assignment problem and show that 80% - 90% correctness for unseen discourse particles can be reached with the feature analysis provided. Furthermore, we show that `analogical transfer' from one discourse particle to another is facilitated if prototypes are computed and used as the basis for generalization. We conclude that the interpretation processes which are a part of the human cognitive system are very similar with respect to different linguistic items. However, the analysis of discourse particles shows clearly that any explanatory theory of language needs to incorporate a theory of communication processes. ------------------------------------------------------------------- Scheler,G. Feature-based Perception of Semantic Concepts. to appear: Freksa(ed.) Computation and Cognition, Springer 1997. ------------------------------------------------------------------- In this paper we point to some principles of neural computation as they have been derived from experimental and theoretical studies primarily on vision. We argue that these principles are well suited to explain some characteristics of the linguistic function of semantic concept recognition. Computational models built on these principles have been applied to morphological-grammatical categories (aspect), function words (determiners) and discourse particles in spoken language. We suggest a few ways in which these studies may be extended to include more detail on neural functions into the computational model. From ESPAA at soc.plym.ac.uk Thu Jun 12 10:40:07 1997 From: ESPAA at soc.plym.ac.uk (espaa) Date: Thu, 12 Jun 1997 14:40:07 GMT Subject: Pattern Analysis and Applications Message-ID: Springer Verlag Ltd is launching a new journal - Pattern Analysis and Applications (PAA) - in Spring 1998. Aims and Scope of PAA: The journal publishes high quality articles in areas of fundamental research in pattern analysis and applications. It aims to provide a forum for original research which describes novel pattern analysis techniques and industrial applications of the current technology. The main aim of the journal is to publish high quality research in intelligent pattern analysis in computer science and engineering. In addition, the journal will also publish articles on pattern analysis applications in medical imaging. The journal solicits articles that detail new technology and methods for pattern recognition and analysis in applied domains including, but not limited to, computer vision and image processing, speech analysis, robotics, multimedia, document analysis, character recognition, knowledge engineering for pattern recognition, fractal analysis, and intelligent control. The journal publishes articles on the use of advanced pattern recognition and analysis methods including statistical techniques, neural networks, genetic algorithms, fuzzy pattern recognition, machine learning, and hardware implementations which are either relevant to the development of pattern analysis as a research area or detail novel pattern analysis applications. Papers proposing new classifier systems or their development, pattern analysis systems for real-time applications, fuzzy and temporal pattern recognition and uncertainty management in applied pattern recognition are particularly solicited. The journal encourages the submission of original case-studies on applied pattern recognition which describe important research in the area. The journal also solicits reviews on novel pattern analysis benchmarks, evaluation of pattern analysis tools, and important research activities at international centres of excellence working in pattern analysis. Audience: Researchers in computer science and engineering. Research and Development Personnel in industry. Researchers/ applications where pattern analysis is used, researchers in the area of novel pattern recognition and analysis techniques and their specific applications. Full information about the journal and detailed instructions for Call for Papers can be found at the PAA web site: http://www.soc.plym.ac.uk/soc/sameer/paa.htm Best regards Barbara Davies School of Computing University of Plymouth From kmb at pac.soton.ac.uk Fri Jun 13 04:05:11 1997 From: kmb at pac.soton.ac.uk (Kevin Bossley) Date: Fri, 13 Jun 1997 09:05:11 +0100 Subject: Neurofuzzy Modelling Thesis Available. Message-ID: <3.0.32.19970613090510.00990530@margaux.pac.soton.ac.uk> The following PhD. thesis is now available! =================================================================== "Neurofuzzy Modelling Approaches in System Identification" by Kevin Bossley =================================================================== This can be obtained from: http://www.isis.ecs.soton.ac.uk/pub/theses/kmb:97/thesis.html and for immediate information I have included the abstract below. Abstract System identification is the task of constructing representative models of processes and has become an invaluable tool in many different areas of science and engineering. Due to the inherent complexity of many real world systems the application of traditional techniques is limited. In such instances more sophisticated (so called intelligent) modelling approaches are required. Neurofuzzy modelling is one such technique, which by integrating the attributes of fuzzy systems and neural networks is ideally suited to system identification. This attractive paradigm combines the well established learning techniques of a particular form of neural network i.e.\ generalised linear models with the transparent knowledge representation of fuzzy systems, thus producing models which possess the ability to learn from real world observations and whose behaviour can be described naturally as a series of linguistic humanly understandable rules. Unfortunately, the application of these systems is limited to low dimensional problems for which good quality expert knowledge and data are available. The work described in this thesis addresses this fundamental problem with neurofuzzy modelling, as a result algorithms which are less sensitive to the quality of the {\em a priori} knowledge and empirical data are developed. The true modelling capabilities of any strategy is heavily reliant on the model's structure, and hence an important (arguably the most important) task is structure identification. Also, due to the {\em curse of dimensionality}, in high dimensional problems the size of conventional neurofuzzy models gets prohibitively large. These issues are tackled by the development of automatic neurofuzzy model identification algorithms, which exploit the available expert knowledge and empirical data. To alleviate problems associated with the curse of dimensionality, aid model generalisation and enhance model transparency, parsimonious models are identified. This is achieved by the application of additive and multiplicative neurofuzzy models which exploit structural redundancies found in conventional systems. The developed construction algorithms successfully identify parsimonious models, but as a result of noisy and poorly distributed empirical data, these models can still generalise inadequately. This problem is addressed by the application of Bayesian inferencing techniques; a form of regularisation. Smooth model outputs are assumed and superfluous model parameters are controlled, sufficiently aiding model generalisation and transparency, and data interpolation and extrapolation. By exploiting the structural decomposition of the identified neurofuzzy models, an efficient local method of regularisation is developed. All the methods introduced in this thesis are illustrated on many different examples, including simulated time series, complex functional equations, and multi-dimensional dynamical systems. For many of these problems conventional neurofuzzy modelling is unsuitable, and the developed techniques have extended the range of problems to which neurofuzzy modelling can successfully be applied. Best Regards Kevin Bossley --- Real Time Systems Group Parallel Applications Centre 2 Venture Road Chilworth Southampton SO16 7NP http://www.pac.soton.ac.uk/ Tel : +44 (0) 1703 760834 Fax : +44 (0) 1703 760833 From terry at salk.edu Fri Jun 13 12:30:32 1997 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 13 Jun 1997 09:30:32 -0700 (PDT) Subject: NEURAL COMPUTATION 9:5 Message-ID: <199706131630.JAA17514@helmholtz.salk.edu> Neural Computation - Contents - Volume 9, Number 5 - July 1, 1997 Article Hybrid Learning of Mapping and its Jacobian in Multilayer Neural Networks Jeong-Woo Lee and Jun-Ho Oh Notes The Joint Development of Orientation and Ocular Dominance: Role of Constraints Christian Piepenbrock, Helge Ritter and Klaus Obermayer Physiological Gain Leads to High ISI Variability in a Simple Model of a Cortical Regular Spiking Cell Todd W. Troyer and Kenneth D. Miller Letters Role of Temporal Integration and Fluctuation Detection in the Highly Irregular Firing of a Leaky Integrator Neuron Model with Partial Reset Guido Bugmann, Chris Christodoulou and John G. Taylor Shunting Inhibition Does Not Have a Divisive Effect on Firing Rates Gary R. Holt and Christof Koch Reduction of the Hogkin-Huxley Equations to a Single-Variable Threshold Model Werner M. Kistler, Wulfram Gerstner and J. Leo van Hemmen Noise Adaptation in Integrate-and-Fire Neurons Michael E. Rudd and Lawrence G. Brown Paradigmatic Working Memory (Attractor) Cell in IT Cortex Daniel J. Amit, Stefano Fusi, and Volodya Yakovlev Noise Injection: Theoretical Prospects Yves Grandvalet, Stephane Canu and Stephane Boucheron The Faulty Behavior of Feedforward Neural Networks with Hard-Limiting Activation Fuction Zhiyu Tian, Ting-Ting Y. Lin, Shiyuan Yang and Shibai Tong Analysis of Dynamical Recognizers Alan D. Blair and Jordan B. Pollack A Bound on the Error of Cross Validation Using the Approximation and Estimation Rates, with Consequences for the Training-Test Split Michael Kearns Averaging Regularized Estimators Michiaki Taniguchi and Volker Tresp ----- ABSTRACTS - http://mitpress.mit.edu/journal-home.tcl?issn=08997667 SUBSCRIPTIONS - 1997 - VOLUME 9 - 8 ISSUES ______ $50 Student and Retired ______ $78 Individual ______ $250 Institution Add $28 for postage and handling outside USA (+7% GST for Canada). (Back issues from Volumes 1-8 are regularly available for $28 each to institutions and $14 each for individuals Add $5 for postage per issue outside USA (+7% GST for Canada) mitpress-orders at mit.edu MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142. Tel: (617) 253-2889 FAX: (617) 258-6779 ----- From bishopc at helios.aston.ac.uk Sat Jun 14 09:03:06 1997 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Sat, 14 Jun 1997 14:03:06 +0100 Subject: Paper on mixtures of probabilistic PCA Message-ID: <186.199706141303@sun.aston.ac.uk> Mixtures of Probabilistic Principal Component Analysers ======================================================= Michael E. Tipping and Christopher M. Bishop Neural Computing Research Group Dept. of Computer Science & Applied Mathematics Aston University, Birmingham B4 7ET Technical Report NCRG/97/003 Submitted to Neural Computation Abstract -------- Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a probability density, and so there is no unique way to combine PCA models. Previous attempts to formulate mixture models for PCA have therefore to some extent been ad hoc. In this paper, PCA is formulated within a maximum-likelihood framework, based on a specific form of Gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analysers, whose parameters can be determined using an EM algorithm. We discuss the advantages of this model in the context of clustering, density modelling and local dimensionality reduction, and we demonstrate its application to image compression and handwritten digit recognition. Available as a postscript file: http://neural-server.aston.ac.uk/Papers/postscript/NCRG_97_003.ps.Z To access other publications by the Neural Computing Research Group, go to the group home page: http://www.ncrg.aston.ac.uk/ and click on `Publications' -- you can then obtain a list of all online NCRG publications, or search by author, title or abstract. From devries at peanut.sarnoff.com Mon Jun 16 15:39:19 1997 From: devries at peanut.sarnoff.com (Aalbert De Vries x2456) Date: Mon, 16 Jun 1997 15:39:19 -0400 Subject: NNSP97 workshop reminder Message-ID: <199706161939.PAA02240@fire.sarnoff.com> WORKSHOP REMINDER ----------------- 1997 IEEE Workshop on Neural Networks for Signal Processing 24-26 September 1997 Amelia Island Plantation, Florida ********************************************** * Advanced registration: before July 1, 1997 * ********************************************** Special Sessions: * BLIND SIGNAL PROCESSING * APPLICATIONS OF NEURAL NETWORKS TO BIOMEDICAL SIGNAL PROCESSING Invited Speakers: Dr. Simon Haykin, Chaos, Radar Clutter, and Neural Networks Dr. David brown, Neural Networks for Medical Image Processing Dr. Yann LeCun, Neural Networks and Gradient-Based Learning in OCR Dr. Jean-Francois Cardoso, Blind Separation of Noisy Mixtures Dr. S.Y. Kung, Neural Networks for Intelligent Multimedia Processing Further Information Local Organizer Ms. Sharon Bosarge Telephone: 352-392-2585 Fax: 352-392-0044 e-mail: sharon at ee1.ee.ufl.edu World Wide Web http://www.cnel.ufl.edu/nnsp97/ (The link to this page may be slow, so please be patient) Organization General Chairs Lee Giles (giles at research.nj.nec.com), NEC Research Nelson Morgan (morgan at icsi.berkeley.edu), UC Berkeley Proceeding Chair Elizabeth J. Wilson (bwilson at ed.ray.com), Raytheon Co. Publicity Chair Bert DeVries (bdevries at sarnoff.com), David Sarnoff Research Center Program Chair Jose Principe (principe at synapse.ee.ufl.edu), University of Florida BIO Session Chair Tulay Adali (adali at engr.umbc.edu), University of Maryland Baltimore County Program Committee Les ATLAS Charles BACHMANN Andrew BACK A. CONSTANTINIDES Federico GIROSI Lars Kai HANSEN Allen GORIN Yu-Hen HU Jenq-Neng HWANG Biing-Hwang JUANG Shigeru KATAGIRI Gary KUHN Sun-Yuan KUNG Richard LIPPMANN John MAKHOUL Elias MANOLAKOS Erkki OJA Tomaso POGGIO Tulay ADALI Volker TRESP John SORENSEN Takao WATANABE Raymond WATROUS Andreas WEIGEND Christian WELLEKENS About Amelia Island Plantation Amelia Island is in the extreme northeast Florida, across the St. Mary's river. The island is just 29 miles from Jacksonville International Airport, which is served by all major airlines. Amelia Island Plantation is a 1,250 acre resort/paradise that offers something for every traveler. The Plantation offers 33,000 square feet of workable meeting space and a staff dedicated to providing an efficient, yet relaxed atmosphere. The many amenities of the Plantation include 45 holes of championship golf, 23 Har-Tru tennis courts, modern fitness facilities, an award winning children's program, more than 7 miles of flora-filled bike and jogging trails, 21 swimming pools, diverse accommodations, exquisite dining opportunities, and of course, miles of glistening Atlantic beach front. From tirthank at mpce.mq.edu.au Mon Jun 16 23:23:12 1997 From: tirthank at mpce.mq.edu.au (Tirthankar Raychaudhuri) Date: Tue, 17 Jun 1997 13:23:12 +1000 (EST) Subject: PhD thesis available Message-ID: <199706170323.NAA02315@krakatoa.mpce.mq.edu.au> The following PhD thesis is now electronically available ftp://ftp.mpce.mq.edu.au/pub/comp/papers/raychaudhuri.phd_thesis.97.ps (2581 k) ftp://ftp.mpce.mq.edu.au/pub/comp/papers/raychaudhuri.phd_thesis.97.ps.Z (848 k) SEEKING THE VALUABLE DOMAIN - QUERY LEARNING IN A COST-OPTIMAL PERSPECTIVE by Tirthankar RayChaudhuri School of Mathematics, Physics, Computing and Electronics, Macquarie University, Sydney, NSW 2109 AUSTRALIA tirthank at mpce.mq.edu.au http://www.comp.mq.edu.au/~tirthank Abstract -------- Purpose This thesis is intended as a contribution to the theories and principles of active learning and dual control. It is an account of an investigation which essentially examines the problem of how a machine learning system can be designed so as to acquire useful data actively, i.e., with environment interaction and simultaneously perform cost-effectively. The primary aim of the research has been to evolve the general basis for developing an engineering solution for a self-designing system of the future. In other words, while developed theory and experimental investigations to substantiate the theory are presented in this dissertation, the learning system used herein is based on certain specific assumptions with a view to achieving a balance between extremely general theoretical abstraction and a practical method that addresses clearly-defined problem parameters. The outcome is a strategy of `learning while performing' - consisting of a set of algorithms directed to addressing a real world scenario such as an industrial manufacturing plant from which continuous data is available. Content The basic learning model used is the neural network function approximator trained by a supervised method. The overall theoretical framework is a system that is endowed with the capability of asking questions (querying) in order to obtain the most useful information from an environment such as a producing plant. Instead of modeling system behaviour with a state-space description, the emphasis is on significant input-output data gathered from a system with unknown internal behaviour (black-box). A locally-optimal or `greedy' strategy is used. Although a general theory for multidimensional systems has been developed the experimental studies are concentrated within a finite boundary, two-dimensional, continuous data domain. This is primarily for the purpose of easy visualisation of results and computational efficiency. The basic principle propounded is that the notion of dollar value of a particular output ought to be incorporated within the querying criterion. In the process of developing such a querying criterion a novel method of data subsampling/statistical jack-knifing is applied to Seung and Freund's query-by-committee query-by-committee philosophy. This technique is demonstrated to be highly effective in gathering both significant and adequate data samples for accurately modeling functions with sharp transition points and nonlinear functions. An extension of the strategy for addressing noisy functions leads to derivation of an estimate of the distribution of the true label (output) for a particular input of the black-box. While such a distribution is closely related to the developed querying criterion for information gain, it is taken further ahead in the investigation and used in defining the expected value of a label as an exploitation objective - in a cost-benefit analysis perspective. A confidence interval on this expected value is then analytically derived. The size of the confidence interval determines the amount of exploration by the learner. Experimental studies with unimodal output value characteristics reveal differences in the performance of the proposed method upon monotonic and non-monotonic environments. Such performance is mostly based upon a long-term measure of yield experimentally computed within a particular finite querying boundary. Two overall approaches to examining the problem are studied - one uses the estimated distribution of a label to compute its expected value (called IMDV: indirectly modeled dollar value), the other estimates expected values by applying the jack-knifed committee method directly to dollar value labels and is called a DMDV (directly modeled dollar value) algorithm. The inherent weakness of the second approach, despite its obvious computational advantage over the other, is pointed out. It is then demonstrated that a theoretically more sound but very computationally expensive strategy of combining the strengths of the IMDV and DMDV approaches can result in a technique known as the combined modeling of dollar values (CMDV). Finally it is discussed how the theoretical premise of a dual controller is inherent in the querying philosophy proposed. This kind of dual control is observer-free and is contrasted with more conventional control methods. -------------------------------------------------------------------------- Dr Tirthankar RayChaudhuri Phone 61 -2 -850-9543 Department of Computing School of MPCE Fax 61 -2 -850-9551 Macquarie University E-mail tirthank at mpce.mq.edu.au Sydney,NSW 2109 WWW http://www-comp.mpce.mq.edu.au/~tirthank From dwang at cis.ohio-state.edu Wed Jun 18 15:02:14 1997 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Wed, 18 Jun 1997 15:02:14 -0400 (EDT) Subject: Tech rep on medical image segmentation Message-ID: <199706181902.PAA27781@shirt.cis.ohio-state.edu> The following technical report is available on medical image segmentation. Comments and suggestions are welcome __________________________________________________________________________ -------------------------------------------------------------------------- Segmentation of Medical Images Using LEGION Technical Report: OSU-CISRC-4/97-TR26, 1997 Naeem Shareef, DeLiang L. Wang and Roni Yagel Department of Computer and Information Science The Ohio State University Columbus, Ohio 43210, USA ABSTRACT Advances in computer and visualization technology allow clinicians to virtually interact with anatomical structures contained within sampled medical image datasets. A hindrance to the effective use of this technology is the problem of image segmentation. In this paper, we utilize a recently proposed oscillator network called LEGION (Locally Excitatory Globally Inhibitory Oscillator Network), whose ability to achieve fast synchrony with local excitation and desynchrony with global inhibition makes it an effective computational framework for grouping similar features and segregating dissimilar ones in an image. We describe an image segmentation algorithm derived from LEGION and introduce an adaptive scheme to choose tolerances. We show results of the algorithm to 2D and 3D (volume) CT and MRI medical image datasets, and compare our results with manual segmentation. LEGION's computational and architectural properties make it a promising approach for real-time medical image segmentation using multiple features. A Postscript version of the report is available via ftp at FTP-host : ftp.cis.ohio-state.edu FTP-pathname: pub/tech-report/1997/TR26-DIR/ OR an HTML version can be found at HTML version: http://www.cis.ohio-state.edu/~shareef/OSUTR26/journal-1.html __________________________________________________________________________ -------------------------------------------------------------------------- From ESPAA at soc.plym.ac.uk Thu Jun 19 13:04:07 1997 From: ESPAA at soc.plym.ac.uk (espaa) Date: Thu, 19 Jun 1997 17:04:07 GMT Subject: PATTERN ANALYSIS AND APPLICATIONS JOURNAL Message-ID: C A L L F O R P A P E R S PATTERN ANALYSIS AND APPLICATIONS http://www.soc.plym.ac.uk/soc/sameer/paa.htm (SPRINGER VERLAG LIMITED) Springer-Verlag is starting the above journal in Spring 1998. The journal now invites papers on a number of topics in pattern analysis and applications. The Call for Papers may be printed out from the journal web site where paper submission instructions are also available. Attached below is the aims and scope of the journal. -------------- AIMS AND SCOPE The journal publishes high quality articles in areas of fundamental research in pattern analysis and applications. It aims to provide a forum for original research which describes novel pattern analysis techniques and industrial applications of the current technology. The main aim of the journal is to publish high quality research in intelligent pattern analysis in computer science and engineering. In addition, the journal will also publish articles on pattern analysis applications in medical imaging. The journal solicits articles that detail new technology and methods for pattern recognition and analysis in applied domains including, but not limited to, computer vision and image processing, speech analysis, robotics, multimedia, document analysis, character recognition, knowledge engineering for pattern recognition, fractal analysis, and intelligent control. The journal publishes articles on the use of advanced pattern recognition and analysis methods including statistical techniques, neural networks, genetic algorithms, fuzzy pattern recognition, machine learning, and hardware implementations which are either relevant to the development of pattern analysis as a research area or detail novel pattern analysis applications. Papers proposing new classifier systems or their development, pattern analysis systems for real-time applications, fuzzy and temporal pattern recognition and uncertainty management in applied pattern recognition are particularly solicited. The journal encourages the submission of original case-studies on applied pattern recognition which describe important research in the area. The journal also solicits reviews on novel pattern analysis benchmarks, evaluation of pattern analysis tools, and important research activities at international centres of excellence working in pattern analysis. AUDIENCE --------- Researchers in computer science and engineering. Research and Development Personnel in industry. Researchers/ applications where pattern analysis is used, researchers in the area of novel pattern recognition and analysis techniques and their specific applications. From Yves.Moreau at esat.kuleuven.ac.be Fri Jun 20 10:16:21 1997 From: Yves.Moreau at esat.kuleuven.ac.be (Yves Moreau) Date: Fri, 20 Jun 1997 16:16:21 +0200 Subject: TR: Dynamics of recurrent networks with and without hidden layer Message-ID: <33AA90B5.164E@esat.kuleuven.ac.be> The following technical report is available via ftp or the World Wide Web: WHEN DO DYNAMICAL NEURAL NETWORKS WITH AND WITHOUT HIDDEN LAYER HAVE IDENTICAL BEHAVIOR? Yves Moreau and Joos Vandewalle K.U.Leuven ESAT-SISTA K.U.Leuven, Elektrotechniek-ESAT, Technical report ESAT-SISTA TR97-51 To get it from the World Wide Web, point your browser to: ftp://ftp.esat.kuleuven.ac.be/pub/SISTA/moreau/reports/rnn_equivalence_tr97-51.ps To get it via FTP: ftp ftp.esat.kuleuven.ac.be cd pub/SISTA/moreau/reports get rnn_equivalence_tr97-51.ps ABSTRACT ======== We present conditions for the input-output equivalence between dynamical neural networks with a hidden layer and dynamical neural networks without hidden layer. We achieve this result by proving a general result on transdimensional changes of coordinates for dynamical neural networks with a hidden layer. This result is of interest because dynamical neural networks without hidden layer are more amenable to analytical studies. Keywords: dynamical neural network, input-output equivalence, transdimensional change of coordinates -------------------- Yves Moreau Department of Electrical Engineering Katholieke Universiteit Leuven Leuven, Belgium email: moreau at esat.kuleuven.ac.be homepage: http://www.esat.kuleuven.ac.be/~moreau publications: http://www.esat.kuleuven.ac.be/~moreau/publication_list.html From suem at soc.plym.ac.uk Fri Jun 20 11:29:44 1997 From: suem at soc.plym.ac.uk (Sue McCabe) Date: Fri, 20 Jun 1997 16:29:44 +0100 Subject: Lectureship/Senior Lectureship in Artificial Intelligence Message-ID: <1.5.4.32.19970620152944.00b649b4@soc.plym.ac.uk> Lectureship/Senior Lectureship in Artificial Intelligence School of Computing, University of Plymouth, UK The School of Computing invites applications for a lectureship in the field of Artificial Intelligence, from persons with a specialist interest in the application of computational intelligence techniques (neural computing, evolutionary computing) in one or more of the following areas: ? multimedia, ? agent-based systems, ? financial systems. The successful candidate will be expected to develop new courses at both undergraduate and postgraduate level in one or more of the above topics, and to carry out research, including initiating new projects, in collaboration with existing research groups in the School, in particular the Centre for Neural and Adaptive Systems (http://www.tech.plym.ac.uk/soc/research/neural/ index.html). Candidates must as a minimum requirement possess, or be about to obtain, a PhD in a related area. Relevant postdoctoral experience and an established research publication record is desirable, but not essential.Applications from more senior academics/researchers are also invited and this is reflected in the salary range for the post, which is ?13480 - ?32966 pa. (Lecturer/Senior Lecturer/Principal Lecturer/ Reader). Appointment will be at an appropriate level to reflect qualifications and experience. For further information on the post and on how to make formal application, please initially contact Professor Mike Denham by telephoning (+44) (0) 1752 232547, or by e-mail to mdenham at plym.ac.uk. Dr Sue McCabe School of computing University of Plymouth Drake Circus Plymouth PL4 8AA email: suem at soc.plym.ac.uk tel: 44 1752 232610 From lbl at nagoya.bmc.riken.go.jp Tue Jun 24 04:53:37 1997 From: lbl at nagoya.bmc.riken.go.jp (Bao-Liang Lu) Date: Tue, 24 Jun 1997 17:53:37 +0900 Subject: Paper available: Task Decomposition and Modular Neural Net Message-ID: <9706240853.AA27776@xian> The following paper, appeared in Lecture Notes in Computer Science, vol. 1240, 1997, Springer, is available via anonymous FTP. (This work was presented at International Work-Conference on Artificial and Natural Neural Networks (IWANN'97), 4-6 June 1997, Lanzarote, Canary Islands, Spain ) FTP-host:ftp.bmc.riken.go.jp FTP-file:pub/publish/Lu/lu-iwann97.ps.gz ========================================================================== TITLE: Task Decomposition Based on Class Relations: A Modular Neural Network Architecture for Pattern Classification AUTHORS: Bao-Liang Lu Masami Ito ORGANISATIONS: Bio-Mimetic Control Research Center, The Institute of Physical and Chemical Research (RIKEN) ABSTRACT: In this paper, we propose a new methodology for decomposing pattern classification problems based on the class relations among training data. We also propose two combination principles for integrating individual modules to solve the original problem. By using the decomposition methodology, we can divide a $K$-class classification problem into ${K\choose 2}$ relatively smaller two-class classification problems. If the two-class problems are still hard to be learned, we can further break down them into a set of smaller and simpler two-class problems. Each of the two-class problem can be learned by a modular network independently. After learning, we can easily integrate all of the modules according to the combination principles to get the solution of the original problem. Consequently, a $K$-class classification problem can be solved effortlessly by learning a set of smaller and simpler two-class classification problems in parallel (10 pages ) Bao-Liang Lu ====================================== Bio-Mimetic Control Research Center The Institute of Physical and Chemical Research (RIKEN) Anagahora, Shimoshidami, Moriyama-ku Nagoya 463, Japan Tel: +81-52-736-5870 Fax: +81-52-736-5871 Email: lbl at bmc.riken.go.jp From bs at corn.mpik-tueb.mpg.de Tue Jun 24 07:31:37 1997 From: bs at corn.mpik-tueb.mpg.de (Bernhard Schoelkopf) Date: Tue, 24 Jun 1997 13:31:37 +0200 Subject: Support Vector Workshop Message-ID: <199706241131.NAA08477@mpik-tueb.mpg.de> ________________________________________________________________ Call for Contributions NIPS'97 Workshop on Support Vector Learning Machines ________________________________________________________________ The Support Vector (SV) learning algorithm (Boser, Guyon, Vapnik, 1992; Cortes, Vapnik, 1995; Vapnik, 1995) provides a general method for solving Pattern Recognition, Regression Estimation and Operator Inversion problems. The method is based on results in the theory of learning with finite sample sizes. The last few years have witnessed an increasing interest in SV machines, due largely to excellent results in pattern recognition, regression estimation and time series prediction experiments. The purpose of this workshop is (1) to provide an overview of recent developments in SV machines, ranging from theoretical results to applications, (2) to explore connections with other methods, and (3) to identify weaknesses, strengths and directions for future research for SVMs. We invite contributions on SV machines and related approaches, looking for empirical support wherever possible. Topics of interest to the workshop include: SV Applications Benchmarks SV Optimization and implementation issues Theory of generalization and regularization Learning methods based on Hilbert-Schmidt kernels (e.g. kernel PCA) Links to related methods and concepts (e.g. boosting, fat shattering) Representation of functions in SV machines (e.g. splines, anova) The workshop will be held in Breckenridge, Colorado, on December 5 or 6, 1997. We are in the process of putting together a tentative schedule. If you are interested in contributing, please contact us (mailto:smola at first.gmd.de, or faxto:+49-30-6392-1805, A. Smola). Submission of papers is not required for the workshop. We would, however, appreciate a brief description of your envisaged talk. As one of the workshop's foci will be discussions, presentation of recent and/or controversial work is encouraged. A workshop home page has been set up at http://svm.first.gmd.de. Additional information on SV research can be found at http://svm.research.bell-labs.com/ (Bell Labs SV services) and http://www.mpik-tueb.mpg.de/people/personal/bs/svm.html (annotated SV bibliography). Organizers: Leon Bottou (AT&T Research, leonb at research.att.com) Chris Burges (Bell Labs, Lucent Technologies, burges at bell-labs.com) Bernhard Schoelkopf (Max Planck Institute at Tuebingen, bs at mpik-tueb.mpg.de) Alex Smola (Technical University/GMD Berlin, smola at first.gmd.de) -- bernhard schoelkopf mailto:bs at mpik-tueb.mpg.de max-planck-institut fuer biologische kybernetik spemannstr.38, 72076 tuebingen, germany phone +49 7071 601-609, fax -616 http://www.mpik-tueb.mpg.de/people/personal/bs/bs.html From Leslie.Smith at ee.ed.ac.uk Tue Jun 24 09:24:07 1997 From: Leslie.Smith at ee.ed.ac.uk (Leslie S Smith) Date: Tue, 24 Jun 1997 14:24:07 +0100 (BST) Subject: Call for Participation: 1st European Workshop on Neuromorphic Systems Message-ID: <199706241324.OAA24298@forbes.ee.ed.ac.uk> Call for Participation. (apologies if you receive this more than once) 1st European Workshop on Neuromorphic Systems (EWNS1) 29-31 August 1997. University of Stirling, Stirling, Scotland. www page: http://www.cs.stir.ac.uk/~lss/Neuromorphic/Info1.html Organisers: Dept of Computing Science, University of Stirling Dept of Electrical Engineering, University of Edinburgh Neuromorphic systems are implementations in silicon of sensory and neural systems whose architecture and design are based on neurobiology. This growing area proffers exciting possibilities such as sensory systems which can compete with human senses and pattern recognition systems that can run in real-time. The area is at the intersection of many disciplines: neurophysiology, computer science and electrical engineering. The purpose of this meeting is to bring together active researchers who can discuss the problems of this developing technology. The meeting is intended to bring together researchers who might not normally meet, for example, engineers who want to implement systems based on neurobiology, and neurobiologists who want to produce engineering implementations of systems. The University is set on one of the most beautiful campusses in Europe, and is centrally located in Scotland. It is easily reached by plane (to Glasgow or Edinburgh, both 1 hour away) train, bus, or car. We are grateful to the Gatsby Foundation for their generous support of this workshop. Meeting format: The meeting will be a single-track meeting so that all attendees can go to all the meetings. Accepted papers will be presented (presentations of about 30 minutes plus time for discussion), and there will be paper sessions and time for discussion each day. Programme The invited speakers are: Professor Rodney Douglas, Institute of Neuroinformatics, ETHZ, University of Zurich, Zurich, Switzerland. Professor Ralph Etienne-Cummings, Dept of Electrical Engineering, Southern IIlinois University, USA. The provisional programme is to be found at http://www.cs.stir.ac.uk/~lss/Neuromorphic/ProvProg.html May I extend my thanks to all who submitted papers. Book of the Conference: We have now agreed with World Scientific to produce a book from the conference. We plan to ask speakers to produce camera-ready copy soon after the meeting. The provisional title for the book is Neuromorphic Systems: Engineering Silicon from Neurobiology. Organising Committee: Leslie S. Smith, Department of Computing Science and Mathematics, University of Stirling. Alister Hamilton, Department of Electrical Engineering, University of Edinburgh. Program Committee Jim Austin(Department of Computer Science, University of York, UK) Guy Brown (Department of Computing Science, Univ of Sheffield, UK) Rodney Douglas (ETHZ, University of Zurich, Zurich, Switzerland) Wulfram Gerstner (EPFL Lausanne, Switzerland) Alister Hamilton (Department of Electrical Engineering, University of Edinburgh, Scotland) John Lazzaro (CS Division, University of California at Berkeley) Wolfgang Maass (Institute for Theoretical Computer Science, T.U. Graz, Austria) Alan Murray (Dept of Electrical Engineering, Univ of Edinburgh, Scotland) Leslie Smith (Department of Computing Science and Mathematics, University of Stirling, Scotland) Eric Vittoz (EPFL, Lausanne) Barbara Webb (Dept of Psychology, Univ of Nottingham, UK) Misha Mahowald (ETHZ, Univ of Zurich, Switzerland) had agreed to be on the committee before her untimely death. Registration Registration (includes lunches) UK Pounds 100 Student Registration (includes lunches) UK Pounds 25 Conference Dinner UK Pounds 25 Accommodation: Bed and Breakfast per night UK Pounds 20 The accommodation above is in nearby student residences. There is plenty of higher quality accommodation available: contact Stirling tourist office (+44) 1786 475019. Money is available to support a limited number of UK Ph.D. students who wish to attend the conference. Please contact Leslie Smith (lss at cs.stir.ac.uk) for further details. Registration Form: If you would like to register, please print and fill out the form at http://www.cs.stir.ac.uk/~lss/Neuromorphic/RegForm.html. Leslie Smith, Department of Computing Science, University of Stirling, Stirling FK9 4LA, Scotland lss at cs.stir.ac.uk Fax: (44) 1786 464551 Phone (44) 1786 467435 From D.Mareschal at exeter.ac.uk Wed Jun 25 07:41:30 1997 From: D.Mareschal at exeter.ac.uk (Denis Mareschal) Date: Wed, 25 Jun 1997 12:41:30 +0100 Subject: Lecturship in Cognitive Psychology Message-ID: Readers of this list with interests language, development, and other other aspects of cognitive psychology may be interested in the following job announcement. cheers, Denis UNIVERSITY OF EXETER: DEPARTMENT OF PSYCHOLOGY LECTURESHIP IN PSYCHOLOGY Applications are invited for a permanent lectureship which will be available from 1st October 1997, though a later start date (January or April 1998) can be arranged, as the result of the early retirement of Dr Robert Brown. The department is seeking people with strong records or great potential who will strengthen our existing research group in cognitive psychology. We are particularly keen that new lecturers should engage in collaborative research with existing staff and develop links between separate areas. Preference will be given to those whose interests are in one or more of: (a) language, particularly including psycholinguistics, cognitive neuropsychology or developmental approaches. (b) neural network modelling (connectionism), particularly its application to cognitive development, language or vision (c) animal cognition The department has a high reputation for its research, having obtained ratings of 5, 4 and 4 in the last three research assessment exercises, and has a good position within the University, having been recently recommended for School status and singled out as a growth area. The research of the department is organised in three research groups: Cognitive Psychology and Cognitive Science; Social and Economic Psychology; Health and Clinical Psychology. Further details of the Cognitive Psychology and Cognitive Science research group's activities can be found at: http://www.exeter.ac.uk/~nejones/pg/cogpage.htm Teaching duties for the new lecturer will be arranged by the Head of Department and will obviously depend upon the interests of those appointed. It should be noted that in the first two years the new lecturer will have a substantially lighter-than-average teaching and administrative load. The Department enjoys excellent accommodation in an exceptionally attractive location, with a full range of technical facilities and support, including a first-rate audio-visual studio, a recently refurbished animal laboratory and state-of-the-art computing facilities for teaching and research. These include a suite of Macintoshes for undergraduate use, a dedicated suite of Macs and PCs for postgraduate use and a new connectionist/computational modelling laboratory consisting of Unix workstations. The University's 'mainframe' provision is Unix based. All staff have a PC for their own use and the new appointee will be given an allocation to buy new journals and books and an initial equipment budget in order to establish a firm base for their research. In addition to external and University research grants, the Department operates its own internal scheme to offer small 'pump-priming' support for new research initiatives. Further details about the department can be found at our world wide web site http://www.exeter.ac.uk/Psychology/ Informal enquiries (and visits) are encouraged and potential applicants should contact Dr Paul Webley, Head of Department, Department of Psychology, University of Exeter, Exeter EX4 4QG. Telephone 01392 264600 or email P.Webley at exeter.ac.uk Further information and application forms are available from Personnel, University of Exeter, Exeter EX4 4QJ, 01392-263100 (answer phone) or email Personnel at exeter.ac.uk, quoting ref. no. 4147. The closing date is Friday 29th August 1997 and it is hoped to carry out interviews during the week beginning September 22nd. The salary will be on the Lecturer A scale, 16,045 - 21,016 pounds p.a. -------------------------------------------------------------- Denis Mareschal Department of Psychology, Washington Singer Laboratories, Exeter University, Perry Road, Exeter, EX4 4QG, UK. Tel: +44 1392 264596, Fax: +44 1392 264623 WWW: http://www.ex.ac.uk/Psychology/staff/dmaresch.htm ============================================================== From aprieto at goliat.ugr.es Wed Jun 25 04:37:10 1997 From: aprieto at goliat.ugr.es (Alberto Prieto) Date: Wed, 25 Jun 1997 10:37:10 +0200 Subject: Summer Course Message-ID: <1.5.4.32.19970625083710.007206ec@goliat.ugr.es> SOME TOPICS RELATED TO INTELLIGENT SYSTEMS http://atc.ugr.es/cursos/cm.html Course organized in cooperation with the Spanish RIG of IEEE Neural Network Council. Almuecar (Granada, Spain) 15-19 September 1997 CENTRO MEDITERRANEO DE LA UNIVERSIDAD DE GRANADA XIII Cursos Internacionales 1997 TUTORIALS A."Introduction. Computational models for intelligent systems" (2 hours) Prof. Dr. Alberto Prieto, Catedr?tico de Arquitectura y Tecnolog?a de Computadores, Dto. de Electronica y Tecnolog?a de Computadores. Unv. of Granada, Spain B."Machine learning" (4 hours) Prof. Dr. Marie Cottrell, Professor, U.F.R. de Mathematiques et d'Informatique. Universite de Paris I - Pantheon - Sorbona, C."Complex Systems" (4 hours) Dr. Chris Langton, Santa Fe Inst., Los Alamos Nat.Lab., USA D."Evolutionary Computation: Genetic Algorithms, Applications of Genetic Algorithms, Classifier Systems, and Applications of Classifier Systems" (4 hours) Prof. Dr. Terry Fogarty, Professor of Computing, Napier Unv., England E."Unification of neuro-fuzzy systems" (4 hours) Prof Dr. Leonardo Reyneri, Prof. Dipartimento de Electronica. Politecnico di Torino, Italy F."Evolvable Hardware" (4 hours) Pierre Marchal. Centre Suisse d'Electronique et de Microtechnique S.A. (CSEM). Ne?chatel, Suisse G."Microelectronics of Intelligent Systems" (3 hours) Prof. Dr.-Ing. Karl Goser, Lehrstuhl f?r Bauelemente der Elektrotechnik. Univ. Dortmund, Germany ORGANIZATION COMMITTEE Course Directors: Karl GOSER and Alberto PRIETO Coordinator: Juan Julian MERELO (jmerelo at kal-el.ugr.es) More information: http://atc.ugr.es/cursos/cm.html Prof. Dr. Alberto PRIETO Departamento de Electronica y Tecnologia de Computadores Facultad de Ciencias - E.T.S. Ingenieria Informatica Campus Universitario de Fuentenueva Universidad de Granada 18071 GRANADA (Spain) Phone: 34-58- 24 32 26 Fax: 34-58- 24 32 30 E-mail: aprieto at ugr.es http://atc.ugr.es/~aprieto/aprieto.html From S.W.Ellacott at bton.ac.uk Thu Jun 26 06:36:29 1997 From: S.W.Ellacott at bton.ac.uk (S.W.Ellacott@bton.ac.uk) Date: Thu, 26 Jun 1997 11:36:29 +0100 Subject: Two books Message-ID: Dear connectionists I hope this is within the spirit of the list and does not constitute just advertising! If it is, I'm sure the moderators will inform me:-) I would like to draw your attention to two recent books. The first is Mathematics of Neural Networks: Models, Algorthms and Applcations Eds. ELLACOTT, S.W., MASON, J.C. and ANDERSON I.J. Boston: Kluwer Academic Press 1997. ISBN 0-7923-9933-1 This is the proceedings of the 1995 MANNA conference now published in book form. There are invited contributions from Allinson, Amari, Cybenko, Grossberg, Hirsch, and Taylor, together with 63 submitted contributions from mathematicinas working in the field including many prominent researchers. Overall the book provides an essential snapshot of mathematical research in neural networks. The second book has been out about a year, but not reported here before. Neural Networks: Deterministic Methods of Analysis ELLACOTT, S.W. and BOSE, D Thomson International Publishers 1996 This is a comprehensive expository text at final year undergraduate level for mathematicians or postgraduate for other disciplines. It collects together the mathematical fundamentals and relevant results from linear algebra, optimization, dynamical systems and approximation theory as applied to neural networks. Steve Ellacott -- Steve Ellacott, School of Computing and Mathematical Sciences, University of Brighton, Moulsecoomb, BN2 4GJ, UK Tel: Home (01273) 885845 Office: (01273) 642544 or 642414 Fax: Home (01273) 270183 Office: (01273) 642405 WWW: http://www.it.brighton.ac.uk/staff/swe From mmisra at adastra.Mines.EDU Thu Jun 26 20:34:36 1997 From: mmisra at adastra.Mines.EDU (Manavendra Misra) Date: Thu, 26 Jun 1997 18:34:36 -0600 Subject: Paper on Parallel Neural Network implementations in NCS Message-ID: <9706261834.ZM24447@adastra.Mines.EDU> Forwarded from Arun Jagota: Dear Connectionists: The following refereed and revised paper is now accessible from the Web site below. M. Misra, Parallel Environments for Implementing Neural Networks, Neural Computing Surveys vol 1, 48-60, 1997, 113 references http://www.icsi.berkeley.edu/~jagota/NCS Send additional references and comments to the author at mmisra at mines.edu Abstract: As artificial neural networks (ANNs) gain popularity in a variety of application domains, it is critical that these models run fast and generate results in real time. Although a number of implementations of neural networks are available on sequential machines, most of these implementations require an inordinate amount of time to train or run ANNs, especially when the ANN models are large. One approach for speeding up the implementation of ANNs is to implement them on parallel machines. This paper surveys the area of parallel environments for the implementations of ANNs, and prescribes desired characteristics to look for in such implementations. ------------------------------------------------------------------------ Regards, Arun From mitra at cco.caltech.edu Thu Jun 26 18:32:34 1997 From: mitra at cco.caltech.edu (Partha Mitra) Date: Thu, 26 Jun 1997 15:32:34 -0700 Subject: Workgroup on Analysis of Neurobiological data Message-ID: <33B2EE02.7C7010BF@cco.caltech.edu> Analysis of Neural Data Modern methods and open issues in the analysis and interpretation of multi-variate time-series and imaging data in the neurosciences 18 August - 31 August 1997 Marine Biological Laboratories - Woods Hole, MA A working group of scientists committed to quantitative approaches to problems in neuroscience will focus their efforts on experimental and theoretical issues related to the analysis of large, single- and multi-channel data sets. Our motivation for the work group is based on issues that arise in two complimentary areas critical to an understanding of brain function. The first involves advanced signal processing methods that are relevant to neuroscience, particularly those appropriate for emerging multi-site recording techniques and noninvasive imaging techniques. The second involves the development of a calculus to study the dynamical behavior of nervous systems and the computations they perform. A distinguishing feature of the work group will be the close collaboration between experimentalists and theorists, particularly with regard to the analysis of data and the planning of experiments. The work group will have a limited number of research lectures, supplemented by tutorials on relevant computational, experimental, and mathematical techniques. This work group is a means to critically evaluate techniques for the processing of multi-channel data, of which imaging forms an important category. Such techniques are of fundamental importance for basic research and medical diagnostics. We will establish a repository of these techniques, along with benchmarks, to insure the rapidly dissemination of modern analytical techniques throughout the neuroscience community. The work group will convene on a yearly basis. For 1997, we propose to focus on topics that fall under the rubric of multivariate time-series, including analysis of point processes, e.g., spike trains, measures of correlation and variability and their interpretation in terms of underlying process models; analysis of continuous processes, e.g., field potential, optical imaging fMRI, and MEG, and the recording of behavioral output, e.g., vocalizations; and problems that involve both point and continuous processes, e.g., spike sorting, and the relations between spike trains and sensory input and motor output. Participants: We propose to have 25 participants, both experimentalists and theorists. Experimentalists are specifically encouraged to bring data records to the work group for analysis and discussion. Appropriate computational facilities will be provided. The work group will further take advantage of interested investigators and course faculty concurrently present at the MBL. We encourage graduate students and postdoctoral fellows as well as senior researchers to apply. Participant Fee: $200. Support: National Institutes of Health - NIMH, NIA, NIAAA, NICHD/NCRR, NIDCD, NIDA, and NINDS. Organizers: David Kleinfeld (UCSD) and Partha P. Mitra (Caltech and Bell Laboratories). Application: Potential participants should send a copy of their curriculum vita, together with a cover letter that contains a brief (ca. 200 word) paragraph on why they wish to attend the work group and a justified request for any financial aid, to Ms. Jean Ainge Room 1D-467, Bell Laboratories, Lucent Technologies 700 Mountain Avenue Murray Hill, NJ 07974 908-582-4702 or Graduate students and postdoctoral fellows are encouraged to include a brief letter of support from their research advisor. Financial assistance: Assistance for travel, accommodations, and board is available based on need. Applications must be received by 3 July 1997; participants will be notified by 11 July. Additional information may be found at http://www-physics.ucsd.edu/research/neurodata (one level above this announcement). The MBL is an EEO AAI. From smagt at dlr.de Fri Jun 27 04:50:54 1997 From: smagt at dlr.de (Patrick van der Smagt) Date: Fri, 27 Jun 1997 10:50:54 +0200 Subject: NIPS*97 Workshop: robots & cerebellum Message-ID: <33B37EEE.925@dlr.de> NIPS'97 Postconference Workshop =============================== Can Artificial Cerebellar Models Compete to Control Robots? http://www.op.dlr.de/FF-DR-RS/CONFERENCES/nips-workshop/ Objective: ---------- Recent successes in robotics have broadened the field of application and acceptation of robots. Nevertheless, industrial robotics still has to go a long way. While the applicability of classical robots remains limited to factory floors, research lab robotics is moving towards novel actuators for constructing light-weight, compliant robot arms. Hereto actuators are needed which consist of agonist-antagonist drive pairs for maintaining accurate positioning without recalibration, as well as for controlling the stiffness of a joint. However, when these joints are combined to construct a robot arm, existing algorithms can only inaccurately control the arm at low velocities. Starting from Albus' model from the 70's, neuro-computational models of the cerebellum have been advocated as possible candidates for the control of complex robot systems. Unfortunately, there have been very few applications of cerebellar models to the control of real robot manipulators with at least 6 degrees of freedom. Cerebellar models have become more refined through specialised investigations based on details of the biological cerebellar system, while insufficient attention has been given to applicability of these refinements in robotics. In the development of these methodologies, there are few examples of successful integration of neuro-computational models. In this workshop we want to investigate how neuro-computational models might be incorporated as a standard part of robotics. We want to address two questions: How reasonable is the desire for such an amalgamation? What prerequisites are there for having cerebellar models successfully compete with alternative approaches to control of real robots? These questions will be addressed through the presentation of papers which: 1. describe biologically plausible models of sensory-motor control; 2. apply cerebellar models to the control of robot manipulators; 3. describe robot control methodologies that can incorporate cerebellar or other neuro-computational model (e.g., vision); 4. provide a case history illustrating barriers to successful competition by cerebellar models for robot control. Call for Contributions ====================== For the workshop we will be looking for talks and papers which: * describe research applying cerebellar models to robot applications implemented on real robots; * describe biologically plausible models of sensory-motor control; * describe robot control methodologies that can incorporate cerebellar or other neuro-computational model (e.g., vision); * provide a case history illustrating barriers to successful competition by cerebellar models for robot control. Please send your proposals [with a paper or extended abstract] by August 15, 1997. For more information about submission visit http://www.op.dlr.de/FF-DR-RS/CONFERENCES/nips-workshop/ Workshop organisers: ==================== Patrick van der Smagt Institute of Robotics and System Dynamics German Aerospace Research (DLR) Wessling, Germany mailto:smagt at dlr.de Daniel Bullock Cognitive and Neural Systems Department Boston University Boston, MA 02215 mailto:danb at cns.bu.edu -- dr Patrick van der Smagt phone +49 8153 281152 DLR/Institute of Robotics and Systems Dynamics fax +49 8153 281134 P.O. Box 1116, 82230 Wessling, Germany email From hochreit at informatik.tu-muenchen.de Fri Jun 27 07:19:09 1997 From: hochreit at informatik.tu-muenchen.de (Josef Hochreiter) Date: Fri, 27 Jun 1997 13:19:09 +0200 Subject: sensory coding with LOCOCODE Message-ID: <97Jun27.131913+0200met_dst.49112+135@papa.informatik.tu-muenchen.de> LOCOCODE Sepp Hochreiter, TUM Juergen Schmidhuber, IDSIA TR FKI-222-97 (19 pages, 23 figures, 450 KB, 4.2 MB gunzipped) Low-complexity coding and decoding (LOCOCODE) is a novel approach to sensory coding and unsupervised learning. Unlike previous methods it explicitly takes into account the information-theoretic complexity of the code generator: lococodes (1) convey information about the input data and (2) can be computed and decoded by low-complexity mappings. We implement LOCOCODE by training autoassociators with Flat Minimum Search, a recent, general method for discovering low-complexity neural nets. Experiments show: unlike codes obtained with standard autoenco- ders, lococodes are based on feature detectors, never unstructured, usually sparse, sometimes factorial or local (depending on the data). Although LOCOCODE's objective function does not contain an explicit term enforcing sparse or factorial codes, it extracts optimal codes for difficult versions of the "bars" benchmark problem. Unlike, e.g., independent component analysis (ICA) it does not need to know the num- ber of independent data sources. It produces familiar, biologically plausible feature detectors when applied to real world images. As a preprocessor for a vowel recognition benchmark problem it sets the stage for excellent classification performance. ftp://ftp.idsia.ch/pub/juergen/lococode.ps.gz ftp://flop.informatik.tu-muenchen.de/pub/fki/fki-222-97.ps.gz http://www7.informatik.tu-muenchen.de/~hochreit/pub.html http://www.idsia.ch/~juergen/onlinepub.html (invited talk at "Theoretical Aspects of Neural Computation" (TANC97), Hong Kong, May 97 - short spin-off papers to be published by Springer) Comments welcome. Sepp & Juergen From mike at stats.gla.ac.uk Fri Jun 27 10:54:07 1997 From: mike at stats.gla.ac.uk (Mike Titterington) Date: Fri, 27 Jun 1997 15:54:07 +0100 (BST) Subject: Postdoctoral Post in Glasgow Message-ID: <4800.199706271454@milkyway.stats.gla.ac.uk> The following advertisement will soon appear in the UK national press. DRAFT ADVERTISEMENT -------------------- UNIVERSITY OF GLASGOW DEPARTMENT OF STATISTICS POSTDOCTORAL RESEARCH ASSISTANT Applications are invited for a Postdoctoral Research Assistantship (IA) post in the Department of Statistics, University of Glasgow, to work with Professor D.M. Titterington for a period of up to 3 years, starting on October 1, 1997, or as soon as possible thereafter. The post is funded by the UK Engineering and Physical Sciences Research Council. The research topic is Statistical Learning Methodology in Neural Computing Problems. Applications, supported by full curriculum vitae and the names of three referees, should be sent, to arrive no later than July 27, 1997, to Professor D. M. Titterington, Department of Statistics, University of Glasgow, Glasgow G12 8QQ, Scotland, from whom further particulars are available. Informal enquiries by electronic mail (mike at stats.gla.ac.uk) are welcomed. From srx014 at coventry.ac.uk Fri Jun 27 13:59:22 1997 From: srx014 at coventry.ac.uk (Colin Reeves) Date: Fri, 27 Jun 1997 18:59:22 +0100 (BST) Subject: PhD studentship available Message-ID: The following research position is available: Would interested candidates please contact Colin Reeves in the first instance (CRReeves at coventry.ac.uk) with a CV. -------------------------------------------------------------------------- Project Proposal The application of artificial intelligence to the control of large, complex gas transmission systems. Control Theory and Applications Centre, and BG Transco, Coventry University System Control, Hinckley Background. BG Transco is responsible for the transportation and storage of natural gas in Britain. In general, gas is delivered to coastal terminals where it enters the Transco pipeline system. System Control manages and controls the flow of gas from these terminals to 18 million consumers. System Control is organised into 5 control centres which operate on a 24 hour basis 365 days a year. There is one National Control Centre (NCC) and 4 Area Control Centres (ACCs). In each of the control centres small teams of engineers are responsible for the provision of a secure and economic gas transportation system. To achieve this, the engineers need to develop best practices and apply these consistently to the operation of the pipeline systems. This project will investigate the possible application of Artificial Intelligence techniques to support the operation of these large, complex gas transmission systems. Methodology The Control Theory and Applications Centre at Coventry University has successfully applied AI methods, including neural nets, fuzzy systems, genetic algorithms etc, to complex control problems. It is intended in this project to extend these approaches to the gas pipeline systems of Transco. This will involve firstly the selection (in close collaboration with Transco System Control) of a suitable subsystem for a feasibility study. Historical records of this subsystem will be used to identify important variables, and then to extract knowledge in the form of rules. Knowledge elicitation from the experts (the operations engineers) will also be carried out. Probable techniques include the use of neuro-fuzzy methods and the application of genetic algorithms or other heuristics in mining the data archives. At appropriate times it is required that a competent report will be presented to Transco. In addition there will be opportunities to present the work at conferences or to publish through technical journals. The successful candidate will register for a PhD, working under the supervision of Colin Reeves at Coventry University. Candidate profile The student should have a first degree (at least a 2i) or MSc in an appropriate technological or engineering subject. Good general mathematical and computing skills are more important than the specific subject of the degree. It would be an advantage to have previous knowledge of AI methods such as those mentioned above. Knowledge of control systems and databases would also be useful, and the candidate needs to possess the appropriate inter-personal skills to work in an operational engineering environment. The candidate should be eligible for UK fees, which means he/she will most likely be a citizen of a member state of the European Union - other candidates are normally subject to higher fees. Timescale The project will commence in September 1997 and will last for 3 years. From dld at cs.monash.edu.au Sun Jun 29 09:45:56 1997 From: dld at cs.monash.edu.au (David L Dowe) Date: Sun, 29 Jun 1997 23:45:56 +1000 Subject: CFPs and CFRs: Information theory in biology Message-ID: <199706291345.XAA29992@dec11.cs.monash.edu.au> Please distributed to interested friends and colleagues: Call For Papers (CFPs) and Call For Referees (CFRs) Complexity and information-theoretic approaches to biology ---------------------------------------------------------- This is a Call For Papers and Call For Referees for the 3rd Pacific Symposium on BioComputing (PSB-3, 1998) conference stream on "Complexity and information-theoretic approaches to biology". PSB-98 will be held from 5-9 January, 1998, in Hawaii, at the Ritz Carlton Kapalua on Maui. Stream Organisers: David L. Dowe (dld at cs.monash.edu.au) and Klaus Prank. Stream submission deadline (details below): 21 July 1997. ~~~~~~~~~~~~~~~~~~~ WWW site: http://www.cs.monash.edu.au/~dld/PSB-3/PSB-3.Info.CFPs.html . Specific technical area to be covered by this stream: Approaches to biological problems using notions of information or complexity, including methods such as Algorithmic Probability, Minimum Message Length and Minimum Description Length. Two possible applications are (e.g.) protein folding and biological information processing. Kolmogorov (1965) and Chaitin (1966) studied the notions of complexity and randomness, with Solomonoff (1964), Wallace (1968) and Rissanen (1978) applying these to problems of statistical and inferential learning and to prediction. The methods of Solomonoff, Wallace and Rissanen have respectively come to be known as Algorithmic Probability (ALP), Minimum Message Length (MML) and Minimum Description Length (MDL). All of these methods relate to information theory, and can also be thought of in terms of Shannon's information theory, and can also be thought of in terms of Boltzmann's thermo-dynamic entropy. An MDL/MML perspective has been suggested by a number of authors in the context of approximating unknown functions with some parametric approximation scheme (such as a neural network). The designated measure to optimize under this scheme combines an estimate of the cost of misfit with an estimate of the cost of describing the parametric approximation (Akaike 1973, Rissanen 1978, Barron and Barron 1988). This stream invites all original papers of a biological nature which use notions of information and/or complexity, with no strong preference as to what specific nature. Such work has been done in problems of, e.g., protein folding and DNA string alignment. As we shortly describe in some detail, such work has also been done in the analysis of temporal dynamics in biology such as neural spike trains and endocrine (hormonal) time series analysis using the MDL principle in the context of neural networks and context-free grammar complexity. To elaborate on one of the relevant topics above, in the last couple of years or so, there has been a major focus on the aspect of timing in biological information processing ranging from fields such as neuroscience to endocrinology. The latest work on information processing at the single-cell level using computational as well as experimental approaches reveals previously unimagined complexity and dynamism. Timing in biological information processing on the single-cell level as well as on the systems level has been studied by signal-processing and information-theoretic approaches in particular in the field of neuroscience (see for an overview: Rieke et al. 1996). Using such approaches to the understanding of temporal complexity in biological information transfer, the maximum information rates and the precision of spike timing to the understanding of temporal complexity in biological information transfer, the maximum information rates and the precision of spike timing could be revealed by computational methods (Mainen and Sejnowski, 1995; Gabbiani and Koch 1996; Gabbiani et al., 1996). The examples given above are examples of some possible biological application domains. We invite and solicit papers in all areas of (computational) biology which make use of ALP, MDL, MML and/or other notions of information and complexity. In problems of prediction, as well as using "yes"/"no" predictions, we would encourage the authors to consider also using probabilistic prediction, where the score assigned to a probabilistic prediction is given according to the negative logarithm of the stated probability of the event. List of Frequently Asked Questions (FAQs) re PSB-98 : ----------------------------------------------------- Q1. How can my paper be included in PSB's hardbound proceedings? PSB publishes peer-reviewed full papers in an archival proceedings. Each accepted paper will be allocated 12 pages in the proceedings volume. Paper authors are required to register (and pay) for the conference by the time they submit their camera-ready copy, or the paper will not be published. Q2. How does a PSB publication compare to a journal publication? PSB papers are strenuously peer reviewed, and must report significant original material. PSB expects to be included in Indicus Medicus, Medline and other indexing services starting this year. All accepted full papers will be indexed just as if they had appeared in a journal. It is too early to assess the impact of a PSB paper quantitatively, but we will take every action we can to improve the visibility and significance of PSB publication. Q3. If I do not want to submit a full paper to PSB, but wish to participate? Authors who do not wish to submit a full paper are welcome to submit one page abstracts, which will be distributed at the meeting separately from the archival proceedings, and are also welcome to display standard or computer-interactive posters. Q4. What are the paper submission deadlines? Papers will be due July 14, although session chairs can to adjust this deadline at their discretion. Results will be announced August 22, and camera ready copy will be due September 22. Poster abstracts will be accepted until October 1, and on a space available basis after that. Poster space is limited, especially for interactive posters that require computer or network access. Q5. Where should I send my submission? All full papers must be submitted to the central PSB address so that we can track the manuscripts. Physical submittors should send five copies of their paper to: PSB-98 c/o Section on Medical Informatics Stanford University Medical School, MSOB X215 Stanford, CA 94305-5479 USA Electronic submission of papers is welcome. Format requirements for electronic submission will be available on the web page (http://www.cgl.ucsf.edu/psb) or from Russ Altman (altman at smi.stanford.edu). Electronic papers will be submitted directly to Dr. Altman. We prefer that all one page abstracts be submitted electronically. Please send them to us in plain ascii text or as a Microsoft Word file. If this is impossible, please contact Dr. Altman as soon as possible. Q6. How can I obtain travel support to come to PSB? We have been able to offer partial travel support to many PSB attendees in the past, including most authors of accepted full papers who request support. However, due to our sponsoring agencies' schedules, we are unable to offer travel awards before the registration (and payment) deadlines for authors. We recognize that this is inconvenient, and we are doing our best to rectify the situation. NO ONE IS GUARANTEED TRAVEL SUPPORT. Travel support applications will be available on our web site (see Q7). Q7. How can I get more information about the meeting? Check our web page: http://www.cgl.ucsf.edu/psb or send email to the conference chair: hunter at nlm.nih.gov Further comments re PSB-98 : ---------------------------- PSB'98 will publish accepted full papers in an archival Proceedings. All contributed papers will be rigorously peer-reviewed by at least three referees. Each accepted full paper will be allocated up to 12 pages in the conference Proceedings. The best papers will be selected for a 30-minute oral presentation to the full assembled conference. Accepted poster abstracts will be distributed at the conference separately from the archival Proceedings. To be eligible for proceedings publication, each full paper must be accompanied by a cover letter stating that it contains original unpublished results not currently under consideration elsewhere. IMPORTANT DATES: Full paper submissions due (NEW deadline): July 21, 1997 Poster abstracts due: August 10, 1997 Notification of paper acceptance: August 22, 1997 Camera-ready copy due: September 22, 1997 Conference: January 5 - 8, 1998 More information about the "Complexity and information-theoretic approaches to biology" stream, including a sample list of relevant papers is available on the WWW at http://www.cs.monash.edu.au/~dld/PSB-3/PSB-3.Info.CFPs.html . For further information about the above stream, e-mail Dr. David Dowe, dld at cs.monash.edu.au , http://www.cs.monash.edu.au/~dld/ , Fax: +61 3 9905-5146 on or before 2 July (or after 19th July) or Dr. Klaus Prank, ndxdpran at rrzn-serv.de , http://sun1.rrzn-user.uni-hannover.de/~ndxdpran/ , on or after 3 July. From icsc at compusmart.ab.ca Fri Jun 27 20:38:02 1997 From: icsc at compusmart.ab.ca (ICSC Canada) Date: Sat, 28 Jun 1997 00:38:02 +0000 Subject: I&AAN'98 Workshop Message-ID: <2.2.32.19970628003802.006ec860@mail.compusmart.ab.ca> International Workshop on INDEPENDENCE & ARTIFICIAL NEURAL NETWORKS / I&ANN'98 at the University of La Laguna, Tenerife, Spain February 9-10, 1998 http://www.compusmart.ab.ca/icsc/iann98.htm This workshop will take place immediately prior to the International ICSC Symposium on ENGINEERING OF INTELLIGENT SYSTEMS / EIS'98 at the University of La Laguna, Tenerife, Spain February 11-13, 1998 http://www.compusmart.ab.ca/icsc/eis98.htm ****************************************** TOPICS Recent ANN research has developed from those networks which find correlations in data sets to the more ambitious goal of finding independent components of data sets. The workshop will concentrate on those neural networks which find independent components and the associated networks whose learning rules use contextual information to organize their learning. The topic falls then into (at least) three main streams of current ANN research: 1. Independent Component Analysis which has most recently been successfully applied to the problem of "blind separation of sources" such as the recovery of a single voice from a mixture/convolution of voices. Such methods normally use either information theoretic criteria or higher order statistics to perform the separation. 2. Identification of independent sources: The seminal experiment in this field is the identification of single bars from an input grid containing mixtures of bars. Factor Analysis (or generative models) has been a recent popular method for this problem. 3. Using contextual information to identify structure in data. We envisage a single track program over two days (February 9 - 10, 1998) with many opportunities for informal discussion. ****************************************** INTERNATIONAL SCIENTIFIC COMMITTEE - Luis Almeida, INESC, Portugal - Tony Bell, Salk Institute, USA - Andrew Cichocki, RIKEN Institute, Japan - Colin Fyfe, University of Paisley, U.K. - Mark Girolami, University of Paisley, U.K. - Peter Hancock, University of Stirling, U.K. - Juha Karhunen, Helsinki University of Technology, Finland - Jim Kay, University of Glasgow, U.K. - Erkki Oja, Helsinki University of Technology, Finland ****************************************** INFORMATION FOR PARTICIPANTS AND AUTHORS Registrations are available for the workshop only (February 9 - 10, 1998), or combined with the EIS'98 symposium (February 11 - 13, 1998). The registration fee for the 2-day workshop is estimated at approximately Ptas. 37,000 per person and includes: - Use of facilities and equipment - Lunches, dinners and coffee breaks - Welcome wine & cheese party - Proceedings in print (workshop only) - Proceedings on CD-ROM (workshop and EIS'98 conference) - Daily transportation between hotels in Santa Cruz and workshop site The regular registration fee for the EIS'98 symposium (February 11-13, 1998) is estimated at Ptas. 59,000 per person, but a reduction will be offered to workshop participants. Separate proceedings will be printed for the workshop, but all respective papers will also be included on the CD-ROM, covering the I&ANN'98 workshop and the EIS'98 symposium. As a bonus, workshop participants will thus automatically also receive the conference proceedings (CD-ROM version). We anticipate that the proceedings will be published as a special issue of a journal. ****************************************** SUBMISSION OF PAPERS Prospective authors are requested to send a 4-6 page report of their work for evaluation by the International Scientific Committee. All reports must be written in English, starting with a succinct statement of the problem, the results achieved, their significance and a comparison with previous work. The report should also include: - Title of workshop (I&ANN'98) - Title of proposed paper - Authors names, affiliations, addresses - Name of author to contact for correspondence - E-mail address and fax # of contact author - Topics which best describe the paper (max. 5 keywords) Submissions may be made by airmail or electronic mail to: Dr. Colin Fyfe Department of Computing and Information Systems The University of Paisley High Street Paisley, PA1 2BE Scotland Email: fyfe0ci at paisley.ac.uk Fax: +44-141-848-3542 ****************************************** SUBMISSION DEADLINE It is the intention of the organizers to have the proceedings available for the delegates. Consequently, the submission deadline of September 15, 1997 has to be strictly respected. ****************************************** IMPORTANT DATES Submission of reports: September 15, 1997 Notification of acceptance: October 15, 1997 Delivery of full papers: November 15, 1997 I&ANN'98 Workshop: February 9 - 10, 1998 EIS'98 Conference: February 11 - 13, 1998 ****************************************** LOCAL ARRANGEMENTS For details about local arrangements, please consult the EIS'98 website at http://www.compusmart.ab.ca/icsc/eis98.htm ****************************************** FURTHER INFORMATION For further information please contact: - Dr. Colin Fyfe Department of Computing and Information Systems The University of Paisley High Street Paisley PA1 2BE Scotland E-mail: fyfe0ci at paisley.ac.uk Fax: +44-141-848-3542 or - ICSC Canada International Computer Science Conventions P.O. Box 279 Millet, Alberta T0C 1Z0 Canada E-mail: icsc at compusmart.ab.ca Fax: +1-403-387-4329 WWW: http://www.compusmart.ab.ca/icsc From MARTIJN.BOET at wkap.nl Mon Jun 2 08:04:41 1997 From: MARTIJN.BOET at wkap.nl (Martijn Boet) Date: Mon, 2 Jun 1997 14:04:41 +0200 Subject: Neural Processing Letters - Tables of Contents Volume 5, issue 1&2 Message-ID: <7332041302061997/A57767/WKVAX5/11B613293200*@MHS> Dear Sirs, Please feel free to pay a visit to the following URL for information on the journal Neural Processing Letters, the Journal for Rapid Publication of Neural Network Research: http://kapis.www.wkap.nl/kapis/CGI-BIN/WORLD/journalhome.htm?1370-4621 Neural Processing Letters Table of Contents Volume 5, Issue 1, February 1997 Implementation of the Exclusive-Or Function in a Hopfield Style Recurrent Network Roelof Brouwer pp. 1-7 Learning Temporally Encoded Patterns in Networks of Spiking Neurons Berthold Ruf, Michael Schmitt pp. 9-18 Exploitation of Contributions to Information Gain, from Learning Analysis to Architecture Synthesis Bertrand Augereau, Thierry Simon, Jacky Bernard pp. 19-24 Prediction of Neural Net Tolerance to Noise Khalid A. Al-Mashouq pp. 25-34 The LBG-U Method for Vector Quantization - an Improvement over LBG Inspired from Neural Networks Bernd Fritzke pp. 35-45 Efficient Minimisation of the KL Distance for the Approximation of Posterior Conditional Probabilities M. Battisti, P. Burrascano, D. Pirollo pp. 47-55 Table of Contents Volume 5, Issue 2, April 1997 The Metric Structure of Weight Space S.M. Ruger and A. Ossen pp. 63-71 A Constrained Generative Model Applied to Face Detection R. Feraud, O. Bernier and D. Collobert pp. 73-81 Virtual Sample Generation Using a Population of Networks S. Cho, M. Jang and S. Chang pp. 83-89 Training a Neuron in Sequential Learning H. Poulard and N. Hernandez pp. 91-96 Recognising Simple Behaviours Using Time-Delay RBF Networks A.J. Howell and H. Buxton pp. 97-104 Modified Oja's Algorithms for Principal Subspace and Minor Subspace Extraction T.Chen pp. 105-110 Vision Experiments with Neural Deformable Template Matching D.S. Banarse and A.W.G. Duller pp. 111-119 Increasing Attraction of Pseudo Inverse Autoassociative Networks D.O. Gorodnichy and A.M. Reznik pp. 121-125 Feedforward Neural Networks Based Input-Output Models for Railway Carriage System Identification T.W.S. Chow and O. Shuai pp. 127-137 Computationally Efficient Approximation of a Probabilistic Model for Document Representation in the WEBSOM Full-Text Analysis Method S. Kaski pp. 139-151 Interweaving Kohonen Maps of Fifferent Dimensions to handle Measure Zero Constraints on Topological Mappings L. Manevitz pp. 153-159 Instruction for Authors pp. 161-165 If you have any questions about this journal please feel free to contact me at: martijn.boet at wkap.nl With kind regards, Kluwer Academic Publishers Martijn Boet Product Manager martijn.boet at wkap.nl From bishopc at helios.aston.ac.uk Tue Jun 3 08:46:40 1997 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Tue, 03 Jun 1997 13:46:40 +0100 Subject: Newton Institute Message-ID: <3514.199706031246@sun.aston.ac.uk> Neural Networks and Machine Learning ------------------------------------ A six month programme at the Isaac Newton Institute, Cambridge, U.K. July to December, 1997 http://www.newton.cam.ac.uk/programs/nnm.html Junior Members Scheme ===================== The Newton Institute recognises that junior researchers have much to contribute to and much to gain from Institute programmes and events, and has therefore introduced the Junior Members Scheme. To be eligible for Junior Membership of the Institute you must be a Research Student or within 5 years of having received a PhD (with appropriate allowance for career breaks) and you must work or study in a UK University or a related research institution. Junior members will receive: * A bulletin giving regular advance information about programmes, workshops, conferences and other Institute events * Information about suitable sources of funding or support for visits to the Institute, when available. Newton Institute Junior Member Grants ------------------------------------- The Institute will make available some of its general funds specifically to support junior researchers' involvement in Institute activities. The types of involvement to be supported include (but are not limited to) attendance at workshops, conferences etc, and visits of up to 2 weeks to work or study with longer-term participants in the Institute's programmes. Further Information and Application Form ---------------------------------------- Full details of the Junior Members scheme, as well as an application form, are available from http://www.newton.cam.ac.uk/junior.html Mailing List ----------- To subscribe to the Junior Members' mailing list, send an e-mail to majordomo at newton.cam.ac.uk (leaving the subject field blank) in which the body of the message contains the line subscribe newton-jmember [your e-mail address] You may subscribe to this list even if you are not eligible to become a Junior Member. From malti at mpce.mq.edu.au Wed Jun 4 01:00:13 1997 From: malti at mpce.mq.edu.au (Malti Patel) Date: Wed, 4 Jun 1997 15:00:13 +1000 Subject: Research Assitant Position available Message-ID: <199706040500.PAA12166@krakatoa.mpce.mq.edu.au> RESEARCH ASSISTANT available at the Department of Computing and Department of Behavioural Sciences Macquarie University Sydney Australia A research assistant (RA) is required to work on a large ARC grant concerned with investigating semantic representations and their implementation in the Dual-Route Cascaded Reading Model (DRC). This work will be part of ongoing research into this area at Macquarie University and will build on previous results. The RA will be involved in performing experiments to collect semantic data in addition to running simulations using this data in the DRC. It is therefore desirable that the candidate have at least a II.1 Honours degree or equivalent postgraduate research experience and also have experience in the following areas: neural networks cognitive modelling good programming skills in C/C++ The position is available for a year, starting immediately, with a possibility of extension. The salary will be approximately $34,000 depending on experience. Applications in the form of a covering letter, plus CV including the names and addresses of two refereees should be sent or emailed to the address below by June 30th 1997. Dr Malti Patel Department of Computing School of MPCE Macquarie University Sydney NSW 2109 Australia malti at mpce.mq.edu.au From omori at cc.tuat.ac.jp Wed Jun 4 04:53:03 1997 From: omori at cc.tuat.ac.jp (Takashi Omori) Date: Wed, 4 Jun 1997 17:53:03 +0900 (JST) Subject: Call for Paper : ICONIP'98-Kitakyushu Message-ID: <199706040853.RAA27168@cc.tuat.ac.jp> ***************************************** * Call for Papers * ***************************************** ICONIP'98-Kitakyushu The Fifth International Conference on Neural Information Processing Organized by Japanese Neural Network Society (JNNS) Sponsored by Asian Pacific Neural Network Assembly (APNNA) October 21-23,1998 Kitakyushu International Conference Center 3-9-30 Asano, Kokura-ku, Kitakyushu 802, Japan The annual conference of the Asian Pacific Neural Network Assembly, ICONIP*98, will be held jointly with the ninth annual conference of Japanese Neural Network Society, from 21 to 23 October 1998 in Kitakyushu, Japan. The goal of ICONIP*98 is to provide a forum for researchers and engineers from academia and industries to meet and to exchange ideas on advanced techniques and recent developments in neural information processing. The conference further serves to stimulate local and regional interests in neural information processing and its potential applications to industries indigenous to this region. Topics of Interest Track$B-5(J: Neurobiological Basis of Brain Functions Track$B-6(J: Mathematical Theory of Brain Functions Track$B-7(J: Cognitive and Behavioral Aspects of Brain Functions Track$B-8(J: Theoretical and Technical Aspects of Neural Networks Track$B-9(J: Distributed Processing Systems Track$B-:(J: Applications of Neural Networks Track$B-;(J: Implementations of Neural Networks Topics cover (Key Words): Neuroscience, Neurobiology and Biophysics, Learning and Plasticity, Sensory and Motor Systems, Cognition and Perception Algorithms and Architectures, Learning and Generalization, Memory, Neurodynamics and Chaos, Probabilistic and Statistical Methods, Neural Coding Emotion, Consciousness and Attention, Visual and Auditory Computation, Speech and Languages, Neural Control and Robotics, Pattern Recognition and Signal Processing, Time Series Forecasting, Blind Separation, Knowledge Acquisition, Data Mining, Rule Extraction Emergent Computation, Distributed AI Systems, Agent-Based Systems, Soft Computing, Real World Systems, Neuro-Fuzzy Systems Neuro Device and Hardware, Neuro and Brain Computers, Software Tools, System Integration Conference Committee Conference Chair: Kunihiko Fukushima, Osaka University Conference Vice-chair: Minoru Tsukada, Tamagawa University Organizing Chair: Shuji Yoshizawa, Tokyo University Program Chair: Shiro Usui, Toyohashi University of Technology International Advisory Committee (tentative) Chair: Shun-ichi Amari, Institute of Physical and Chemical Research Members: S. Bang (Korea), J. Bezdek (USA), J. Dayhoff (USA), R. Eckmiller (Germany), W. Freeman (USA), N. Kasabov (New Zealand), H. Mallot (Germany), G. Matsumoto (Japan), N. Sugie (Japan), R. Suzuki (Japan), K. Toyama (Japan), Y. Wu (China), L.Xei (Hong Kong), J. Zurada (USA) Call for paper The Program Committee is looking for original papers on the above mentioned topics. Authors should pay special attention to explanation of theoretical and technical choices involved, point out possible limitations and describe the current states of their work. All received papers will be reviewed by the Program Committee. The authors will be informed about the decision of the review process by June 22, 1998. All accepted papers will be published. As the conference is a multi-disciplinary meeting the papers are required to be comprehensible to a wider rather than to a very specialized audience. Instruction to Authors Papers must be received by March 31, 1998. The papers must be submitted in a camera-ready format. Electronic or fax submission is not acceptable. Papers will be presented at the conference either in an oral or in a poster session. Please submit a completed full original pages and five copies of the paper written in English, and backing material in a large mailing envelope. Do not fold or bend your paper in any way. They must be prepared on A4-format white paper with one inch margins on all four sides, in two column format, on not more than 4 pages, single-spaced, in Times or similar font of 10 points, and printed on one side of the page only. Centered at the top of the first page should be the complete title, author(s), mailing and e-mailing addresses, followed by 100-150 words abstract and the text. Extra 2 pages are permitted with a cost of 5000 yen/page. Use black ink. Do not use any other color, either in the text or illustrations. The proceedings will be printed with black ink on white paper. In the covering letter the track and the topic of the paper according to the list above should be indicated. No changes will be possible after submission of your manuscript. Authors may also retrieve the ICONIP style "iconip98.tex", "iconip98.sty" and "sample.eps" files (they are compressed as form.tar.gz) for the conference via WWW at URL http://jnns-www.okabe.rcast.u-tokyo.ac.jp/jnns/ICONIP98.html. Language The use of English is required for papers and presentation. No simultaneous interpretation will be provided. Registration The deadline for Registration for speakers and Early Registration for non-speakers with remittance will be July 31, 1998. Both registration sheet and fee must be received by the day. The registration fee will be as follows: Japanese Yen Dead line General Participant Student Speaker July 31, 1998 \40,000 \10,000 Non-speaker Early Regist.July 31, 1998 \40,000 \10,000 After July 31, 1998 \50,000 \15,000 The registration fee for General Participant includes attendance to the conference, proceedings, banquette and reception. The registration fee for Student includes attendance to the conference and proceedings. Conference Venue Kitakyushu is a northern city in Kyushu Island, south west of Japan main islands. The place is one of the Japanese major industrial areas, and also has long history of two thousand years in Japanese and Chinese ancient records. There are direct flights from Asian and American major airports. You will be able to enjoy some technical tours and excursion in the area. Passport and Visa All foreign attendants entering Japan must possess a valid passport. Those requiring visas should apply to the Japanese council or diplomatic mission in their own country prior to departure. For details, participants are advised to consult their travel agents, air-line reservation office or the nearest Japanese mission. Accommodation Hotel information will be Available in the Second Circular. Events Exhibition, poster sessions, workshops, forum will be held at the conference. Two satellite workshops will be held just before or after the conference. Social Events Banquette, reception and excursion will be held at the conference. The details will be announced in the second circular. Workshops Two satellite workshops will be held. One is "Satellite workshop for young researcher on Information processing" that will be held after the conference. The detail is announced in the attached paper. Another workshop "Dynamical Brain" is under programming. This will take place in Brain Science Research Center, Tamagawa University Research Institute. The details will be announced in the Second Circular. Please see second circular for more information on these workshops, and possibly other new ones. Important Dates for ICONIP*98 Papers Due: March 31, 1998 Notification of Paper Acceptance: June 22, 1998 Second Circular (with Registration Form): June 22, 1998 Registration of at least one author of a paper: July 31, 1998 Early Registration: July 31, 1998 Conference: October 21-23, 1998 Workshop: October 24-26, 1998 Further Information & Paper Submissions ICONIP*98 Secretariat Mr. Masahito Matsue Japan Technical Information Service Sogo Kojimachi No.3 Bldg. 1-6 Kojimachi, Chiyoda-ku, Tokyo 102, Japan Tel:+81-3-3239-4565 Fax:+81-3-3239-4714 E-mail:KYD00356 at niftyserve.or.jp $B!y(J Could you suggest your friends and acquaintances who will be interested in ICONIP*98-Kitakyushu? Thank you. ------------------------------------------------------------- ICONIP*98-Kitakyushu 21-23 October, 1998 Tentative Registration (PLEASE PRINT) Name: Professor Dr. Ms. Mr. Last Name First Name Middle Name Affiliation: Address: Country: Telephone: Fax: E-mail: $B""(J I intend to submit a paper. The tentative title of my paper is: ------------------------------------------------------- $B""(J I intend to attend the conference. $B""(J I want to receive the Second Circular. Please mail a copy of this completed form to: ICONIP*98 Secretariat Mr. Masahito Matue Japan Technical Information Service Sogo Kojimachi No.3 Bldg. 1-6 Kojimachi, Chiyoda-ku, Tokyo 102, Japan Tel:+81-3-3239-4565 Fax:+81-3-3239-4714 E-mail:KYD00356 at niftyserve.or.jp ---------------------------------------------------------------- Satellite Workshop of ICONIP*98 Workshop for Young Researchers on Information Processing in the Central Nervous System --- Experimental and Theoretical Studies --- October 24-26, 1998 Site Hiko-san Seminar House (Tentative) Aim This workshop is a satellite meeting of ICONIP'98-Kitakyushu to be held in Kitakyushu, Japan, on October 21-23, 1998. In this workshop, invited talks by young researchers in the field of neuroscience and talks by young authors selected from among those who submitted papers to ICONIP'98 will be given. The talks will cover a broad range of topics in the field of neuroscience to promote interdisciplinary interactions and to familiarize the participants with a wide range of experimental and theoretical methodologies. About one hour and 30 minutes is assigned to each talk to allow in-depth discussion, which benefits both the speakers and audience, namely, better understanding of the talks for the audience and possible improvement of the paper for the speaker. Approximately 40 researchers will be able to participate. The organizer will provide partial support for room and board. The workshop will run from morning to evening every day and informal discussion will follow at night.! The organizers expect the attendance to exchange information and build new friendships through the workshop. Topics Information processing in single cells Sensory information processing/ Learning and memory / Motor learning and adaptive control/ Computational perspective / Methodology for data analysis / Hardware implementation Application A young author who is interested in participating in the workshop should mark "WS" in red on all six copies of the manuscript of his or her paper submitted to ICONIP'98, or contact the workshop organizer as follows: Shunsuke Sato Fax: +81-6-853-1809 e-mail: sato at bpe.es.osaka-u.ac.jp Information A bus will leave from the gate of Kitakyushu International Conference Center for the workshop site at 18:00, Oct. 23, 1998. --iconip98.sty------------------------------------------------ % proc.sty 24-Sep-85 % % This style file is originally for aw93. % % Modified (sc) 6-July-87 % DARPA style modifed for aw93 by JBH2 and RMS % Reformatted (rms) 27-Jan-92 % Title and abstract sizes moved to .sty file (Magerman) 31-1-92 % Reformatted (rms) 15-Jan-93 % Modified by Matsuo and Kuroyanagi (Nagoya I.T.) for ICONIP97 April 97. % % modified by kuroyanagi \def\large{\@setsize\large{12pt}\xpt\@xpt} % end modified \typeout{Document Style Option 'aw93' (1/93 version)} \let\yes=y \let\no=y \let\secnums=\yes \def\sitereport {\global\let\secnums=n} \def\thesection {} \def\thesubsection {} \def\thesubsubsection {} %% Redefine these for more compact headings: %\def\section{\@startsection {section}{1}{\z@} %{-2.0ex plus -0.5ex minus -.2ex}{3pt plus 2pt minus 1pt}{\large\bf}} %\def\subsection{\@startsection{subsection}{2}{\z@} %{-2.0ex plus -0.5ex minus -.2ex}{3pt plus 2pt minus 1pt}{\large\bf}} %\def\subsubsection{\@startsection{subsubsection}{3}{\z@} %{-6pt plus 2pt minus 1pt}{-1em}{\normalsize\bf}} %% Redifine margins etc. \voffset .19in \oddsidemargin -.25in \evensidemargin -.25in % \topmargin -.17in \headheight 0pt \headsep 0pt \footheight 12pt % modified by matsuo \topmargin -.50in \headheight 0pt \headsep 0pt \footheight 12pt % end \footskip 30pt \marginparwidth 0pt \marginparsep 0pt %modified bymatsuo %\textheight 9.00truein \textwidth 7.0truein \columnsep .25truein \textheight 9.50truein \textwidth 7.0truein \columnsep .25truein % end modified % \columnseprule 0pt %% make table and figure numbers (bold) % \def\thetable{\@arabic\c at table} % \def\fnum at table{Table \thetable} % \def\thefigure{\@arabic\c at figure} % \def\fnum at figure{Figure \thefigure} %% redefine the description list environment \def\descriptionlabel#1{\raggedright\bf #1} \def\description{\list{}{ \leftmargin=5em \labelwidth=4em \labelsep=1em \let\makelabel\descriptionlabel}} \let\enddescription\endlist %% this kills page number generation \def\@oddhead{}\def\@evenhead{} \def\@oddfoot{} \def\@evenfoot{\@oddfoot} %% Set up title format \def\institution#1{\gdef\@institution{#1}} % add by matsuo \def\email#1{\gdef\@email{#1}} % end add \def\maketitle{\par \begingroup \def\thefootnote{\fnsymbol{footnote}} \def\@makefnmark{\hbox to 0pt{$^{\@thefnmark}$\hss}} \twocolumn[\@maketitle] \@thanks \endgroup \setcounter{footnote}{0} \let\maketitle\relax \let\@maketitle\relax \gdef\@thanks{}\gdef\@author{}\gdef\@title{}\let\thanks\relax} % title's vertical height is no more/no less than 2.4 inches \def\@maketitle{\vbox to 1.8in{\hsize\textwidth \linewidth\hsize \centering {\LARGE\bf \@title \par} % \vfil {\large \bf \begin{tabular}[t]{c}\@author \end{tabular}\par} % \vfil {\begin{tabular}[t]{c}\@institution \end{tabular}\par} \vspace*{3mm} {\large\it \begin{tabular}[t]{c}\@author \end{tabular}\par} % add by matsuo \vspace*{1mm} {\large\it \begin{tabular}[t]{c}Email:\@email \end{tabular}\par} % end add \vspace*{3mm} {\large\begin{tabular}[t]{c}\@institution \end{tabular}\par} \vfil}} \def\copyrightspace#1{\unmarkedfootnote{\begin{minipage}[t]{4.23in} \parindent 1em \input{#1}\vrule height 3pt width 0pt \end{minipage}}} \def\abstract{\let\secnums=n \section{ABSTRACT} \small} \def\endabstract{\par\@endparenv} %add by matsuo \def\keyword{\let\secnums=n \vspace*{-3mm}\subsubsection{KEYWORDS:} \small\bf} \def\endkeyword{\par\@endparenv} %add by matsuo % add by matsuo for keyword %\newbox\abstractbox %\setbox\abstractbox\hbox{} %\def\abstract{\global\setbox\abstractbox=\hbox\bgroup% %\begin{minipage}[t]{137.5mm}%11Q 50zw %{\small\baselineskip=\bf\egtgt\hskip1zw $B$"$i$^$7 (B\hskip1zw}\small\cntrm\igno %respaces} %\def\endabstract{\end{minipage}\egroup} %\newbox\keywordbox %\setbox\keywordbox\hbox{} %\def\keyword{\global\setbox\keywordbox=\hbox\bgroup% %\begin{minipage}[t]{100mm}%11Q 50zw %{\small\bf Keyword:}\small\ignorespaces} %\def\endkeyword{\end{minipage}\egroup} % end add %\def\endabstract{\par\@endparenv\vskip 0.2in\vskip -3.5ex} %\def\abstract{\section*{ABSTRACT}} %\def\endabstract{\par} % taken from twocolumn.sty 27 Jan 85 \twocolumn \sloppy \flushbottom \parindent 0em \leftmargini 2em \leftmarginv .5em \leftmarginvi .5em % format for references \def\references{\let\secnums=n \section{References} \topsep=0pt plus 2pt \partopsep=0pt plus 2pt \def\baselinestretch{0.89}\small \vskip -9pt \list{\hfill [\arabic{enumi}]} {\settowidth\labelwidth{[99]}\leftmargin\labelwidth \advance\leftmargin\labelsep \usecounter{enumi}} \def\newblock{\hskip .11em plus .33em minus -.07em} \sloppy \sfcode`\.=1000\relax} \let\endreferences=\endlist \let\bibliography\references \let\endbibliography\endreferences %section format instructions %\def\ {} %\def\thesubsection {} %\def\thesubsubsection {} %\def\l at section#1#2{\relax} %\let\l at subsection\l at section %\let\l at subsubsection\l at section \def\section#1{\@startsection{section}{1}{0pt} {1ex plus 0.3ex minus 0.3ex} {0.8ex plus 0.2ex minus 0.2ex} {\centering\large\bf{ % {\large\bf{ \ifx\secnums\yes \arabic{section}. \else \addtocounter{section}{-1} \fi #1}}\relax \vskip -\parskip} \def\subsection#1{\@startsection{subsection}{2}{0pt} {0.25ex plus .2ex minus .1ex} {1.5ex plus 0.1ex minus 0.1ex} {\large\bf \ifx\secnums\yes \arabic{section}.\arabic{subsection}. \else \addtocounter{subsection}{-1} \fi #1}\relax \vskip -\parskip} \def\subsubsection#1{\@startsection{subsubsection}{3}{\parindent}{0pt} {-0.8em}{\bf}*{\bf{\vphantom{Zy}#1}}} \let\paragraph\subsubsection \renewcommand{\topfraction}{.9} \renewcommand{\dbltopfraction}{.9} \renewcommand{\bottomfraction}{.5} \renewcommand{\textfraction}{.15} \renewcommand{\floatpagefraction}{.85} \renewcommand{\dblfloatpagefraction}{.85} \setcounter{totalnumber}{3} \setcounter{topnumber}{1} \setcounter{dbltopnumber}{1} \setcounter{bottomnumber}{1} \setlength{\parskip}{2ex} \def\ninept{\def\baselinestretch{.95}\let\normalsize\small\normalsize} --iconip98.tex------------------------------------------------ %\documentstyle[aw93,times,psfig]{article} \documentstyle[iconip98,epsbox]{article} \begin{document} \ninept \title{ICONIP '98---TITLE OF THE PAPER} \author{Aikon Nippu \dag and Kyuju Hachi \dag\dag} \email{Nippu at dept.univ.ac.jp} \institution{\dag Dept. of Computer Eng., Univ University\\ Town1, City1, Prefecture 777, Japan\\ \dag\dag Dept. of Information, Univ University\\ Town2, City2, Prefecture 777, Japan} \maketitle \begin{abstract} This is an example of a ICONIP '98 proceedings paper\footnote{This is a sample of footnote.}, as you should submit it. All manuscripts must be in English. Electronic or fax submission is not acceptable. Use a two column format, with the title and author information centered across both columns on the first page, staying within the designated image area. Illustrations and figures may also span both columns. Dimensions for the paper are specified in this text. Your paper should begin with an abstract of about 100-150 words. \end{abstract} \begin{keyword} keyword1, keyword2, keyword3 \end{keyword} \section{INSTRUCTIONS FOR THE PROCEEDINGS PAPER} You must use the double column style specified. Please adhere to the style guidelines described in this sample paper. This allows us to maintain uniformity in the printed copy of the proceedings. \subsection{General Instructions} The manuscript that you prepare will be printed as it is received. Neatness is of paramount importance. Prepare upto 4 pages of text and figures for your paper. Extra 2 pages are permited with a cost of 5000 yen/page. Use black ink. Do not use any other color, either in the text or illustrations. The proceedings will be printed with black ink on white paper. Enclose the completed full original pages and 5 copies with a cover sheet, and backing material in a large mailing envelope; do not fold or bent your paper in any way. Papers must be received before March, 31, 1998 by ICONIP '98 Secretariat ( the address is listed at the end ). No electronic submission is accepted. No changes will be possible after submission of your manuscript. \subsection{Style Guidelines} All manuscripts must be single space, in the specified two column format. A full double column page with no illustrations should contain approximately 1000 words. All text should be in Times Roman or equivalent font, and the main text should have a fontsize of 10 points. Do not overcrowd and create an unreadable paper by making the lettering or the line spacing too small. The line spacing should be approximately 2.75 lines/cm (7 lines/inch). Authors may also retrieve the ICONIP style ``iconip98.tex'', ``iconip98.sty'' and ``sample.eps'' files (they are compressed as form.tar.gz) for the conference via WWW at following URL http://jnns-www.okabe.rcast.u-tokyo.ac.jp /jnns/ICONIP98.html \subsubsection{Cover Sheet} The cover sheet should include, \begin{quote} (1) Paper title, (2) Track number, (3) Key words, (4) Oral / Poster, (5) Contact author's name, (6) Full mailing address, (7) Telephone number, (8) Fax number and (9) Electronic mail address \end{quote} \subsubsection{Paper Title} The paper title should be in 18-point boldface capital letters, centered across the top of the two columns on the first page as indicated above. \subsubsection{Author's Name(s)} The author's name(s) and affiliations(s) appear below the title in 12-point capital and lower case letters. You should include a mailing address here if space permits. It is also useful to include your email. \subsubsection{Abstract} Each paper should contain an abstract of about 100-150 words that appears at the beginning of the first column of the first page. \subsubsection{Major Headings} Major headings appear in 12-point bold face capital letters, centered in the column. Examples of the various levels of headings are included in this sample paper. \subsubsection{Sub-headings} Sub-headings appear in 12-point boldface capital and lower case. They start at the left margin on a separate line. \subsubsection{Sub-Sub headings} Sub-sub headings appear in 10-point boldface capital and lower case in text. Justify and fill the text. \subsubsection{References} List and number all references at the end of the paper. Number the references in the order they first appear in the text. When referring to them in the text, type the corresponding reference number in the square brackets as shown at the end of this paper \cite{AUTHOR1}, \cite{AUTHOR2}. %[1], [2]. \subsubsection{Illustrations} Illustrations must appear within the designated margins, and must be part of the submitted paper. % They may span two columns. If possible, %position illustrations at the tops of columns, rather than the middle %or at the bottom. Caption and number every illustration. All halftone illustrations must be clear black and white prints. Line drawings may be made in black ink on white paper. Do not use any color illustrations. \subsubsection{Samples of Equation} \begin{eqnarray} O(t) & = & \frac{1}{1-e^{I(t)}} \\ I(t) & = & \sum ^{n}_{i=1} W_{i} \cdot X(i,t) \label{eq1} \\ y & = & ax_1 + bx_2 + cx_3 + dx_4 + ex_5 \nonumber \\ & & { } + fx_6 + gx_7 + hx_8 + jx_9 + kx_{10} \end{eqnarray} %\newpage \subsubsection{Sample of Table} This is a sample of table. \begin{table}[h] \caption{ A sample of table} \begin{center} \begin{tabular}{|c||c|c|c|c|c|} \hline & A & B & C & D & E \\ \hline\hline A & 1.0 & 0.4 & 0.5 & 0.9 & 0.3 \\ \hline B & 0.1 & 1.0 & 0.5 & 0.9 & 0.3 \\ \hline C & 0.1 & 0.8 & 1.0 & 0.3 & 0.5 \\ \hline D & 0.2 & 0.4 & 0.5 & 1.0 & 0.3 \\ \hline E & 0.1 & 0.4 & 0.5 & 0.4 & 1.0 \\ \hline \end{tabular} \end{center} \end{table} \subsubsection{Sample of Figure} This is a sample of figure. \begin{figure}[h] \begin{center} \psbox[width=2.5in]{sample} \caption{A sample of Figure} \end{center} \end{figure} \subsection{Layout Specification} You should use letter size or A4 size paper that will be printed exactly as received. Keep the top and left margins as specified. Detailed size specifications for the layout are given below. It is important that your printer produces clean, high quality output, since there will be no sharpness enhancing reduction of what you provide. Do not use a dot matrix or other low quality printer. If a suitable printer is not available, you can type or print your paper at a larger size and reduce it on a photocopying machine to the specified dimensions. All materials (text and illustrations) must appear within a bounding box that is 180mm wide and 215 mm high (7.1" x 8.4") area. Noting outside this area will be printed. The bounding box should be placed in the center of the paper, with equal sized left and right margins, and equal sized top and bottom margins. Indicate the order of your pages by numbering them at the back using a light pencil. Text should appear in two columns, each about 86 mm wide with a 6 mm space between the columns. On the first page, the top 47 mm (1.8") of both columns is reserved for the title, author(s) and affiliations(s). These items should be centered across both columns, starting at approximately 18 mm (0.7") from the top. On the second page, the text or illustrations start at approximately 18 mm (0.7") from the top. \subsection{Writing Style and Other Hints} It is sometimes difficult to describe your research work in detail and stay within 4 pages. However, by proper organization of the text you will be surprised how much information you can present. Emphasize the presentation of your new contributions. Background information and previous work can be summarized in a few lines accompanied by the proper references. Obvious errors or typos can easily be avoided and yet are often over looked. Some useful hints are given below. Try to have others who are familiar with the topic, but not necessarily a co-author, proofread the text. Also the use of spelling checkers can be quite useful. If English is not your nat ive language, try to have a native English speaker or a professional English editor proofread your text. Please direct any questions to the ICONIP '98 Secretariat,\\ \\ {\bf ICONIP '98 Secretariat\\ Mr. Masahito Matsue \\ Japan Technical Information Service\\ Sogo Kojimachi No.3 Bldg., 1-6 Kojimachi,\\ Chiyoda-ku, Tokyo 102, Japan\\ Tel: +81-3-3239-4565, Fax: +81-3-3239-4714\\ E-mail: KYD00356 at niftyserve.or.jp} \begin{references} \bibitem{AUTHOR1} Author, O., ``This is the first Reference (Journal),'' {\it IAGNU Trans.}, {\bf Vol.E99-D}, 12, pp.123--456 (1999). \bibitem{AUTHOR2} Author., T, {\it The Second Reference -- Book --}, Authors Press, 1900. \bibitem{FELLOW} Fellow, A.G., ``An Research about the Third Reference (Tech.Rep.),'' IEIE Tech. Rep., USO-88-123, pp.35-43 (1968). \bibitem{FELLOW2} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW3} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW4} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW5} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW6} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW7} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW8} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \bibitem{FELLOW9} Fellow, A.B., ``An Model of the Fourth Reference (Proceeding),'' Proceedings of the Fourth Sample Conference on Learning, pp.1234--6543 (1957). \end{references} \end{document} ------------------------------------------------------------ end of ICONIP'98 call for paper mail From hali at theophys.kth.se Thu Jun 5 06:12:35 1997 From: hali at theophys.kth.se (Hans Liljenstrom) Date: Thu, 5 Jun 1997 06:12:35 -0400 Subject: Workshop Announcment Message-ID: <9706051012.AA15740@> Workshop on Biology as a Source of Inspiration for Technology June 19 in Sigtuna, Sweden An informal satellite meeting to the International Conference on Engineering Applications of Neural Networks, EANN'97, is organised by the Agora for Biosystems on June 19 in Sigtuna, Sweden. The theme of the workshop is "Biology as a Source of Inspiration for Technology" and will be discussed by EANN conference participants, invited speakers and others interested in the topic. The idea is to focus attention on modern technical applications that have been inspired by biology, and to promote the dialogue between theorists and practitioners in the field. The workshop is taking place at the Agora for Biosystems, located in the nice and unique building of Sigtunastiftelsen Guesthouse, in the small town of Sigtuna, some 45 km north west of Stockholm, near Arlanda international airport. The workshop will start at 9 am on Thursday, June 19, 1997, and end around 5 pm the same day. Lunch and dinner can be served at self cost at Sigtunastiftelsen or in any nearby restaurant. Accomodation is also possible at the Guesthouse, or in nearby hotels or hostels. Participants who want to stay at the Guesthouse should contact the organisers or make room reservations directly at the Guesthouse, phone +46-(0)8-592 589 00. Please, also inform the organisers or the Guesthouse if you plan to have lunch and/or dinner at the Guesthouse. Informal presentations to initiate the discussions will be given by, among others, Prof. Jarl-Thure Ericsson, professor at the Dept. of Electrical Engineering and headmaster of Tampere University of Technology, Prof. Erik Mosekilde, Dept. of Physics, the Technical University of Denmark, Prof. Wlodzislaw Duch, Dept. of Computer Methods, Nicholas Copernicus University, Torun (Poland), and Prof. Hans Wiksell, Comair AB, Stockholm. Anyone willing to give a short oral presentation is welcome to contact the organisers asap and no later than June 12. For furhter information, please see the home page, http://bach.theophys.kth.se/~hali/eann97/eannwork.html or contact the organisers. All interested are welcome to contact the organisers for registration before June 12. Hans Liljenstrom Agora for Biosystems Box 57 S-193 22 Sigtuna Sweden Phone: +46-(0)8-592 509 01 (Agora) +46-(0)8-790 7172 (KTH) Fax: +46-(0)8-104879, Email: hali at theophys.kth.se. From marc at cogsci.ed.ac.uk Thu Jun 5 08:36:30 1997 From: marc at cogsci.ed.ac.uk (Marc Moens) Date: Thu, 05 Jun 1997 13:36:30 +0100 Subject: Job Advert -- Chairs in Informatics, Edinburgh Message-ID: <199706051236.NAA16051@stevenson.cogsci.ed.ac.uk> [Apologies if you receive this more than once.] Three Chairs in Informatics at the University of Edinburgh Chair of Artificial Intelligence Chair of Cognitive Science Chair of Computer Science The University intends to appoint to three Chairs, in Artificial Intelligence, Cognitive Science, and Computer Science. These posts are being filled to sustain and develop Edinburgh's identity as a centre of the highest international quality for research and teaching in Informatics. Applications are invited from candidates with an established international reputation in research. Salary will be on the Professorial scale. For further details see http://www.dcs.ed.ac.uk/ipu/chairs Closing date 27th June 1997 From zwerg at libra.chaos.gwdg.de Thu Jun 5 02:57:30 1997 From: zwerg at libra.chaos.gwdg.de (Dirk Brockmann) Date: Thu, 5 Jun 1997 08:57:30 +0200 (MET DST) Subject: New paper on orientation and ocular dominance available In-Reply-To: Message-ID: "Conditions for the joint emergence of orientation and ocular dominance in a high-dimensional self-organizing map." by D. Brockmann, M. Riesenhuber, H. U. Bauer, T. Geisel, (1997)> at http://www.chaos.gwdg.de/~zwerg/ Abstract: We investigate a model for the joint development of orientation and ocular dominance columns which is based on the high-dimensional version of Kohonen's Self-Organizing Map algorithm. The model is driven by stimuli which have correlated left eye/right eye and on center cell/off center cell components. We first analyze the model using a recently proposed mathematical technique which allows us to identify regions in parameter space for which either ocular dominance or orientation columns alone develop, or both together. Guided by these results we then simulate the model and indeed find jointly developed columnar systems. The prefered angle of intersection between iso-orientation lines and the boundaries of iso-ocularity domains turns out to be rather perpendicular, consistent with experimental observations. Dirk > ********************************************************************* > Dirk Brockmann e-mail: zwerg at chaos.gwdg.de > Max-Planck-Institut fuer Stroemungsforschung Tel: (49) 551 5176 411 > Goettingen FAX: (49) 551 5176 402 > ********************************************************************* > > From devin at psy.uq.edu.au Fri Jun 6 05:03:45 1997 From: devin at psy.uq.edu.au (Devin McAuley) Date: Fri, 6 Jun 1997 19:03:45 +1000 (EST) Subject: Connectionist Models of Cognition: A Workshop Message-ID: *** Registration deadline extended to June 21st *** CONNECTIONIST MODELS OF COGNITION: A WORKSHOP Monday July 7th - Friday July 11th, 1997 University of Queensland Brisbane, Queensland 4072 Australia sponsored by the School of Psychology and the Cognitive Science Program Workshop Home Page: http://psy.uq.edu.au/~brainwav/Workshop/ BrainWave Home Page: http://psy.uq.edu.au/~brainwav/ This workshop provides an opportunity for faculty and students with teaching and research interests in connectionist modeling to gain hands-on modeling experience with the BrainWave neural network simulator during an intensive 5-day workshop on connectionist models of cognition at the University of Queensland. The workshop has three primary objectives: * to provide training in specific connectionist models of cognition. * to introduce instructors to the BrainWave simulator and course materials. * to support the development of new models by the participants. The first two days of the workshop will provide training in specific connectionist models of cognition and introduce participants to the BrainWave simulator. Day 3 will focus on how to develop a new model using BrainWave. On days 4 and 5, the instructors will be available to assist faculty and students in the development of new models and to discuss the development of teaching materials for undergraduate and postgraduate courses on connectionist modeling. Instructors: * Simon Dennis, School of Psychology * Devin McAuley, School of Psychology * Janet Wiles, Schools of Information Technology and Psychology Registration for the 5-day workshop includes: * Course materials: o The BrainWave Simulator for the Mac, Windows95, and Unix Platforms o An Introduction to Neural Networks and the BrainWave Simulator o Three chapters of a workbook on connectionist models of cognition * Morning and afternoon tea (Monday - Friday) * Lunch (Monday and Tuesday) The deadline for registration is June 21st. ---------------------------------------------------------------------------- Workshop Program Monday, July 7th Morning Session 8:00-9:00 Registration 9:00-10:30 Lecture: Introduction to connectionist models of cognition 11:00-1:00 Laboratory: Introduction to the BrainWave simulator Afternoon Session 2:00-3:00 Lecture: The Interactive Activation and Competition model of letter perception 3:30-5:00 Laboratory: The word superiority effect Tuesday, July 8th Morning Session 9:00-10:00 Lecture: The BackPropagation model of word and color naming 10:30-12:30 Laboratory: The Stroop effect Afternoon Session 1:30-3:00 Lecture: The Matrix model of episodic memory 3:30-5:00 Laboratory: Familiarity versus recognition Wednesday, July 9th Morning Session (using the Scheme language interface to modify BrainWave) 9:00-10:30 Lecture: Introduction to the Scheme programming language 11:00-12:30 Laboratory: How to extend models in BrainWave Afternoon Session (a case study of model development in BrainWave) 1:30-3:00 Lecture: The Deep Dyslexia network 3:30-5:00 Organisation of group projects Thursday, July 10th Morning Session 9:30-11:00 Group Work: Model design 11:00-12:30 Group Work: Model construction I Afternoon Session: Stream 1 (continued model construction) 1:30-3:00 Group Work: Model construction II 3:30-5:00 Group Work: Model construction III Afternoon Session: Stream 2 (using the Java language to modify BrainWave) 1:30-3:00 Lecture: Introduction to the Java programming language 3:30-5:00 Laboratory: How to add new architectures to BrainWave Friday, July 11th Morning Session 9:30-12:30 Group Work: Model construction IV (preparing your demonstration) Afternoon Session 1:30-3:30 Workshop presentations and discussion ---------------------------------------------------------------------------- Send general inquiries and registration form to: email: brainwav at psy.uq.edu.au fax: +61-7-3365-4466 (attn: Dr. Devin McAuley) postal mail: BrainWave Workshop (c/o Dr. Devin McAuley) School of Psychology University of Queensland Brisbane, Queensland 4072 Australia ---------------------------------------------------------------------------- REGISTRATION FORM Name: ________________________________________ Email: ________________________________________ Address: ________________________________________ ________________________________________ ________________________________________ ________________________________________ Student (AUS$60) ____ Academic (AUS$95) ____ Industry (AUS$295) ____ ACCOMODATION I would like accomodation at King's College ($45 per night - private room with ensuite shared between two, bed and breakfast) Yes ____ No ____ Arrival Date: ____ Departure Date: ____ Accomodation total: AUS $ ______ I would like to be billeted: Yes ____ No ____ Arrival Date: ____ Departure Date: ____ Total payment including registration: AUS $______ FORM OF PAYMENT Cheque or Money Order ____ Visa ____ Mastercard ____ Card # ____________________________ Expiration Date _____ Please debit my credit card for AUS$_______________ Signature _________________________________________ Cheques and money orders should be made out to University of Queensland, School of Psychology. From bishopc at helios.aston.ac.uk Mon Jun 9 10:29:11 1997 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Mon, 09 Jun 1997 15:29:11 +0100 Subject: Postdoctoral Research Fellowship Message-ID: <17994.199706091429@sun.aston.ac.uk> ---------------------------------------------------------------------- POSTDOCTORAL RESEARCH FELLOWSHIP -------------------------------- Neural Computing Research Group Dept of Computer Science & Applied Mathematics Aston University, Birmingham, UK "Automatic Analysis of Fluorescence In-Situ Hybridization Images" ----------------------------------------------------------------- *** Full details at http://www.ncrg.aston.ac.uk/ *** A mathematically-oriented researcher is required to work on the development of robust pattern recognition algorithms for the automatic interpretation of fluorescence in situ hybridization (FISH) images arising from the analysis of human chromosomes. Candidates should have an established research background in statistical pattern recognition including the use of techniques such as neural networks. Ideally, candidates should also have experience of developing solutions to problems in biomedical image interpretation. The successful candidate will join the thriving Neural Computing Research Group at Aston University, and will work under the supervision of Professor Chris Bishop. Conditions of Service --------------------- The salary will be on the RA 1A scale, within the range 15,159 to 22,785 UK pounds. This salary scale is subject to annual increments. How to Apply ------------ If you wish to be considered for this Fellowship, please send a full CV and publications list (including full details and grades of academic qualifications), a statement of how you believe you can contribute to this project, and the names and contact information of 3 referees, to: Hanni Sondermann Neural Computing Research Group Dept. of Computer Science and Applied Mathematics Aston University Birmingham B4 7ET, U.K. Tel: 0121 333 4631 Fax: 0121 333 4586 e-mail: H.E.Sondermann at aston.ac.uk e-mail submission of postscript files is welcome. Closing date: 27 June, 1997. From rvicente at onsager.if.usp.br Mon Jun 9 18:34:21 1997 From: rvicente at onsager.if.usp.br (Renato Vicente) Date: Mon, 9 Jun 1997 19:34:21 -0300 (EST) Subject: preprint: Functional Optimisation in Neural Networks Message-ID: <199706092234.TAA10602@gibbs.if.usp.br> Title: Functional Optimisation of Online Algorithms in Neural Networks (submitted to J. Phys. A: Math. Gen.) Authors: Renato Vicente and Nestor Caticha. Available at : http://www.fma.if.usp.br/~rvicente/nnsp.html http://xxx.lanl.gov (cond-mat/9706015) Abstract: We study the online dynamics of learning in fully connected soft committee machines in the student-teacher scenario. The locally optimal modulation function, which determines the learning algorithm, is obtained from a variational argument in such a manner as to maximise the average generalisation error decay per example. Simulations results for the resulting algorithm are presented for a few cases. The symmetric phase plateaux are found to be vastly reduced in comparison to those found when online backpropagation algorithms are used. A discussion of the implementation of these ideas as practical algorithms is given. Comments are welcome. Renato Vicente Instituto de Fisica, Universidade de Sao Paulo, Brazil From ted at SPENCER.CTAN.YALE.EDU Tue Jun 10 13:51:04 1997 From: ted at SPENCER.CTAN.YALE.EDU (ted@SPENCER.CTAN.YALE.EDU) Date: Tue, 10 Jun 1997 17:51:04 GMT Subject: Preprint available Message-ID: <199706101751.RAA02479@CHARCOT.CTAN.YALE.EDU> A preprint of The NEURON Simulation Environment M.L. Hines and N.T. Carnevale Neural Computation, in press is now available in PostScript for UNIX systems as ftp.neuron.yale.edu/neuron/papers/nsimenv.ps.Z and for MSDOS systems as ftp.neuron.yale.edu/neuron/papers/nsimenps.zip The only difference between these two files is the method of compression. Length about 250K, expands to about 600K when uncompressed. An HTML formatted version has been posted at http://www.nnc.yale.edu/papers/nrnrefs.html This is graced by convenient internal and external links. The paper contains much helpful material that has not appeared elsewhere, and should be of interest both to current users and to those who are deciding what simulation tools to choose. This list of section headings gives an idea of what is covered. 1. INTRODUCTION 1.1 The problem domain 1.2 Experimental advances and quantitative modeling 2. OVERVIEW OF NEURON 3. MATHEMATICAL BASIS 3.1 The cable equation 3.2 Spatial discretization in a biological context: sections and segments 3.3 Integration methods 3.3.1 Efficiency 4. THE NEURON SIMULATION ENVIRONMENT 4.1 The hoc interpreter 4.2 A specific example 4.2.1 First step: establish model topology 4.2.2 Second step: assign anatomical and biophysical properties 4.2.3 Third step: attach stimulating electrodes 4.2.4 Fourth step: control simulation time course 4.3 Section variables 4.4 Range variables 4.5 Specifying geometry: stylized vs. 3-D 4.6 Density mechanisms and point processes 4.7 Graphical interface 4.8 Object-oriented syntax 4.8.1 Neurons 4.8.2 Networks 5. SUMMARY ACKNOWLEDGMENTS REFERENCES --Ted From ted at SPENCER.CTAN.YALE.EDU Tue Jun 10 14:17:45 1997 From: ted at SPENCER.CTAN.YALE.EDU (ted@SPENCER.CTAN.YALE.EDU) Date: Tue, 10 Jun 1997 18:17:45 GMT Subject: NEURON course at SFN 1997 Message-ID: <199706101817.SAA02517@CHARCOT.CTAN.YALE.EDU> In response to yesterday's announcement of record # of abstracts at the upcoming Society for Neuroscience meeting in New Orleans, everybody seems to be making travel arrangements early-- so we'd like to bring something to your attention, just in case it affects your travel plans. John Moore, Michael Hines & I will be presenting a 1 day course on Saturday, Oct. 25, entitled "Using the NEURON Simulation Environment." This is a regular Satellite Symposium, and eventually you will read the announcement in SFN's preliminary program. It will run from 9 AM to 5 PM. Coffee breaks and lunch will be provided. There will be a registration fee to cover these and related costs, e.g. AV equipment rental and handout materials. This will be in the ballpark of other 1 day courses; exact figure depends on factors we haven't yet learned from SFN. More information about this course is posted at http://www.neuron.yale.edu/sfn.html and soon also at nttp://neuron.duke.edu/sfn.html --Ted From at at cogsci.soton.ac.uk Tue Jun 10 12:13:39 1997 From: at at cogsci.soton.ac.uk (Adriaan Tijsseling) Date: Tue, 10 Jun 1997 16:13:39 +0000 Subject: ANN: BackProp Simulator for Macintosh Message-ID: Release: BackBrain version 1.0 Introduction A Macintosh shareware program called BackBrain that simulates the backpropagation type artificial neural network algorithm. This application lets you create, train, and analyze backprop nets using an intuitive interface. BackBrain also makes it possible to create 3D models of network dynamics. A step by step tutorial is available as an Apple Guide document (under the Balloon Help menu). You can also turn on balloon help to quickly assess the function of particular commands. System Requirements BackBrain only works with PowerMacintoshes with System 7 or later and with the Thread Manager? installed. If you have system 7.5 or later then the Thread Manager is already incorporated, but for former system versions you need the Thread Manager extension. You will also need QuickDraw3D 1.5.1 or later and QuickTime. Fees BackBrain is shareware with a value of $25,-. Site licences will be $10,- for each copy. More information can be found in the Readme file. Download the program The BackBrain program can be found on the following site: http://www.soton.ac.uk/~agt/mac/ ______________________________________________________________ Cognitive Scientist & Mac Guru Work: Cognitive Sciences Centre / Department of Psychology, University of Southampton, Highfield, SO17 1BJ, Southampton, UK Web-Sites: Cognitive Sciences Centre: http://www.soton.ac.uk/~coglab/coglab/ Ph.D.-Research: http://www.soton.ac.uk/~coglab/coglab/formalme.html Personal WebSite: http://www.soton.ac.uk/~agt/ E-mail: mailto:at at neuro.psy.soton.ac.uk From cas-cns at cns.bu.edu Wed Jun 11 12:44:43 1997 From: cas-cns at cns.bu.edu (CAS/CNS) Date: Wed, 11 Jun 1997 11:44:43 -0500 Subject: Neural Networks: CALL FOR PAPERS Message-ID: <199706111544.LAA29783@cns.bu.edu> *****CALL FOR PAPERS***** 1998 Special Issue of Neural Networks NEURAL CONTROL AND ROBOTICS: BIOLOGY AND TECHNOLOGY Planning and executing movements is of great importance in both biological and mechanical systems. This Special Issue will bring together a broad range of invited and contributed articles that describe progress in understanding the biology and technology of movement control. Movement control covers a wide range of topics, from integration of different types of sensory information, to flexible planning of movements, to generation of motor commands, to compensation for internal and external perturbations. Of particular importance are the coordinate transformations, memory systems, and attentional and volitional mechanisms needed to implement movement control. Neural control is the study of how biological systems have solved these problems with joints, muscles, and brains. Robotics is the attempt to build mechanical systems that can solve these problems under constraints of size, weight, robustness, and cost. This Special Issue welcomes high quality articles from both fields and seeks to explore the possible synergies between them. CO-EDITORS: Professor Rodney Brooks, Massachusetts Institute of Technology Professor Stephen Grossberg, Boston University Dr. Lance Optican, National Institutes of Health SUBMISSION: Deadline for submission: October 31, 1997 Notification of acceptance: January 31, 1998 Format: no longer than 10,000 words; APA format ADDRESS FOR SUBMISSION: Professor Stephen Grossberg Boston University Department of Cognitive and Neural Systems 677 Beacon Street Boston, Massachusetts 02215 From suem at soc.plym.ac.uk Wed Jun 11 06:03:35 1997 From: suem at soc.plym.ac.uk (Sue McCabe) Date: Wed, 11 Jun 1997 11:03:35 +0100 Subject: Postdoctoral Research Fellowship Message-ID: <1.5.4.32.19970611100335.006b70f0@soc.plym.ac.uk> Postdoctoral Research Fellowship in Neural Network Modelling of Brain Processes Centre for Neural and Adaptive Systems (http://www.tech.plym.ac.uk/soc/research/neural/ index.html) School of Computing University of Plymouth, UK The Centre for Neural and Adaptive Systems investigates computational neural models of major brain processes underlying intelligent behaviour and uses these models to provide inspiration for the development of novel neural computing architectures and systems for intelligent sensory information processing, condition monitoring, autonomous control and robotics. The Centre carries out collaborative research with leading international experts in the UK, USA and Europe, in both the fields of computational neuroscience and neural computation. A Research Fellow is sought by the Centre to participate in its on-going research programme in the area of neural network modelling of major brain processes, in collaboration with existing researchers in the Centre. Current work in the Centre is concerned with the investigation and development of neural network models of a number of brain regions and their interactions, including the hippocampus, septum, basal ganglia, prefrontal cortex, thalamus and sensory cortex. Specific investigations include learning and memory in spatial navigation tasks, streaming and grouping in auditory perception, and attentional set shifting in visual discrimination tasks. Over the next few years we wish to develop our links more strongly with leading experimental neuroscientists in the field in the UK, USA and Europe, and the Research Fellow will be expected to contribute mainly to this objective. He/she will be eligible, and encouraged, to apply for UK Research Council and other grants to support his/her research, and to take on research student supervision. Candidates for the Fellowship will be expected to hold, or be about to receive, a PhD degree either in the neural network modelling of brain processes, or in experimental neuroscience. If the latter, then he/she must have a strong interest in neural modelling. Candidates should also have or expect to achieve within two years, significant journal or conference publications in their area. A good knowledge of fundamental neuroscience principles is necessary, and candidates must either have good computational modelling skills (we currently use Matlab and C/C++). or be willing to acquire these skills. In addition to carrying out his/her research programme, the Research Fellow will be expected to undertake some teaching duties on the MSc Computational Intelligence programme to which the Centre makes a major contribution. The salary for the post will be in the range of #18,724 to #22,471 per annum. Further details about the Fellowship can be obtained by telephoning Professor Mike Denham on (+44) (0) 1752 232547, or by e-mail to mdenham at plym.ac.uk. Dr Sue McCabe School of computing University of Plymouth Drake Circus Plymouth PL4 8AA email: suem at soc.plym.ac.uk tel: 44 1752 232610 From HECKATHO at a.psc.edu Wed Jun 11 12:22:51 1997 From: HECKATHO at a.psc.edu (HECKATHO@a.psc.edu) Date: Wed, 11 Jun 1997 12:22:51 -0400 Subject: PSC Computational Neuroscience Workshop on the Mbone Message-ID: <970611122251.20222e9d@CPWSCA.PSC.EDU> The Pittsburgh Supercomputing Center will broadcast the "Simulations in Computational Neuroscience" workshop, June 12-14, 1997. We are inviting interested scientists to watch this lecture over the Mbone. Information regarding the reception of the workshop at your remote site can be found at: http://www.psc.edu/training/general/remote.html Agenda, lecture notes and further information at: http://www.psc.edu/biomed/workshops/wk-97/neuro/neural.html From oby at cs.tu-berlin.de Wed Jun 11 15:36:30 1997 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Wed, 11 Jun 1997 21:36:30 +0200 (MET DST) Subject: No subject Message-ID: <199706111936.VAA11586@pollux.cs.tu-berlin.de> Dear Connectionists, below please find the announcement of a one day workshop on computational neuroscience. Participation is free and everybody interested is wellcome. Klaus Obermayer ----------------------------------------------------------------------- Berlin Workshop on Computational Neuroscience Time: Wednesday July 9, 1997; 9.15 am - 5.00 pm Location: Lise-Meitner Hoersaal, Institute for Biochemistry, Free University of Berlin, Thielallee 63-67, 14195 Berlin, Germany Sponsor: German Science Foundation, training grant ``Signalling Chains in Living Systems'' (Prof. R. Menzel, PI) 9.15 - 9.30 Opening Remarks 9.30 - 10.20 Dr. Peter Stern, Max Planck Institute for Brain Research, Frankfurt, Clones, channels and kinetics - towards a complete model of the NMDA receptor 10.30 - 11.20 Dr. Klaus Pawelzik, Max Planck Institute for Fluid Dynamics, Goettingen, A model for the development and function of long range connections in cortex 11.30 - 12.20 Prof. Dr. Henning Scheich, Institute for Neurobiology, Magdeburg, Functional organisation, plasticity, and language processing in auditory cortex 12.30 - 14.00 Lunch 14.00 - 14.50 Prof. Dr. Werner von Seelen, Institute for Neuroinformatics, University of Bochum, Neural fields for the navigation of robots and for the interpretation of cortical representations 15.00 - 15.50 Prof. Dr. Martin Egelhaaf, Department of Biology, University of Bielefeld, A look into the cockpit of a fly: Neural circuits and visual orientation behavior 16.00 - 16.50 Prof. Dr. Holk Cruse, Department of Biology, University of Bielefeld, Towards an understanding of the control of complex movements: Experiments and simulations 17.00 Adjourn ----------------------------------------------------------------------- For further information please contact: Prof. Klaus Obermayer phone: 49-30-314-73442 FR2-1, KON, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://kon.cs.tu-berlin.de/ From yann at research.att.com Wed Jun 11 16:08:42 1997 From: yann at research.att.com (Yann LeCun) Date: Wed, 11 Jun 1997 16:08:42 -0400 Subject: AT&T is moving Message-ID: <199706112007.QAA18443@surfcity.research.att.com> Dear Colleagues: As you may have heard, AT&T split into three companies last year: AT&T, Lucent, and NCR. AT&T Bell Laboratories, home of quite a few NN and machine learning researchers, was split between AT&T and Lucent. The part that went with Lucent kept the name Bell Labs, while the part that stayed with AT&T took the name AT&T Labs-Research. Most former AT&T Bell Labs machine learning and neural net researchers are now with AT&T Labs-Research, and are moving from the Holmdel and Murray Hill locations (now owned by Lucent) to new AT&T Labs locations. The following people are moving to the location below: Name phone email Yoshua Bengio (732)345-3335 yoshua at research.att.com Leon Bottou (732)345-3336 leonb at research.att.com Eric Cosatto (732)345-3337 eric at research.att.com Hans-Peter Graf (732)345-3338 hpg at research.att.com Patrick Haffner (732)345-3339 haffner at research.att.com Yann LeCun (732)345-3333 yann at research.att.com Patrice Simard (732)345-3341 patrice at research.att.com Vladimir Vapnik (732)345-3342 vlad at research.att.com Shirley Bradford (732)345-3334 shirley at research.att.com (secretary) Image Processing Services Research Lab AT&T Labs - Research 100 Schulz Drive Red Bank, NJ 07701-7033 USA fax: (732)345-3031 web: http://www.research.att.com/ Other AT&T Researchers who have connections with the machine learning and connectionist communities including John Denker, Harris Drucker, and Larry Jackel, are also moving to the above address. AT&T Labs NN/ML Researchers who were previously in Murray Hill, NJ, including Corinna Cortes, Al Gorin, Michael Kearns, Esther Levin, Fernando Pereira, Daryl Pregibon, Mazin Rahim, Rob Shapire, and Yoram Singer, have moved to the following location: AT&T Labs - Research 180 Park Avenue P.O. Box 971 Florham Park, NJ 07932-0971 (973)360-8000 ================================================================ Yann LeCun Image Processing Services Research AT&T Labs - Research 100 Schulz Drive Red Bank, NJ 07701-7033 tel: (732)345-3333 fax: (732)345-3031 yann at research.att.com http://www.research.att.com/info/yann From scheler at ICSI.Berkeley.EDU Wed Jun 11 20:33:29 1997 From: scheler at ICSI.Berkeley.EDU (Gabriele Scheler) Date: Wed, 11 Jun 1997 17:33:29 -0700 (PDT) Subject: Preprints available Message-ID: <199706120033.RAA05823@icsib77.ICSI.Berkeley.EDU> Dear connectionists, a number of preprints on neuronal and statistical models of linguistic functions are available from my homepage now. The URL is: http://www.informatik.tu-muenchen.de/~scheler/publications.html Gabriele Scheler Institut fuer Informatik TU Muenchen D 80290 Muenchen present address: ICSI 1947 Center Street Berkeley, Ca. 94704 I include titles, abstracts and bibliographic references for two recent papers: ------------------------------------------------------------------- Scheler, Gabriele and Kerstin Fischer: The many functions of discourse particles: A computational model. to appear in Proceedings of Cognitive Science 1997. ------------------------------------------------------------------- We present a connectionist model for the interpretation of discourse particles in real dialogues that is based on neuronal principles of categorization (categorical perception, prototype formation, contextual interpretation). It can be shown that discourse particles operate just like other morphological and lexical items with respect to interpretation processes. The description proposed locates discourse particles in an elaborate model of communication which incorporates many different aspects of the communicative situation. We therefore also attempt to explore the content of the category discourse particle. We present a detailed analysis of the meaning assignment problem and show that 80% - 90% correctness for unseen discourse particles can be reached with the feature analysis provided. Furthermore, we show that `analogical transfer' from one discourse particle to another is facilitated if prototypes are computed and used as the basis for generalization. We conclude that the interpretation processes which are a part of the human cognitive system are very similar with respect to different linguistic items. However, the analysis of discourse particles shows clearly that any explanatory theory of language needs to incorporate a theory of communication processes. ------------------------------------------------------------------- Scheler,G. Feature-based Perception of Semantic Concepts. to appear: Freksa(ed.) Computation and Cognition, Springer 1997. ------------------------------------------------------------------- In this paper we point to some principles of neural computation as they have been derived from experimental and theoretical studies primarily on vision. We argue that these principles are well suited to explain some characteristics of the linguistic function of semantic concept recognition. Computational models built on these principles have been applied to morphological-grammatical categories (aspect), function words (determiners) and discourse particles in spoken language. We suggest a few ways in which these studies may be extended to include more detail on neural functions into the computational model. From ESPAA at soc.plym.ac.uk Thu Jun 12 10:40:07 1997 From: ESPAA at soc.plym.ac.uk (espaa) Date: Thu, 12 Jun 1997 14:40:07 GMT Subject: Pattern Analysis and Applications Message-ID: Springer Verlag Ltd is launching a new journal - Pattern Analysis and Applications (PAA) - in Spring 1998. Aims and Scope of PAA: The journal publishes high quality articles in areas of fundamental research in pattern analysis and applications. It aims to provide a forum for original research which describes novel pattern analysis techniques and industrial applications of the current technology. The main aim of the journal is to publish high quality research in intelligent pattern analysis in computer science and engineering. In addition, the journal will also publish articles on pattern analysis applications in medical imaging. The journal solicits articles that detail new technology and methods for pattern recognition and analysis in applied domains including, but not limited to, computer vision and image processing, speech analysis, robotics, multimedia, document analysis, character recognition, knowledge engineering for pattern recognition, fractal analysis, and intelligent control. The journal publishes articles on the use of advanced pattern recognition and analysis methods including statistical techniques, neural networks, genetic algorithms, fuzzy pattern recognition, machine learning, and hardware implementations which are either relevant to the development of pattern analysis as a research area or detail novel pattern analysis applications. Papers proposing new classifier systems or their development, pattern analysis systems for real-time applications, fuzzy and temporal pattern recognition and uncertainty management in applied pattern recognition are particularly solicited. The journal encourages the submission of original case-studies on applied pattern recognition which describe important research in the area. The journal also solicits reviews on novel pattern analysis benchmarks, evaluation of pattern analysis tools, and important research activities at international centres of excellence working in pattern analysis. Audience: Researchers in computer science and engineering. Research and Development Personnel in industry. Researchers/ applications where pattern analysis is used, researchers in the area of novel pattern recognition and analysis techniques and their specific applications. Full information about the journal and detailed instructions for Call for Papers can be found at the PAA web site: http://www.soc.plym.ac.uk/soc/sameer/paa.htm Best regards Barbara Davies School of Computing University of Plymouth From kmb at pac.soton.ac.uk Fri Jun 13 04:05:11 1997 From: kmb at pac.soton.ac.uk (Kevin Bossley) Date: Fri, 13 Jun 1997 09:05:11 +0100 Subject: Neurofuzzy Modelling Thesis Available. Message-ID: <3.0.32.19970613090510.00990530@margaux.pac.soton.ac.uk> The following PhD. thesis is now available! =================================================================== "Neurofuzzy Modelling Approaches in System Identification" by Kevin Bossley =================================================================== This can be obtained from: http://www.isis.ecs.soton.ac.uk/pub/theses/kmb:97/thesis.html and for immediate information I have included the abstract below. Abstract System identification is the task of constructing representative models of processes and has become an invaluable tool in many different areas of science and engineering. Due to the inherent complexity of many real world systems the application of traditional techniques is limited. In such instances more sophisticated (so called intelligent) modelling approaches are required. Neurofuzzy modelling is one such technique, which by integrating the attributes of fuzzy systems and neural networks is ideally suited to system identification. This attractive paradigm combines the well established learning techniques of a particular form of neural network i.e.\ generalised linear models with the transparent knowledge representation of fuzzy systems, thus producing models which possess the ability to learn from real world observations and whose behaviour can be described naturally as a series of linguistic humanly understandable rules. Unfortunately, the application of these systems is limited to low dimensional problems for which good quality expert knowledge and data are available. The work described in this thesis addresses this fundamental problem with neurofuzzy modelling, as a result algorithms which are less sensitive to the quality of the {\em a priori} knowledge and empirical data are developed. The true modelling capabilities of any strategy is heavily reliant on the model's structure, and hence an important (arguably the most important) task is structure identification. Also, due to the {\em curse of dimensionality}, in high dimensional problems the size of conventional neurofuzzy models gets prohibitively large. These issues are tackled by the development of automatic neurofuzzy model identification algorithms, which exploit the available expert knowledge and empirical data. To alleviate problems associated with the curse of dimensionality, aid model generalisation and enhance model transparency, parsimonious models are identified. This is achieved by the application of additive and multiplicative neurofuzzy models which exploit structural redundancies found in conventional systems. The developed construction algorithms successfully identify parsimonious models, but as a result of noisy and poorly distributed empirical data, these models can still generalise inadequately. This problem is addressed by the application of Bayesian inferencing techniques; a form of regularisation. Smooth model outputs are assumed and superfluous model parameters are controlled, sufficiently aiding model generalisation and transparency, and data interpolation and extrapolation. By exploiting the structural decomposition of the identified neurofuzzy models, an efficient local method of regularisation is developed. All the methods introduced in this thesis are illustrated on many different examples, including simulated time series, complex functional equations, and multi-dimensional dynamical systems. For many of these problems conventional neurofuzzy modelling is unsuitable, and the developed techniques have extended the range of problems to which neurofuzzy modelling can successfully be applied. Best Regards Kevin Bossley --- Real Time Systems Group Parallel Applications Centre 2 Venture Road Chilworth Southampton SO16 7NP http://www.pac.soton.ac.uk/ Tel : +44 (0) 1703 760834 Fax : +44 (0) 1703 760833 From terry at salk.edu Fri Jun 13 12:30:32 1997 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 13 Jun 1997 09:30:32 -0700 (PDT) Subject: NEURAL COMPUTATION 9:5 Message-ID: <199706131630.JAA17514@helmholtz.salk.edu> Neural Computation - Contents - Volume 9, Number 5 - July 1, 1997 Article Hybrid Learning of Mapping and its Jacobian in Multilayer Neural Networks Jeong-Woo Lee and Jun-Ho Oh Notes The Joint Development of Orientation and Ocular Dominance: Role of Constraints Christian Piepenbrock, Helge Ritter and Klaus Obermayer Physiological Gain Leads to High ISI Variability in a Simple Model of a Cortical Regular Spiking Cell Todd W. Troyer and Kenneth D. Miller Letters Role of Temporal Integration and Fluctuation Detection in the Highly Irregular Firing of a Leaky Integrator Neuron Model with Partial Reset Guido Bugmann, Chris Christodoulou and John G. Taylor Shunting Inhibition Does Not Have a Divisive Effect on Firing Rates Gary R. Holt and Christof Koch Reduction of the Hogkin-Huxley Equations to a Single-Variable Threshold Model Werner M. Kistler, Wulfram Gerstner and J. Leo van Hemmen Noise Adaptation in Integrate-and-Fire Neurons Michael E. Rudd and Lawrence G. Brown Paradigmatic Working Memory (Attractor) Cell in IT Cortex Daniel J. Amit, Stefano Fusi, and Volodya Yakovlev Noise Injection: Theoretical Prospects Yves Grandvalet, Stephane Canu and Stephane Boucheron The Faulty Behavior of Feedforward Neural Networks with Hard-Limiting Activation Fuction Zhiyu Tian, Ting-Ting Y. Lin, Shiyuan Yang and Shibai Tong Analysis of Dynamical Recognizers Alan D. Blair and Jordan B. Pollack A Bound on the Error of Cross Validation Using the Approximation and Estimation Rates, with Consequences for the Training-Test Split Michael Kearns Averaging Regularized Estimators Michiaki Taniguchi and Volker Tresp ----- ABSTRACTS - http://mitpress.mit.edu/journal-home.tcl?issn=08997667 SUBSCRIPTIONS - 1997 - VOLUME 9 - 8 ISSUES ______ $50 Student and Retired ______ $78 Individual ______ $250 Institution Add $28 for postage and handling outside USA (+7% GST for Canada). (Back issues from Volumes 1-8 are regularly available for $28 each to institutions and $14 each for individuals Add $5 for postage per issue outside USA (+7% GST for Canada) mitpress-orders at mit.edu MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142. Tel: (617) 253-2889 FAX: (617) 258-6779 ----- From bishopc at helios.aston.ac.uk Sat Jun 14 09:03:06 1997 From: bishopc at helios.aston.ac.uk (Prof. Chris Bishop) Date: Sat, 14 Jun 1997 14:03:06 +0100 Subject: Paper on mixtures of probabilistic PCA Message-ID: <186.199706141303@sun.aston.ac.uk> Mixtures of Probabilistic Principal Component Analysers ======================================================= Michael E. Tipping and Christopher M. Bishop Neural Computing Research Group Dept. of Computer Science & Applied Mathematics Aston University, Birmingham B4 7ET Technical Report NCRG/97/003 Submitted to Neural Computation Abstract -------- Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a probability density, and so there is no unique way to combine PCA models. Previous attempts to formulate mixture models for PCA have therefore to some extent been ad hoc. In this paper, PCA is formulated within a maximum-likelihood framework, based on a specific form of Gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analysers, whose parameters can be determined using an EM algorithm. We discuss the advantages of this model in the context of clustering, density modelling and local dimensionality reduction, and we demonstrate its application to image compression and handwritten digit recognition. Available as a postscript file: http://neural-server.aston.ac.uk/Papers/postscript/NCRG_97_003.ps.Z To access other publications by the Neural Computing Research Group, go to the group home page: http://www.ncrg.aston.ac.uk/ and click on `Publications' -- you can then obtain a list of all online NCRG publications, or search by author, title or abstract. From devries at peanut.sarnoff.com Mon Jun 16 15:39:19 1997 From: devries at peanut.sarnoff.com (Aalbert De Vries x2456) Date: Mon, 16 Jun 1997 15:39:19 -0400 Subject: NNSP97 workshop reminder Message-ID: <199706161939.PAA02240@fire.sarnoff.com> WORKSHOP REMINDER ----------------- 1997 IEEE Workshop on Neural Networks for Signal Processing 24-26 September 1997 Amelia Island Plantation, Florida ********************************************** * Advanced registration: before July 1, 1997 * ********************************************** Special Sessions: * BLIND SIGNAL PROCESSING * APPLICATIONS OF NEURAL NETWORKS TO BIOMEDICAL SIGNAL PROCESSING Invited Speakers: Dr. Simon Haykin, Chaos, Radar Clutter, and Neural Networks Dr. David brown, Neural Networks for Medical Image Processing Dr. Yann LeCun, Neural Networks and Gradient-Based Learning in OCR Dr. Jean-Francois Cardoso, Blind Separation of Noisy Mixtures Dr. S.Y. Kung, Neural Networks for Intelligent Multimedia Processing Further Information Local Organizer Ms. Sharon Bosarge Telephone: 352-392-2585 Fax: 352-392-0044 e-mail: sharon at ee1.ee.ufl.edu World Wide Web http://www.cnel.ufl.edu/nnsp97/ (The link to this page may be slow, so please be patient) Organization General Chairs Lee Giles (giles at research.nj.nec.com), NEC Research Nelson Morgan (morgan at icsi.berkeley.edu), UC Berkeley Proceeding Chair Elizabeth J. Wilson (bwilson at ed.ray.com), Raytheon Co. Publicity Chair Bert DeVries (bdevries at sarnoff.com), David Sarnoff Research Center Program Chair Jose Principe (principe at synapse.ee.ufl.edu), University of Florida BIO Session Chair Tulay Adali (adali at engr.umbc.edu), University of Maryland Baltimore County Program Committee Les ATLAS Charles BACHMANN Andrew BACK A. CONSTANTINIDES Federico GIROSI Lars Kai HANSEN Allen GORIN Yu-Hen HU Jenq-Neng HWANG Biing-Hwang JUANG Shigeru KATAGIRI Gary KUHN Sun-Yuan KUNG Richard LIPPMANN John MAKHOUL Elias MANOLAKOS Erkki OJA Tomaso POGGIO Tulay ADALI Volker TRESP John SORENSEN Takao WATANABE Raymond WATROUS Andreas WEIGEND Christian WELLEKENS About Amelia Island Plantation Amelia Island is in the extreme northeast Florida, across the St. Mary's river. The island is just 29 miles from Jacksonville International Airport, which is served by all major airlines. Amelia Island Plantation is a 1,250 acre resort/paradise that offers something for every traveler. The Plantation offers 33,000 square feet of workable meeting space and a staff dedicated to providing an efficient, yet relaxed atmosphere. The many amenities of the Plantation include 45 holes of championship golf, 23 Har-Tru tennis courts, modern fitness facilities, an award winning children's program, more than 7 miles of flora-filled bike and jogging trails, 21 swimming pools, diverse accommodations, exquisite dining opportunities, and of course, miles of glistening Atlantic beach front. From tirthank at mpce.mq.edu.au Mon Jun 16 23:23:12 1997 From: tirthank at mpce.mq.edu.au (Tirthankar Raychaudhuri) Date: Tue, 17 Jun 1997 13:23:12 +1000 (EST) Subject: PhD thesis available Message-ID: <199706170323.NAA02315@krakatoa.mpce.mq.edu.au> The following PhD thesis is now electronically available ftp://ftp.mpce.mq.edu.au/pub/comp/papers/raychaudhuri.phd_thesis.97.ps (2581 k) ftp://ftp.mpce.mq.edu.au/pub/comp/papers/raychaudhuri.phd_thesis.97.ps.Z (848 k) SEEKING THE VALUABLE DOMAIN - QUERY LEARNING IN A COST-OPTIMAL PERSPECTIVE by Tirthankar RayChaudhuri School of Mathematics, Physics, Computing and Electronics, Macquarie University, Sydney, NSW 2109 AUSTRALIA tirthank at mpce.mq.edu.au http://www.comp.mq.edu.au/~tirthank Abstract -------- Purpose This thesis is intended as a contribution to the theories and principles of active learning and dual control. It is an account of an investigation which essentially examines the problem of how a machine learning system can be designed so as to acquire useful data actively, i.e., with environment interaction and simultaneously perform cost-effectively. The primary aim of the research has been to evolve the general basis for developing an engineering solution for a self-designing system of the future. In other words, while developed theory and experimental investigations to substantiate the theory are presented in this dissertation, the learning system used herein is based on certain specific assumptions with a view to achieving a balance between extremely general theoretical abstraction and a practical method that addresses clearly-defined problem parameters. The outcome is a strategy of `learning while performing' - consisting of a set of algorithms directed to addressing a real world scenario such as an industrial manufacturing plant from which continuous data is available. Content The basic learning model used is the neural network function approximator trained by a supervised method. The overall theoretical framework is a system that is endowed with the capability of asking questions (querying) in order to obtain the most useful information from an environment such as a producing plant. Instead of modeling system behaviour with a state-space description, the emphasis is on significant input-output data gathered from a system with unknown internal behaviour (black-box). A locally-optimal or `greedy' strategy is used. Although a general theory for multidimensional systems has been developed the experimental studies are concentrated within a finite boundary, two-dimensional, continuous data domain. This is primarily for the purpose of easy visualisation of results and computational efficiency. The basic principle propounded is that the notion of dollar value of a particular output ought to be incorporated within the querying criterion. In the process of developing such a querying criterion a novel method of data subsampling/statistical jack-knifing is applied to Seung and Freund's query-by-committee query-by-committee philosophy. This technique is demonstrated to be highly effective in gathering both significant and adequate data samples for accurately modeling functions with sharp transition points and nonlinear functions. An extension of the strategy for addressing noisy functions leads to derivation of an estimate of the distribution of the true label (output) for a particular input of the black-box. While such a distribution is closely related to the developed querying criterion for information gain, it is taken further ahead in the investigation and used in defining the expected value of a label as an exploitation objective - in a cost-benefit analysis perspective. A confidence interval on this expected value is then analytically derived. The size of the confidence interval determines the amount of exploration by the learner. Experimental studies with unimodal output value characteristics reveal differences in the performance of the proposed method upon monotonic and non-monotonic environments. Such performance is mostly based upon a long-term measure of yield experimentally computed within a particular finite querying boundary. Two overall approaches to examining the problem are studied - one uses the estimated distribution of a label to compute its expected value (called IMDV: indirectly modeled dollar value), the other estimates expected values by applying the jack-knifed committee method directly to dollar value labels and is called a DMDV (directly modeled dollar value) algorithm. The inherent weakness of the second approach, despite its obvious computational advantage over the other, is pointed out. It is then demonstrated that a theoretically more sound but very computationally expensive strategy of combining the strengths of the IMDV and DMDV approaches can result in a technique known as the combined modeling of dollar values (CMDV). Finally it is discussed how the theoretical premise of a dual controller is inherent in the querying philosophy proposed. This kind of dual control is observer-free and is contrasted with more conventional control methods. -------------------------------------------------------------------------- Dr Tirthankar RayChaudhuri Phone 61 -2 -850-9543 Department of Computing School of MPCE Fax 61 -2 -850-9551 Macquarie University E-mail tirthank at mpce.mq.edu.au Sydney,NSW 2109 WWW http://www-comp.mpce.mq.edu.au/~tirthank From dwang at cis.ohio-state.edu Wed Jun 18 15:02:14 1997 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Wed, 18 Jun 1997 15:02:14 -0400 (EDT) Subject: Tech rep on medical image segmentation Message-ID: <199706181902.PAA27781@shirt.cis.ohio-state.edu> The following technical report is available on medical image segmentation. Comments and suggestions are welcome __________________________________________________________________________ -------------------------------------------------------------------------- Segmentation of Medical Images Using LEGION Technical Report: OSU-CISRC-4/97-TR26, 1997 Naeem Shareef, DeLiang L. Wang and Roni Yagel Department of Computer and Information Science The Ohio State University Columbus, Ohio 43210, USA ABSTRACT Advances in computer and visualization technology allow clinicians to virtually interact with anatomical structures contained within sampled medical image datasets. A hindrance to the effective use of this technology is the problem of image segmentation. In this paper, we utilize a recently proposed oscillator network called LEGION (Locally Excitatory Globally Inhibitory Oscillator Network), whose ability to achieve fast synchrony with local excitation and desynchrony with global inhibition makes it an effective computational framework for grouping similar features and segregating dissimilar ones in an image. We describe an image segmentation algorithm derived from LEGION and introduce an adaptive scheme to choose tolerances. We show results of the algorithm to 2D and 3D (volume) CT and MRI medical image datasets, and compare our results with manual segmentation. LEGION's computational and architectural properties make it a promising approach for real-time medical image segmentation using multiple features. A Postscript version of the report is available via ftp at FTP-host : ftp.cis.ohio-state.edu FTP-pathname: pub/tech-report/1997/TR26-DIR/ OR an HTML version can be found at HTML version: http://www.cis.ohio-state.edu/~shareef/OSUTR26/journal-1.html __________________________________________________________________________ -------------------------------------------------------------------------- From ESPAA at soc.plym.ac.uk Thu Jun 19 13:04:07 1997 From: ESPAA at soc.plym.ac.uk (espaa) Date: Thu, 19 Jun 1997 17:04:07 GMT Subject: PATTERN ANALYSIS AND APPLICATIONS JOURNAL Message-ID: C A L L F O R P A P E R S PATTERN ANALYSIS AND APPLICATIONS http://www.soc.plym.ac.uk/soc/sameer/paa.htm (SPRINGER VERLAG LIMITED) Springer-Verlag is starting the above journal in Spring 1998. The journal now invites papers on a number of topics in pattern analysis and applications. The Call for Papers may be printed out from the journal web site where paper submission instructions are also available. Attached below is the aims and scope of the journal. -------------- AIMS AND SCOPE The journal publishes high quality articles in areas of fundamental research in pattern analysis and applications. It aims to provide a forum for original research which describes novel pattern analysis techniques and industrial applications of the current technology. The main aim of the journal is to publish high quality research in intelligent pattern analysis in computer science and engineering. In addition, the journal will also publish articles on pattern analysis applications in medical imaging. The journal solicits articles that detail new technology and methods for pattern recognition and analysis in applied domains including, but not limited to, computer vision and image processing, speech analysis, robotics, multimedia, document analysis, character recognition, knowledge engineering for pattern recognition, fractal analysis, and intelligent control. The journal publishes articles on the use of advanced pattern recognition and analysis methods including statistical techniques, neural networks, genetic algorithms, fuzzy pattern recognition, machine learning, and hardware implementations which are either relevant to the development of pattern analysis as a research area or detail novel pattern analysis applications. Papers proposing new classifier systems or their development, pattern analysis systems for real-time applications, fuzzy and temporal pattern recognition and uncertainty management in applied pattern recognition are particularly solicited. The journal encourages the submission of original case-studies on applied pattern recognition which describe important research in the area. The journal also solicits reviews on novel pattern analysis benchmarks, evaluation of pattern analysis tools, and important research activities at international centres of excellence working in pattern analysis. AUDIENCE --------- Researchers in computer science and engineering. Research and Development Personnel in industry. Researchers/ applications where pattern analysis is used, researchers in the area of novel pattern recognition and analysis techniques and their specific applications. From Yves.Moreau at esat.kuleuven.ac.be Fri Jun 20 10:16:21 1997 From: Yves.Moreau at esat.kuleuven.ac.be (Yves Moreau) Date: Fri, 20 Jun 1997 16:16:21 +0200 Subject: TR: Dynamics of recurrent networks with and without hidden layer Message-ID: <33AA90B5.164E@esat.kuleuven.ac.be> The following technical report is available via ftp or the World Wide Web: WHEN DO DYNAMICAL NEURAL NETWORKS WITH AND WITHOUT HIDDEN LAYER HAVE IDENTICAL BEHAVIOR? Yves Moreau and Joos Vandewalle K.U.Leuven ESAT-SISTA K.U.Leuven, Elektrotechniek-ESAT, Technical report ESAT-SISTA TR97-51 To get it from the World Wide Web, point your browser to: ftp://ftp.esat.kuleuven.ac.be/pub/SISTA/moreau/reports/rnn_equivalence_tr97-51.ps To get it via FTP: ftp ftp.esat.kuleuven.ac.be cd pub/SISTA/moreau/reports get rnn_equivalence_tr97-51.ps ABSTRACT ======== We present conditions for the input-output equivalence between dynamical neural networks with a hidden layer and dynamical neural networks without hidden layer. We achieve this result by proving a general result on transdimensional changes of coordinates for dynamical neural networks with a hidden layer. This result is of interest because dynamical neural networks without hidden layer are more amenable to analytical studies. Keywords: dynamical neural network, input-output equivalence, transdimensional change of coordinates -------------------- Yves Moreau Department of Electrical Engineering Katholieke Universiteit Leuven Leuven, Belgium email: moreau at esat.kuleuven.ac.be homepage: http://www.esat.kuleuven.ac.be/~moreau publications: http://www.esat.kuleuven.ac.be/~moreau/publication_list.html From suem at soc.plym.ac.uk Fri Jun 20 11:29:44 1997 From: suem at soc.plym.ac.uk (Sue McCabe) Date: Fri, 20 Jun 1997 16:29:44 +0100 Subject: Lectureship/Senior Lectureship in Artificial Intelligence Message-ID: <1.5.4.32.19970620152944.00b649b4@soc.plym.ac.uk> Lectureship/Senior Lectureship in Artificial Intelligence School of Computing, University of Plymouth, UK The School of Computing invites applications for a lectureship in the field of Artificial Intelligence, from persons with a specialist interest in the application of computational intelligence techniques (neural computing, evolutionary computing) in one or more of the following areas: ? multimedia, ? agent-based systems, ? financial systems. The successful candidate will be expected to develop new courses at both undergraduate and postgraduate level in one or more of the above topics, and to carry out research, including initiating new projects, in collaboration with existing research groups in the School, in particular the Centre for Neural and Adaptive Systems (http://www.tech.plym.ac.uk/soc/research/neural/ index.html). Candidates must as a minimum requirement possess, or be about to obtain, a PhD in a related area. Relevant postdoctoral experience and an established research publication record is desirable, but not essential.Applications from more senior academics/researchers are also invited and this is reflected in the salary range for the post, which is ?13480 - ?32966 pa. (Lecturer/Senior Lecturer/Principal Lecturer/ Reader). Appointment will be at an appropriate level to reflect qualifications and experience. For further information on the post and on how to make formal application, please initially contact Professor Mike Denham by telephoning (+44) (0) 1752 232547, or by e-mail to mdenham at plym.ac.uk. Dr Sue McCabe School of computing University of Plymouth Drake Circus Plymouth PL4 8AA email: suem at soc.plym.ac.uk tel: 44 1752 232610 From lbl at nagoya.bmc.riken.go.jp Tue Jun 24 04:53:37 1997 From: lbl at nagoya.bmc.riken.go.jp (Bao-Liang Lu) Date: Tue, 24 Jun 1997 17:53:37 +0900 Subject: Paper available: Task Decomposition and Modular Neural Net Message-ID: <9706240853.AA27776@xian> The following paper, appeared in Lecture Notes in Computer Science, vol. 1240, 1997, Springer, is available via anonymous FTP. (This work was presented at International Work-Conference on Artificial and Natural Neural Networks (IWANN'97), 4-6 June 1997, Lanzarote, Canary Islands, Spain ) FTP-host:ftp.bmc.riken.go.jp FTP-file:pub/publish/Lu/lu-iwann97.ps.gz ========================================================================== TITLE: Task Decomposition Based on Class Relations: A Modular Neural Network Architecture for Pattern Classification AUTHORS: Bao-Liang Lu Masami Ito ORGANISATIONS: Bio-Mimetic Control Research Center, The Institute of Physical and Chemical Research (RIKEN) ABSTRACT: In this paper, we propose a new methodology for decomposing pattern classification problems based on the class relations among training data. We also propose two combination principles for integrating individual modules to solve the original problem. By using the decomposition methodology, we can divide a $K$-class classification problem into ${K\choose 2}$ relatively smaller two-class classification problems. If the two-class problems are still hard to be learned, we can further break down them into a set of smaller and simpler two-class problems. Each of the two-class problem can be learned by a modular network independently. After learning, we can easily integrate all of the modules according to the combination principles to get the solution of the original problem. Consequently, a $K$-class classification problem can be solved effortlessly by learning a set of smaller and simpler two-class classification problems in parallel (10 pages ) Bao-Liang Lu ====================================== Bio-Mimetic Control Research Center The Institute of Physical and Chemical Research (RIKEN) Anagahora, Shimoshidami, Moriyama-ku Nagoya 463, Japan Tel: +81-52-736-5870 Fax: +81-52-736-5871 Email: lbl at bmc.riken.go.jp From bs at corn.mpik-tueb.mpg.de Tue Jun 24 07:31:37 1997 From: bs at corn.mpik-tueb.mpg.de (Bernhard Schoelkopf) Date: Tue, 24 Jun 1997 13:31:37 +0200 Subject: Support Vector Workshop Message-ID: <199706241131.NAA08477@mpik-tueb.mpg.de> ________________________________________________________________ Call for Contributions NIPS'97 Workshop on Support Vector Learning Machines ________________________________________________________________ The Support Vector (SV) learning algorithm (Boser, Guyon, Vapnik, 1992; Cortes, Vapnik, 1995; Vapnik, 1995) provides a general method for solving Pattern Recognition, Regression Estimation and Operator Inversion problems. The method is based on results in the theory of learning with finite sample sizes. The last few years have witnessed an increasing interest in SV machines, due largely to excellent results in pattern recognition, regression estimation and time series prediction experiments. The purpose of this workshop is (1) to provide an overview of recent developments in SV machines, ranging from theoretical results to applications, (2) to explore connections with other methods, and (3) to identify weaknesses, strengths and directions for future research for SVMs. We invite contributions on SV machines and related approaches, looking for empirical support wherever possible. Topics of interest to the workshop include: SV Applications Benchmarks SV Optimization and implementation issues Theory of generalization and regularization Learning methods based on Hilbert-Schmidt kernels (e.g. kernel PCA) Links to related methods and concepts (e.g. boosting, fat shattering) Representation of functions in SV machines (e.g. splines, anova) The workshop will be held in Breckenridge, Colorado, on December 5 or 6, 1997. We are in the process of putting together a tentative schedule. If you are interested in contributing, please contact us (mailto:smola at first.gmd.de, or faxto:+49-30-6392-1805, A. Smola). Submission of papers is not required for the workshop. We would, however, appreciate a brief description of your envisaged talk. As one of the workshop's foci will be discussions, presentation of recent and/or controversial work is encouraged. A workshop home page has been set up at http://svm.first.gmd.de. Additional information on SV research can be found at http://svm.research.bell-labs.com/ (Bell Labs SV services) and http://www.mpik-tueb.mpg.de/people/personal/bs/svm.html (annotated SV bibliography). Organizers: Leon Bottou (AT&T Research, leonb at research.att.com) Chris Burges (Bell Labs, Lucent Technologies, burges at bell-labs.com) Bernhard Schoelkopf (Max Planck Institute at Tuebingen, bs at mpik-tueb.mpg.de) Alex Smola (Technical University/GMD Berlin, smola at first.gmd.de) -- bernhard schoelkopf mailto:bs at mpik-tueb.mpg.de max-planck-institut fuer biologische kybernetik spemannstr.38, 72076 tuebingen, germany phone +49 7071 601-609, fax -616 http://www.mpik-tueb.mpg.de/people/personal/bs/bs.html From Leslie.Smith at ee.ed.ac.uk Tue Jun 24 09:24:07 1997 From: Leslie.Smith at ee.ed.ac.uk (Leslie S Smith) Date: Tue, 24 Jun 1997 14:24:07 +0100 (BST) Subject: Call for Participation: 1st European Workshop on Neuromorphic Systems Message-ID: <199706241324.OAA24298@forbes.ee.ed.ac.uk> Call for Participation. (apologies if you receive this more than once) 1st European Workshop on Neuromorphic Systems (EWNS1) 29-31 August 1997. University of Stirling, Stirling, Scotland. www page: http://www.cs.stir.ac.uk/~lss/Neuromorphic/Info1.html Organisers: Dept of Computing Science, University of Stirling Dept of Electrical Engineering, University of Edinburgh Neuromorphic systems are implementations in silicon of sensory and neural systems whose architecture and design are based on neurobiology. This growing area proffers exciting possibilities such as sensory systems which can compete with human senses and pattern recognition systems that can run in real-time. The area is at the intersection of many disciplines: neurophysiology, computer science and electrical engineering. The purpose of this meeting is to bring together active researchers who can discuss the problems of this developing technology. The meeting is intended to bring together researchers who might not normally meet, for example, engineers who want to implement systems based on neurobiology, and neurobiologists who want to produce engineering implementations of systems. The University is set on one of the most beautiful campusses in Europe, and is centrally located in Scotland. It is easily reached by plane (to Glasgow or Edinburgh, both 1 hour away) train, bus, or car. We are grateful to the Gatsby Foundation for their generous support of this workshop. Meeting format: The meeting will be a single-track meeting so that all attendees can go to all the meetings. Accepted papers will be presented (presentations of about 30 minutes plus time for discussion), and there will be paper sessions and time for discussion each day. Programme The invited speakers are: Professor Rodney Douglas, Institute of Neuroinformatics, ETHZ, University of Zurich, Zurich, Switzerland. Professor Ralph Etienne-Cummings, Dept of Electrical Engineering, Southern IIlinois University, USA. The provisional programme is to be found at http://www.cs.stir.ac.uk/~lss/Neuromorphic/ProvProg.html May I extend my thanks to all who submitted papers. Book of the Conference: We have now agreed with World Scientific to produce a book from the conference. We plan to ask speakers to produce camera-ready copy soon after the meeting. The provisional title for the book is Neuromorphic Systems: Engineering Silicon from Neurobiology. Organising Committee: Leslie S. Smith, Department of Computing Science and Mathematics, University of Stirling. Alister Hamilton, Department of Electrical Engineering, University of Edinburgh. Program Committee Jim Austin(Department of Computer Science, University of York, UK) Guy Brown (Department of Computing Science, Univ of Sheffield, UK) Rodney Douglas (ETHZ, University of Zurich, Zurich, Switzerland) Wulfram Gerstner (EPFL Lausanne, Switzerland) Alister Hamilton (Department of Electrical Engineering, University of Edinburgh, Scotland) John Lazzaro (CS Division, University of California at Berkeley) Wolfgang Maass (Institute for Theoretical Computer Science, T.U. Graz, Austria) Alan Murray (Dept of Electrical Engineering, Univ of Edinburgh, Scotland) Leslie Smith (Department of Computing Science and Mathematics, University of Stirling, Scotland) Eric Vittoz (EPFL, Lausanne) Barbara Webb (Dept of Psychology, Univ of Nottingham, UK) Misha Mahowald (ETHZ, Univ of Zurich, Switzerland) had agreed to be on the committee before her untimely death. Registration Registration (includes lunches) UK Pounds 100 Student Registration (includes lunches) UK Pounds 25 Conference Dinner UK Pounds 25 Accommodation: Bed and Breakfast per night UK Pounds 20 The accommodation above is in nearby student residences. There is plenty of higher quality accommodation available: contact Stirling tourist office (+44) 1786 475019. Money is available to support a limited number of UK Ph.D. students who wish to attend the conference. Please contact Leslie Smith (lss at cs.stir.ac.uk) for further details. Registration Form: If you would like to register, please print and fill out the form at http://www.cs.stir.ac.uk/~lss/Neuromorphic/RegForm.html. Leslie Smith, Department of Computing Science, University of Stirling, Stirling FK9 4LA, Scotland lss at cs.stir.ac.uk Fax: (44) 1786 464551 Phone (44) 1786 467435 From D.Mareschal at exeter.ac.uk Wed Jun 25 07:41:30 1997 From: D.Mareschal at exeter.ac.uk (Denis Mareschal) Date: Wed, 25 Jun 1997 12:41:30 +0100 Subject: Lecturship in Cognitive Psychology Message-ID: Readers of this list with interests language, development, and other other aspects of cognitive psychology may be interested in the following job announcement. cheers, Denis UNIVERSITY OF EXETER: DEPARTMENT OF PSYCHOLOGY LECTURESHIP IN PSYCHOLOGY Applications are invited for a permanent lectureship which will be available from 1st October 1997, though a later start date (January or April 1998) can be arranged, as the result of the early retirement of Dr Robert Brown. The department is seeking people with strong records or great potential who will strengthen our existing research group in cognitive psychology. We are particularly keen that new lecturers should engage in collaborative research with existing staff and develop links between separate areas. Preference will be given to those whose interests are in one or more of: (a) language, particularly including psycholinguistics, cognitive neuropsychology or developmental approaches. (b) neural network modelling (connectionism), particularly its application to cognitive development, language or vision (c) animal cognition The department has a high reputation for its research, having obtained ratings of 5, 4 and 4 in the last three research assessment exercises, and has a good position within the University, having been recently recommended for School status and singled out as a growth area. The research of the department is organised in three research groups: Cognitive Psychology and Cognitive Science; Social and Economic Psychology; Health and Clinical Psychology. Further details of the Cognitive Psychology and Cognitive Science research group's activities can be found at: http://www.exeter.ac.uk/~nejones/pg/cogpage.htm Teaching duties for the new lecturer will be arranged by the Head of Department and will obviously depend upon the interests of those appointed. It should be noted that in the first two years the new lecturer will have a substantially lighter-than-average teaching and administrative load. The Department enjoys excellent accommodation in an exceptionally attractive location, with a full range of technical facilities and support, including a first-rate audio-visual studio, a recently refurbished animal laboratory and state-of-the-art computing facilities for teaching and research. These include a suite of Macintoshes for undergraduate use, a dedicated suite of Macs and PCs for postgraduate use and a new connectionist/computational modelling laboratory consisting of Unix workstations. The University's 'mainframe' provision is Unix based. All staff have a PC for their own use and the new appointee will be given an allocation to buy new journals and books and an initial equipment budget in order to establish a firm base for their research. In addition to external and University research grants, the Department operates its own internal scheme to offer small 'pump-priming' support for new research initiatives. Further details about the department can be found at our world wide web site http://www.exeter.ac.uk/Psychology/ Informal enquiries (and visits) are encouraged and potential applicants should contact Dr Paul Webley, Head of Department, Department of Psychology, University of Exeter, Exeter EX4 4QG. Telephone 01392 264600 or email P.Webley at exeter.ac.uk Further information and application forms are available from Personnel, University of Exeter, Exeter EX4 4QJ, 01392-263100 (answer phone) or email Personnel at exeter.ac.uk, quoting ref. no. 4147. The closing date is Friday 29th August 1997 and it is hoped to carry out interviews during the week beginning September 22nd. The salary will be on the Lecturer A scale, 16,045 - 21,016 pounds p.a. -------------------------------------------------------------- Denis Mareschal Department of Psychology, Washington Singer Laboratories, Exeter University, Perry Road, Exeter, EX4 4QG, UK. Tel: +44 1392 264596, Fax: +44 1392 264623 WWW: http://www.ex.ac.uk/Psychology/staff/dmaresch.htm ============================================================== From aprieto at goliat.ugr.es Wed Jun 25 04:37:10 1997 From: aprieto at goliat.ugr.es (Alberto Prieto) Date: Wed, 25 Jun 1997 10:37:10 +0200 Subject: Summer Course Message-ID: <1.5.4.32.19970625083710.007206ec@goliat.ugr.es> SOME TOPICS RELATED TO INTELLIGENT SYSTEMS http://atc.ugr.es/cursos/cm.html Course organized in cooperation with the Spanish RIG of IEEE Neural Network Council. Almuecar (Granada, Spain) 15-19 September 1997 CENTRO MEDITERRANEO DE LA UNIVERSIDAD DE GRANADA XIII Cursos Internacionales 1997 TUTORIALS A."Introduction. Computational models for intelligent systems" (2 hours) Prof. Dr. Alberto Prieto, Catedr?tico de Arquitectura y Tecnolog?a de Computadores, Dto. de Electronica y Tecnolog?a de Computadores. Unv. of Granada, Spain B."Machine learning" (4 hours) Prof. Dr. Marie Cottrell, Professor, U.F.R. de Mathematiques et d'Informatique. Universite de Paris I - Pantheon - Sorbona, C."Complex Systems" (4 hours) Dr. Chris Langton, Santa Fe Inst., Los Alamos Nat.Lab., USA D."Evolutionary Computation: Genetic Algorithms, Applications of Genetic Algorithms, Classifier Systems, and Applications of Classifier Systems" (4 hours) Prof. Dr. Terry Fogarty, Professor of Computing, Napier Unv., England E."Unification of neuro-fuzzy systems" (4 hours) Prof Dr. Leonardo Reyneri, Prof. Dipartimento de Electronica. Politecnico di Torino, Italy F."Evolvable Hardware" (4 hours) Pierre Marchal. Centre Suisse d'Electronique et de Microtechnique S.A. (CSEM). Ne?chatel, Suisse G."Microelectronics of Intelligent Systems" (3 hours) Prof. Dr.-Ing. Karl Goser, Lehrstuhl f?r Bauelemente der Elektrotechnik. Univ. Dortmund, Germany ORGANIZATION COMMITTEE Course Directors: Karl GOSER and Alberto PRIETO Coordinator: Juan Julian MERELO (jmerelo at kal-el.ugr.es) More information: http://atc.ugr.es/cursos/cm.html Prof. Dr. Alberto PRIETO Departamento de Electronica y Tecnologia de Computadores Facultad de Ciencias - E.T.S. Ingenieria Informatica Campus Universitario de Fuentenueva Universidad de Granada 18071 GRANADA (Spain) Phone: 34-58- 24 32 26 Fax: 34-58- 24 32 30 E-mail: aprieto at ugr.es http://atc.ugr.es/~aprieto/aprieto.html From S.W.Ellacott at bton.ac.uk Thu Jun 26 06:36:29 1997 From: S.W.Ellacott at bton.ac.uk (S.W.Ellacott@bton.ac.uk) Date: Thu, 26 Jun 1997 11:36:29 +0100 Subject: Two books Message-ID: Dear connectionists I hope this is within the spirit of the list and does not constitute just advertising! If it is, I'm sure the moderators will inform me:-) I would like to draw your attention to two recent books. The first is Mathematics of Neural Networks: Models, Algorthms and Applcations Eds. ELLACOTT, S.W., MASON, J.C. and ANDERSON I.J. Boston: Kluwer Academic Press 1997. ISBN 0-7923-9933-1 This is the proceedings of the 1995 MANNA conference now published in book form. There are invited contributions from Allinson, Amari, Cybenko, Grossberg, Hirsch, and Taylor, together with 63 submitted contributions from mathematicinas working in the field including many prominent researchers. Overall the book provides an essential snapshot of mathematical research in neural networks. The second book has been out about a year, but not reported here before. Neural Networks: Deterministic Methods of Analysis ELLACOTT, S.W. and BOSE, D Thomson International Publishers 1996 This is a comprehensive expository text at final year undergraduate level for mathematicians or postgraduate for other disciplines. It collects together the mathematical fundamentals and relevant results from linear algebra, optimization, dynamical systems and approximation theory as applied to neural networks. Steve Ellacott -- Steve Ellacott, School of Computing and Mathematical Sciences, University of Brighton, Moulsecoomb, BN2 4GJ, UK Tel: Home (01273) 885845 Office: (01273) 642544 or 642414 Fax: Home (01273) 270183 Office: (01273) 642405 WWW: http://www.it.brighton.ac.uk/staff/swe From mmisra at adastra.Mines.EDU Thu Jun 26 20:34:36 1997 From: mmisra at adastra.Mines.EDU (Manavendra Misra) Date: Thu, 26 Jun 1997 18:34:36 -0600 Subject: Paper on Parallel Neural Network implementations in NCS Message-ID: <9706261834.ZM24447@adastra.Mines.EDU> Forwarded from Arun Jagota: Dear Connectionists: The following refereed and revised paper is now accessible from the Web site below. M. Misra, Parallel Environments for Implementing Neural Networks, Neural Computing Surveys vol 1, 48-60, 1997, 113 references http://www.icsi.berkeley.edu/~jagota/NCS Send additional references and comments to the author at mmisra at mines.edu Abstract: As artificial neural networks (ANNs) gain popularity in a variety of application domains, it is critical that these models run fast and generate results in real time. Although a number of implementations of neural networks are available on sequential machines, most of these implementations require an inordinate amount of time to train or run ANNs, especially when the ANN models are large. One approach for speeding up the implementation of ANNs is to implement them on parallel machines. This paper surveys the area of parallel environments for the implementations of ANNs, and prescribes desired characteristics to look for in such implementations. ------------------------------------------------------------------------ Regards, Arun From mitra at cco.caltech.edu Thu Jun 26 18:32:34 1997 From: mitra at cco.caltech.edu (Partha Mitra) Date: Thu, 26 Jun 1997 15:32:34 -0700 Subject: Workgroup on Analysis of Neurobiological data Message-ID: <33B2EE02.7C7010BF@cco.caltech.edu> Analysis of Neural Data Modern methods and open issues in the analysis and interpretation of multi-variate time-series and imaging data in the neurosciences 18 August - 31 August 1997 Marine Biological Laboratories - Woods Hole, MA A working group of scientists committed to quantitative approaches to problems in neuroscience will focus their efforts on experimental and theoretical issues related to the analysis of large, single- and multi-channel data sets. Our motivation for the work group is based on issues that arise in two complimentary areas critical to an understanding of brain function. The first involves advanced signal processing methods that are relevant to neuroscience, particularly those appropriate for emerging multi-site recording techniques and noninvasive imaging techniques. The second involves the development of a calculus to study the dynamical behavior of nervous systems and the computations they perform. A distinguishing feature of the work group will be the close collaboration between experimentalists and theorists, particularly with regard to the analysis of data and the planning of experiments. The work group will have a limited number of research lectures, supplemented by tutorials on relevant computational, experimental, and mathematical techniques. This work group is a means to critically evaluate techniques for the processing of multi-channel data, of which imaging forms an important category. Such techniques are of fundamental importance for basic research and medical diagnostics. We will establish a repository of these techniques, along with benchmarks, to insure the rapidly dissemination of modern analytical techniques throughout the neuroscience community. The work group will convene on a yearly basis. For 1997, we propose to focus on topics that fall under the rubric of multivariate time-series, including analysis of point processes, e.g., spike trains, measures of correlation and variability and their interpretation in terms of underlying process models; analysis of continuous processes, e.g., field potential, optical imaging fMRI, and MEG, and the recording of behavioral output, e.g., vocalizations; and problems that involve both point and continuous processes, e.g., spike sorting, and the relations between spike trains and sensory input and motor output. Participants: We propose to have 25 participants, both experimentalists and theorists. Experimentalists are specifically encouraged to bring data records to the work group for analysis and discussion. Appropriate computational facilities will be provided. The work group will further take advantage of interested investigators and course faculty concurrently present at the MBL. We encourage graduate students and postdoctoral fellows as well as senior researchers to apply. Participant Fee: $200. Support: National Institutes of Health - NIMH, NIA, NIAAA, NICHD/NCRR, NIDCD, NIDA, and NINDS. Organizers: David Kleinfeld (UCSD) and Partha P. Mitra (Caltech and Bell Laboratories). Application: Potential participants should send a copy of their curriculum vita, together with a cover letter that contains a brief (ca. 200 word) paragraph on why they wish to attend the work group and a justified request for any financial aid, to Ms. Jean Ainge Room 1D-467, Bell Laboratories, Lucent Technologies 700 Mountain Avenue Murray Hill, NJ 07974 908-582-4702 or Graduate students and postdoctoral fellows are encouraged to include a brief letter of support from their research advisor. Financial assistance: Assistance for travel, accommodations, and board is available based on need. Applications must be received by 3 July 1997; participants will be notified by 11 July. Additional information may be found at http://www-physics.ucsd.edu/research/neurodata (one level above this announcement). The MBL is an EEO AAI. From smagt at dlr.de Fri Jun 27 04:50:54 1997 From: smagt at dlr.de (Patrick van der Smagt) Date: Fri, 27 Jun 1997 10:50:54 +0200 Subject: NIPS*97 Workshop: robots & cerebellum Message-ID: <33B37EEE.925@dlr.de> NIPS'97 Postconference Workshop =============================== Can Artificial Cerebellar Models Compete to Control Robots? http://www.op.dlr.de/FF-DR-RS/CONFERENCES/nips-workshop/ Objective: ---------- Recent successes in robotics have broadened the field of application and acceptation of robots. Nevertheless, industrial robotics still has to go a long way. While the applicability of classical robots remains limited to factory floors, research lab robotics is moving towards novel actuators for constructing light-weight, compliant robot arms. Hereto actuators are needed which consist of agonist-antagonist drive pairs for maintaining accurate positioning without recalibration, as well as for controlling the stiffness of a joint. However, when these joints are combined to construct a robot arm, existing algorithms can only inaccurately control the arm at low velocities. Starting from Albus' model from the 70's, neuro-computational models of the cerebellum have been advocated as possible candidates for the control of complex robot systems. Unfortunately, there have been very few applications of cerebellar models to the control of real robot manipulators with at least 6 degrees of freedom. Cerebellar models have become more refined through specialised investigations based on details of the biological cerebellar system, while insufficient attention has been given to applicability of these refinements in robotics. In the development of these methodologies, there are few examples of successful integration of neuro-computational models. In this workshop we want to investigate how neuro-computational models might be incorporated as a standard part of robotics. We want to address two questions: How reasonable is the desire for such an amalgamation? What prerequisites are there for having cerebellar models successfully compete with alternative approaches to control of real robots? These questions will be addressed through the presentation of papers which: 1. describe biologically plausible models of sensory-motor control; 2. apply cerebellar models to the control of robot manipulators; 3. describe robot control methodologies that can incorporate cerebellar or other neuro-computational model (e.g., vision); 4. provide a case history illustrating barriers to successful competition by cerebellar models for robot control. Call for Contributions ====================== For the workshop we will be looking for talks and papers which: * describe research applying cerebellar models to robot applications implemented on real robots; * describe biologically plausible models of sensory-motor control; * describe robot control methodologies that can incorporate cerebellar or other neuro-computational model (e.g., vision); * provide a case history illustrating barriers to successful competition by cerebellar models for robot control. Please send your proposals [with a paper or extended abstract] by August 15, 1997. For more information about submission visit http://www.op.dlr.de/FF-DR-RS/CONFERENCES/nips-workshop/ Workshop organisers: ==================== Patrick van der Smagt Institute of Robotics and System Dynamics German Aerospace Research (DLR) Wessling, Germany mailto:smagt at dlr.de Daniel Bullock Cognitive and Neural Systems Department Boston University Boston, MA 02215 mailto:danb at cns.bu.edu -- dr Patrick van der Smagt phone +49 8153 281152 DLR/Institute of Robotics and Systems Dynamics fax +49 8153 281134 P.O. Box 1116, 82230 Wessling, Germany email From hochreit at informatik.tu-muenchen.de Fri Jun 27 07:19:09 1997 From: hochreit at informatik.tu-muenchen.de (Josef Hochreiter) Date: Fri, 27 Jun 1997 13:19:09 +0200 Subject: sensory coding with LOCOCODE Message-ID: <97Jun27.131913+0200met_dst.49112+135@papa.informatik.tu-muenchen.de> LOCOCODE Sepp Hochreiter, TUM Juergen Schmidhuber, IDSIA TR FKI-222-97 (19 pages, 23 figures, 450 KB, 4.2 MB gunzipped) Low-complexity coding and decoding (LOCOCODE) is a novel approach to sensory coding and unsupervised learning. Unlike previous methods it explicitly takes into account the information-theoretic complexity of the code generator: lococodes (1) convey information about the input data and (2) can be computed and decoded by low-complexity mappings. We implement LOCOCODE by training autoassociators with Flat Minimum Search, a recent, general method for discovering low-complexity neural nets. Experiments show: unlike codes obtained with standard autoenco- ders, lococodes are based on feature detectors, never unstructured, usually sparse, sometimes factorial or local (depending on the data). Although LOCOCODE's objective function does not contain an explicit term enforcing sparse or factorial codes, it extracts optimal codes for difficult versions of the "bars" benchmark problem. Unlike, e.g., independent component analysis (ICA) it does not need to know the num- ber of independent data sources. It produces familiar, biologically plausible feature detectors when applied to real world images. As a preprocessor for a vowel recognition benchmark problem it sets the stage for excellent classification performance. ftp://ftp.idsia.ch/pub/juergen/lococode.ps.gz ftp://flop.informatik.tu-muenchen.de/pub/fki/fki-222-97.ps.gz http://www7.informatik.tu-muenchen.de/~hochreit/pub.html http://www.idsia.ch/~juergen/onlinepub.html (invited talk at "Theoretical Aspects of Neural Computation" (TANC97), Hong Kong, May 97 - short spin-off papers to be published by Springer) Comments welcome. Sepp & Juergen From mike at stats.gla.ac.uk Fri Jun 27 10:54:07 1997 From: mike at stats.gla.ac.uk (Mike Titterington) Date: Fri, 27 Jun 1997 15:54:07 +0100 (BST) Subject: Postdoctoral Post in Glasgow Message-ID: <4800.199706271454@milkyway.stats.gla.ac.uk> The following advertisement will soon appear in the UK national press. DRAFT ADVERTISEMENT -------------------- UNIVERSITY OF GLASGOW DEPARTMENT OF STATISTICS POSTDOCTORAL RESEARCH ASSISTANT Applications are invited for a Postdoctoral Research Assistantship (IA) post in the Department of Statistics, University of Glasgow, to work with Professor D.M. Titterington for a period of up to 3 years, starting on October 1, 1997, or as soon as possible thereafter. The post is funded by the UK Engineering and Physical Sciences Research Council. The research topic is Statistical Learning Methodology in Neural Computing Problems. Applications, supported by full curriculum vitae and the names of three referees, should be sent, to arrive no later than July 27, 1997, to Professor D. M. Titterington, Department of Statistics, University of Glasgow, Glasgow G12 8QQ, Scotland, from whom further particulars are available. Informal enquiries by electronic mail (mike at stats.gla.ac.uk) are welcomed. From srx014 at coventry.ac.uk Fri Jun 27 13:59:22 1997 From: srx014 at coventry.ac.uk (Colin Reeves) Date: Fri, 27 Jun 1997 18:59:22 +0100 (BST) Subject: PhD studentship available Message-ID: The following research position is available: Would interested candidates please contact Colin Reeves in the first instance (CRReeves at coventry.ac.uk) with a CV. -------------------------------------------------------------------------- Project Proposal The application of artificial intelligence to the control of large, complex gas transmission systems. Control Theory and Applications Centre, and BG Transco, Coventry University System Control, Hinckley Background. BG Transco is responsible for the transportation and storage of natural gas in Britain. In general, gas is delivered to coastal terminals where it enters the Transco pipeline system. System Control manages and controls the flow of gas from these terminals to 18 million consumers. System Control is organised into 5 control centres which operate on a 24 hour basis 365 days a year. There is one National Control Centre (NCC) and 4 Area Control Centres (ACCs). In each of the control centres small teams of engineers are responsible for the provision of a secure and economic gas transportation system. To achieve this, the engineers need to develop best practices and apply these consistently to the operation of the pipeline systems. This project will investigate the possible application of Artificial Intelligence techniques to support the operation of these large, complex gas transmission systems. Methodology The Control Theory and Applications Centre at Coventry University has successfully applied AI methods, including neural nets, fuzzy systems, genetic algorithms etc, to complex control problems. It is intended in this project to extend these approaches to the gas pipeline systems of Transco. This will involve firstly the selection (in close collaboration with Transco System Control) of a suitable subsystem for a feasibility study. Historical records of this subsystem will be used to identify important variables, and then to extract knowledge in the form of rules. Knowledge elicitation from the experts (the operations engineers) will also be carried out. Probable techniques include the use of neuro-fuzzy methods and the application of genetic algorithms or other heuristics in mining the data archives. At appropriate times it is required that a competent report will be presented to Transco. In addition there will be opportunities to present the work at conferences or to publish through technical journals. The successful candidate will register for a PhD, working under the supervision of Colin Reeves at Coventry University. Candidate profile The student should have a first degree (at least a 2i) or MSc in an appropriate technological or engineering subject. Good general mathematical and computing skills are more important than the specific subject of the degree. It would be an advantage to have previous knowledge of AI methods such as those mentioned above. Knowledge of control systems and databases would also be useful, and the candidate needs to possess the appropriate inter-personal skills to work in an operational engineering environment. The candidate should be eligible for UK fees, which means he/she will most likely be a citizen of a member state of the European Union - other candidates are normally subject to higher fees. Timescale The project will commence in September 1997 and will last for 3 years. From dld at cs.monash.edu.au Sun Jun 29 09:45:56 1997 From: dld at cs.monash.edu.au (David L Dowe) Date: Sun, 29 Jun 1997 23:45:56 +1000 Subject: CFPs and CFRs: Information theory in biology Message-ID: <199706291345.XAA29992@dec11.cs.monash.edu.au> Please distributed to interested friends and colleagues: Call For Papers (CFPs) and Call For Referees (CFRs) Complexity and information-theoretic approaches to biology ---------------------------------------------------------- This is a Call For Papers and Call For Referees for the 3rd Pacific Symposium on BioComputing (PSB-3, 1998) conference stream on "Complexity and information-theoretic approaches to biology". PSB-98 will be held from 5-9 January, 1998, in Hawaii, at the Ritz Carlton Kapalua on Maui. Stream Organisers: David L. Dowe (dld at cs.monash.edu.au) and Klaus Prank. Stream submission deadline (details below): 21 July 1997. ~~~~~~~~~~~~~~~~~~~ WWW site: http://www.cs.monash.edu.au/~dld/PSB-3/PSB-3.Info.CFPs.html . Specific technical area to be covered by this stream: Approaches to biological problems using notions of information or complexity, including methods such as Algorithmic Probability, Minimum Message Length and Minimum Description Length. Two possible applications are (e.g.) protein folding and biological information processing. Kolmogorov (1965) and Chaitin (1966) studied the notions of complexity and randomness, with Solomonoff (1964), Wallace (1968) and Rissanen (1978) applying these to problems of statistical and inferential learning and to prediction. The methods of Solomonoff, Wallace and Rissanen have respectively come to be known as Algorithmic Probability (ALP), Minimum Message Length (MML) and Minimum Description Length (MDL). All of these methods relate to information theory, and can also be thought of in terms of Shannon's information theory, and can also be thought of in terms of Boltzmann's thermo-dynamic entropy. An MDL/MML perspective has been suggested by a number of authors in the context of approximating unknown functions with some parametric approximation scheme (such as a neural network). The designated measure to optimize under this scheme combines an estimate of the cost of misfit with an estimate of the cost of describing the parametric approximation (Akaike 1973, Rissanen 1978, Barron and Barron 1988). This stream invites all original papers of a biological nature which use notions of information and/or complexity, with no strong preference as to what specific nature. Such work has been done in problems of, e.g., protein folding and DNA string alignment. As we shortly describe in some detail, such work has also been done in the analysis of temporal dynamics in biology such as neural spike trains and endocrine (hormonal) time series analysis using the MDL principle in the context of neural networks and context-free grammar complexity. To elaborate on one of the relevant topics above, in the last couple of years or so, there has been a major focus on the aspect of timing in biological information processing ranging from fields such as neuroscience to endocrinology. The latest work on information processing at the single-cell level using computational as well as experimental approaches reveals previously unimagined complexity and dynamism. Timing in biological information processing on the single-cell level as well as on the systems level has been studied by signal-processing and information-theoretic approaches in particular in the field of neuroscience (see for an overview: Rieke et al. 1996). Using such approaches to the understanding of temporal complexity in biological information transfer, the maximum information rates and the precision of spike timing to the understanding of temporal complexity in biological information transfer, the maximum information rates and the precision of spike timing could be revealed by computational methods (Mainen and Sejnowski, 1995; Gabbiani and Koch 1996; Gabbiani et al., 1996). The examples given above are examples of some possible biological application domains. We invite and solicit papers in all areas of (computational) biology which make use of ALP, MDL, MML and/or other notions of information and complexity. In problems of prediction, as well as using "yes"/"no" predictions, we would encourage the authors to consider also using probabilistic prediction, where the score assigned to a probabilistic prediction is given according to the negative logarithm of the stated probability of the event. List of Frequently Asked Questions (FAQs) re PSB-98 : ----------------------------------------------------- Q1. How can my paper be included in PSB's hardbound proceedings? PSB publishes peer-reviewed full papers in an archival proceedings. Each accepted paper will be allocated 12 pages in the proceedings volume. Paper authors are required to register (and pay) for the conference by the time they submit their camera-ready copy, or the paper will not be published. Q2. How does a PSB publication compare to a journal publication? PSB papers are strenuously peer reviewed, and must report significant original material. PSB expects to be included in Indicus Medicus, Medline and other indexing services starting this year. All accepted full papers will be indexed just as if they had appeared in a journal. It is too early to assess the impact of a PSB paper quantitatively, but we will take every action we can to improve the visibility and significance of PSB publication. Q3. If I do not want to submit a full paper to PSB, but wish to participate? Authors who do not wish to submit a full paper are welcome to submit one page abstracts, which will be distributed at the meeting separately from the archival proceedings, and are also welcome to display standard or computer-interactive posters. Q4. What are the paper submission deadlines? Papers will be due July 14, although session chairs can to adjust this deadline at their discretion. Results will be announced August 22, and camera ready copy will be due September 22. Poster abstracts will be accepted until October 1, and on a space available basis after that. Poster space is limited, especially for interactive posters that require computer or network access. Q5. Where should I send my submission? All full papers must be submitted to the central PSB address so that we can track the manuscripts. Physical submittors should send five copies of their paper to: PSB-98 c/o Section on Medical Informatics Stanford University Medical School, MSOB X215 Stanford, CA 94305-5479 USA Electronic submission of papers is welcome. Format requirements for electronic submission will be available on the web page (http://www.cgl.ucsf.edu/psb) or from Russ Altman (altman at smi.stanford.edu). Electronic papers will be submitted directly to Dr. Altman. We prefer that all one page abstracts be submitted electronically. Please send them to us in plain ascii text or as a Microsoft Word file. If this is impossible, please contact Dr. Altman as soon as possible. Q6. How can I obtain travel support to come to PSB? We have been able to offer partial travel support to many PSB attendees in the past, including most authors of accepted full papers who request support. However, due to our sponsoring agencies' schedules, we are unable to offer travel awards before the registration (and payment) deadlines for authors. We recognize that this is inconvenient, and we are doing our best to rectify the situation. NO ONE IS GUARANTEED TRAVEL SUPPORT. Travel support applications will be available on our web site (see Q7). Q7. How can I get more information about the meeting? Check our web page: http://www.cgl.ucsf.edu/psb or send email to the conference chair: hunter at nlm.nih.gov Further comments re PSB-98 : ---------------------------- PSB'98 will publish accepted full papers in an archival Proceedings. All contributed papers will be rigorously peer-reviewed by at least three referees. Each accepted full paper will be allocated up to 12 pages in the conference Proceedings. The best papers will be selected for a 30-minute oral presentation to the full assembled conference. Accepted poster abstracts will be distributed at the conference separately from the archival Proceedings. To be eligible for proceedings publication, each full paper must be accompanied by a cover letter stating that it contains original unpublished results not currently under consideration elsewhere. IMPORTANT DATES: Full paper submissions due (NEW deadline): July 21, 1997 Poster abstracts due: August 10, 1997 Notification of paper acceptance: August 22, 1997 Camera-ready copy due: September 22, 1997 Conference: January 5 - 8, 1998 More information about the "Complexity and information-theoretic approaches to biology" stream, including a sample list of relevant papers is available on the WWW at http://www.cs.monash.edu.au/~dld/PSB-3/PSB-3.Info.CFPs.html . For further information about the above stream, e-mail Dr. David Dowe, dld at cs.monash.edu.au , http://www.cs.monash.edu.au/~dld/ , Fax: +61 3 9905-5146 on or before 2 July (or after 19th July) or Dr. Klaus Prank, ndxdpran at rrzn-serv.de , http://sun1.rrzn-user.uni-hannover.de/~ndxdpran/ , on or after 3 July. From icsc at compusmart.ab.ca Fri Jun 27 20:38:02 1997 From: icsc at compusmart.ab.ca (ICSC Canada) Date: Sat, 28 Jun 1997 00:38:02 +0000 Subject: I&AAN'98 Workshop Message-ID: <2.2.32.19970628003802.006ec860@mail.compusmart.ab.ca> International Workshop on INDEPENDENCE & ARTIFICIAL NEURAL NETWORKS / I&ANN'98 at the University of La Laguna, Tenerife, Spain February 9-10, 1998 http://www.compusmart.ab.ca/icsc/iann98.htm This workshop will take place immediately prior to the International ICSC Symposium on ENGINEERING OF INTELLIGENT SYSTEMS / EIS'98 at the University of La Laguna, Tenerife, Spain February 11-13, 1998 http://www.compusmart.ab.ca/icsc/eis98.htm ****************************************** TOPICS Recent ANN research has developed from those networks which find correlations in data sets to the more ambitious goal of finding independent components of data sets. The workshop will concentrate on those neural networks which find independent components and the associated networks whose learning rules use contextual information to organize their learning. The topic falls then into (at least) three main streams of current ANN research: 1. Independent Component Analysis which has most recently been successfully applied to the problem of "blind separation of sources" such as the recovery of a single voice from a mixture/convolution of voices. Such methods normally use either information theoretic criteria or higher order statistics to perform the separation. 2. Identification of independent sources: The seminal experiment in this field is the identification of single bars from an input grid containing mixtures of bars. Factor Analysis (or generative models) has been a recent popular method for this problem. 3. Using contextual information to identify structure in data. We envisage a single track program over two days (February 9 - 10, 1998) with many opportunities for informal discussion. ****************************************** INTERNATIONAL SCIENTIFIC COMMITTEE - Luis Almeida, INESC, Portugal - Tony Bell, Salk Institute, USA - Andrew Cichocki, RIKEN Institute, Japan - Colin Fyfe, University of Paisley, U.K. - Mark Girolami, University of Paisley, U.K. - Peter Hancock, University of Stirling, U.K. - Juha Karhunen, Helsinki University of Technology, Finland - Jim Kay, University of Glasgow, U.K. - Erkki Oja, Helsinki University of Technology, Finland ****************************************** INFORMATION FOR PARTICIPANTS AND AUTHORS Registrations are available for the workshop only (February 9 - 10, 1998), or combined with the EIS'98 symposium (February 11 - 13, 1998). The registration fee for the 2-day workshop is estimated at approximately Ptas. 37,000 per person and includes: - Use of facilities and equipment - Lunches, dinners and coffee breaks - Welcome wine & cheese party - Proceedings in print (workshop only) - Proceedings on CD-ROM (workshop and EIS'98 conference) - Daily transportation between hotels in Santa Cruz and workshop site The regular registration fee for the EIS'98 symposium (February 11-13, 1998) is estimated at Ptas. 59,000 per person, but a reduction will be offered to workshop participants. Separate proceedings will be printed for the workshop, but all respective papers will also be included on the CD-ROM, covering the I&ANN'98 workshop and the EIS'98 symposium. As a bonus, workshop participants will thus automatically also receive the conference proceedings (CD-ROM version). We anticipate that the proceedings will be published as a special issue of a journal. ****************************************** SUBMISSION OF PAPERS Prospective authors are requested to send a 4-6 page report of their work for evaluation by the International Scientific Committee. All reports must be written in English, starting with a succinct statement of the problem, the results achieved, their significance and a comparison with previous work. The report should also include: - Title of workshop (I&ANN'98) - Title of proposed paper - Authors names, affiliations, addresses - Name of author to contact for correspondence - E-mail address and fax # of contact author - Topics which best describe the paper (max. 5 keywords) Submissions may be made by airmail or electronic mail to: Dr. Colin Fyfe Department of Computing and Information Systems The University of Paisley High Street Paisley, PA1 2BE Scotland Email: fyfe0ci at paisley.ac.uk Fax: +44-141-848-3542 ****************************************** SUBMISSION DEADLINE It is the intention of the organizers to have the proceedings available for the delegates. Consequently, the submission deadline of September 15, 1997 has to be strictly respected. ****************************************** IMPORTANT DATES Submission of reports: September 15, 1997 Notification of acceptance: October 15, 1997 Delivery of full papers: November 15, 1997 I&ANN'98 Workshop: February 9 - 10, 1998 EIS'98 Conference: February 11 - 13, 1998 ****************************************** LOCAL ARRANGEMENTS For details about local arrangements, please consult the EIS'98 website at http://www.compusmart.ab.ca/icsc/eis98.htm ****************************************** FURTHER INFORMATION For further information please contact: - Dr. Colin Fyfe Department of Computing and Information Systems The University of Paisley High Street Paisley PA1 2BE Scotland E-mail: fyfe0ci at paisley.ac.uk Fax: +44-141-848-3542 or - ICSC Canada International Computer Science Conventions P.O. Box 279 Millet, Alberta T0C 1Z0 Canada E-mail: icsc at compusmart.ab.ca Fax: +1-403-387-4329 WWW: http://www.compusmart.ab.ca/icsc