new papers in the neuroprose archive

Uli Bodenhausen uli at ira.uka.de
Thu Feb 4 12:06:41 EST 1993


The following papers have been placed in the neuroprose archive as  

	bodenhausen.application_oriented.ps.Z
	bodenhausen.architectural_learning.ps.Z

Instructions for retrieving and printing follow the abstracts.

1.)

CONNECTIONIST ARCHITECTURAL LEARNING FOR HIGH PERFORMANCE 
CHARACTER AND SPEECH RECOGNITION

Ulrich Bodenhausen and Stefan Manke

University of Karlsruhe and Carnegie Mellon University

Highly structured neural networks like the Time-Delay Neural 
Network (TDNN) can achieve very high recognition accuracies 
in real world applications like handwritten character and speech 
recognition systems. Achieving the best possible performance 
greatly depends on the optimization of all structural parameters 
for the given task and amount of training data. We propose an 
Automatic Structure Optimization (ASO) algorithm that avoids 
time-consuming manual optimization and apply it to Multi State 
Time-Delay Neural Networks, a recent extension of the TDNN. 
We show that the ASO algorithm can construct efficient architec
tures in a single training run that achieve very high recognition 
accuracies for two handwritten character recognition tasks and 
one speech recognition task. (only 4 pages!)

To appear in the proceedings of the International Conference on 
Acoustics, Speech and Signal Processing (ICASSP) 93, Minneapolis

--------------------------------------------------------------------------
2.)

Application Oriented Automatic Structuring of Time-Delay Neural Networks for 
High Performance Character and Speech Recognition

Ulrich Bodenhausen and Alex Waibel

University of Karlsruhe and Carnegie Mellon University

Highly structured artificial neural networks have been shown 
to be superior to fully connected networks for real-world 
applications like speech recognition and handwritten character 
recognition. These structured networks can be optimized 
in many ways, and have to be optimized for optimal performance. 
This makes the manual optimization very time consuming. 
A highly structured approach is the Multi State 
Time Delay Neural Network (MSTDNN) which uses shifted 
input windows and allows the recognition of sequences of 
ordered events that have to be observed jointly. In this paper 
we propose an Automatic Structure Optimization (ASO) 
algorithm and apply it to MSTDNN type networks. The 
ASO algorithm optimizes all relevant parameters of 
MSTDNNs automatically and was successfully tested with 
three different tasks and varying amounts of training data.
(6 pages, more detailed than the first paper)

To appear in the ICNN 93 proceedings, San Francisco.

--------------------------------------------------------------------------

     unix> ftp archive.cis.ohio-state.edu (or 128.146.8.52)
     Name: anonymous
     Password: neuron
     ftp> cd pub/neuroprose
     ftp> binary
     ftp> get bodenhausen.application_oriented.ps.Z
     ftp> get bodenhausen.architectural_learning.ps.Z
     ftp> quit
     unix> uncompress  bodenhausen.application_oriented.ps.Z
     unix> uncompress  bodenhausen.architectural_learning.ps.Z
     unix> lpr -s  bodenhausen.application_oriented.ps  (or however you print postscript)
     unix> lpr -s  bodenhausen.architectural_learning.ps

Thanks to  Jordan  Pollack for providing this service!




More information about the Connectionists mailing list