paper on 2nd order methods in Neuroprose
BATTITI@ITNVAX.CINECA.IT
BATTITI at ITNVAX.CINECA.IT
Thu Nov 14 10:01:00 EST 1991
A new paper is available from the Neuroprose directory.
FILE: battiti.second.ps.Z (ftp binary, uncompress, lpr (PostScript))
TITLE: "First and Second-Order Methods for Learning:
between Steepest Descent and Newton's Method"
AUTHOR: Roberto Battiti
ABSTRACT: On-line first order backpropagation is sufficiently fast
and effective for many large-scale classification problems but for
very high precision mappings, batch processing may be the method of
choice.This paper reviews first- and second-order optimization methods
for learning in feed-forward neural networks. The viewpoint is that
of optimization: many methods can be cast in the language of optimiza-
tion techniques, allowing the transfer to neural nets of detailed
results about computational complexity and safety procedures to ensure
convergence and to avoid numerical problems.
The review is not intended to deliver detailed prescriptions for the
most appropriate methods in specific applications, but to illustrate
the main characteristics of the different methods and their mutual
relations.
PS: the paper will be published in Neural Computation.
PPSS: comments and/or new results welcome.
======================================================================
| | |
| Roberto Battiti | e-mail: battiti at itnvax.cineca.it |
| Dipartimento di Matematica | tel: (+39) - 461 - 88 - 1639 |
| 38050 Povo (Trento) - ITALY | fax: (+39) - 461 - 88 - 1624 |
| | |
======================================================================
More information about the Connectionists
mailing list