Connectionists: On differential-geometrical methods in neural network learning

Simone G.O. FIORI sfr at unipg.it
Thu Jun 2 05:30:35 EDT 2005


Dear Colleagues,

the following 3 preprints, related to differential-
geometrical methods for neural network learning, 
are available on the internet.

=========================================================

*Title: 
Formulation and Integration of Learning Differential 
Equations on the Stiefel Manifold

*Author: 
Simone Fiori, University of Perugia (Italy)

*Journal: 
IEEE Transactions on Neural Networks (IEEE-TNN)

*Abstract:
The present Letter aims at illustrating the relevance of 
numerical integration of learning differential equations on 
differential manifolds. In particular, the task of learning 
with orthonormality constraints is dealt with, which is 
naturally formulated as an optimization task with the compact 
Stiefel manifold as neural parameter space. Intrinsic 
properties of the derived learning algorithms, such as 
stability and constraints preservation, are illustrated 
through experiments on minor and independent component 
analysis.

*Keywords:
Unsupervised neural network learning; Differential 
Geometry; Riemannian manifold; Riemannian gradient; 
Geodesics.

*Source:
http://www.unipg.it/sfr/publications/TNN05.pdf

=========================================================

*Title:
Quasi-Geodesic Neural Learning Algorithms over 
the Orthogonal Group: A Tutorial

*Author:
Simone Fiori, University of Perugia (Italy)

*Journal: Journal of Machine Learning Research 
(JMLR)

*Abstract: 
The aim of this contribution is to present a tutorial 
on learning algorithms for a single neural layer whose 
connection matrix belongs to the orthogonal group. The 
algorithms exploit geodesics appropriately connected 
as piece-wise approximate integrals of the exact 
differential learning equation. The considered learning 
equations essentially arise from the Riemannian-gradient-
based optimization theory with deterministic and diffusion-
type gradient. The paper aims specifically at reviewing the 
relevant mathematics (and at presenting it in as much 
transparent way as possible in order to make it accessible 
to Readers that do not possess a background in differential 
geometry), at bringing together modern optimization methods 
on manifolds and at comparing the different algorithms on a 
common machine learning problem. As a numerical case-study, 
we consider an application to non-negative independent 
component analysis, although it should be recognized that 
Riemannian gradient methods are general-purpose algorithms, 
by no means limited to ICA-related applications. 

*Keywords:
Differential geometry; Diffusion-type gradient; Lie groups; 
Non-negative independent component analysis; Riemannian 
gradient.

*Source:
http://www.jmlr.org/papers/volume6/fiori05a/fiori05a.pdf

========================================================

*Title: 
Editorial: Special issue on ''Geometrical Methods in Neural 
Networks and Learning''

*Authors: 
Simone Fiori, University of Perugia (Italy)
Shun-ichi Amari, Brain Science Institute (RIKEN, Japan)

*Journal: 
Neurocomputing

*Source:
http://www.unipg.it/sfr/publications/editorial_si_nng.pdf
 
 =================================================
|         Simone FIORI (Elec.Eng., Ph.D.)         |
| * Faculty of Engineering - Perugia University * |
|  * Polo Didattico e Scientifico del Ternano *   |
| Loc. Pentima bassa, 21 - I-05100 TERNI (Italy)  |
|   eMail: fiori at unipg.it - Fax: +39 0744 492925  |
|        Web: http://www.unipg.it/sfr/            |
 =================================================




More information about the Connectionists mailing list