CFP: Special Session in KES'2001: Geometry and Statistics in NN

Sumio Watanabe swatanab at pi.titech.ac.jp
Wed Mar 14 20:44:29 EST 2001


Dear Connectionists,

We have a special session on Geometry and Statistics in NN
in the international conference KES'2001, which will be held 
in Oska and Nara in Japan, 6,7,8, September, 2001. 

http://www.bton.ac.uk/kes/kes2001/

In the session "Geometry and Statistics in Neural Network
Learning Theory", we study the statistical problem caused by
non-identifiability of layered learning machines. (See the followings).

We would like to invite some researchers who will take 
part in this session and present a paper. 

http://www.bton.ac.uk/kes/kes2001/sessions.html.

The authors who will be invited:
Shun-Ichi Amari (RIKEN Brain Science Institute,Japan)
Kenji Fukumizu (Insitute of Statistical Mathematics,Japan)
Katsuyuki Hagiwara (Mie University,Japan)
Sumio Watanabe(Tokyo Institute of Technology,Japan)

If you are interested in this special session, 
please contact the following e-mail address
by March 31th, 2001. 

swatanab at pi.titech.ac.jp

Thank you very much. 

                    Sincerely, 

Dr. Sumio Watanabe,  Associate Professor
Advanced Information Processing Division
Precision and Intelligence Laboratory
Tokyo Institute of Technology
(Fax) +81-45-924-5018
E-mail: swatanab at pi.titech.ac.jp
http://watanabe-www.pi.titech.ac.jp/~swatanab/index.html


***************Special Session ***************

Geometry and Statistics in Neural Network Learning Theory

Non-identifiability:
A learning machine p(x|w) with a parameter w is called 
identifiable if the mapping from w to p(x|w) is one-to-one.
It should be emphasized that almost all learning machines with
hidden parts are not identifiable, resulting that the manifolds 
of parameters have singular Fisher metrics. 

Problem: 
If a non-identifiable learning machine can almost approximate 
the true distribution, then, because of the finiteness of 
training samples, it is often in an almost redundant state. 
In such a case,  neither the distribution of the MLE nor the 
Bayes a posteriori distrbution is subject to the asymptotically
normal distribution. The conventional statistical asymptotic 
theory can not be applied to analyze the learning curves. 

Purpose:
We would like to clarify the relation between the learning 
curve and geometry of neuro-manifolds when the set of 
true parameters is an analytic set (the set defined 
as zeros of an analytic function). 

Methods:
The following topics will be discussed. 
(1)  Information Geometry of Singular Neuro-Manifold.
(2) Theory of Order Statistic.
(3)  Weak Convergence and Empirical Prcess. 
(4) Conic Singularities.
(5) Algebraic Geometry and Resolution of Singuralities.
(6) Zeta function of Kullback Information and Prior.

*******************************************















More information about the Connectionists mailing list