Paper announcements
Nathan Intrator
nin at cns.brown.edu
Fri Aug 23 12:07:32 EDT 1996
*** Papers Announcements ***
The following papers are now available from my research page:
http://www.physics.brown.edu/people/nin/research.html
Comments are welcomed.
-----------------------------------------------------------------------
Classifying Seismic Signals by Integrating Ensembles of Neural Networks
Yair Shimshoni and Nathan Intrator
ftp://cns.brown.edu/nin/papers/hong-kong.ps.Z
This paper proposes a classification scheme based on the
integration of multiple Ensembles of ANNs. It is demonstrated on
a classification problem, in which Seismic recordings of Natural
Earthquakes must be distinguished from the recordings of
Artificial Explosions. A Redundant Classification Environment
consists of several Ensembles of Neural Networks is created and
trained on Bootstrap Sample Sets, using various data representations
and architectures. The ANNs within the Ensembles are aggregated
(as in Bagging) while the Ensembles are integrated non-linearly, in a
signal adaptive manner, using a posterior confidence measure
based on the agreement (variance) within the Ensembles. The proposed
Integrated Classification Machine achieved 92.1\% correct
classification on the seismic test data. Cross Validation evaluations
and comparisons indicate that such integration of a collection of
ANN's Ensembles is a robust way for handling high dimensional
problems with a complex non-stationary signal space as in the current
Seismic Classification problem.
To appear: Proceedings of ICONIP 96
-----------------------------------------------------------------------
Learning low dimensional representations of visual objects
with extensive use of prior knowledge
Nathan Intrator and Shimon Edelman
ftp://cns.brown.edu/nin/papers/ml1.ps.Z
Learning to recognize visual objects from examples requires the
ability to find meaningful patterns in spaces of very high
dimensionality. We present a method for dimensionality reduction
which effectively biases the learning system by combining multiple
constraints via an extensive use of class labels. The use of
multiple class labels steers the resulting low-dimensional
representation to become invariant to those directions of variation
in the input space that are irrelevant to classification; this is
done merely by making class labels independent of these directions.
We also show that prior knowledge of the proper dimensionality of
the target representation can be imposed by training a
multiple-layer bottleneck network. A series of computational
experiments involving parameterized fractal images and real human
faces indicate that the low-dimensional representation extracted by
our method leads to improved generalization in the learned tasks,
and is likely to preserve the topology of the original space.
To appear: EXPLANATION-BASED NEURAL NETWORK LEARNING: A LIFELONG LEARNING
APPROACH. Editor: SEBASTIAN THRUN
-----------------------------------------------------------------------
Bootstrapping with Noise: An Effective Regularization Technique
Yuval Raviv and Nathan Intrator
ftp://cns.brown.edu/nin/papers/spiral.ps.Z
Bootstrap samples with noise are shown to be an effective smoothness
and capacity control technique for training feed-forward networks and
for other statistical methods such as generalized additive models.
It is shown that noisy bootstrap performs best in conjunction with
weight decay regularization and ensemble averaging.
The two-spiral problem, a highly non-linear noise-free data, is used to
demonstrate these findings.
The combination of noisy bootstrap and ensemble averaging is also
shown useful for generalized additive modeling, and is also
demonstrated on the well known Cleveland Heart Data
\cite{Detrano89}.
To appear: Connection Science, Speciall issue on Combining Estimators.
More information about the Connectionists
mailing list