TRs available
Jai Choi
jai at blake.acs.washington.edu
Sat Apr 14 00:19:33 EDT 1990
To whom it may concern:
We appreciate if you post followings which advertises two
technical notes. Thanks in advance.
Jai Choi.
==================================================================
Two Technical Notes Available
==================================================================
1. Query Learning Based on Boundary Search and Gradient
Computation of Trained Multilayer Perceptrons
Jenq-Neng Hwang, Jai J. Choi, Seho Oh, Robert J. Marks II
Interactive Systems Design Lab.
Department of Electrical Engr., FT-10
University of Washington
Seattle, WA 98195
****** Abstract *******
In many machine learning applications, the source of the training data
can be modeled as an oracle. An oracle has the ability, when presented
with an example (query), to give a correct classification. An efficient
query learning is to provide the good training data to the oracle at low
cost. This report presents a novel approach for query based neural
network learning. Consider a layered perceptron partially trained for
binary classification. The single output neuron is trained to be either
a 0 or a 1. A test decision is made by thresholding the output at, say,
0.5. The set of inputs that produce an output of 0.5, forms the classification
boundary. We adopted an inversion algorithm for the neural network that
allows generation of this boundary. In addition, for each boundary point,
we can generate the classification gradient. The gradient provides a useful
measure of the sharpness of the multi-dimensional decision surfaces. Using
the boundary point and gradient information, conjugate input pair locations
are generated and presented to an oracle for proper classification.
This new data is used to further refine the classification boundary thereby
increasing the classification accuracy. The result can be a significant
reduction in the training set cardinality in comparison with, for example,
randomly generated data points. An application example to power security
assessment is given.
(will be presented in IJCNN'90, San Diego)
**********************************************************************
2. Iterative Constrained Inversion of Neural Networks and its Applications
Jenq-Neng Hwang, Chi H. Chan
****** Abstract ******
This report presents a new approach to solve the constrained inverse problems
for a trained nonlinear mapping. These problems can be found in a wide variety
of applications in dynamic control of nonlinear systems and nonlinear
constrained optimization. The forward problem in a nonlinear functional
mapping is to obtain the best approximation of the output vector given the
input vector. The inverse problem, on the other hand, is to obtain the best
approximation of the input vector given a specified output vector, i.e., to
find the inverse function of the nonlinear mapping, which might not exist
except when the constraints are imposed on.
Most neural networks previously proposed for training the inverse mapping
either adopted an one-way constraint perturbation or a two-stage learning.
Both of these approaches are very laborious and unreliable.
Instead of using two neural networks for emulating the forward
and inverse mappings separately, we applied the network inversion algorithm,
which works directly on the network used to train the forward mapping, yielding
the inverse mapping. Our approach uses one network to emulate both of forward
and inverse nonlinear mapping without explicitly characterizing and
implementing the inverse mapping. Furthermore, our single network inversion
approach allows to iteratively locate the optimal inverted solution which
also satisfies some constraints imposed on the inputs, and also allows best
exploitation of the sensitivity measure of the inputs to outputs in a non-
linear mapping.
(presented in 24 Conf. on Information Systems and Sciences)
******** For copy of above two TR ************
Send your physical address to
Jai Choi
Dept. EE, FT-10
Univ. of Washington
Seattle, WA 98195
or "jai at blake.acs.washington.edu".
More information about the Connectionists
mailing list