linear separability

David Lovell dlovell at s1.elec.uq.oz.au
Mon Mar 9 18:52:48 EST 1992


Sorry this is going out to everyone, I couldn't find a useable email
address for Christoph.
#
#Given a binary (-1/1) valued training set consisting of n-dimensional
#input vectors (homogeneous coordinates: n-1 inputs and a 1 for the bias
#term as the n-th dimension) and 1-dimensional target vectors. For this 2-class
#classification problem I wish to prove (non-) linear separability
#solely on the basis of the given training set (hence determine if
#the underlying problem may be solved with a 2-layer feedforward network).

Actually, you only need one neuron to seperate that set of vectors (if, of
course, they are linearly seperable).
#
#My approach so far: 
#Denote input vectors of the first (second) class with H1 (H2). We need 
#to find a hyperplane in n-dimesional space separating vectors of H1
#and H2. The hyperplane goes through the origin of the n-dimensional
#hypercube. Negate the vectors of H2 and the problem now states: Find the
#hyperplane with all vectors (from H1 and the negated of H2) on one side
#of the plane and none on the other. Pick one of the vectors (= a vertex of
#the hypercube) and compute the Hamming Distance to all other vectors.
#Clearly, there must be a constraint on how many of the vectors are
#allowed to be how far "away" in order for the hyperplane to be able to
#separate. E.g.: If any of the Hamming Distances would be n, then the 
#training set would not be linearly separable.
#My problem is concerning the constraints for Distances of n-1, n-2,
#etc...
#
#Has anyone taken a similar approach?
A distressingly large number of people in the 60's actually.
(I've been beating my head against that problem for the past 6 months).

#Are there alternate solutions to
#linear separability? I tried linear programming, but the simplex
#algorithm uses a prohibitive number of variables for a mid-size
#problem.
#
There is a fundamental result concerning the sums of all TRUE vectors and
all FALSE vectors, but it doesn't work with only a sampling of those vectors.
Here come a list of papers that deal with the problem as best as they can:

W.H. Highleyman, "A note on linear separation", IRE Trans. Electronic
Computers, EC-10, 777-778, 1961.

R.C. Singleton, "A test for linear separability applied to self-organizing
machines," in Self-Organizing Systems, M.C. Yovitts, Ed., 503-524, 1961.

The Perceptron Convergence Procedure, Ho-Kayshap procedure and TLU
synthesis techniques by Dertouzos or Kaszerman will not converge in the
non-LS case, see:

Perceptron Convergence Procedure
R.O.Duda  & P.E. Hart, " Pattern Classification and Scene Analysis," New
York, John Wiley and Sons, 1973.

Ho-Kayshap
ibid

TLU Dertouzos
M.L. Dertouzos, "Threshold Logic: A synthesis Approach", MIT Research
Monograph), vol 32. Cambridge, MA, MIT Press, 1965.

TLU Kaszerman
P. Kaszerman, "A geometric Test Synthesis Procedure for a threshold
Device", Information and Control, 6, 381-393, 1963.

I'm not sure, but I think those last two require the functions to be
completely specified.

The approach that I was looking at is as follows.
What we are really interested in is the set of between n and
n Choose Floor(n/2) vectors that closest to the separating hyperplane (if
there is one). This is the union of the minimally TRUE and maximally FALSE
vectors.

If we can show that this set of vectors lies on one side of some hyperplane
then the problem is solved. The reason that the problem is not solved is
that I haven't yet been able to find a method for determining these
vectors without having the problem completely specified!

#Your help is greatly appreciated,
#
I could use a bit of help too.
Anyone interested? Please mail me if you want or can supply
more info. Don't forget to send your email address so that we can continue
our discussion in private if it does not prove to be of general interest to
the readership of connectionists.

Happy connectionisming.
-- 

David Lovell - dlovell at s1.elec.uq.oz.au  |
                                         |
Dept. Electrical Engineering             | "Oh bother! The pudding is ruined
University of Queensland                 | completely now!" said Marjory, as
BRISBANE 4072                            | Henry the daschund leapt up and
Australia                                | into the lemon surprise.
                                         |
tel: (07) 365 3564                       |



More information about the Connectionists mailing list