Connectionists: Stephen Hanson in conversation with Geoff Hinton

Danko Nikolic danko.nikolic at gmail.com
Tue Jul 19 17:08:36 EDT 2022


Dear Barak,

Thank you for the pointers. I have to read the papers. I need to understand
then why, if parity is so easy to learn, my deep learning models had such a
hard time that it led to an exponential growth in the number of needed
parameters with each additional bit added to the input. Strange.

I will report back.

Best

Danko

Dr. Danko Nikolić
www.danko-nikolic.com
https://www.linkedin.com/in/danko-nikolic/
-- I wonder, how is the brain able to generate insight? --


On Mon, Jul 18, 2022 at 1:12 PM Barak A. Pearlmutter <barak at pearlmutter.net>
wrote:

> On Mon, 18 Jul 2022 at 08:28, Danko Nikolic <danko.nikolic at gmail.com>
> wrote:
> > In short, learning mechanisms cannot discover generalized XOR functions
> with simple connectivity -- only with complex connectivity. This problem
> results in exponential growth of needed resources as the number of bits in
> the generalized XOR increases.
>
> Assuming that "generalized XOR" means parity, this must rely on some
> unusual definitions which you should probably state in order to avoid
> confusion.
>
> Parity is a poster boy for an *easy* function to learn, albeit a
> nonlinear one. This is because in the (boolean) Fourier domain its
> spectrum consists of a single nonzero coefficient, and functions that
> are sparse in that domain are very easy to learn. See N. Linial, Y.
> Mansour, and N. Nisan, "Constant depth circuits, Fourier Transform and
> learnability", FOCS 1989, or Mansour, Y. (1994). Learning Boolean
> Functions via the Fourier Transform. Theoretical Advances in Neural
> Computation and Learning, 391–424. doi:10.1007/978-1-4615-2696-4_11
>
> --Barak Pearlmutter
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20220719/de447564/attachment.html>


More information about the Connectionists mailing list