<div dir="ltr">Dear Barak,<div><br></div><div>Thank you for the pointers. I have to read the papers. I need to understand then why, if parity is so easy to learn, my deep learning models had such a hard time that it led to an exponential growth in the number of needed parameters with each additional bit added to the input. Strange.</div><div><br></div><div>I will report back.</div><div><br></div><div>Best</div><div><br></div><div>Danko</div><div><br clear="all"><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr">Dr. Danko Nikolić<br><a href="http://www.danko-nikolic.com" target="_blank">www.danko-nikolic.com</a><br><a href="https://www.linkedin.com/in/danko-nikolic/" target="_blank">https://www.linkedin.com/in/danko-nikolic/</a><div><span style="color:rgb(34,34,34)">-- I wonder, how is the brain able to generate insight? --</span><br></div></div></div></div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Jul 18, 2022 at 1:12 PM Barak A. Pearlmutter <<a href="mailto:barak@pearlmutter.net">barak@pearlmutter.net</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On Mon, 18 Jul 2022 at 08:28, Danko Nikolic <<a href="mailto:danko.nikolic@gmail.com" target="_blank">danko.nikolic@gmail.com</a>> wrote:<br>
> In short, learning mechanisms cannot discover generalized XOR functions with simple connectivity -- only with complex connectivity. This problem results in exponential growth of needed resources as the number of bits in the generalized XOR increases.<br>
<br>
Assuming that "generalized XOR" means parity, this must rely on some<br>
unusual definitions which you should probably state in order to avoid<br>
confusion.<br>
<br>
Parity is a poster boy for an *easy* function to learn, albeit a<br>
nonlinear one. This is because in the (boolean) Fourier domain its<br>
spectrum consists of a single nonzero coefficient, and functions that<br>
are sparse in that domain are very easy to learn. See N. Linial, Y.<br>
Mansour, and N. Nisan, "Constant depth circuits, Fourier Transform and<br>
learnability", FOCS 1989, or Mansour, Y. (1994). Learning Boolean<br>
Functions via the Fourier Transform. Theoretical Advances in Neural<br>
Computation and Learning, 391–424. doi:10.1007/978-1-4615-2696-4_11<br>
<br>
--Barak Pearlmutter<br>
</blockquote></div>