<html><head><style type='text/css'>body { font-family: Arial; font-size: 10pt; }p { margin: 0; }</style></head><body><font face="Arial"><span style="font-size: 10pt;">Dear all,</span></font><div style="font-family: Arial; font-size: 10pt;"><br></div><div style="font-family: Arial; font-size: 10pt;">I am an external observer of your interesting discussions. It has been a bit surprising to me that the work of prof. <span style="font-size: 10pt;"> </span><span style="font-size: 10pt;">Nikolic</span><span style="font-size: 10pt;"> does not  </span><font face="Arial" style="font-size: 13px;"><span style="font-size: 10pt;">comment the work </span></font><font face="Arial" style="font-size: 13px;"><span style="font-size: 13.3333px;">@book{Fus:05,</span></font>  title={Cortex and mind: Unifying cognition}, author={J.M. Fuster}, year={2005}, publisher={Oxford university press}}, which<i> I feel </i>as the highly relevant predecessor of his work.</div><div style="font-family: Arial; font-size: 10pt;"><br></div><div style="font-family: Arial; font-size: 10pt;">                       Best regards  Miroslav Karny</div><div style="font-family: Arial; font-size: 10pt;">                                              https://www.utia.cas.cz/people/karny</div><div style="font-family: Arial; font-size: 10pt;"><br>Danko Nikolic  wrote:<br><blockquote style="margin-left: 8px; padding-left: 8px; border-left: 1px solid lightgrey">
                Dear Gary and everyone,<br><br>I am continuing the discussion from where we left off a few months ago.<br>Back then, some of us agreed that the problem of understanding remains<br>unsolved.<br><br>As a reminder, the challenge for connectionism was to 1) learn with few<br>examples and 2) apply the knowledge to a broad set of situations.<br><br>I am happy to announce that I have now finished a draft of a paper in which<br>I propose how the brain is able to achieve that. The manuscript requires a<br>bit of patience for two reasons: one is that the reader may be exposed for<br>the first time to certain aspects of brain physiology. The second reason is<br>that it may take some effort to understand the counterintuitive<br>implications of the new ideas (this requires a different way of thinking<br>than what we are used to based on connectionism).<br><br>In short, I am suggesting that instead of the connectionist paradigm, we<br>adopt transient selection of subnetworks. The mechanisms that transiently<br>select brain subnetworks are distributed all over the nervous system and, I<br>argue, are our main machinery for thinking/cognition. The surprising<br>outcome is that neural activation, which was central in connectionism, now<br>plays only a supportive role, while the real 'workers' within the brain are<br>the mechanisms for transient selection of subnetworks.<br><br>I also explain how I think transient selection achieves learning with only<br>a few examples and how the learned knowledge is possible to apply to a<br>broad set of situations.<br><br>The manuscript is made available to everyone and can be downloaded here:<br><a href="https://bit.ly/3IFs8Ug" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%62%69%74%2E%6C%79%2F%33%49%46%73%38%55%67" target="_blank">https://bit.ly/3IFs8Ug</a><br>(I apologize for the neuroscience lingo, which I tried to minimize.)<br><br>It will likely take a wide effort to implement these concepts as an AI<br>technology, provided my ideas do not have a major flaw in the first place.<br>Does anyone see a flaw?<br><br>Thanks.<br><br>Danko<br><br><br>Dr. Danko Nikolić<br><a href="" data-saferedirecturl="redir.hsp?url=%68%74%74%70%3A%2F%2F%77%77%77%2E%64%61%6E%6B%6F%2D%6E%69%6B%6F%6C%69%63%2E%63%6F%6D" target="_blank">http://www.danko-nikolic.com</a><br><a href="https://www.linkedin.com/in/danko-nikolic/" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%77%77%77%2E%6C%69%6E%6B%65%64%69%6E%2E%63%6F%6D%2F%69%6E%2F%64%61%6E%6B%6F%2D%6E%69%6B%6F%6C%69%63%2F" target="_blank">https://www.linkedin.com/in/danko-nikolic/</a><br><br><br>On Thu, Feb 3, 2022 at 5:25 PM Gary Marcus <a href="mailto:<gary.marcus@nyu.edu>" target="_blank"><gary.marcus@nyu.edu></a> wrote:<br><br>> Dear Danko,<br>><br>> Well said. I had a somewhat similar response to Jeff Dean’s 2021 TED talk,<br>> in which he said (paraphrasing from memory, because I don’t remember the<br>> precise words) that the famous 200 Quoc Le unsupervised model [<br>> <a href="https://static.googleusercontent.com/media/research.google.com/en//archive/unsupervised_icml2012.pdf" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%73%74%61%74%69%63%2E%67%6F%6F%67%6C%65%75%73%65%72%63%6F%6E%74%65%6E%74%2E%63%6F%6D%2F%6D%65%64%69%61%2F%72%65%73%65%61%72%63%68%2E%67%6F%6F%67%6C%65%2E%63%6F%6D%2F%65%6E%2F%2F%61%72%63%68%69%76%65%2F%75%6E%73%75%70%65%72%76%69%73%65%64%5F%69%63%6D%6C%32%30%31%32%2E%70%64%66" target="_blank">https://static.googleusercontent.com/media/research.google.com/en//archive/unsupervised_icml2012.pdf</a>]<br>> had learned the concept of a ca. In reality the model had clustered<br>> together some catlike images based on the image statistics that it had<br>> extracted, but it was a long way from a full, counterfactual-supporting<br>> concept of a cat, much as you describe below.<br>><br>> I fully agree with you that the reason for even having a semantics is as<br>> you put it, "to 1) learn with a few examples and 2) apply the knowledge to<br>> a broad set of situations.” GPT-3 sometimes gives the appearance of having<br>> done so, but it falls apart under close inspection, so the problem remains<br>> unsolved.<br>><br>> Gary<br>><br>> On Feb 3, 2022, at 3:19 AM, Danko Nikolic <a href="mailto:<danko.nikolic@gmail.com>" target="_blank"><danko.nikolic@gmail.com></a> wrote:<br>><br>> G. Hinton wrote: "I believe that any reasonable person would admit that if<br>> you ask a neural net to draw a picture of a hamster wearing a red hat and<br>> it draws such a picture, it understood the request."<br>><br>> I would like to suggest why drawing a hamster with a red hat does not<br>> necessarily imply understanding of the statement "hamster wearing a red<br>> hat".<br>> To understand that "hamster wearing a red hat" would mean inferring, in<br>> newly emerging situations of this hamster, all the real-life<br>> implications that the red hat brings to the little animal.<br>><br>> What would happen to the hat if the hamster rolls on its back? (Would the<br>> hat fall off?)<br>> What would happen to the red hat when the hamster enters its lair? (Would<br>> the hat fall off?)<br>> What would happen to that hamster when it goes foraging? (Would the red<br>> hat have an influence on finding food?)<br>> What would happen in a situation of being chased by a predator? (Would it<br>> be easier for predators to spot the hamster?)<br>><br>> ...and so on.<br>><br>> Countless many questions can be asked. One has understood "hamster wearing<br>> a red hat" only if one can answer reasonably well many of such real-life<br>> relevant questions. Similarly, a student has understood materias in a class<br>> only if they can apply the materials in real-life situations (e.g.,<br>> applying Pythagora's theorem). If a student gives a correct answer to a<br>> multiple choice question, we don't know whether the student understood the<br>> material or whether this was just rote learning (often, it is rote<br>> learning).<br>><br>> I also suggest that understanding also comes together with effective<br>> learning: We store new information in such a way that we can recall it<br>> later and use it effectively  i.e., make good inferences in newly emerging<br>> situations based on this knowledge.<br>><br>> In short: Understanding makes us humans able to 1) learn with a few<br>> examples and 2) apply the knowledge to a broad set of situations.<br>><br>> No neural network today has such capabilities and we don't know how to<br>> give them such capabilities. Neural networks need large amounts of<br>> training examples that cover a large variety of situations and then<br>> the networks can only deal with what the training examples have already<br>> covered. Neural networks cannot extrapolate in that 'understanding' sense.<br>><br>> I suggest that understanding truly extrapolates from a piece of knowledge.<br>> It is not about satisfying a task such as translation between languages or<br>> drawing hamsters with hats. It is how you got the capability to complete<br>> the task: Did you only have a few examples that covered something different<br>> but related and then you extrapolated from that knowledge? If yes, this is<br>> going in the direction of understanding. Have you seen countless examples<br>> and then interpolated among them? Then perhaps it is not understanding.<br>><br>> So, for the case of drawing a hamster wearing a red hat, understanding<br>> perhaps would have taken place if the following happened before that:<br>><br>> 1) first, the network learned about hamsters (not many examples)<br>> 2) after that the network learned about red hats (outside the context of<br>> hamsters and without many examples)<br>> 3) finally the network learned about drawing (outside of the context of<br>> hats and hamsters, not many examples)<br>><br>> After that, the network is asked to draw a hamster with a red hat. If it<br>> does it successfully, maybe we have started cracking the problem of<br>> understanding.<br>><br>> Note also that this requires the network to learn sequentially without<br>> exhibiting catastrophic forgetting of the previous knowledge, which is<br>> possibly also a consequence of human learning by understanding.<br>><br>><br>> Danko<br>><br>><br>><br>><br>><br>><br>> Dr. Danko Nikolić<br>> <a href="" data-saferedirecturl="redir.hsp?url=%68%74%74%70%3A%2F%2F%77%77%77%2E%64%61%6E%6B%6F%2D%6E%69%6B%6F%6C%69%63%2E%63%6F%6D" target="_blank">http://www.danko-nikolic.com</a><br>> <<a href="https://urldefense.proofpoint.com/v2/url?u=http-3A__www.danko-2Dnikolic.com&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=HwOLDw6UCRzU5-FPSceKjtpNm7C6sZQU5kuGAMVbPaI&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%2D%33%41%5F%5F%77%77%77%2E%64%61%6E%6B%6F%2D%32%44%6E%69%6B%6F%6C%69%63%2E%63%6F%6D%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%77%61%53%4B%59%36%37%4A%46%35%37%49%5A%58%67%33%30%79%73%46%42%5F%52%37%4F%47%39%7A%6F%51%77%46%77%78%79%70%73%36%46%62%54%61%31%5A%68%35%6D%74%74%78%52%6F%74%5F%74%34%4E%37%6D%6E%36%38%50%6A%26%73%3D%48%77%4F%4C%44%77%36%55%43%52%7A%55%35%2D%46%50%53%63%65%4B%6A%74%70%4E%6D%37%43%36%73%5A%51%55%35%6B%75%47%41%4D%56%62%50%61%49%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=http-3A__www.danko-2Dnikolic.com&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=HwOLDw6UCRzU5-FPSceKjtpNm7C6sZQU5kuGAMVbPaI&e=</a>><br>> <a href="https://www.linkedin.com/in/danko-nikolic/" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%77%77%77%2E%6C%69%6E%6B%65%64%69%6E%2E%63%6F%6D%2F%69%6E%2F%64%61%6E%6B%6F%2D%6E%69%6B%6F%6C%69%63%2F" target="_blank">https://www.linkedin.com/in/danko-nikolic/</a><br>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.linkedin.com_in_danko-2Dnikolic_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=b70c8lokmxM3Kz66OfMIM4pROgAhTJOAlp205vOmCQ8&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%77%77%77%2E%6C%69%6E%6B%65%64%69%6E%2E%63%6F%6D%5F%69%6E%5F%64%61%6E%6B%6F%2D%32%44%6E%69%6B%6F%6C%69%63%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%77%61%53%4B%59%36%37%4A%46%35%37%49%5A%58%67%33%30%79%73%46%42%5F%52%37%4F%47%39%7A%6F%51%77%46%77%78%79%70%73%36%46%62%54%61%31%5A%68%35%6D%74%74%78%52%6F%74%5F%74%34%4E%37%6D%6E%36%38%50%6A%26%73%3D%62%37%30%63%38%6C%6F%6B%6D%78%4D%33%4B%7A%36%36%4F%66%4D%49%4D%34%70%52%4F%67%41%68%54%4A%4F%41%6C%70%32%30%35%76%4F%6D%43%51%38%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__www.linkedin.com_in_danko-2Dnikolic_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=b70c8lokmxM3Kz66OfMIM4pROgAhTJOAlp205vOmCQ8&e=</a>><br>> --- A progress usually starts with an insight ---<br>><br>><br>><br>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.avast.com_sig-2Demail-3Futm-5Fmedium-3Demail-26utm-5Fsource-3Dlink-26utm-5Fcampaign-3Dsig-2Demail-26utm-5Fcontent-3Dwebmail&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=Ao9QQWtO62go0hx1tb3NU6xw2FNBadjj8q64-hl5Sx4&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%77%77%77%2E%61%76%61%73%74%2E%63%6F%6D%5F%73%69%67%2D%32%44%65%6D%61%69%6C%2D%33%46%75%74%6D%2D%35%46%6D%65%64%69%75%6D%2D%33%44%65%6D%61%69%6C%2D%32%36%75%74%6D%2D%35%46%73%6F%75%72%63%65%2D%33%44%6C%69%6E%6B%2D%32%36%75%74%6D%2D%35%46%63%61%6D%70%61%69%67%6E%2D%33%44%73%69%67%2D%32%44%65%6D%61%69%6C%2D%32%36%75%74%6D%2D%35%46%63%6F%6E%74%65%6E%74%2D%33%44%77%65%62%6D%61%69%6C%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%77%61%53%4B%59%36%37%4A%46%35%37%49%5A%58%67%33%30%79%73%46%42%5F%52%37%4F%47%39%7A%6F%51%77%46%77%78%79%70%73%36%46%62%54%61%31%5A%68%35%6D%74%74%78%52%6F%74%5F%74%34%4E%37%6D%6E%36%38%50%6A%26%73%3D%41%6F%39%51%51%57%74%4F%36%32%67%6F%30%68%78%31%74%62%33%4E%55%36%78%77%32%46%4E%42%61%64%6A%6A%38%71%36%34%2D%68%6C%35%53%78%34%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__www.avast.com_sig-2Demail-3Futm-5Fmedium-3Demail-26utm-5Fsource-3Dlink-26utm-5Fcampaign-3Dsig-2Demail-26utm-5Fcontent-3Dwebmail&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=Ao9QQWtO62go0hx1tb3NU6xw2FNBadjj8q64-hl5Sx4&e=</a>> Virus-free.<br>> <a href="" data-saferedirecturl="redir.hsp?url=%68%74%74%70%3A%2F%2F%77%77%77%2E%61%76%61%73%74%2E%63%6F%6D" target="_blank">http://www.avast.com</a><br>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.avast.com_sig-2Demail-3Futm-5Fmedium-3Demail-26utm-5Fsource-3Dlink-26utm-5Fcampaign-3Dsig-2Demail-26utm-5Fcontent-3Dwebmail&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=Ao9QQWtO62go0hx1tb3NU6xw2FNBadjj8q64-hl5Sx4&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%77%77%77%2E%61%76%61%73%74%2E%63%6F%6D%5F%73%69%67%2D%32%44%65%6D%61%69%6C%2D%33%46%75%74%6D%2D%35%46%6D%65%64%69%75%6D%2D%33%44%65%6D%61%69%6C%2D%32%36%75%74%6D%2D%35%46%73%6F%75%72%63%65%2D%33%44%6C%69%6E%6B%2D%32%36%75%74%6D%2D%35%46%63%61%6D%70%61%69%67%6E%2D%33%44%73%69%67%2D%32%44%65%6D%61%69%6C%2D%32%36%75%74%6D%2D%35%46%63%6F%6E%74%65%6E%74%2D%33%44%77%65%62%6D%61%69%6C%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%77%61%53%4B%59%36%37%4A%46%35%37%49%5A%58%67%33%30%79%73%46%42%5F%52%37%4F%47%39%7A%6F%51%77%46%77%78%79%70%73%36%46%62%54%61%31%5A%68%35%6D%74%74%78%52%6F%74%5F%74%34%4E%37%6D%6E%36%38%50%6A%26%73%3D%41%6F%39%51%51%57%74%4F%36%32%67%6F%30%68%78%31%74%62%33%4E%55%36%78%77%32%46%4E%42%61%64%6A%6A%38%71%36%34%2D%68%6C%35%53%78%34%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__www.avast.com_sig-2Demail-3Futm-5Fmedium-3Demail-26utm-5Fsource-3Dlink-26utm-5Fcampaign-3Dsig-2Demail-26utm-5Fcontent-3Dwebmail&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=Ao9QQWtO62go0hx1tb3NU6xw2FNBadjj8q64-hl5Sx4&e=</a>><br>><br>> On Thu, Feb 3, 2022 at 9:55 AM Asim Roy <a href="mailto:<ASIM.ROY@asu.edu>" target="_blank"><ASIM.ROY@asu.edu></a> wrote:<br>><br>>> Without getting into the specific dispute between Gary and Geoff, I think<br>>> with approaches similar to GLOM, we are finally headed in the right<br>>> direction. There’s plenty of neurophysiological evidence for single-cell<br>>> abstractions and multisensory neurons in the brain, which one might claim<br>>> correspond to symbols. And I think we can finally reconcile the decades old<br>>> dispute between Symbolic AI and Connectionism.<br>>><br>>><br>>><br>>> GARY: (Your GLOM, which as you know I praised publicly, is in many ways<br>>> an effort to wind up with encodings that effectively serve as symbols in<br>>> exactly that way, guaranteed to serve as consistent representations of<br>>> specific concepts.)<br>>><br>>> GARY: I have *never* called for dismissal of neural networks, but rather<br>>> for some hybrid between the two (as you yourself contemplated in 1991); the<br>>> point of the 2001 book was to characterize exactly where multilayer<br>>> perceptrons succeeded and broke down, and where symbols could complement<br>>> them.<br>>><br>>><br>>><br>>> Asim Roy<br>>><br>>> Professor, Information Systems<br>>><br>>> Arizona State University<br>>><br>>> Lifeboat Foundation Bios: Professor Asim Roy<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__lifeboat.com_ex_bios.asim.roy&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=oDRJmXX22O8NcfqyLjyu4Ajmt8pcHWquTxYjeWahfuw&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%6C%69%66%65%62%6F%61%74%2E%63%6F%6D%5F%65%78%5F%62%69%6F%73%2E%61%73%69%6D%2E%72%6F%79%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%77%61%53%4B%59%36%37%4A%46%35%37%49%5A%58%67%33%30%79%73%46%42%5F%52%37%4F%47%39%7A%6F%51%77%46%77%78%79%70%73%36%46%62%54%61%31%5A%68%35%6D%74%74%78%52%6F%74%5F%74%34%4E%37%6D%6E%36%38%50%6A%26%73%3D%6F%44%52%4A%6D%58%58%32%32%4F%38%4E%63%66%71%79%4C%6A%79%75%34%41%6A%6D%74%38%70%63%48%57%71%75%54%78%59%6A%65%57%61%68%66%75%77%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__lifeboat.com_ex_bios.asim.roy&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=oDRJmXX22O8NcfqyLjyu4Ajmt8pcHWquTxYjeWahfuw&e=</a>><br>>><br>>> Asim Roy | iSearch (asu.edu)<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__isearch.asu.edu_profile_9973&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=jCesWT7oGgX76_y7PFh4cCIQ-Ife-esGblJyrBiDlro&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%69%73%65%61%72%63%68%2E%61%73%75%2E%65%64%75%5F%70%72%6F%66%69%6C%65%5F%39%39%37%33%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%77%61%53%4B%59%36%37%4A%46%35%37%49%5A%58%67%33%30%79%73%46%42%5F%52%37%4F%47%39%7A%6F%51%77%46%77%78%79%70%73%36%46%62%54%61%31%5A%68%35%6D%74%74%78%52%6F%74%5F%74%34%4E%37%6D%6E%36%38%50%6A%26%73%3D%6A%43%65%73%57%54%37%6F%47%67%58%37%36%5F%79%37%50%46%68%34%63%43%49%51%2D%49%66%65%2D%65%73%47%62%6C%4A%79%72%42%69%44%6C%72%6F%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__isearch.asu.edu_profile_9973&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=jCesWT7oGgX76_y7PFh4cCIQ-Ife-esGblJyrBiDlro&e=</a>><br>>><br>>><br>>><br>>><br>>><br>>> *From:* Connectionists <a href="mailto:<connectionists-bounces@mailman.srv.cs.cmu.edu>" target="_blank"><connectionists-bounces@mailman.srv.cs.cmu.edu></a> *On<br>>> Behalf Of *Gary Marcus<br>>> *Sent:* Wednesday, February 2, 2022 1:26 PM<br>>> *To:* Geoffrey Hinton <a href="mailto:<geoffrey.hinton@gmail.com>" target="_blank"><geoffrey.hinton@gmail.com></a><br>>> *Cc:* AIhub <a href="mailto:<aihuborg@gmail.com>" target="_blank"><aihuborg@gmail.com></a>; <a href="mailto:connectionists@mailman.srv.cs.cmu.edu" target="_blank">connectionists@mailman.srv.cs.cmu.edu</a><br>>> *Subject:* Re: Connectionists: Stephen Hanson in conversation with Geoff<br>>> Hinton<br>>><br>>><br>>><br>>> Dear Geoff, and interested others,<br>>><br>>><br>>><br>>> What, for example, would you make of a system that often drew the<br>>> red-hatted hamster you requested, and perhaps a fifth of the time gave you<br>>> utter nonsense?  Or say one that you trained to create birds but sometimes<br>>> output stuff like this:<br>>><br>>><br>>><br>>> <image001.png><br>>><br>>><br>>><br>>> One could<br>>><br>>><br>>><br>>> a. avert one’s eyes and deem the anomalous outputs irrelevant<br>>><br>>> or<br>>><br>>> b. wonder if it might be possible that sometimes the system gets the<br>>> right answer for the wrong reasons (eg partial historical contingency), and<br>>> wonder whether another approach might be indicated.<br>>><br>>><br>>><br>>> Benchmarks are harder than they look; most of the field has come to<br>>> recognize that. The Turing Test has turned out to be a lousy measure of<br>>> intelligence, easily gamed. It has turned out empirically that the Winograd<br>>> Schema Challenge did not measure common sense as well as Hector might have<br>>> thought. (As it happens, I am a minor coauthor of a very recent review on<br>>> this very topic: <a href="https://arxiv.org/abs/2201.02387" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%61%72%78%69%76%2E%6F%72%67%2F%61%62%73%2F%32%32%30%31%2E%30%32%33%38%37" target="_blank">https://arxiv.org/abs/2201.02387</a><br>>> <<a href="https://urldefense.com/v3/__https:/arxiv.org/abs/2201.02387__;!!IKRxdwAv5BmarQ!INA0AMmG3iD1B8MDtLfjWCwcBjxO-e-eM2Ci9KEO_XYOiIEgiywK-G_8j6L3bHA$" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%63%6F%6D%2F%76%33%2F%5F%5F%68%74%74%70%73%3A%2F%61%72%78%69%76%2E%6F%72%67%2F%61%62%73%2F%32%32%30%31%2E%30%32%33%38%37%5F%5F%3B%21%21%49%4B%52%78%64%77%41%76%35%42%6D%61%72%51%21%49%4E%41%30%41%4D%6D%47%33%69%44%31%42%38%4D%44%74%4C%66%6A%57%43%77%63%42%6A%78%4F%2D%65%2D%65%4D%32%43%69%39%4B%45%4F%5F%58%59%4F%69%49%45%67%69%79%77%4B%2D%47%5F%38%6A%36%4C%33%62%48%41%24" target="_blank">https://urldefense.com/v3/__https:/arxiv.org/abs/2201.02387__;!!IKRxdwAv5BmarQ!INA0AMmG3iD1B8MDtLfjWCwcBjxO-e-eM2Ci9KEO_XYOiIEgiywK-G_8j6L3bHA$</a>>)<br>>> But its conquest in no way means machines now have common sense; many<br>>> people from many different perspectives recognize that (including, e.g.,<br>>> Yann LeCun, who generally tends to be more aligned with you than with me).<br>>><br>>><br>>><br>>> So: on the goalpost of the Winograd schema, I was wrong, and you can<br>>> quote me; but what you said about me and machine translation remains your<br>>> invention, and it is inexcusable that you simply ignored my 2019<br>>> clarification. On the essential goal of trying to reach meaning and<br>>> understanding, I remain unmoved; the problem remains unsolved.<br>>><br>>><br>>><br>>> All of the problems LLMs have with coherence, reliability, truthfulness,<br>>> misinformation, etc stand witness to that fact. (Their persistent inability<br>>> to filter out toxic and insulting remarks stems from the same.) I am hardly<br>>> the only person in the field to see that progress on any given benchmark<br>>> does not inherently mean that the deep underlying problems have solved.<br>>> You, yourself, in fact, have occasionally made that point.<br>>><br>>><br>>><br>>> With respect to embeddings: Embeddings are very good for natural language<br>>> *processing*; but NLP is not the same as NL*U* – when it comes to<br>>> *understanding*, their worth is still an open question. Perhaps they<br>>> will turn out to be necessary; they clearly aren’t sufficient. In their<br>>> extreme, they might even collapse into being symbols, in the sense of<br>>> uniquely identifiable encodings, akin to the ASCII code, in which a<br>>> specific set of numbers stands for a specific word or concept. (Wouldn’t<br>>> that be ironic?)<br>>><br>>><br>>><br>>> (Your GLOM, which as you know I praised publicly, is in many ways an<br>>> effort to wind up with encodings that effectively serve as symbols in<br>>> exactly that way, guaranteed to serve as consistent representations of<br>>> specific concepts.)<br>>><br>>><br>>><br>>> Notably absent from your email is any kind of apology for misrepresenting<br>>> my position. It’s fine to say that “many people thirty years ago once<br>>> thought X” and another to say “Gary Marcus said X in 2015”, when I didn’t.<br>>> I have consistently felt throughout our interactions that you have mistaken<br>>> me for Zenon Pylyshyn; indeed, you once (at NeurIPS 2014) apologized to me<br>>> for having made that error. I am still not he.<br>>><br>>><br>>><br>>> Which maybe connects to the last point; if you read my work, you would<br>>> see thirty years of arguments *for* neural networks, just not in the way<br>>> that you want them to exist. I have ALWAYS argued that there is a role for<br>>> them;  characterizing me as a person “strongly opposed to neural networks”<br>>> misses the whole point of my 2001 book, which was subtitled “Integrating<br>>> Connectionism and Cognitive Science.”<br>>><br>>><br>>><br>>> In the last two decades or so you have insisted (for reasons you have<br>>> never fully clarified, so far as I know) on abandoning symbol-manipulation,<br>>> but the reverse is not the case: I have *never* called for dismissal of<br>>> neural networks, but rather for some hybrid between the two (as you<br>>> yourself contemplated in 1991); the point of the 2001 book was to<br>>> characterize exactly where multilayer perceptrons succeeded and broke down,<br>>> and where symbols could complement them. It’s a rhetorical trick (which is<br>>> what the previous thread was about) to pretend otherwise.<br>>><br>>><br>>><br>>> Gary<br>>><br>>><br>>><br>>><br>>><br>>> On Feb 2, 2022, at 11:22, Geoffrey Hinton <a href="mailto:<geoffrey.hinton@gmail.com>" target="_blank"><geoffrey.hinton@gmail.com></a><br>>> wrote:<br>>><br>>> <br>>><br>>> Embeddings are just vectors of soft feature detectors and they are very<br>>> good for NLP.  The quote on my webpage from Gary's 2015 chapter implies the<br>>> opposite.<br>>><br>>><br>>><br>>> A few decades ago, everyone I knew then would have agreed that the<br>>> ability to translate a sentence into many different languages was strong<br>>> evidence that you understood it.<br>>><br>>><br>>><br>>> But once neural networks could do that, their critics moved the<br>>> goalposts. An exception is Hector Levesque who defined the goalposts more<br>>> sharply by saying that the ability to get pronoun references correct in<br>>> Winograd sentences is a crucial test. Neural nets are improving at that but<br>>> still have some way to go. Will Gary agree that when they can get pronoun<br>>> references correct in Winograd sentences they really do understand? Or does<br>>> he want to reserve the right to weasel out of that too?<br>>><br>>><br>>><br>>> Some people, like Gary, appear to be strongly opposed to neural networks<br>>> because they do not fit their preconceived notions of how the mind should<br>>> work.<br>>><br>>> I believe that any reasonable person would admit that if you ask a neural<br>>> net to draw a picture of a hamster wearing a red hat and it draws such a<br>>> picture, it understood the request.<br>>><br>>><br>>><br>>> Geoff<br>>><br>>><br>>><br>>><br>>><br>>><br>>><br>>><br>>><br>>><br>>><br>>> On Wed, Feb 2, 2022 at 1:38 PM Gary Marcus <a href="mailto:<gary.marcus@nyu.edu>" target="_blank"><gary.marcus@nyu.edu></a> wrote:<br>>><br>>> Dear AI Hub, cc: Steven Hanson and Geoffrey Hinton, and the larger neural<br>>> network community,<br>>><br>>><br>>><br>>> There has been a lot of recent discussion on this list about framing and<br>>> scientific integrity. Often the first step in restructuring narratives is<br>>> to bully and dehumanize critics. The second is to misrepresent their<br>>> position. People in positions of power are sometimes tempted to do this.<br>>><br>>><br>>><br>>> The Hinton-Hanson interview that you just published is a real-time<br>>> example of just that. It opens with a needless and largely content-free<br>>> personal attack on a single scholar (me), with the explicit intention of<br>>> discrediting that person. Worse, the only substantive thing it says is<br>>> false.<br>>><br>>><br>>><br>>> Hinton says “In 2015 he [Marcus] made a prediction that computers<br>>> wouldn’t be able to do machine translation.”<br>>><br>>><br>>><br>>> I never said any such thing.<br>>><br>>><br>>><br>>> What I predicted, rather, was that multilayer perceptrons, as they<br>>> existed then, would not (on their own, absent other mechanisms)<br>>> *understand* language. Seven years later, they still haven’t, except in<br>>> the most superficial way.<br>>><br>>><br>>><br>>> I made no comment whatsoever about machine translation, which I view as a<br>>> separate problem, solvable to a certain degree by correspondance without<br>>> semantics.<br>>><br>>><br>>><br>>> I specifically tried to clarify Hinton’s confusion in 2019, but,<br>>> disappointingly, he has continued to purvey misinformation despite that<br>>> clarification. Here is what I wrote privately to him then, which should<br>>> have put the matter to rest:<br>>><br>>><br>>><br>>> You have taken a single out of context quote [from 2015] and<br>>> misrepresented it. The quote, which you have prominently displayed at the<br>>> bottom on your own web page, says:<br>>><br>>><br>>><br>>> Hierarchies of features are less suited to challenges such as language,<br>>> inference, and high-level planning. For example, as Noam Chomsky famously<br>>> pointed out, language is filled with sentences you haven't seen<br>>> before. Pure classifier systems don't know what to do with such sentences.<br>>> The talent of feature detectors -- in  identifying which member of some<br>>> category something belongs to -- doesn't translate into understanding<br>>> novel  sentences, in which each sentence has its own unique meaning.<br>>><br>>><br>>><br>>> It does *not* say "neural nets would not be able to deal with novel<br>>> sentences"; it says that hierachies of features detectors (on their own, if<br>>> you read the context of the essay) would have trouble *understanding *novel sentences.<br>>><br>>><br>>><br>>><br>>> Google Translate does yet not *understand* the content of the sentences<br>>> is translates. It cannot reliably answer questions about who did what to<br>>> whom, or why, it cannot infer the order of the events in paragraphs, it<br>>> can't determine the internal consistency of those events, and so forth.<br>>><br>>><br>>><br>>> Since then, a number of scholars, such as the the computational linguist<br>>> Emily Bender, have made similar points, and indeed current LLM difficulties<br>>> with misinformation, incoherence and fabrication all follow from these<br>>> concerns. Quoting from Bender’s prizewinning 2020 ACL article on the matter<br>>> with Alexander Koller, <a href="https://aclanthology.org/2020.acl-main.463.pdf" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%61%63%6C%61%6E%74%68%6F%6C%6F%67%79%2E%6F%72%67%2F%32%30%32%30%2E%61%63%6C%2D%6D%61%69%6E%2E%34%36%33%2E%70%64%66" target="_blank">https://aclanthology.org/2020.acl-main.463.pdf</a><br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__aclanthology.org_2020.acl-2Dmain.463.pdf&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=xnFSVUARkfmiXtiTP_uXfFKv4uNEGgEeTluRFR7dnUpay2BM5EiLz-XYCkBNJLlL&s=K-Vl6vSvzuYtRMi-s4j7mzPkNRTb-I6Zmf7rbuKEBpk&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%61%63%6C%61%6E%74%68%6F%6C%6F%67%79%2E%6F%72%67%5F%32%30%32%30%2E%61%63%6C%2D%32%44%6D%61%69%6E%2E%34%36%33%2E%70%64%66%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%78%6E%46%53%56%55%41%52%6B%66%6D%69%58%74%69%54%50%5F%75%58%66%46%4B%76%34%75%4E%45%47%67%45%65%54%6C%75%52%46%52%37%64%6E%55%70%61%79%32%42%4D%35%45%69%4C%7A%2D%58%59%43%6B%42%4E%4A%4C%6C%4C%26%73%3D%4B%2D%56%6C%36%76%53%76%7A%75%59%74%52%4D%69%2D%73%34%6A%37%6D%7A%50%6B%4E%52%54%62%2D%49%36%5A%6D%66%37%72%62%75%4B%45%42%70%6B%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__aclanthology.org_2020.acl-2Dmain.463.pdf&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=xnFSVUARkfmiXtiTP_uXfFKv4uNEGgEeTluRFR7dnUpay2BM5EiLz-XYCkBNJLlL&s=K-Vl6vSvzuYtRMi-s4j7mzPkNRTb-I6Zmf7rbuKEBpk&e=</a>>,<br>>> also emphasizing issues of understanding and meaning:<br>>><br>>><br>>><br>>> *The success of the large neural language models on many NLP tasks is<br>>> exciting. However, we find that these successes sometimes lead to hype in<br>>> which these models are being described as “understanding” language or<br>>> capturing “meaning”. In this position paper, we argue that a system trained<br>>> only on form has a priori no way to learn meaning. .. a clear understanding<br>>> of the distinction between form and meaning will help guide the field<br>>> towards better science around natural language understanding. *<br>>><br>>><br>>><br>>> Her later article with Gebru on language models “stochastic parrots” is<br>>> in some ways an extension of this point; machine translation requires<br>>> mimicry, true understanding (which is what I was discussing in 2015)<br>>> requires something deeper than that.<br>>><br>>><br>>><br>>> Hinton’s intellectual error here is in equating machine translation with<br>>> the deeper comprehension that robust natural language understanding will<br>>> require; as Bender and Koller observed, the two appear not to be the same.<br>>> (There is a longer discussion of the relation between language<br>>> understanding and machine translation, and why the latter has turned out to<br>>> be more approachable than the former, in my 2019 book with Ernest Davis).<br>>><br>>><br>>><br>>> More broadly, Hinton’s ongoing dismissiveness of research from<br>>> perspectives other than his own (e.g. linguistics) have done the field a<br>>> disservice.<br>>><br>>><br>>><br>>> As Herb Simon once observed, science does not have to be zero-sum.<br>>><br>>><br>>><br>>> Sincerely,<br>>><br>>> Gary Marcus<br>>><br>>> Professor Emeritus<br>>><br>>> New York University<br>>><br>>><br>>><br>>> On Feb 2, 2022, at 06:12, AIhub <a href="mailto:<aihuborg@gmail.com>" target="_blank"><aihuborg@gmail.com></a> wrote:<br>>><br>>> <br>>><br>>> Stephen Hanson in conversation with Geoff Hinton<br>>><br>>><br>>><br>>> In the latest episode of this video series for AIhub.org<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=http-3A__AIhub.org&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=xnFSVUARkfmiXtiTP_uXfFKv4uNEGgEeTluRFR7dnUpay2BM5EiLz-XYCkBNJLlL&s=eOtzMh8ILIH5EF7K20Ks4Fr27XfNV_F24bkj-SPk-2A&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%2D%33%41%5F%5F%41%49%68%75%62%2E%6F%72%67%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%78%6E%46%53%56%55%41%52%6B%66%6D%69%58%74%69%54%50%5F%75%58%66%46%4B%76%34%75%4E%45%47%67%45%65%54%6C%75%52%46%52%37%64%6E%55%70%61%79%32%42%4D%35%45%69%4C%7A%2D%58%59%43%6B%42%4E%4A%4C%6C%4C%26%73%3D%65%4F%74%7A%4D%68%38%49%4C%49%48%35%45%46%37%4B%32%30%4B%73%34%46%72%32%37%58%66%4E%56%5F%46%32%34%62%6B%6A%2D%53%50%6B%2D%32%41%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=http-3A__AIhub.org&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=xnFSVUARkfmiXtiTP_uXfFKv4uNEGgEeTluRFR7dnUpay2BM5EiLz-XYCkBNJLlL&s=eOtzMh8ILIH5EF7K20Ks4Fr27XfNV_F24bkj-SPk-2A&e=</a>>,<br>>> Stephen Hanson talks to  Geoff Hinton about neural networks,<br>>> backpropagation, overparameterization, digit recognition, voxel cells,<br>>> syntax and semantics, Winograd sentences, and more.<br>>><br>>><br>>><br>>> You can watch the discussion, and read the transcript, here:<br>>><br>>><br>>> <a href="https://aihub.org/2022/02/02/what-is-ai-stephen-hanson-in-conversation-with-geoff-hinton/" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%61%69%68%75%62%2E%6F%72%67%2F%32%30%32%32%2F%30%32%2F%30%32%2F%77%68%61%74%2D%69%73%2D%61%69%2D%73%74%65%70%68%65%6E%2D%68%61%6E%73%6F%6E%2D%69%6E%2D%63%6F%6E%76%65%72%73%61%74%69%6F%6E%2D%77%69%74%68%2D%67%65%6F%66%66%2D%68%69%6E%74%6F%6E%2F" target="_blank">https://aihub.org/2022/02/02/what-is-ai-stephen-hanson-in-conversation-with-geoff-hinton/</a><br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__aihub.org_2022_02_02_what-2Dis-2Dai-2Dstephen-2Dhanson-2Din-2Dconversation-2Dwith-2Dgeoff-2Dhinton_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=OY_RYGrfxOqV7XeNJDHuzE--aEtmNRaEyQ0VJkqFCWw&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%61%69%68%75%62%2E%6F%72%67%5F%32%30%32%32%5F%30%32%5F%30%32%5F%77%68%61%74%2D%32%44%69%73%2D%32%44%61%69%2D%32%44%73%74%65%70%68%65%6E%2D%32%44%68%61%6E%73%6F%6E%2D%32%44%69%6E%2D%32%44%63%6F%6E%76%65%72%73%61%74%69%6F%6E%2D%32%44%77%69%74%68%2D%32%44%67%65%6F%66%66%2D%32%44%68%69%6E%74%6F%6E%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%4F%59%5F%52%59%47%72%66%78%4F%71%56%37%58%65%4E%4A%44%48%75%7A%45%2D%2D%61%45%74%6D%4E%52%61%45%79%51%30%56%4A%6B%71%46%43%57%77%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__aihub.org_2022_02_02_what-2Dis-2Dai-2Dstephen-2Dhanson-2Din-2Dconversation-2Dwith-2Dgeoff-2Dhinton_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=OY_RYGrfxOqV7XeNJDHuzE--aEtmNRaEyQ0VJkqFCWw&e=</a>><br>>><br>>><br>>><br>>> About AIhub:<br>>><br>>> AIhub is a non-profit dedicated to connecting the AI community to the<br>>> public by providing free, high-quality information through AIhub.org<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=http-3A__AIhub.org&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=xnFSVUARkfmiXtiTP_uXfFKv4uNEGgEeTluRFR7dnUpay2BM5EiLz-XYCkBNJLlL&s=eOtzMh8ILIH5EF7K20Ks4Fr27XfNV_F24bkj-SPk-2A&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%2D%33%41%5F%5F%41%49%68%75%62%2E%6F%72%67%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%78%6E%46%53%56%55%41%52%6B%66%6D%69%58%74%69%54%50%5F%75%58%66%46%4B%76%34%75%4E%45%47%67%45%65%54%6C%75%52%46%52%37%64%6E%55%70%61%79%32%42%4D%35%45%69%4C%7A%2D%58%59%43%6B%42%4E%4A%4C%6C%4C%26%73%3D%65%4F%74%7A%4D%68%38%49%4C%49%48%35%45%46%37%4B%32%30%4B%73%34%46%72%32%37%58%66%4E%56%5F%46%32%34%62%6B%6A%2D%53%50%6B%2D%32%41%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=http-3A__AIhub.org&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=xnFSVUARkfmiXtiTP_uXfFKv4uNEGgEeTluRFR7dnUpay2BM5EiLz-XYCkBNJLlL&s=eOtzMh8ILIH5EF7K20Ks4Fr27XfNV_F24bkj-SPk-2A&e=</a>><br>>> (<a href="https://aihub.org/" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%61%69%68%75%62%2E%6F%72%67%2F" target="_blank">https://aihub.org/</a><br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__aihub.org_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=IKFanqeMi73gOiS7yD-X_vRx_OqDAwv1Il5psrxnhIA&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%61%69%68%75%62%2E%6F%72%67%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%49%4B%46%61%6E%71%65%4D%69%37%33%67%4F%69%53%37%79%44%2D%58%5F%76%52%78%5F%4F%71%44%41%77%76%31%49%6C%35%70%73%72%78%6E%68%49%41%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__aihub.org_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=IKFanqeMi73gOiS7yD-X_vRx_OqDAwv1Il5psrxnhIA&e=</a>>).<br>>> We help researchers publish the latest AI news, summaries of their work,<br>>> opinion pieces, tutorials and more.  We are supported by many leading<br>>> scientific organizations in AI, namely AAAI<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__aaai.org_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=wBvjOWTzEkbfFAGNj9wOaiJlXMODmHNcoWO5JYHugS0&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%61%61%61%69%2E%6F%72%67%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%77%42%76%6A%4F%57%54%7A%45%6B%62%66%46%41%47%4E%6A%39%77%4F%61%69%4A%6C%58%4D%4F%44%6D%48%4E%63%6F%57%4F%35%4A%59%48%75%67%53%30%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__aaai.org_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=wBvjOWTzEkbfFAGNj9wOaiJlXMODmHNcoWO5JYHugS0&e=</a>>,<br>>> NeurIPS<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__neurips.cc_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=3-lOHXyu8171pT_UE9hYWwK6ft4I-cvYkuX7shC00w0&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%6E%65%75%72%69%70%73%2E%63%63%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%33%2D%6C%4F%48%58%79%75%38%31%37%31%70%54%5F%55%45%39%68%59%57%77%4B%36%66%74%34%49%2D%63%76%59%6B%75%58%37%73%68%43%30%30%77%30%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__neurips.cc_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=3-lOHXyu8171pT_UE9hYWwK6ft4I-cvYkuX7shC00w0&e=</a>>,<br>>> ICML<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__icml.cc_imls_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=JJyjwIpPy9gtKrZzBMbW3sRMh3P3Kcw-SvtxG35EiP0&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%69%63%6D%6C%2E%63%63%5F%69%6D%6C%73%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%4A%4A%79%6A%77%49%70%50%79%39%67%74%4B%72%5A%7A%42%4D%62%57%33%73%52%4D%68%33%50%33%4B%63%77%2D%53%76%74%78%47%33%35%45%69%50%30%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__icml.cc_imls_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=JJyjwIpPy9gtKrZzBMbW3sRMh3P3Kcw-SvtxG35EiP0&e=</a>>,<br>>> AIJ<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.journals.elsevier.com_artificial-2Dintelligence&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=eWrRCVWlcbySaH3XgacPpi0iR0-NDQYCLJ1x5yyMr8U&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%77%77%77%2E%6A%6F%75%72%6E%61%6C%73%2E%65%6C%73%65%76%69%65%72%2E%63%6F%6D%5F%61%72%74%69%66%69%63%69%61%6C%2D%32%44%69%6E%74%65%6C%6C%69%67%65%6E%63%65%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%65%57%72%52%43%56%57%6C%63%62%79%53%61%48%33%58%67%61%63%50%70%69%30%69%52%30%2D%4E%44%51%59%43%4C%4A%31%78%35%79%79%4D%72%38%55%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__www.journals.elsevier.com_artificial-2Dintelligence&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=eWrRCVWlcbySaH3XgacPpi0iR0-NDQYCLJ1x5yyMr8U&e=</a>><br>>> /IJCAI<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.journals.elsevier.com_artificial-2Dintelligence&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=eWrRCVWlcbySaH3XgacPpi0iR0-NDQYCLJ1x5yyMr8U&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%77%77%77%2E%6A%6F%75%72%6E%61%6C%73%2E%65%6C%73%65%76%69%65%72%2E%63%6F%6D%5F%61%72%74%69%66%69%63%69%61%6C%2D%32%44%69%6E%74%65%6C%6C%69%67%65%6E%63%65%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%65%57%72%52%43%56%57%6C%63%62%79%53%61%48%33%58%67%61%63%50%70%69%30%69%52%30%2D%4E%44%51%59%43%4C%4A%31%78%35%79%79%4D%72%38%55%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__www.journals.elsevier.com_artificial-2Dintelligence&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=eWrRCVWlcbySaH3XgacPpi0iR0-NDQYCLJ1x5yyMr8U&e=</a>>,<br>>> ACM SIGAI<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=http-3A__sigai.acm.org_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=7rC6MJFaMqOms10EYDQwfnmX-zuVNhu9fz8cwUwiLGQ&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%2D%33%41%5F%5F%73%69%67%61%69%2E%61%63%6D%2E%6F%72%67%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%37%72%43%36%4D%4A%46%61%4D%71%4F%6D%73%31%30%45%59%44%51%77%66%6E%6D%58%2D%7A%75%56%4E%68%75%39%66%7A%38%63%77%55%77%69%4C%47%51%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=http-3A__sigai.acm.org_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=7rC6MJFaMqOms10EYDQwfnmX-zuVNhu9fz8cwUwiLGQ&e=</a>>,<br>>> EurAI/AICOMM, CLAIRE<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__claire-2Dai.org_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=66ZofDIhuDba6Fb0LhlMGD3XbBhU7ez7dc3HD5-pXec&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%63%6C%61%69%72%65%2D%32%44%61%69%2E%6F%72%67%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%36%36%5A%6F%66%44%49%68%75%44%62%61%36%46%62%30%4C%68%6C%4D%47%44%33%58%62%42%68%55%37%65%7A%37%64%63%33%48%44%35%2D%70%58%65%63%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__claire-2Dai.org_&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=66ZofDIhuDba6Fb0LhlMGD3XbBhU7ez7dc3HD5-pXec&e=</a>><br>>> and RoboCup<br>>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.robocup.org__&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=bBI6GRq--MHLpIIahwoVN8iyXXc7JAeH3kegNKcFJc0&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%77%77%77%2E%72%6F%62%6F%63%75%70%2E%6F%72%67%5F%5F%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%79%6C%37%2D%56%50%53%76%4D%72%48%57%59%4B%5A%46%74%4B%64%46%70%54%68%51%39%55%54%62%32%6A%57%31%34%67%72%68%56%4F%6C%41%77%56%32%31%52%34%46%77%50%72%69%30%52%4F%4A%2D%75%46%64%4D%71%48%79%31%26%73%3D%62%42%49%36%47%52%71%2D%2D%4D%48%4C%70%49%49%61%68%77%6F%56%4E%38%69%79%58%58%63%37%4A%41%65%48%33%6B%65%67%4E%4B%63%46%4A%63%30%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__www.robocup.org__&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=yl7-VPSvMrHWYKZFtKdFpThQ9UTb2jW14grhVOlAwV21R4FwPri0ROJ-uFdMqHy1&s=bBI6GRq--MHLpIIahwoVN8iyXXc7JAeH3kegNKcFJc0&e=</a>><br>>> .<br>>><br>>> Twitter: @aihuborg<br>>><br>>><br>><br>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.avast.com_sig-2Demail-3Futm-5Fmedium-3Demail-26utm-5Fsource-3Dlink-26utm-5Fcampaign-3Dsig-2Demail-26utm-5Fcontent-3Dwebmail&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=Ao9QQWtO62go0hx1tb3NU6xw2FNBadjj8q64-hl5Sx4&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%77%77%77%2E%61%76%61%73%74%2E%63%6F%6D%5F%73%69%67%2D%32%44%65%6D%61%69%6C%2D%33%46%75%74%6D%2D%35%46%6D%65%64%69%75%6D%2D%33%44%65%6D%61%69%6C%2D%32%36%75%74%6D%2D%35%46%73%6F%75%72%63%65%2D%33%44%6C%69%6E%6B%2D%32%36%75%74%6D%2D%35%46%63%61%6D%70%61%69%67%6E%2D%33%44%73%69%67%2D%32%44%65%6D%61%69%6C%2D%32%36%75%74%6D%2D%35%46%63%6F%6E%74%65%6E%74%2D%33%44%77%65%62%6D%61%69%6C%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%77%61%53%4B%59%36%37%4A%46%35%37%49%5A%58%67%33%30%79%73%46%42%5F%52%37%4F%47%39%7A%6F%51%77%46%77%78%79%70%73%36%46%62%54%61%31%5A%68%35%6D%74%74%78%52%6F%74%5F%74%34%4E%37%6D%6E%36%38%50%6A%26%73%3D%41%6F%39%51%51%57%74%4F%36%32%67%6F%30%68%78%31%74%62%33%4E%55%36%78%77%32%46%4E%42%61%64%6A%6A%38%71%36%34%2D%68%6C%35%53%78%34%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__www.avast.com_sig-2Demail-3Futm-5Fmedium-3Demail-26utm-5Fsource-3Dlink-26utm-5Fcampaign-3Dsig-2Demail-26utm-5Fcontent-3Dwebmail&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=Ao9QQWtO62go0hx1tb3NU6xw2FNBadjj8q64-hl5Sx4&e=</a>> Virus-free.<br>> <a href="" data-saferedirecturl="redir.hsp?url=%68%74%74%70%3A%2F%2F%77%77%77%2E%61%76%61%73%74%2E%63%6F%6D" target="_blank">http://www.avast.com</a><br>> <<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__www.avast.com_sig-2Demail-3Futm-5Fmedium-3Demail-26utm-5Fsource-3Dlink-26utm-5Fcampaign-3Dsig-2Demail-26utm-5Fcontent-3Dwebmail&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=Ao9QQWtO62go0hx1tb3NU6xw2FNBadjj8q64-hl5Sx4&e=" data-saferedirecturl="redir.hsp?url=%68%74%74%70%73%3A%2F%2F%75%72%6C%64%65%66%65%6E%73%65%2E%70%72%6F%6F%66%70%6F%69%6E%74%2E%63%6F%6D%2F%76%32%2F%75%72%6C%3F%75%3D%68%74%74%70%73%2D%33%41%5F%5F%77%77%77%2E%61%76%61%73%74%2E%63%6F%6D%5F%73%69%67%2D%32%44%65%6D%61%69%6C%2D%33%46%75%74%6D%2D%35%46%6D%65%64%69%75%6D%2D%33%44%65%6D%61%69%6C%2D%32%36%75%74%6D%2D%35%46%73%6F%75%72%63%65%2D%33%44%6C%69%6E%6B%2D%32%36%75%74%6D%2D%35%46%63%61%6D%70%61%69%67%6E%2D%33%44%73%69%67%2D%32%44%65%6D%61%69%6C%2D%32%36%75%74%6D%2D%35%46%63%6F%6E%74%65%6E%74%2D%33%44%77%65%62%6D%61%69%6C%26%64%3D%44%77%4D%46%61%51%26%63%3D%73%6C%72%72%42%37%64%45%38%6E%37%67%42%4A%62%65%4F%30%67%2D%49%51%26%72%3D%77%51%52%31%4E%65%50%43%53%6A%36%64%4F%47%44%44%30%72%36%42%35%4B%6E%31%66%63%4E%61%54%4D%67%37%74%41%52%65%37%54%64%45%44%71%51%26%6D%3D%77%61%53%4B%59%36%37%4A%46%35%37%49%5A%58%67%33%30%79%73%46%42%5F%52%37%4F%47%39%7A%6F%51%77%46%77%78%79%70%73%36%46%62%54%61%31%5A%68%35%6D%74%74%78%52%6F%74%5F%74%34%4E%37%6D%6E%36%38%50%6A%26%73%3D%41%6F%39%51%51%57%74%4F%36%32%67%6F%30%68%78%31%74%62%33%4E%55%36%78%77%32%46%4E%42%61%64%6A%6A%38%71%36%34%2D%68%6C%35%53%78%34%26%65%3D" target="_blank">https://urldefense.proofpoint.com/v2/url?u=https-3A__www.avast.com_sig-2Demail-3Futm-5Fmedium-3Demail-26utm-5Fsource-3Dlink-26utm-5Fcampaign-3Dsig-2Demail-26utm-5Fcontent-3Dwebmail&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=Ao9QQWtO62go0hx1tb3NU6xw2FNBadjj8q64-hl5Sx4&e=</a>><br>><br>><br>><br></blockquote>
                </div></body></html>