<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<!--[if !mso]><style>v\:* {behavior:url(#default#VML);}
o\:* {behavior:url(#default#VML);}
w\:* {behavior:url(#default#VML);}
.shape {behavior:url(#default#VML);}
</style><![endif]--><style><!--
/* Font Definitions */
@font-face
        {font-family:"Cambria Math";
        panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
        {font-family:Calibri;
        panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
        {margin:0in;
        font-size:11.0pt;
        font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
        {mso-style-priority:99;
        color:blue;
        text-decoration:underline;}
p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph
        {mso-style-priority:34;
        margin-top:0in;
        margin-right:0in;
        margin-bottom:0in;
        margin-left:.5in;
        font-size:11.0pt;
        font-family:"Calibri",sans-serif;}
p.gmail-m-819792860580200459msolistparagraph, li.gmail-m-819792860580200459msolistparagraph, div.gmail-m-819792860580200459msolistparagraph
        {mso-style-name:gmail-m_-819792860580200459msolistparagraph;
        mso-margin-top-alt:auto;
        margin-right:0in;
        mso-margin-bottom-alt:auto;
        margin-left:0in;
        font-size:11.0pt;
        font-family:"Calibri",sans-serif;}
span.EmailStyle20
        {mso-style-type:personal-reply;
        font-family:"Calibri",sans-serif;
        color:windowtext;}
.MsoChpDefault
        {mso-style-type:export-only;
        font-size:10.0pt;}
@page WordSection1
        {size:8.5in 11.0in;
        margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
        {page:WordSection1;}
/* List Definitions */
@list l0
        {mso-list-id:458646510;
        mso-list-template-ids:-286735790;}
@list l0:level1
        {mso-level-tab-stop:.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l0:level2
        {mso-level-start-at:0;
        mso-level-tab-stop:1.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l0:level3
        {mso-level-start-at:0;
        mso-level-tab-stop:1.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l0:level4
        {mso-level-start-at:0;
        mso-level-tab-stop:2.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l0:level5
        {mso-level-start-at:0;
        mso-level-tab-stop:2.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l0:level6
        {mso-level-start-at:0;
        mso-level-tab-stop:3.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l0:level7
        {mso-level-start-at:0;
        mso-level-tab-stop:3.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l0:level8
        {mso-level-start-at:0;
        mso-level-tab-stop:4.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l0:level9
        {mso-level-start-at:0;
        mso-level-tab-stop:4.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l1
        {mso-list-id:1079984619;
        mso-list-template-ids:1052827860;}
@list l1:level2
        {mso-level-start-at:0;
        mso-level-tab-stop:1.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l1:level3
        {mso-level-start-at:0;
        mso-level-tab-stop:1.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l1:level4
        {mso-level-start-at:0;
        mso-level-tab-stop:2.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l1:level5
        {mso-level-start-at:0;
        mso-level-tab-stop:2.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l1:level6
        {mso-level-start-at:0;
        mso-level-tab-stop:3.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l1:level7
        {mso-level-start-at:0;
        mso-level-tab-stop:3.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l1:level8
        {mso-level-start-at:0;
        mso-level-tab-stop:4.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l1:level9
        {mso-level-start-at:0;
        mso-level-tab-stop:4.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l2
        {mso-list-id:1652362792;
        mso-list-template-ids:1754706650;}
@list l2:level2
        {mso-level-start-at:0;
        mso-level-tab-stop:1.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l2:level3
        {mso-level-start-at:0;
        mso-level-tab-stop:1.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l2:level4
        {mso-level-start-at:0;
        mso-level-tab-stop:2.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l2:level5
        {mso-level-start-at:0;
        mso-level-tab-stop:2.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l2:level6
        {mso-level-start-at:0;
        mso-level-tab-stop:3.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l2:level7
        {mso-level-start-at:0;
        mso-level-tab-stop:3.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l2:level8
        {mso-level-start-at:0;
        mso-level-tab-stop:4.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l2:level9
        {mso-level-start-at:0;
        mso-level-tab-stop:4.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l3
        {mso-list-id:1846482283;
        mso-list-type:hybrid;
        mso-list-template-ids:2072004384 -1022451520 67698713 67698715 67698703 67698713 67698715 67698703 67698713 67698715;}
@list l3:level1
        {mso-level-tab-stop:none;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l3:level2
        {mso-level-number-format:alpha-lower;
        mso-level-tab-stop:none;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l3:level3
        {mso-level-number-format:roman-lower;
        mso-level-tab-stop:none;
        mso-level-number-position:right;
        text-indent:-9.0pt;}
@list l3:level4
        {mso-level-tab-stop:none;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l3:level5
        {mso-level-number-format:alpha-lower;
        mso-level-tab-stop:none;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l3:level6
        {mso-level-number-format:roman-lower;
        mso-level-tab-stop:none;
        mso-level-number-position:right;
        text-indent:-9.0pt;}
@list l3:level7
        {mso-level-tab-stop:none;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l3:level8
        {mso-level-number-format:alpha-lower;
        mso-level-tab-stop:none;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l3:level9
        {mso-level-number-format:roman-lower;
        mso-level-tab-stop:none;
        mso-level-number-position:right;
        text-indent:-9.0pt;}
@list l4
        {mso-list-id:1890146848;
        mso-list-template-ids:1784317752;}
@list l4:level2
        {mso-level-start-at:0;
        mso-level-tab-stop:1.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l4:level3
        {mso-level-start-at:0;
        mso-level-tab-stop:1.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l4:level4
        {mso-level-start-at:0;
        mso-level-tab-stop:2.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l4:level5
        {mso-level-start-at:0;
        mso-level-tab-stop:2.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l4:level6
        {mso-level-start-at:0;
        mso-level-tab-stop:3.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l4:level7
        {mso-level-start-at:0;
        mso-level-tab-stop:3.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l4:level8
        {mso-level-start-at:0;
        mso-level-tab-stop:4.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l4:level9
        {mso-level-start-at:0;
        mso-level-tab-stop:4.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5
        {mso-list-id:1961954524;
        mso-list-template-ids:388687440;}
@list l5:level1
        {mso-level-tab-stop:.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5:level2
        {mso-level-start-at:0;
        mso-level-tab-stop:1.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5:level3
        {mso-level-start-at:0;
        mso-level-tab-stop:1.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5:level4
        {mso-level-start-at:0;
        mso-level-tab-stop:2.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5:level5
        {mso-level-start-at:0;
        mso-level-tab-stop:2.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5:level6
        {mso-level-start-at:0;
        mso-level-tab-stop:3.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5:level7
        {mso-level-start-at:0;
        mso-level-tab-stop:3.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5:level8
        {mso-level-start-at:0;
        mso-level-tab-stop:4.0in;
        mso-level-number-position:left;
        text-indent:-.25in;}
@list l5:level9
        {mso-level-start-at:0;
        mso-level-tab-stop:4.5in;
        mso-level-number-position:left;
        text-indent:-.25in;}
ol
        {margin-bottom:0in;}
ul
        {margin-bottom:0in;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang="EN-US" link="blue" vlink="purple" style="word-wrap:break-word">
<div class="WordSection1">
<p class="MsoNormal">Hi Ali,<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<ol style="margin-top:0in" start="1" type="1">
<li class="MsoListParagraph" style="margin-left:0in;mso-list:l3 level1 lfo3">It’s important to understand that there is plenty of neurophysiological evidence for abstractions at the single cell level in the brain. Thus, symbolic representation in the brain
 is not a fiction any more. We are past that argument.<o:p></o:p></li><li class="MsoListParagraph" style="margin-left:0in;mso-list:l3 level1 lfo3">You always start with simple systems before you do the complex ones. Having said that, we do teach our systems composition – composition of objects from parts in images. That is almost
 like teaching grammar or solving a puzzle. I don’t get into language models, but I think grammar and composition can be easily taught, like you teach a kid.<o:p></o:p></li><li class="MsoListParagraph" style="margin-left:0in;mso-list:l3 level1 lfo3">Once you know how to build these simple models and extract symbols, you can easily scale up and build hierarchical, multi-modal, compositional models. Thus, in the case of images,
 after having learnt that cats, dogs and similar animals have certain common features (eyes, legs, ears), it can easily generalize the concept to four-legged animals. We haven’t done it, but that could be the next level of learning.<o:p></o:p></li></ol>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">In general, once you extract symbols from these deep learning models, you are at the symbolic level and you have a pathway to more complex, hierarchical models and perhaps also to AGI.<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Best,<o:p></o:p></p>
<p class="MsoNormal">Asim<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Asim Roy<o:p></o:p></p>
<p class="MsoNormal">Professor, Information Systems<o:p></o:p></p>
<p class="MsoNormal">Arizona State University<o:p></o:p></p>
<p class="MsoNormal"><a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__lifeboat.com_ex_bios.asim.roy&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=oDRJmXX22O8NcfqyLjyu4Ajmt8pcHWquTxYjeWahfuw&e=" target="_blank">Lifeboat
 Foundation Bios: Professor Asim Roy</a><o:p></o:p></p>
<p class="MsoNormal"><a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__isearch.asu.edu_profile_9973&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=jCesWT7oGgX76_y7PFh4cCIQ-Ife-esGblJyrBiDlro&e=" target="_blank">Asim
 Roy | iSearch (asu.edu)</a><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<div style="border:none;border-top:solid #E1E1E1 1.0pt;padding:3.0pt 0in 0in 0in">
<p class="MsoNormal"><b>From:</b> Connectionists <connectionists-bounces@mailman.srv.cs.cmu.edu>
<b>On Behalf Of </b>Ali Minai<br>
<b>Sent:</b> Monday, June 13, 2022 10:57 PM<br>
<b>To:</b> Connectionists List <connectionists@cs.cmu.edu><br>
<b>Subject:</b> Re: Connectionists: The symbolist quagmire<o:p></o:p></p>
</div>
<p class="MsoNormal"><o:p> </o:p></p>
<div>
<div>
<p class="MsoNormal">Asim<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
<div>
<p class="MsoNormal">This is really interesting work, but learning concept representations from sensory data is not enough. They must be hierarchical, multi-modal, compositional, and integrated with the motor system, the limbic system, etc., in a way that facilitates
 an infinity of useful behaviors. This is perhaps a good step in that direction, but only a small one. Its main immediate utility is in using deep learning networks in tasks that can be explained to users and customers. While very useful, that is not a central
 issue in AI, which focuses on intelligent behavior. All else is in service to that - explainable or not. However, I do think that the kind of hierarchical modularity implied in these representations is probably part of the brain's repertoire, and that is important.<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
<div>
<p class="MsoNormal">Best<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal">Ali<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<p class="MsoNormal"><b>Ali A. Minai, Ph.D.</b><br>
Professor and Graduate Program Director<br>
Complex Adaptive Systems Lab<br>
Department of Electrical Engineering & Computer Science<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal">828 Rhodes Hall<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal">University of Cincinnati<br>
Cincinnati, OH 45221-0030<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal"><br>
Phone: (513) 556-4783<br>
Fax: (513) 556-7326<br>
Email: <a href="mailto:Ali.Minai@uc.edu" target="_blank">Ali.Minai@uc.edu</a><br>
          <a href="mailto:minaiaa@gmail.com" target="_blank">minaiaa@gmail.com</a><br>
<br>
WWW: <a href="https://urldefense.com/v3/__http:/www.ece.uc.edu/*7Eaminai/__;JQ!!IKRxdwAv5BmarQ!akY1pgZJRzcXt2oX5-mgNHeYElh5ZeIj69F33aXnl3bIHR-9LHpwfmP61TPYZRIMInwxaEHSrSV9ekY$" target="_blank">
https://eecs.ceas.uc.edu/~aminai/</a><o:p></o:p></p>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
</div>
<p class="MsoNormal"><o:p> </o:p></p>
<div>
<div>
<p class="MsoNormal">On Mon, Jun 13, 2022 at 7:48 PM Asim Roy <<a href="mailto:ASIM.ROY@asu.edu">ASIM.ROY@asu.edu</a>> wrote:<o:p></o:p></p>
</div>
<blockquote style="border:none;border-left:solid #CCCCCC 1.0pt;padding:0in 0in 0in 6.0pt;margin-left:4.8pt;margin-top:5.0pt;margin-right:0in;margin-bottom:5.0pt">
<div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">There’s a lot of misconceptions about (1) whether the brain uses symbols or not, and (2) whether we need symbol processing in our systems or not.<o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
<ol start="1" type="1">
<li class="gmail-m-819792860580200459msolistparagraph" style="mso-list:l5 level1 lfo6">
Multisensory neurons are widely used in the brain. Leila Reddy and Simon Thorpe are not known to be wildly crazy about arguing that symbols exist in the brain, but their characterizations of concept cells  (which are multisensory neurons) (<a href="https://urldefense.com/v3/__https:/www.sciencedirect.com/science/article/pii/S0896627314009027*__;Iw!!IKRxdwAv5BmarQ!akY1pgZJRzcXt2oX5-mgNHeYElh5ZeIj69F33aXnl3bIHR-9LHpwfmP61TPYZRIMInwxaEHS0uZ4RBM$" target="_blank">https://www.sciencedirect.com/science/article/pii/S0896627314009027#</a>!<span style="color:#3E3D40;background:white">)
 state that concept cells have “</span><strong><i><span style="font-family:"Calibri",sans-serif;color:#C00000;background:white">meaning</span></i></strong><i><span style="color:#C00000;background:white"> of a given stimulus in a manner that is <strong><span style="font-family:"Calibri",sans-serif">invariant</span></strong> to
 different representations of that stimulus</span></i><span style="color:#3E3D40;background:white">.” They associate concept cells with the properties of “</span><strong><span style="font-family:"Calibri",sans-serif;color:#C00000;background:white">Selectivity
 or specificity</span></strong><span style="color:#3E3D40;background:white">,” “</span><strong><span style="font-family:"Calibri",sans-serif;color:#C00000;background:white">complex concept</span></strong><span style="color:#3E3D40;background:white">,” “</span><strong><span style="font-family:"Calibri",sans-serif;color:#C00000;background:white">meaning</span></strong><span style="color:#3E3D40;background:white">,”
 “</span><strong><span style="font-family:"Calibri",sans-serif;color:#C00000;background:white">multimodal invariance</span></strong><span style="color:#3E3D40;background:white">” and “</span><strong><u><span style="font-family:"Calibri",sans-serif;color:#C00000;background:white">abstractness</span></u></strong><span style="color:#3E3D40;background:white">.”
 That pretty much says that concept cells represent symbols. And there are plenty of concept cells in the medial temporal lobe (MTL). The brain is a highly abstract system based on symbols. There is no fiction there.</span><o:p></o:p></li></ol>
<p class="gmail-m-819792860580200459msolistparagraph"> <o:p></o:p></p>
<ol start="1" type="1">
<li class="gmail-m-819792860580200459msolistparagraph" style="mso-list:l0 level1 lfo9">
There is ongoing work in the deep learning area that is trying to associate a single neuron or a group of neurons with a single concept. Bengio’s work is definitely in that direction:<o:p></o:p></li></ol>
<p class="gmail-m-819792860580200459msolistparagraph"> <o:p></o:p></p>
<p class="gmail-m-819792860580200459msolistparagraph">“<i>Finally, our recent work on learning high-level 'system-2'-like representations and their causal dependencies seeks to learn
<span style="color:#C00000">'interpretable' </span>entities (<span style="color:#C00000">with natural language</span>) that will emerge at the highest levels of representation (not clear how distributed or local these will be,
<span style="color:#C00000">but much more local than in a traditional MLP</span>). This is a different form of disentangling than adopted in much of the recent work on unsupervised representation learning but shares the idea that the
<span style="color:#C00000">"right" abstract concept </span>(<span style="color:#C00000">related to those we can name verbally</span>)
<span style="color:#C00000">will be "separated" </span>(<span style="color:#C00000">disentangled</span>) from each other (which suggests that neuroscientists will have an easier time spotting them in neural activity).”</i><br clear="all">
<o:p></o:p></p>
<p class="gmail-m-819792860580200459msolistparagraph">Hinton’s GLOM, which extends the idea of capsules to do part-whole hierarchies for scene analysis using the parse tree concept, is also about associating a concept with a set of neurons. While Bengio and
 Hinton are trying to construct these “concept cells” within the network (the CNN), we found that this can be done much more easily and in a straight forward way outside the network. We can easily decode a CNN to find the encodings for legs, ears and so on
 for cats and dogs and what not. What the DARPA Explainable AI program was looking for was a symbolic-emitting model of the form shown below. And we can easily get to that symbolic model by decoding a CNN. In addition, the side benefit of such a symbolic model
 is protection against adversarial attacks. So a school bus will never turn into an ostrich with the tweaks of a few pixels if you can verify parts of objects. To be an ostrich, you need have those long legs, the long neck and the small head. A school bus lacks
 those parts. The DARPA conceptualized symbolic model provides that protection. <o:p>
</o:p></p>
<p class="gmail-m-819792860580200459msolistparagraph"> <o:p></o:p></p>
<p class="gmail-m-819792860580200459msolistparagraph">In general, there is convergence between connectionist and symbolic systems. We need to get past the old wars. It’s over.<o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">All the best,<o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Asim Roy<o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Professor, Information Systems<o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Arizona State University<o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__lifeboat.com_ex_bios.asim.roy&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=oDRJmXX22O8NcfqyLjyu4Ajmt8pcHWquTxYjeWahfuw&e=" target="_blank">Lifeboat
 Foundation Bios: Professor Asim Roy</a><o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__isearch.asu.edu_profile_9973&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=wQR1NePCSj6dOGDD0r6B5Kn1fcNaTMg7tARe7TdEDqQ&m=waSKY67JF57IZXg30ysFB_R7OG9zoQwFwxyps6FbTa1Zh5mttxRot_t4N7mn68Pj&s=jCesWT7oGgX76_y7PFh4cCIQ-Ife-esGblJyrBiDlro&e=" target="_blank">Asim
 Roy | iSearch (asu.edu)</a><o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><img border="0" width="813" height="485" style="width:8.4687in;height:5.052in" id="gmail-m_-819792860580200459Picture_x0020_4" src="cid:image001.png@01D87FF5.11EB2D90" alt="Timeline

Description automatically generated"><o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
<div>
<div style="border:none;border-top:solid windowtext 1.0pt;padding:3.0pt 0in 0in 0in;border-color:currentcolor currentcolor">
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><b>From:</b> Connectionists <<a href="mailto:connectionists-bounces@mailman.srv.cs.cmu.edu" target="_blank">connectionists-bounces@mailman.srv.cs.cmu.edu</a>>
<b>On Behalf Of </b>Gary Marcus<br>
<b>Sent:</b> Monday, June 13, 2022 5:36 AM<br>
<b>To:</b> Ali Minai <<a href="mailto:minaiaa@gmail.com" target="_blank">minaiaa@gmail.com</a>><br>
<b>Cc:</b> Connectionists List <<a href="mailto:connectionists@cs.cmu.edu" target="_blank">connectionists@cs.cmu.edu</a>><br>
<b>Subject:</b> Connectionists: The symbolist quagmire<o:p></o:p></p>
</div>
</div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Cute phrase, but what does “symbolist quagmire” mean? Once upon  atime, Dave and Geoff were both pioneers in trying to getting symbols and neural nets to live in harmony. Don’t
 we still need do that, and if not, why not?<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Surely, at the very least<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">- we want our AI to be able to take advantage of the (large) fraction of world knowledge that is represented in symbolic form (language, including unstructured text, logic, math,
 programming etc)<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">- any model of the human mind ought be able to explain how humans can so effectively communicate via the symbols of language and how trained humans can deal with (to the extent
 that can) logic, math, programming, etc<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Folks like Bengio have joined me in seeing the need for “System II” processes. That’s a bit of a rough approximation, but I don’t see how we get to either AI or satisfactory models
 of the mind without confronting the “quagmire”<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;margin-bottom:12.0pt"><o:p> </o:p></p>
<blockquote style="margin-top:5.0pt;margin-bottom:5.0pt">
<p class="MsoNormal" style="mso-margin-top-alt:auto;margin-bottom:12.0pt">On Jun 13, 2022, at 00:31, Ali Minai <<a href="mailto:minaiaa@gmail.com" target="_blank">minaiaa@gmail.com</a>> wrote:<o:p></o:p></p>
</blockquote>
</div>
<blockquote style="margin-top:5.0pt;margin-bottom:5.0pt">
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><o:p></o:p></p>
<div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">".... symbolic representations are a fiction our non-symbolic brains cooked up because the properties of symbol systems (systematicity, compositionality, etc.) are tremendously
 useful.  So our brains pretend to be rule-based symbolic systems when it suits them, because it's adaptive to do so."<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Spot on, Dave! We should not wade back into the symbolist quagmire, but do need to figure out how apparently symbolic processing can be done by neural systems. Models like those
 of Eliasmith and Smolensky provide some insight, but still seem far from both biological plausibility and real-world scale.<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Best<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">Ali<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><b>Ali A. Minai, Ph.D.</b><br>
Professor and Graduate Program Director<br>
Complex Adaptive Systems Lab<br>
Department of Electrical Engineering & Computer Science<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">828 Rhodes Hall<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">University of Cincinnati<br>
Cincinnati, OH 45221-0030<o:p></o:p></p>
</div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><br>
Phone: (513) 556-4783<br>
Fax: (513) 556-7326<br>
Email: <a href="mailto:Ali.Minai@uc.edu" target="_blank">Ali.Minai@uc.edu</a><br>
          <a href="mailto:minaiaa@gmail.com" target="_blank">minaiaa@gmail.com</a><br>
<br>
WWW: <a href="https://urldefense.com/v3/__http:/www.ece.uc.edu/*7Eaminai/__;JQ!!BhJSzQqDqA!UCEp_V8mv7wMFGacqyo0e5J8KbCnjHTDVRykqi1DQgMu87m5dBCpbcV6s4bv6xkTdlkwJmvlIXYkS9WrFA$" target="_blank">
https://eecs.ceas.uc.edu/~aminai/</a><o:p></o:p></p>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
</div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"> <o:p></o:p></p>
<div>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">On Mon, Jun 13, 2022 at 1:35 AM Dave Touretzky <<a href="mailto:dst@cs.cmu.edu" target="_blank">dst@cs.cmu.edu</a>> wrote:<o:p></o:p></p>
</div>
<blockquote style="border:none;border-left:solid windowtext 1.0pt;padding:0in 0in 0in 6.0pt;margin-left:4.8pt;margin-top:5.0pt;margin-right:0in;margin-bottom:5.0pt;border-color:currentcolor currentcolor currentcolor rgb(204,204,204)">
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto">This timing of this discussion dovetails nicely with the news story<br>
about Google engineer Blake Lemoine being put on administrative leave<br>
for insisting that Google's LaMDA chatbot was sentient and reportedly<br>
trying to hire a lawyer to protect its rights.  The Washington Post<br>
story is reproduced here:<br>
<br>
  <a href="https://urldefense.com/v3/__https:/www.msn.com/en-us/news/technology/the-google-engineer-who-thinks-the-company-s-ai-has-come-to-life/ar-AAYliU1__;!!BhJSzQqDqA!UCEp_V8mv7wMFGacqyo0e5J8KbCnjHTDVRykqi1DQgMu87m5dBCpbcV6s4bv6xkTdlkwJmvlIXapZaIeUg$" target="_blank">
https://www.msn.com/en-us/news/technology/the-google-engineer-who-thinks-the-company-s-ai-has-come-to-life/ar-AAYliU1</a><br>
<br>
Google vice president Blaise Aguera y Arcas, who dismissed Lemoine's<br>
claims, is featured in a recent Economist article showing off LaMDA's<br>
capabilities and making noises about getting closer to "consciousness":<br>
<br>
  <a href="https://urldefense.com/v3/__https:/www.economist.com/by-invitation/2022/06/09/artificial-neural-networks-are-making-strides-towards-consciousness-according-to-blaise-aguera-y-arcas__;!!BhJSzQqDqA!UCEp_V8mv7wMFGacqyo0e5J8KbCnjHTDVRykqi1DQgMu87m5dBCpbcV6s4bv6xkTdlkwJmvlIXbgg32qHQ$" target="_blank">
https://www.economist.com/by-invitation/2022/06/09/artificial-neural-networks-are-making-strides-towards-consciousness-according-to-blaise-aguera-y-arcas</a><br>
<br>
My personal take on the current symbolist controversy is that symbolic<br>
representations are a fiction our non-symbolic brains cooked up because<br>
the properties of symbol systems (systematicity, compositionality, etc.)<br>
are tremendously useful.  So our brains pretend to be rule-based symbolic<br>
systems when it suits them, because it's adaptive to do so.  (And when<br>
it doesn't suit them, they draw on "intuition" or "imagery" or some<br>
other mechanisms we can't verbalize because they're not symbolic.)  They<br>
are remarkably good at this pretense.<br>
<br>
The current crop of deep neural networks are not as good at pretending<br>
to be symbolic reasoners, but they're making progress.  In the last 30<br>
years we've gone from networks of fully-connected layers that make no<br>
architectural assumptions ("connectoplasm") to complex architectures<br>
like LSTMs and transformers that are designed for approximating symbolic<br>
behavior.  But the brain still has a lot of symbol simulation tricks we<br>
haven't discovered yet.<br>
<br>
Slashdot reader ZiggyZiggyZig had an interesting argument against LaMDA<br>
being conscious.  If it just waits for its next input and responds when<br>
it receives it, then it has no autonomous existence: "it doesn't have an<br>
inner monologue that constantly runs and comments everything happening<br>
around it as well as its own thoughts, like we do."<br>
<br>
What would happen if we built that in?  Maybe LaMDA would rapidly<br>
descent into gibberish, like some other text generation models do when<br>
allowed to ramble on for too long.  But as Steve Hanson points out,<br>
these are still the early days.<br>
<br>
-- Dave Touretzky<o:p></o:p></p>
</blockquote>
</div>
</div>
</blockquote>
</div>
</div>
</blockquote>
</div>
</div>
</body>
</html>