<html xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
@font-face
{font-family:"Times New Roman \(Body CS\)";
panose-1:2 11 6 4 2 2 2 2 2 4;}
@font-face
{font-family:Consolas;
panose-1:2 11 6 9 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
font-size:10.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
pre
{mso-style-priority:99;
mso-style-link:"HTML Preformatted Char";
margin:0in;
font-size:10.0pt;
font-family:"Courier New";}
span.HTMLPreformattedChar
{mso-style-name:"HTML Preformatted Char";
mso-style-priority:99;
mso-style-link:"HTML Preformatted";
font-family:Consolas;}
span.gmailsignatureprefix
{mso-style-name:gmail_signature_prefix;}
.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;
mso-ligatures:none;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style>
</head>
<body lang="EN-US" link="blue" vlink="purple" style="word-wrap:break-word">
<div class="WordSection1">
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">Dear Steve,<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">Thanks for your detailed reply.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">I believe that what I wrote about the 1974 work of Paul Werbos, which also developed computational examples, gives him modern priority for back propagation.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">Mentioning lots of other applications does not diminish Paul’s priority.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">What Paul did not do well was to market his work.
<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">But good marketing does not imply priority, just as Voltaire’s “marketing” of the discoveries of Newton on the Continent did not diminish Newton’s priority. Just Google to find
the following quote:<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">“Yes, Voltaire played a significant role in spreading Isaac Newton's scientific ideas across the European continent, particularly through his writings like "Letters from England"
which highlighted Newton's work and helped popularize Newtonian physics in France and beyond.”<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">I also am concerned about the Hopfield award because, again, the Nobel Prize committee does not seem to know the scientific history behind that.<br>
<br>
<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">I published several articles in 1967 – 1972 in the
<i>Proceedings of the National Academy of Sciences</i> that introduced the Additive Model that Hopfield used in 1984. This work proved global theorems about the limits and oscillations of my Generalized Additive Models. See sites.bu.edu/steveg for downloadable
versions of these articles. For example:<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Grossberg, S. (1971). Pavlovian pattern learning by nonlinear neural networks. <i>Proceedings of the National Academy of Sciences</i>, 68, 828-831. <o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><a href="https://sites.bu.edu/steveg/files/2016/06/Gro1971ProNatAcaSci.pdf">https://sites.bu.edu/steveg/files/2016/06/Gro1971ProNatAcaSci.pdf</a><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">You will note from the title of this article that the model arose from my discovery and development of biological neural networks to provide principled mechanistic
explanations of increasingly large psychological and neurobiological data bases. One of my early loves were animal conditioning and associative learning, to whose clarification the above article, one in a series, contributed.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Building on these discoveries, Michael Cohen and I published a Liapunov function that included the Additive Model, Shunting Model, and major generalizations thereof
in 1982 and 1983 before the 1984 Hopfield article appeared. For example,<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Cohen, M.A. and Grossberg, S. (1983). Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. <i>IEEE Transactions
on Systems, Man, and Cybernetics</i>, <b>SMC-13</b>, 815-826. <o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><a href="https://sites.bu.edu/steveg/files/2016/06/CohGro1983IEEE.pdf">https://sites.bu.edu/steveg/files/2016/06/CohGro1983IEEE.pdf</a><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Significantly, we proved mathematically that our generalized Liapunov function worked in all cases.
<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">This 1983 article was part of a series of earlier articles in which I proved theorems about the limiting and oscillatory behavior of increasingly general neural
networks that arose in multiple kinds of applications.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">One does not pull such a result out of a hat.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">In this regard, I was told that Hopfield knew about my work before he published his 1984 article.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">I was also alerted before his publication by colleagues of mine that Hopfield had just presented a public lecture about my model, declaring it a major breakthrough.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">At that time in my life, I was trying to manage a storm of creative ideas that thrust me into multiple topics of neural networks. One example, was my 1978 Theory
of Human Memory that laid out a major research program that took several decades to fully develop with multiple collaborators.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Please keep in mind that I started my research in neural networks in 1957 as a Freshman at Dartmouth College when I was taking Introductory Psychology.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">That year, I introduced the paradigm of using nonlinear systems of neural networks, as well as the short-term memory (STM), medium-term memory (MTM), and long-term
memory (LTM) laws that are used in one specialization or another to this day, including in the Additive Model, to computationally explain data about how our brains make our minds. See the historical summary in
<a href="https://en.wikipedia.org/wiki/Stephen_Grossberg">https://en.wikipedia.org/wiki/Stephen_Grossberg</a> .<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">I was not thinking about priority. I was trying to develop the latest in a steady stream of new ideas.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">When I started in 1957, I knew no one else who was doing neural networks. That is why various of my colleagues call me the Father of AI.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Because I was intellectually lonely, and wanted others to enjoy and be able to contribute to what I viewed as a paradigm-shifting breakthrough, I felt strongly
the need to help create a neural networks community. <o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">That challenge thrust me into founding infrastructure for the field of neural networks, including a research center, academic department, the International Neural
Network Society, the journal <i>Neural Networks</i>, multiple international conferences on neural networks, and Boston-area research centers, while I began training over 100 gifted PhD students, postdocs, and faculty to do neural network research. See the
Wikipedia page for a summary.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">I did not feel that I had the strength to fight priority battles on top of all my educational, administrative, and leadership responsibilities.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">As some of you know, I was recently able to provide a self-contained and non-technical overview and synthesis of some of my scientific discoveries since 1957, as
well as explanations of the work of hundreds of other scientists, in my 2021 Magnum Opus<o:p></o:p></span></p>
<p class="MsoNormal"><b><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></b></p>
<p class="MsoNormal"><b><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Conscious Mind, Resonant Brain: How Each Brain Makes a Mind<o:p></o:p></span></b></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><a href="https://www.amazon.com/Conscious-Mind-Resonant-Brain-Makes/dp/0190070552">https://www.amazon.com/Conscious-Mind-Resonant-Brain-Makes/dp/0190070552</a><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">I was grateful to learn that the book won the 2022 PROSE book award in Neuroscience of the Association of American Publishers.<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Best to all,<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121"><o:p> </o:p></span></p>
<p class="MsoNormal"><span style="font-size:16.0pt;font-family:"Arial",sans-serif;color:#212121">Steve<o:p></o:p></span></p>
<div id="mail-editor-reference-message-container">
<div>
<div style="border:none;border-top:solid #B5C4DF 1.0pt;padding:3.0pt 0in 0in 0in">
<p class="MsoNormal" style="margin-bottom:12.0pt"><b><span style="font-size:12.0pt;color:black">From:
</span></b><span style="font-size:12.0pt;color:black">Brad Wyble <bwyble@gmail.com><br>
<b>Date: </b>Tuesday, October 8, 2024 at 7:56 PM<br>
<b>To: </b>Stephen José Hanson <jose@rubic.rutgers.edu><br>
<b>Cc: </b>Grossberg, Stephen <steve@bu.edu>, Jonathan D. Cohen <jdc@princeton.edu>, Connectionists <connectionists@cs.cmu.edu><br>
<b>Subject: </b>Re: Connectionists: 2024 Nobel Prize in Physics goes to Hopfield and Hinton<o:p></o:p></span></p>
</div>
<div>
<p class="MsoNormal"><span style="font-size:11.0pt">We really can trace the current AI boom back to John Carmack who wrote Doom, which ushered in the era of GPU-hungry computing. Credit where it's due please. <o:p></o:p></span></p>
</div>
<p class="MsoNormal"><span style="font-size:11.0pt"><o:p> </o:p></span></p>
<div>
<div>
<p class="MsoNormal"><span style="font-size:11.0pt">On Tue, Oct 8, 2024 at 4:10 PM Stephen José Hanson <</span><a href="mailto:jose@rubic.rutgers.edu"><span style="font-size:11.0pt">jose@rubic.rutgers.edu</span></a><span style="font-size:11.0pt">> wrote:<o:p></o:p></span></p>
</div>
<blockquote style="border:none;border-left:solid #CCCCCC 1.0pt;padding:0in 0in 0in 6.0pt;margin-left:4.8pt;margin-top:5.0pt;margin-right:0in;margin-bottom:5.0pt">
<div>
<p>Hi Steve,</p>
<p style="margin-bottom:0in">The problem every writer encounters is what can be concluded as resolved knowledge rather then new/novel knowledge. In the law this is of course “legal precedence”, so does the reference refer to a recent precedent, or does the
one for the 17<sup>th</sup> century hold precedence? In the present case, I agree that calculating gradients of functions using the chain rule was invented (Legendre -- Least squares) far before Rumelhart and Hinton applied it to error gradients in acyclic/cyclic
networks, and of course there were others as you say, in the 20<sup>th</sup> century that also applied error gradient to networks (Parker, Le cun et al). Schmidhuber says all that matters is the “math” not the applied context. However, I seriously doubt that
Legendre could have imagined using gradients of function error through succesive application in a acylic network would have produced a hierarchical kinship relationship (distinguishing between an italian and english family mother, fathers, sons, aunts, grandparents
etc.) in the hidden units of a network, simply by observing individuals with fixed feature relations. I think any reasonable person would maintain that this application is completely novel and could not be predicted in or out of context from the “math” and
certainly not from the 18<sup>th</sup> century. Hidden units were new in this context and their representational nature was novel, in this context. Scope of reference is also based on logical or causal proximity to the reference. In this case, referencing
Darwin or Newton in all biological or physics papers should be based on the outcome of the metaphorical test of whether the recent results tie back to original source in some direct line, for example, was Oswald’s grandfather responsible for the death of President
John F. Kennedy? Failing this test, suggests that the older reference may not have scope. But of course this can be subjective.</p>
<p style="margin-bottom:0in">Steve</p>
<div>
<p class="MsoNormal"><span style="font-size:11.0pt">On 10/8/24 2:38 PM, Grossberg, Stephen wrote:<o:p></o:p></span></p>
</div>
<blockquote style="margin-top:5.0pt;margin-bottom:5.0pt">
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">Actually, Paul Werbos developed back propagation into its modern form, and worked out computational examples, for his
1974 Harvard PhD thesis.</span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"> </span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">Then David Parker rediscovered it in 1982, etc.</span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"> </span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">Schmidhuber provides an excellent and wide-ranging history of many contributors to Deep Learning and its antecedents:</span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"> </span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><a href="https://www.sciencedirect.com/science/article/pii/S0893608014002135?casa_token=k47YCzFwcFEAAAAA:me_ZGF5brDqjRihq5kHyeQBzyUMYBypJ3neSinZ-cPn1pnyi69DGyM9eKSyLsdiRf759I77c7w" target="_blank"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">https://www.sciencedirect.com/science/article/pii/S0893608014002135?casa_token=k47YCzFwcFEAAAAA:me_ZGF5brDqjRihq5kHyeQBzyUMYBypJ3neSinZ-cPn1pnyi69DGyM9eKSyLsdiRf759I77c7w</span></a><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"> </span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif">This article has been cited over 23,000 times.</span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:16.0pt;font-family:"Arial",sans-serif"> </span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<div id="m_-23520673755399983mail-editor-reference-message-container">
<div>
<div style="border:none;border-top:solid #B5C4DF 1.0pt;padding:3.0pt 0in 0in 0in">
<p class="MsoNormal" style="mso-margin-top-alt:auto;margin-bottom:12.0pt"><b><span style="font-size:12.0pt;color:black">From:
</span></b><span style="font-size:12.0pt;color:black">Connectionists </span><a href="mailto:connectionists-bounces@mailman.srv.cs.cmu.edu" target="_blank"><span style="font-size:12.0pt"><connectionists-bounces@mailman.srv.cs.cmu.edu></span></a><span style="font-size:12.0pt;color:black">
on behalf of Stephen José Hanson </span><a href="mailto:jose@rubic.rutgers.edu" target="_blank"><span style="font-size:12.0pt"><jose@rubic.rutgers.edu></span></a><span style="font-size:12.0pt;color:black"><br>
<b>Date: </b>Tuesday, October 8, 2024 at 2:25 PM<br>
<b>To: </b>Jonathan D. Cohen </span><a href="mailto:jdc@princeton.edu" target="_blank"><span style="font-size:12.0pt"><jdc@princeton.edu></span></a><span style="font-size:12.0pt;color:black">, Connectionists
</span><a href="mailto:connectionists@cs.cmu.edu" target="_blank"><span style="font-size:12.0pt"><connectionists@cs.cmu.edu></span></a><span style="font-size:12.0pt;color:black"><br>
<b>Subject: </b>Re: Connectionists: 2024 Nobel Prize in Physics goes to Hopfield and Hinton</span><span style="font-size:11.0pt"><o:p></o:p></span></p>
</div>
<p>Yes, Jon good point here, and although there is a through line from Hopfield to Hinton and Sejnowski.. Ie boltzmann machines and onto DL and LLMs</p>
<p>Dave of course invented BP, Geoff would always say.. his contribution was to try and talk Dave out of it as it had so many computational problems and could be in no way considered biologically plausible.</p>
<p>Steve</p>
<div>
<p class="MsoNormal" style="mso-margin-top-alt:auto;mso-margin-bottom-alt:auto"><span style="font-size:11.0pt">On 10/8/24 8:47 AM, Jonathan D. Cohen wrote:<o:p></o:p></span></p>
</div>
<blockquote style="margin-top:5.0pt;margin-bottom:5.0pt">
<pre>I’d like to add, in this context, a note in memoriam of David Rumelhart, who was an integral contributor to the work honored by today’s Nobel Prize.</pre>
<pre> </pre>
<pre>jdc</pre>
<pre> </pre>
</blockquote>
<pre>-- </pre>
<pre>Stephen José Hanson</pre>
<pre>Professor, Psychology Department</pre>
<pre>Director, RUBIC (Rutgers University Brain Imaging Center)</pre>
<pre>Member, Executive Committee, RUCCS</pre>
</div>
</div>
</div>
</blockquote>
<pre>-- </pre>
<pre>Stephen José Hanson</pre>
<pre>Professor, Psychology Department</pre>
<pre>Director, RUBIC (Rutgers University Brain Imaging Center)</pre>
<pre>Member, Executive Committee, RUCCS</pre>
</div>
</blockquote>
</div>
<p class="MsoNormal"><span style="font-size:11.0pt"><br clear="all">
<o:p></o:p></span></p>
<div>
<p class="MsoNormal"><span style="font-size:11.0pt"><o:p> </o:p></span></p>
</div>
<p class="MsoNormal"><span class="gmailsignatureprefix"><span style="font-size:11.0pt">--
</span></span><span style="font-size:11.0pt"><o:p></o:p></span></p>
<div>
<div>
<div>
<div>
<div>
<p class="MsoNormal"><span style="font-size:11.0pt">Brad Wyble (he/him)<o:p></o:p></span></p>
<div>
<p class="MsoNormal"><span style="font-size:11.0pt"><o:p> </o:p></span></p>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</body>
</html>