Connectionists: 2024 Nobel Prize in Physics goes to Hopfield and Hinton

David B bisant at umbc.edu
Thu Oct 10 16:12:08 EDT 2024


Colleagues,
I have had the benefit of learning from many of the pioneers in the field
including LeCun, Werbos, Widrow, and others. I also had the benefit of
spending 2 years working with David Rumelhart's research group.

David worked in a number of areas including the modeling of cognition,
engineering applications,  and  he was even one of the first to map brain
activity using magnetic resonance. Besides his contributions to back
propagation, he also made another primary contribution, which was the
development of forward models, in collaboration with Jordan. Though many
people took David's developments and extended them or even applied them
commercially, he was always supportive and happy that they extended his
work. He did this without asking for anything. He never felt that his
contributions had been diminished by others extending his work.

I think Dave would've been happy about these researchers building upon his
work to achieve their award.
David Bisant, PhD

On Tue, Oct 8, 2024 at 3:53 PM Stephen José Hanson <jose at rubic.rutgers.edu>
wrote:

> Hi Steve,
>
> The problem every writer encounters is what can be concluded as resolved
> knowledge rather then new/novel knowledge. In the law this is of course
> “legal precedence”, so does the reference refer to a recent precedent, or
> does the one for the 17th century hold precedence? In the present case, I
> agree that calculating gradients of functions using the chain rule was
> invented (Legendre -- Least squares) far before Rumelhart and Hinton
> applied it to error gradients in acyclic/cyclic networks, and of course
> there were others as you say, in the 20th century that also applied error
> gradient to networks (Parker, Le cun et al). Schmidhuber says all that
> matters is the “math” not the applied context. However, I seriously doubt
> that Legendre could have imagined using gradients of function error through
> succesive application in a acylic network would have produced a
> hierarchical kinship relationship (distinguishing between an italian and
> english family mother, fathers, sons, aunts, grandparents etc.) in the
> hidden units of a network, simply by observing individuals with fixed
> feature relations. I think any reasonable person would  maintain that this
> application is completely novel and could not be predicted in or out of
> context from the “math” and certainly not from the 18th century. Hidden
> units were new in this context and their representational nature was novel,
> in this context.   Scope of reference is also based on logical or causal
> proximity to the reference. In this case, referencing Darwin or Newton in
> all biological or physics papers should be based on the outcome of the
> metaphorical test of whether the recent results tie back to original source
> in some direct line, for example, was Oswald’s grandfather responsible for
> the death of President John F. Kennedy? Failing this test, suggests that
> the older reference may not have scope. But of course this can be
> subjective.
>
> Steve
>
> On 10/8/24 2:38 PM, Grossberg, Stephen wrote:
>
> Actually, Paul Werbos developed back propagation into its modern form, and
> worked out computational examples, for his 1974 Harvard PhD thesis.
>
>
>
> Then David Parker rediscovered it in 1982, etc.
>
>
>
> Schmidhuber provides an excellent and wide-ranging history of many
> contributors to Deep Learning and its antecedents:
>
>
>
>
> https://www.sciencedirect.com/science/article/pii/S0893608014002135?casa_token=k47YCzFwcFEAAAAA:me_ZGF5brDqjRihq5kHyeQBzyUMYBypJ3neSinZ-cPn1pnyi69DGyM9eKSyLsdiRf759I77c7w
>
>
>
> This article has been cited over 23,000 times.
>
>
>
> *From: *Connectionists <connectionists-bounces at mailman.srv.cs.cmu.edu>
> <connectionists-bounces at mailman.srv.cs.cmu.edu> on behalf of Stephen José
> Hanson <jose at rubic.rutgers.edu> <jose at rubic.rutgers.edu>
> *Date: *Tuesday, October 8, 2024 at 2:25 PM
> *To: *Jonathan D. Cohen <jdc at princeton.edu> <jdc at princeton.edu>,
> Connectionists <connectionists at cs.cmu.edu> <connectionists at cs.cmu.edu>
> *Subject: *Re: Connectionists: 2024 Nobel Prize in Physics goes to
> Hopfield and Hinton
>
> Yes, Jon good point here, and  although there is a through line from
> Hopfield to Hinton and Sejnowski.. Ie boltzmann machines and onto DL and
> LLMs
>
> Dave of course invented BP, Geoff would always say.. his contribution was
> to try and talk Dave out of it as it had so many computational problems and
> could be in no way considered biologically plausible.
>
> Steve
>
> On 10/8/24 8:47 AM, Jonathan D. Cohen wrote:
>
> I’d like to add, in this context, a note in memoriam of David Rumelhart, who was an integral contributor to the work honored by today’s Nobel Prize.
>
>
>
> jdc
>
>
>
> --
>
> Stephen José Hanson
>
> Professor, Psychology Department
>
> Director, RUBIC (Rutgers University Brain Imaging Center)
>
> Member, Executive Committee, RUCCS
>
> --
> Stephen José Hanson
> Professor, Psychology Department
> Director, RUBIC (Rutgers University Brain Imaging Center)
> Member, Executive Committee, RUCCS
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20241010/0b3b6a0f/attachment.html>


More information about the Connectionists mailing list