Connectionists: Annotated History of Modern AI and Deep Learning
Schmidhuber Juergen
juergen at idsia.ch
Thu Jan 26 08:29:29 EST 2023
And in 1967-68, the same Shun-Ichi Amari trained multilayer perceptrons (MLPs) with many layers by stochastic gradient descent (SGD) in end-to-end fashion. See Sec. 7 of the survey: https://people.idsia.ch/~juergen/deep-learning-history.html#2nddl
Amari's implementation [GD2,GD2a] (with his student Saito) learned internal representations in a five layer MLP with two modifiable layers, which was trained to classify non-linearily separable pattern classes.
Back then compute was billions of times more expensive than today.
To my knowledge, this was the first implementation of learning internal representations through SGD-based deep learning.
If anyone knows of an earlier one then please let me know :)
Jürgen
> On 25. Jan 2023, at 16:44, Schmidhuber Juergen <juergen at idsia.ch> wrote:
>
> Some are not aware of this historic tidbit in Sec. 4 of the survey: half a century ago, Shun-Ichi Amari published a learning recurrent neural network (1972) which was later called the Hopfield network.
>
> https://people.idsia.ch/~juergen/deep-learning-history.html#rnn
>
> Jürgen
>
>
>
>
>> On 13. Jan 2023, at 11:13, Schmidhuber Juergen <juergen at idsia.ch> wrote:
>>
>> Machine learning is the science of credit assignment. My new survey credits the pioneers of deep learning and modern AI (supplementing my award-winning 2015 survey):
>>
>> https://arxiv.org/abs/2212.11279
>>
>> https://people.idsia.ch/~juergen/deep-learning-history.html
>>
>> This was already reviewed by several deep learning pioneers and other experts. Nevertheless, let me know under juergen at idsia.ch if you can spot any remaining error or have suggestions for improvements.
>>
>> Happy New Year!
>>
>> Jürgen
>>
>
More information about the Connectionists
mailing list