Connectionists: Deep Learning Overview Draft

Juyang Weng weng at cse.msu.edu
Thu Apr 17 16:07:58 EDT 2014


Dear Juergen,

Congratulations on the draft and 600+ references!  Thank you very much 
for asking for reference. Cresceptron generated heated debate then 
(e.g., Takeo Kanade's comments).   Some people commented  that 
Cresceptron started the learning for computer vision from cluttered 
scenes.  Of course, it had many problems then. To save your time, I cut 
and paste the major characterization of Cresceptron from my web:

1991 (IJCNN 1992) - 1997 (IJCV): Cresceptron 
<http://www.cse.msu.edu/%7Eweng/research/cresceptron.html>.

It appeared to be the first deep learning network that adapts its 
connection structure.
- It appeared to be the first visual learning program for both detecting 
and recognizing general objects from cluttered complex natural background.
- It also did segmentation, but in another separate top-down 
segmentation phase while the network did not do recognition.
- The number of neural planes dynamically and incrementally grew from 
interactive experience, but the number of layers (15 in the experiments) 
was determined by the image size.
- All the internal network learning was fully automatic --- there was no 
need for manual intervention once the learning (development) had started.
- It required pre-segmentation for teaching: A human outlined the object 
contours for supervised learning. This avoided learning background.
- Its internal features were automatically grouped through the 
last-layer motor supervision (class labels) but learnings of internal 
features were all unsupervised.
- It uses local match-and-maximization paired-layer architecture which 
corresponds to logic-AND and logic-OR in multivalue logic (Tommy Poggio 
used a term HMAX later).
- The intrinsic convolution mechanism of the network provided both shift 
invariance and distortion tolerance. (Later WWNs are better in learning 
location as one of concepts.)
- It is a cascade network: features in a layer are learned from features 
of the previous layer, but not earlier. (This cascade restriction was 
overcome by later WWNs.)
- Inspired by Neocognitron (K. Fukushima 1975) which was for recognition 
of individual characters in a uniform background.

If you are so kind to cite it, I guess that probably it belongs to your 
section 5.9 1991-: Deep Hierarchy of Recurrent NNs.
If it does not fit your article, please accept my apology for wasting 
your time.

Just my 2 cents of worth. :)

Best regards,

-John

On 4/17/14 11:40 AM, Schmidhuber Juergen wrote:
> Dear connectionists,
>
> here the preliminary draft of an invited Deep Learning overview:
>
> http://www.idsia.ch/~juergen/DeepLearning17April2014.pdf
>
> Abstract. In recent years, deep neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarises relevant work, much of it from the previous millennium. Shallow and deep learners are distinguished by the depth of their credit assignment paths, which are chains of possibly learnable, causal links between actions and effects. I review deep supervised learning (also recapitulating the history of backpropagation), unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
>
> The draft mostly consists of references (about 600 entries so far). Many important citations are still missing though. As a machine learning researcher, I am obsessed with credit assignment. In case you know of references to add or correct, please send brief explanations and bibtex entries to juergen at idsia.ch (NOT to the entire list), preferably together with URL links to PDFs for verification. Please also do not hesitate to send me additional corrections / improvements / suggestions / Deep Learning success stories with feedforward and recurrent neural networks. I'll post a revised version later.
>
> Thanks a lot!
>
> Juergen Schmidhuber
> http://www.idsia.ch/~juergen/
> http://www.idsia.ch/~juergen/whatsnew.html
>
>
>

-- 
--
Juyang (John) Weng, Professor
Department of Computer Science and Engineering
MSU Cognitive Science Program and MSU Neuroscience Program
428 S Shaw Ln Rm 3115
Michigan State University
East Lansing, MI 48824 USA
Tel: 517-353-4388
Fax: 517-432-1061
Email: weng at cse.msu.edu
URL: http://www.cse.msu.edu/~weng/
----------------------------------------------

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20140417/14cf7781/attachment.html>


More information about the Connectionists mailing list