Connectionists: Stephen Hanson in conversation with Geoff Hinton

Tsvi Achler achler at gmail.com
Wed Jul 20 03:51:59 EDT 2022


I believe one way to address this problem that learning with different
types of features may give up other virtues (of Transformers etc) is to
scale better by:

1) reducing the cost of learning so the same information over more feature
types can be learned at once.

2) new features/learning to be able to add modularly to the model (eg avoid
catastrophic forgetting)

3)  Not making a decision of what features are most important ahead of time

4) taking a shotgun approach and learning with as much features as possible

These goals can be better achieved if the networks learning (or at least
top layer learning) does not require iid (independent and
identically distributed) rehearsal and is super scalable.

Feedforward methods (e.g. current neural networks) have issues with 1 & 2
while most other methods such as Bayesian Networks have problems with 1 &
scalability.

My 2c,
-Tsvi


On Tue, Jul 19, 2022 at 12:09 AM Gary Marcus <gary.marcus at nyu.edu> wrote:

> sure, but this goes back to Tom’s point about representation; i addressed
> this sort of thing at length in Chapter 3 of The Algebraic Mind.
>
> You can solve this one problem in that way but then give up most of the
> other virtues of Transformers etc if you build a different network and
> representational scheme for each problem you encounter.
>
> > On Jul 18, 2022, at 9:19 AM, Barak A. Pearlmutter <barak at pearlmutter.net>
> wrote:
> >
> > On Mon, 18 Jul 2022 at 17:02, Gary Marcus <gary.marcus at nyu.edu> wrote:
> >>
> >> sure,   but a person can learn [n-bit parity] from a few examples with
> a small number of bits, generalizing it to large values of n. most current
> systems learn it for a certain number of bits and don’t generalize beyond
> that number of bits.
> >
> > Really? Because I would not think that induction of a two-state DFA
> > over a two-symbol alphabet woud be beyond the current state of the
> > art.
> >
> > --Barak Pearlmutter.
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20220720/929801d4/attachment.html>


More information about the Connectionists mailing list