Connectionists: Weird beliefs about consciousness

Ali Minai minaiaa at gmail.com
Mon Feb 14 10:42:18 EST 2022


Gary

I agree with all of that completely. However, I do have a further question.
How will we ever know whether even a fully embodied, high performing
intelligent machine that can “explain” its decisions has the capacity for
actual self-reflection? How can we be sure that other humans have the
capacity for self-reflection, beyond the fact that they are like us, give
us understandable reports of their inner thinking, and tickle our mirror
system in the right way?

This is not to deny that humans can do self-reflection; it is about whether
we have an objective method to know that another species not like us is
doing it. At some point in what I think will be a fairly distant future, an
nth- generation, descendent of GPT-3 may be so sophisticated that the
question will become moot. That is the inevitable consequence of a
non-dualistic view of intelligence that we all share, and will come to pass
unless the current narrow application-driven view of AI derails the entire
project.

I do think that that that system will have a very different neural
architecture much closer to the brain’s, will be embodied, will not need to
learn using billions of data points and millions of gradient-descent
iterations, and will be capable of actual mental growth (not just learning
more stuff but learning new modes of understanding) over its lifetime. It
will also not be able to explain its inner thoughts perfectly, nor feel the
need to do so because it will not be our obedient servant. In other words,
it will not be anything like GPT-3.

Ali



On Mon, Feb 14, 2022 at 10:06 AM Gary Marcus <gary.marcus at nyu.edu> wrote:

> Also true: Many AI researchers are very unclear about what consciousness
> is and also very sure that ELIZA doesn’t have it.
>
> Neither ELIZA nor GPT-3 have
> - anything remotely related to embodiment
> - any capacity to reflect upon themselves
>
> Hypothesis: neither keyword matching nor tensor manipulation, even at
> scale, suffice in themselves to qualify for consciousness.
>
> - Gary
>
> > On Feb 14, 2022, at 00:24, Geoffrey Hinton <geoffrey.hinton at gmail.com>
> wrote:
> >
> > Many AI researchers are very unclear about what consciousness is and
> also very sure that GPT-3 doesn’t have it. It’s a strange combination.
> >
> >
>
> --
*Ali A. Minai, Ph.D.*
Professor and Graduate Program Director
Complex Adaptive Systems Lab
Department of Electrical Engineering & Computer Science
828 Rhodes Hall
University of Cincinnati
Cincinnati, OH 45221-0030

Past-President (2015-2016)
International Neural Network Society

Phone: (513) 556-4783
Fax: (513) 556-7326
Email: Ali.Minai at uc.edu
          minaiaa at gmail.com

WWW: https://eecs.ceas.uc.edu/~aminai/ <http://www.ece.uc.edu/%7Eaminai/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20220214/3f78b1fd/attachment.html>


More information about the Connectionists mailing list