Connectionists: Weird beliefs about consciousness

Daniel Polani daniel.polani at gmail.com
Mon Feb 14 12:04:02 EST 2022


There are quite a few researchers spending a lot of effort trying to
understand the origins of consciousness and to understand whether and how
the subjective experience of consciousness can be captured in a descriptive
and ideally mathematical manner. Tononi, Albantakis, Seth, O'Regan, just to
name a few; one does not have to agree with them, but this question has
been given a lot of attention and it's worth having a look before
discussing it in a vacuum. Also worth reading, amongst other, Dennett and
Chalmers (just as side remark: some of you may remember the the latter as
he had actually a nice Evolutionary Algorithm experiment in the 90s showing
how the Widrow-Hoff rule emerged as "optimal" learning rule in a
neural-type learning scenario).

The issue about consciousness being an exclusively human ability (as is
often insinuated) is probably not anymore seriously discussed; it is pretty
clear that even self-awareness extends significantly beyond humans, not
even mentioning the subjective experience which does away with the
requirement of self-reflection. It is certainly far safer to estimate that
it will be a matter of degree of consciousness in the animal kingdom than
to claim that it is either present or not. It seems that even our
understanding of elementary experiences must be redefined as e.g. lobsters
actually may feel pain.

Thus we should be careful making sweeping statements about the presence of
consciousness in the biological realm. It is indeed a very interesting
question to understand to which extent (if at all) an artificial system can
experience that, too; what if the artificial system is a specifically
designed, but growing biological neuron culture on an agar plate? If the
response is yes for the latter, but no for the former, what is the core
difference? Is it the recurrence that matters? The embodiment? Some aspect
of its biological makeup? Something else?

I do not think we have good answers for this at this stage, but only some
vague hints.





On Mon, Feb 14, 2022 at 4:22 PM Iam Palatnik <iam.palat at gmail.com> wrote:

> A somewhat related question, just out of curiosity.
>
> Imagine the following:
>
> - An automatic solar panel that tracks the position of the sun.
> - A group of single celled microbes with phototaxis that follow the
> sunlight.
> - A jellyfish (animal without a brain) that follows/avoids the sunlight.
> - A cockroach (animal with a brain) that avoids the sunlight.
> - A drone with onboard AI that flies to regions of more intense sunlight
> to recharge its batteries.
> - A human that dislikes sunlight and actively avoids it.
>
> Can any of these, beside the human, be said to be aware or conscious of
> the sunlight, and why?
> What is most relevant? Being a biological life form, having a brain, being
> able to make decisions based on the environment? Being taxonomically close
> to humans?
>
>
>
>
>
>
>
> On Mon, Feb 14, 2022 at 12:06 PM Gary Marcus <gary.marcus at nyu.edu> wrote:
>
>> Also true: Many AI researchers are very unclear about what consciousness
>> is and also very sure that ELIZA doesn’t have it.
>>
>> Neither ELIZA nor GPT-3 have
>> - anything remotely related to embodiment
>> - any capacity to reflect upon themselves
>>
>> Hypothesis: neither keyword matching nor tensor manipulation, even at
>> scale, suffice in themselves to qualify for consciousness.
>>
>> - Gary
>>
>> > On Feb 14, 2022, at 00:24, Geoffrey Hinton <geoffrey.hinton at gmail.com>
>> wrote:
>> >
>> > Many AI researchers are very unclear about what consciousness is and
>> also very sure that GPT-3 doesn’t have it. It’s a strange combination.
>> >
>> >
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20220214/52820722/attachment.html>


More information about the Connectionists mailing list