<div dir="ltr"><div>There are quite a few researchers spending a lot of effort trying to understand the origins of consciousness and to understand whether and how the subjective experience of consciousness can be captured in a descriptive and ideally mathematical manner. Tononi, Albantakis, Seth, O'Regan, just to name a few; one does not have to agree with them, but this question has been given a lot of attention and it's worth having a look before discussing it in a vacuum. Also worth reading, amongst other, Dennett and Chalmers (just as side remark: some of you may remember the the latter as he had actually a nice Evolutionary Algorithm experiment in the 90s showing how the Widrow-Hoff rule emerged as "optimal" learning rule in a neural-type learning scenario). <br></div><div><br></div><div>The issue about consciousness being an exclusively human ability (as is often insinuated) is probably not anymore seriously discussed; it is pretty clear that even self-awareness extends significantly beyond humans, not even mentioning the subjective experience which does away with the requirement of self-reflection. It is certainly far safer to estimate that it will be a matter of degree of consciousness in the animal kingdom than to claim that it is either present or not. It seems that even our understanding of elementary experiences must be redefined as e.g. lobsters actually may feel pain.</div><div><br></div><div>Thus we should be careful making sweeping statements about the presence of consciousness in the biological realm. It is indeed a very interesting question to understand to which extent (if at all) an artificial system can experience that, too; what if the artificial system is a specifically designed, but growing biological neuron culture on an agar plate? If the response is yes for the latter, but no for the former, what is the core difference? Is it the recurrence that matters? The embodiment? Some aspect of its biological makeup? Something else?<br></div><div><br></div><div>I do not think we have good answers for this at this stage, but only some vague hints.<br></div><div><br></div><div><br></div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Feb 14, 2022 at 4:22 PM Iam Palatnik <<a href="mailto:iam.palat@gmail.com" target="_blank">iam.palat@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>A somewhat related question, just out of curiosity.</div><div><br></div><div>Imagine the following:</div><div><br></div><div></div><div>- An automatic solar panel that tracks the position of the sun.<br></div><div><div>- A group of single celled microbes with phototaxis that follow the sunlight.</div><div>- A jellyfish (animal without a brain) that follows/avoids the sunlight.</div><div>
- A cockroach (animal with a brain) that avoids the sunlight.</div><div>- A drone with onboard AI that flies to regions of more intense sunlight to recharge its batteries.<br></div><div>- A human that dislikes sunlight and actively avoids it.<br></div><div><br></div><div>Can any of these, beside the human, be said to be aware or conscious of the sunlight, and why?</div><div>What is most relevant? Being a biological life form, having a brain, being able to make decisions based on the environment? Being taxonomically close to humans?<br></div><div><br></div><div><br>
</div><div><br></div><div><br></div><div><br></div><div><br></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Feb 14, 2022 at 12:06 PM Gary Marcus <<a href="mailto:gary.marcus@nyu.edu" target="_blank">gary.marcus@nyu.edu</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Also true: Many AI researchers are very unclear about what consciousness is and also very sure that ELIZA doesn’t have it.<br>
<br>
Neither ELIZA nor GPT-3 have<br>
- anything remotely related to embodiment<br>
- any capacity to reflect upon themselves<br>
<br>
Hypothesis: neither keyword matching nor tensor manipulation, even at scale, suffice in themselves to qualify for consciousness.<br>
<br>
- Gary<br>
<br>
> On Feb 14, 2022, at 00:24, Geoffrey Hinton <<a href="mailto:geoffrey.hinton@gmail.com" target="_blank">geoffrey.hinton@gmail.com</a>> wrote:<br>
> <br>
> Many AI researchers are very unclear about what consciousness is and also very sure that GPT-3 doesn’t have it. It’s a strange combination.<br>
> <br>
> <br>
<br>
</blockquote></div>
</blockquote></div>