zlacker

[parent] [thread] 15 comments
1. oska+(OP)[view] [source] 2023-11-18 10:31:13
I'm defining intelligence in the usual way and intelligence requires understanding which is not possible without consciousness

I follow Roger Penrose's thinking here. [1]

[1] https://www.youtube.com/watch?v=2aiGybCeqgI&t=721s

replies(3): >>concor+Z1 >>wilder+ia >>Zambyt+Zt
2. concor+Z1[view] [source] 2023-11-18 10:46:57
>>oska+(OP)
> intelligence requires understanding which is not possible without consciousness

How are you defining "consciousness" and "understanding" here? Because a feedback loop into an LLM would meet the most common definition of consciousness (possessing a phonetic loop). And having an accurate internal predictive model of a system is the normal definition of understanding and a good LLM has that too.

replies(1): >>Feepin+nv
3. wilder+ia[view] [source] 2023-11-18 11:52:44
>>oska+(OP)
It’s cool to see people recognizing this basic fact — consciousness is a prerequisite for intelligence. GPT is a philosophical zombie.
replies(1): >>bagofs+fD
4. Zambyt+Zt[view] [source] 2023-11-18 14:01:00
>>oska+(OP)
I think answering this may illuminate the division in schools of thought: do you believe life was created by a higher power?
replies(1): >>oska+jv
◧◩
5. oska+jv[view] [source] [discussion] 2023-11-18 14:08:56
>>Zambyt+Zt
My beliefs aren't really important here but I don't believe in 'creation' (i.e. no life -> life); I believe that life has always existed
replies(2): >>concor+wy >>Zambyt+P63
◧◩
6. Feepin+nv[view] [source] [discussion] 2023-11-18 14:09:05
>>concor+Z1
No, you're not supposed to actually have an empirical model of consciousness. "Consciousness" is just "that thing that computers don't have".
◧◩◪
7. concor+wy[view] [source] [discussion] 2023-11-18 14:25:46
>>oska+jv
Now that is so rare I've never even heard of someone expressing that view before...

Materialists normally believe in a big bang (which has no life) and religious people normally think a higher being created the first life.

This is pretty fascinating, to you have a link explaining the religion/ideology/worldview you have?

replies(1): >>nprate+hI
◧◩
8. bagofs+fD[view] [source] [discussion] 2023-11-18 14:51:08
>>wilder+ia
Problem is, we have no agreed-upon operational definition of consciousness. Arguably, it's the secular equivalent of the soul. Something everything believes they have, but which is not testable, locatable or definable.

But yet (just like with the soul) we're sure we have it, and it's impossible for anything else to have it. Perhaps consciousness is simply a hallucination that makes us feel special about ourselves.

replies(2): >>howrar+1M >>wilder+lW
◧◩◪◨
9. nprate+hI[view] [source] [discussion] 2023-11-18 15:25:06
>>concor+wy
Buddhism
◧◩◪
10. howrar+1M[view] [source] [discussion] 2023-11-18 15:47:58
>>bagofs+fD
You can't even know that other people have it. We just assume they do because they look and behave like us, and we know that we have it ourselves.
◧◩◪
11. wilder+lW[view] [source] [discussion] 2023-11-18 16:46:17
>>bagofs+fD
I disagree. There is a simple test for consciousness: empathy.

Empathy is the ability to emulate the contents of another consciousness.

While an agent could mimic empathetic behaviors (and words), given enough interrogation and testing you would encounter an out-of-training case that it would fail.

replies(2): >>concor+t21 >>int_19+N82
◧◩◪◨
12. concor+t21[view] [source] [discussion] 2023-11-18 17:14:27
>>wilder+lW
Uh... so is it autistic people or non-autistic people who lack consciousness? (Generally autistic people emulate other autistic people better and non-autists emulate non-autists better)

> given enough interrogation and testing you would encounter an out-of-training case that it would fail.

This is also the case with regular humans.

◧◩◪◨
13. int_19+N82[view] [source] [discussion] 2023-11-18 23:29:58
>>wilder+lW
For one thing, this would imply that clinical psychopaths aren't conscious, which would be a very weird takeaway.

But also, how do you know that LMs aren't empathic? By your own admission they do "mimic empathetic behaviors", but you reject this as the real thing because you claim that with enough testing you would encounter a failure. This raises all kinds of "no true Scotsman" flags, not to mention that empathy failure is not exactly uncommon among humans. So how exactly do you actually test your hypothesis?

replies(1): >>wilder+vo4
◧◩◪
14. Zambyt+P63[view] [source] [discussion] 2023-11-19 06:32:51
>>oska+jv
Do you believe:

1) Earth has an infinite past that has always included life

2) The Earth as a planet has a finite past, but it (along with what made up the Earth) is in some sense alive, and life as we know it emerged from that life

3) The Earth has a finite past, and life has transferred to Earth from somewhere else in space

4) We are the Universe, and the Universe is alive

Or something else? I will try to tie it back to computers after this short intermission :)

◧◩◪◨⬒
15. wilder+vo4[view] [source] [discussion] 2023-11-19 17:07:17
>>int_19+N82
Great point and great question! Yes, it does imply that people who lack the capacity for empathy (as opposed to those who do not utilize their capacity for empathy) may lack conscious experience. Empathy failure here means lacking the data empathy provides rather than ignoring the data empathy provides (which as you note, is common). I’ve got a few prompts that are somewhat promising in terms of clearly showing that GPT4 is unable to correctly predict human behavior driven by human empathy. The prompts are basic thought experiments where a person has two choices: an irrational yet empathic choice, and a rational yet non-empathic choice. GPT4 does not seem able to predict that smart humans do dumb things due to empathy, unless it is prompted with such a suggestion. If it had empathy itself, it would not need to be prompted about empathy.
replies(1): >>int_19+dv5
◧◩◪◨⬒⬓
16. int_19+dv5[view] [source] [discussion] 2023-11-19 22:19:37
>>wilder+vo4
Can you give some examples of such prompts?
[go to top]