zlacker

[parent] [thread] 2 comments
1. anon29+(OP)[view] [source] 2023-11-18 14:45:45
> LLMs will get to the point where they appear to be AGI, but only in the same way the latest 3D rendering technology can create images that appear to be real.

Distinction without difference

replies(1): >>keepam+HM1
2. keepam+HM1[view] [source] 2023-11-19 00:58:14
>>anon29+(OP)
Disagree. Huge difference. In our tech power society, we often mistakenly think that we can describe everything about the world and equally falsely only what we can consciously describes exists.

But there is so much more than what we can consciously describe, to reality, like 10,000 to 1 — and none of that is captured by any of these synthetic representations.

so far. and yet all of that is or a lot of that is understood, responded to and dealt with by the intelligence that resides within our bodies and in our subconscious.

And our own intelligence arises out of that, you cannot have general intelligence without reality. No matter how much data you train it on from the Internet. It’s never gonna be as rich as for the same as putting it in a body in the real world, and letting them grow learn experience and evolve. And so any air quotes intelligence you get out of this virtual synthetic training is never going to be real. Itis always gonna be a poor copy of intelligence and is not gonna be an AGI.

replies(1): >>anon29+mS1
◧◩
3. anon29+mS1[view] [source] [discussion] 2023-11-19 01:43:06
>>keepam+HM1
Developing an AGI is not the same as developing an artificial human. The former is achievable, the latter is not. The problem is many of the gnostics today believe that giving the appearance of AGI (ie having all the utility that a general mechanical intelligence would have to a human being) somehow instills humanity into the system. It does not

Intelligence is not the defining characteristic of humanity, which is what you're getting at here. But it is something that can be automated.

[go to top]