zlacker

[parent] [thread] 3 comments
1. paxys+(OP)[view] [source] 2024-01-08 23:00:52
AGI doesn't have to mean superintelligence/singularity (which seems to be what you are describing).
replies(1): >>endisn+g2
2. endisn+g2[view] [source] 2024-01-08 23:12:24
>>paxys+(OP)
What is your definition for AGI that isn't already met?
replies(1): >>paxys+54
◧◩
3. paxys+54[view] [source] [discussion] 2024-01-08 23:21:11
>>endisn+g2
Intelligence involves self-learning and self-correction. AIs today are trained for specific tasks on specific data sets and cannot expand beyond that. If you give an LLM a question it cannot answer, and it goes and figures out how to answer it without additional help, that will be behavior that qualifies it as AGI.
replies(1): >>endisn+v5
◧◩◪
4. endisn+v5[view] [source] [discussion] 2024-01-08 23:27:59
>>paxys+54
by that definition what you realize is that it's the same as what I said since it can easily be reduced down to any thing any human can do, and your definition says AGI can go figure out how to do it. you extrapolate this onto future tasks and viola.

as I mention in another post, this is why I do not make any distinction between AGI and superintelligence. I believe they are the same thing. a thought experiment - what would it mean for a human to be superintelligent? presumably it would mean learning things with the least possible amount of exposure (not omniscience, necessarily).

[go to top]