zlacker

[return to "Thousands of AI Authors on the Future of AI"]
1. endisn+uc[view] [source] 2024-01-08 22:23:55
>>treebr+(OP)
Maybe I'm too pessimistic, but I doubt we will have AGI by even 2100. I define AGI as the ability for an intelligence that is not human to do anything any human has ever done or will do with technology that does not include itself* (AGI).

* It also goes without saying that by this definition I mean to say that humanity will no longer be able to meaningfully help in any qualitative way with respect to intellectual tasks (e.g. AGI > human; AGI > human + computer; AGI > human + internet; AGI > human + LLM).

Fundamentally I believe AGI will never happen without a body. I believe intelligence requires constraints and the ultimate constraint is life. Some omniscient immortal thing seems neat, but I doubt it'll be as smart since it lacks any constraints to drive it to growth.

◧◩
2. paxys+Ck[view] [source] 2024-01-08 23:00:52
>>endisn+uc
AGI doesn't have to mean superintelligence/singularity (which seems to be what you are describing).
◧◩◪
3. endisn+Sm[view] [source] 2024-01-08 23:12:24
>>paxys+Ck
What is your definition for AGI that isn't already met?
◧◩◪◨
4. paxys+Ho[view] [source] 2024-01-08 23:21:11
>>endisn+Sm
Intelligence involves self-learning and self-correction. AIs today are trained for specific tasks on specific data sets and cannot expand beyond that. If you give an LLM a question it cannot answer, and it goes and figures out how to answer it without additional help, that will be behavior that qualifies it as AGI.
◧◩◪◨⬒
5. endisn+7q[view] [source] 2024-01-08 23:27:59
>>paxys+Ho
by that definition what you realize is that it's the same as what I said since it can easily be reduced down to any thing any human can do, and your definition says AGI can go figure out how to do it. you extrapolate this onto future tasks and viola.

as I mention in another post, this is why I do not make any distinction between AGI and superintelligence. I believe they are the same thing. a thought experiment - what would it mean for a human to be superintelligent? presumably it would mean learning things with the least possible amount of exposure (not omniscience, necessarily).

[go to top]