zlacker

[parent] [thread] 0 comments
1. rglove+(OP)[view] [source] 2023-07-06 01:43:50
> The extinction risk from unaligned supterintelligent AGI is real

Yes, but for reasons that no one seems to be looking at: skill atrophy. As more and more people buy into this gambit that AI is "super intelligent," they will cede more and more cognitive power to it.

On a curve, that means ~10-20 years out, AI doesn't kill us because it took over all of our work, people just got too lazy (read: over-dependent on AI doing "all the things") and then subsequently too dumb to do the work. Idiocracy, but the M. Night Shymalan version.

As we approach that point, systems that require some form of conscious human will begin to fail and the bubble will burst.

[go to top]