zlacker

[return to "Introducing Superalignment"]
1. skepti+pe[view] [source] 2023-07-05 17:55:07
>>tim_sw+(OP)
Why are they starting to sound more and more cult-like? This is an incredibly unscientific blog post. I get that they are a private company now, but why even release something like this without further details?
◧◩
2. batman+vn[view] [source] 2023-07-05 18:26:25
>>skepti+pe
Because the "AGI" pursuit is at least as much a faith movement as it is a rational engineering program. If you examine it more deeply, the faith object isn't even the conjectured inevitable AGI, it's exponential growth curves. (That is of course true for startup culture more generally, from which the current AI boom is an outgrowth.) For my money, The Singularity is Near still counts as the ur-text that the true believers will never let go, even though Kurzweil was summarizing earlier belief trends.

It's just a pity that the creepy doomer weirdos so thoroughly squatted the term "rationalist." It would be interesting to see the perspective on these people 100 years hence, or even 50. I don't doubt there will still be remnant believers who end up moderating and sanitizing their beliefs, much like the Seventh Day Adventists or the Mormons.

◧◩◪
3. zzzzzz+7F[view] [source] 2023-07-05 19:36:20
>>batman+vn
you don't need to believe in exponential growth per se, all you really need to believe is that humans aren't that capable relative to what could be in principle built - it's entirely possible logistic growth may be more than enough to get us very far past human ability once the right paradigm is discovered.
◧◩◪◨
4. trasht+Me1[view] [source] 2023-07-05 22:33:36
>>zzzzzz+7F
Exactly. All that is needed for AGI to eventually be developed, is that humans do NOT have some magical or divine essence that set us apart from the material world.

Now the _timeline_ of AGI could be anything from the a few years to millennia, at least if evaluated 40 years ago. Now, though, it really doesnt seem very distant.

[go to top]