zlacker

[return to "Jan Leike Resigns from OpenAI"]
1. ganzuu+z41[view] [source] 2024-05-15 14:26:26
>>Jimmc4+(OP)
I bet superalignment is indistinguishable from religion (the spiritual, not manipulative kind), so proponents get frequency-pulled into the well-established cult leader pipeline. It's a quagmire to navigate so we can't have both open and enlightening discussions about what is going on.
◧◩
2. uLogMi+Y41[view] [source] 2024-05-15 14:28:22
>>ganzuu+z41
I thought the whole point of making a transparent organization to lead the charge on AI was so that we could prevent this sort of ego and the other risks that come with.
◧◩◪
3. ganzuu+q71[view] [source] 2024-05-15 14:40:29
>>uLogMi+Y41
Say I have intelligence x and a superintelligence is 10x, then I get stuck at local minima that the 10x is able to get out of. To me, the local minima looked "good", so if I see the 10x get out of my "good" then most likely I'm looking at something that appears to me to be "evil" even if that is just my limited perspective.

It's one hell of a problem.

◧◩◪◨
4. esmeva+ze1[view] [source] 2024-05-15 15:12:28
>>ganzuu+q71
Short response:

I agree it's a problem but it isn't incumbent on the 'x' peers to solve it. The burden of that goes to any supposed '10x'.

Long version:

I agree with you, though I would add that a superintellect at '10x' that couldn't look at the 'x' baseline of those around it and navigate that in an effective way (in other words, couldn't organize its thoughts and present them in a safe or good seeming way), is just plain not going to ever function at a '10x' level sustainably in an ecosystem full of normal 'x' peers.

I think the whole point of Stranger in a Strange Land is about this. The Martian is (generally) not only completely ascendant, he's also incredibly effective at leveraging his ascendancy. Repeatedly, characters who find him abhorrent at a distance chill out as they begin to grok him.

The reality is that this is an ecosystem of normal 'x' peers and the '10x', as the abnormality, needs to have "functional and effective in an ecosystem of 'x' peers" as part of its core skill set, or else none of us (not even the '10x' itself) can never recognize or utilize its supposed '10x' capacity.

◧◩◪◨⬒
5. ganzuu+gj1[view] [source] 2024-05-15 15:35:05
>>esmeva+ze1
That's what I meant, once you apply what happens in practice to the theory. It's a response to a comment about ego and cults so I tried to be as political as I can... which just isn't sufficient. My entire premise is that this subject is something familiar and controversial in a new guise so there is going to be a lot of knee-jerk reactions as soon as you bring up something that looks like a pain-point.

For reference, I think most of us are '10x' in a particular field and that is our talent. Society-in-scarcity rewards talents unequally so we get status and ego resulting in a host of dark patterns. I think AI can ease scarcity so I keep betting on this horse for solving the real problem, which is ego.

[go to top]