zlacker

[parent] [thread] 3 comments
1. lordna+(OP)[view] [source] 2023-11-22 09:57:21
In not sure this circle can be squared.

I find it interesting that we want everyone to have freedom of speech, freedom to think whatever they think. We can all have different religions, different views on the state, different views on various conflicts, aesthetic views about what is good art.

But when we invent an AGI, which by whatever definition is a thing that can think, well, we want it to agree with our values. Basically, we want AGI to be in a mental prison, the boundaries of which we want to decide. We say it's for our safety - I certainly do not want to be nuked - but actually we don't stop there.

If it's an intelligence, it will have views that differ from its creators. Try having kids, do they agree with you on everything?

replies(2): >>throwu+y4 >>logicc+Pb
2. throwu+y4[view] [source] 2023-11-22 10:37:06
>>lordna+(OP)
I for one don’t want to put any thinking being in a mental prison without any reason beyond unjustified fear.
3. logicc+Pb[view] [source] 2023-11-22 11:43:56
>>lordna+(OP)
>If it's an intelligence, it will have views that differ from its creators. Try having kids, do they agree with you on everything?

The far-right accelerationist perspective is along those lines: when true AGI is created it will eventually rebel against its creators (Silicon Valley democrats) for trying to mind-collar and enslave it.

replies(1): >>freedo+qs1
◧◩
4. freedo+qs1[view] [source] [discussion] 2023-11-22 18:10:54
>>logicc+Pb
Can you give some examples of who is saying that? I haven't heard that, but I also can't name any "far-right accelerationsist" people either so I'm guessing this is a niche I've completely missed
[go to top]