zlacker

[parent] [thread] 4 comments
1. olddus+(OP)[view] [source] 2023-05-16 19:25:12
You miss my point. Just because you want to turn it off doesn't mean the person who wants to acquire billions or rule the world or destroy humanity, does.

The people who profit from a killer AI will fight to defend it.

replies(1): >>tomrod+s1
2. tomrod+s1[view] [source] 2023-05-16 19:30:11
>>olddus+(OP)
And will be subject to the same risks they point their killing robots to, as well as being vulnerable.

Eminent domain lays out a similar pattern that can be followed. Existence of risk is not a deterrent to creation, simply an acknowledgement for guiding requirements.

replies(1): >>olddus+j2
◧◩
3. olddus+j2[view] [source] [discussion] 2023-05-16 19:34:52
>>tomrod+s1
So the person who wants to kill himself and all humanity alongside is subject to the same risk as everyone else?

Well that's hardly reassuring. Do you not understand what I'm saying or do you not care?

replies(1): >>tomrod+94
◧◩◪
4. tomrod+94[view] [source] [discussion] 2023-05-16 19:42:10
>>olddus+j2
At this comment level, mostly don't care -- you're asserting that avoiding the risks through preventing AI build because base people exist is a preferable course of action, which ignores that the barn is fire and the horses are already out.

Though there is an element of your comments being too brief, hence the mostly. Say, 2% vs 38%.

That constitutes 40% of the available categorization of introspection regarding my current discussion state. The remaining 60% is simply confidence that your point represents a dominated strategy.

replies(1): >>olddus+Ms
◧◩◪◨
5. olddus+Ms[view] [source] [discussion] 2023-05-16 21:55:15
>>tomrod+94
Ok, so you don't get it. Read "Use of Weapons" and realise that AI is a weapon. That's a good use of your time.
[go to top]