zlacker

[parent] [thread] 1 comments
1. a_wild+(OP)[view] [source] 2024-03-01 22:55:23
Self annihilation fails due to nuclear proliferation, i.e MAD. So your conclusion is backward.

But that's irrelevant anyway, because nukes are a terrible analogy. If you insist on sci-fi speculation, use an analogy that's somewhat remotely similar -- perhaps compare the development of AI vs. traditional medicine. They're both very general technologies with incredible benefits and important dangers (e.g. superbugs, etc).

replies(1): >>TeMPOr+zf
2. TeMPOr+zf[view] [source] 2024-03-02 00:55:22
>>a_wild+(OP)
If you insist on sci-fi analogy, then try protomolecule from The Expanse. Or a runaway grey goo scenario triggered by a biotech or nanotech accident.

Artificial general intelligence is not a stick you can wield and threaten other countries with. It's a process, complex beyond our understanding.

[go to top]