And it isn't a strong argument for the same reason that it isn't a good argument when used to argue we should allow human cloning and just focus on regulating the more direct causal links like non-clone employment loss from mass produced hyper-intelligent clones, and ensuring they have legal rights, and having proper oversight and non-clone human accountability.
Maybe those things could all make ethical human cloning viable. But I think the world coming together and being like "holy shit this is happening too fast. Our institutions aren't ready at all nor will they adapt fast enough. Global ban" was the right call.
It is not impossible that a similar call is also appropriate here with AI. I personally dunno what the right call is, but I'm pretty skeptical of any strong claim that it could never be the right call to outright ban some forms of advanced AI research just like we did with some forms of advanced genetic engineering research.
This isn't like banning numbers at all. The blame falling on the corner-cutters doesn't mean the right call is always to just tell the blamed not to cut corners. In some cases the right call is instead taking away their corner-cutting tool.
At least until our institutions can catch up.
But even then, that’s a linear diffusion- one person, one body mod. I guess you could say that their descendants would proliferate and multiply so the alteration slowly grows exponentially over the generations.. but the FUD I hear from AI decelerationists is that it would be an explosive diffusion of harms, like, as soon as the day after tomorrow. One architect, up to billions of victims, allegedly. Not that I think it’s unwise to be compelled to precaution with new and mighty technologies, but what is it that some people are so worried about that they’re willing to ban all research, and choke all the good that has come from them, already? Maybe it’s just a symptom of the underlying growing mistrust in the social contract..
Totally agree we could be witnessing a growing mistrust in the social contract.