I agree that we should legislate against the aggressors, that's why I'm pointing out the limitations of technical solutions like watermarks. We need extensions to things like revenge pornography laws, if we're talking about legislation, and I don't see any harm in outlawing services that automate the creation of deepfakes.
Of course the only "solution" is that we would universally behind to teach young boys that they are not entitled to women's bodies or their sexuality, but so many grown men apparently disagree that I can't see it happening quickly enough.
It's a nice applause line though.
Edit: you disagree that men aren't entitled to women's sexuality?
Edit: I mis-interpreted what was being disagree with.
This article talks a bit about the lack of legal power to fight against deepfakes: https://mcolaw.com/theres-not-much-we-can-legally-do-about-d...
I think the central issue here is: what restrictions, if any, should be placed around creating and distributing a likeness of another person? Are we just looking to prohibit pornographic likenesses, or do you think the restrictions should be broader? What's the threshold rule you would apply? Should these rules be medium-specific, or should we also prohibit people from, say, painting a likeness of another person without their consent?
I guess in a US context you'd also have to consider whether it's constitutional to restrict freedom of expression, even the distasteful ones, in this manner.
Edit: Just saw your edit suggesting that I think "men are entitled to women's bodies" (whatever that means). I think I'll end my participation here, not interested in having a bad faith discussion.
Personally the limits are similar to that, as I'm personally most interested in fighting sexual harassment. The legislation against revenge pornography already faces and tackles issues of what constitutes pornography and when it becomes illegal to disseminate pornographic images of others, so it's not an intractable problem.
Indeed, we also have precedents for limiting the use of tools for certain purposes. Using deepfake technology to generate images akin to CSAM would already be illegal in the UK, but other broader and everyday examples exist like speed limits for cars.
Edit to respond to yours: I said above that we should teach boys they're not entitled to women's sexuality, but that many men disagree. You said you were one of them. I had meant the disagreement being on the entitlement, but I'm now considering that you took it to mean they disagreed with the education about entitlement. It was a misunderstanding, and I was responding in good faith. I didn't suggest anything about you, I asked if my interpretation of your response was correct.
The precedents you raise are worth considering. They're related but not completely analogous to deepfake porn of real people in my view. CSAM is criminalised due to the direct harm its production inflicts on minors and the deep injury to society that follows. Deepfake CSAM, I presume, has more of an 'obscenity' rationale as there is no actual direct harm inflicted on minors in that case. I suppose you could have a similar obscenity rationale for criminalising deepfake porn but you would then have to accept that pornography in general should be outlawed. An obscenity rationale would also be more supportive of criminal sanctions, as acts of obscenity injure society in addition to individual subjects.
I think revenge pornography is the best analogy here. I assume the policy rationale / theory of harm for criminalising 'revenge porn' (i.e. distributing true intimate private images of another person) is one of two things: (1) violation of the subject's privacy or (2) infliction of psychological harm on the subject. If the policy rationale is (1) then I don't I don't think there's a sound analogy to deepfake porn - deepfakes are fictional and so do not violate the privacy of the subject.
If the rationale is (2), psychological harm, then I could see a similar policy rationale for legislating against deepfake porn. But if psychological harm is your policy rationale then wouldn't it make more sense to directly criminalise the infliction of psychological harm on others regardless of the method used? If we were regulating on a principled and universal basis we should pass a law that criminalises any act, publication or utterance that inflicts psychological harm on another person, rather than using the law to solve single instances of this class of offences. Although I'd strongly disagree with such a law due to the chilling effect it would have on all forms of speech, expression and public commentary I think there's at least a principled argument to be had.
But if you legislate on this principle then you have to grapple with the far reaching implications of such a law - if someone writes some smutty (but fictional) erotic story about me that I find psychologically distressing should they then be thrown in jail? What if they say hurtful things to me that I find psychologically harmful? What if they insult a religion or political candidate, party or ideology that I strongly identify with? We all inflict psychological harm on others from time to time - what should minimum harm threshold be?
Personally, I don't think the criminal law is the answer in either the deepfake or revenge porn cases if the rationale is 'psychological harm'. Although I'm not sure where I stand on the following, I think a civil tort for infliction of psychological harm would be the sanest option if we feel the need to regulate against infliction of psychological harm. It would be analogous to defamation and libel torts, but instead of having to prove economic harm the plaintiff would have to prove some minimum threshold level of psychological harm to become entitled to compensation from the defendant in proportion to the actual provable injury sustained.
My thoughts aside, what is your general theory of harm / principled policy rationale here and, on that basis, what do you think the state's response should be to regulate?