zlacker

[parent] [thread] 9 comments
1. AlecSc+(OP)[view] [source] 2024-02-01 12:14:46
Ease of use and accessibility. Think of how we control access to guns even though a baseball bat could also be used to kill or maim someone.

I agree that we should legislate against the aggressors, that's why I'm pointing out the limitations of technical solutions like watermarks. We need extensions to things like revenge pornography laws, if we're talking about legislation, and I don't see any harm in outlawing services that automate the creation of deepfakes.

Of course the only "solution" is that we would universally behind to teach young boys that they are not entitled to women's bodies or their sexuality, but so many grown men apparently disagree that I can't see it happening quickly enough.

replies(1): >>spangr+o7
2. spangr+o7[view] [source] 2024-02-01 13:18:08
>>AlecSc+(OP)
I'm one of the grown men who disagree. I don't think treating half the population as pre-criminals, when in reality it's an extremely tiny minority who act in this way, is a particularly good solution. If we were to apply this kind of "solution" to all undesirable behaviours exhibited by deviant minorities of both men and women I doubt there'd be any time for any actual K-12 formal education.

It's a nice applause line though.

replies(1): >>AlecSc+k9
◧◩
3. AlecSc+k9[view] [source] [discussion] 2024-02-01 13:34:46
>>spangr+o7
So no to technical solutions, no to legislative solutions and no to education. What do you suggest?

Edit: you disagree that men aren't entitled to women's sexuality?

Edit: I mis-interpreted what was being disagree with.

replies(2): >>kj99+6a >>spangr+gd
◧◩◪
4. kj99+6a[view] [source] [discussion] 2024-02-01 13:38:34
>>AlecSc+k9
Presumably enforcement of existing laws.
replies(1): >>AlecSc+4c
◧◩◪◨
5. AlecSc+4c[view] [source] [discussion] 2024-02-01 13:54:34
>>kj99+6a
Which laws do you mean and where do they apply?

This article talks a bit about the lack of legal power to fight against deepfakes: https://mcolaw.com/theres-not-much-we-can-legally-do-about-d...

replies(1): >>kj99+2C
◧◩◪
6. spangr+gd[view] [source] [discussion] 2024-02-01 14:02:19
>>AlecSc+k9
I suggest we clearly specify the class of problem we're trying to solve and come up with a principled solution that would make sense when applied consistently and universally to all problems in that class. I prefer this over coming up with knee-jerk moral panic patches (e.g. "censor generative models so they can't generate this particular thing I find distasteful") or with overly abstract and tangential problem-solutions (e.g. "just teach men not to be big jerks").

I think the central issue here is: what restrictions, if any, should be placed around creating and distributing a likeness of another person? Are we just looking to prohibit pornographic likenesses, or do you think the restrictions should be broader? What's the threshold rule you would apply? Should these rules be medium-specific, or should we also prohibit people from, say, painting a likeness of another person without their consent?

I guess in a US context you'd also have to consider whether it's constitutional to restrict freedom of expression, even the distasteful ones, in this manner.

Edit: Just saw your edit suggesting that I think "men are entitled to women's bodies" (whatever that means). I think I'll end my participation here, not interested in having a bad faith discussion.

replies(1): >>AlecSc+Xe
◧◩◪◨
7. AlecSc+Xe[view] [source] [discussion] 2024-02-01 14:11:14
>>spangr+gd
I'm not in the US but I understand that there are already laws which limit your expression, such as in cases of CSAM or revenge pornography, which night be the closest analogue.

Personally the limits are similar to that, as I'm personally most interested in fighting sexual harassment. The legislation against revenge pornography already faces and tackles issues of what constitutes pornography and when it becomes illegal to disseminate pornographic images of others, so it's not an intractable problem.

Indeed, we also have precedents for limiting the use of tools for certain purposes. Using deepfake technology to generate images akin to CSAM would already be illegal in the UK, but other broader and everyday examples exist like speed limits for cars.

Edit to respond to yours: I said above that we should teach boys they're not entitled to women's sexuality, but that many men disagree. You said you were one of them. I had meant the disagreement being on the entitlement, but I'm now considering that you took it to mean they disagreed with the education about entitlement. It was a misunderstanding, and I was responding in good faith. I didn't suggest anything about you, I asked if my interpretation of your response was correct.

replies(1): >>spangr+dk2
◧◩◪◨⬒
8. kj99+2C[view] [source] [discussion] 2024-02-01 16:12:13
>>AlecSc+4c
Good information. Then the solution would be to improve harassment legislation rather than limiting the availability of tools. Just as assault is illegal but we don’t require all hammers to be made out of foam.
replies(1): >>AlecSc+4K
◧◩◪◨⬒⬓
9. AlecSc+4K[view] [source] [discussion] 2024-02-01 16:50:07
>>kj99+2C
Indeed.
◧◩◪◨⬒
10. spangr+dk2[view] [source] [discussion] 2024-02-02 02:11:53
>>AlecSc+Xe
Fair enough on your edit, I accept it was a misinterpretation and appreciate the clarification.

The precedents you raise are worth considering. They're related but not completely analogous to deepfake porn of real people in my view. CSAM is criminalised due to the direct harm its production inflicts on minors and the deep injury to society that follows. Deepfake CSAM, I presume, has more of an 'obscenity' rationale as there is no actual direct harm inflicted on minors in that case. I suppose you could have a similar obscenity rationale for criminalising deepfake porn but you would then have to accept that pornography in general should be outlawed. An obscenity rationale would also be more supportive of criminal sanctions, as acts of obscenity injure society in addition to individual subjects.

I think revenge pornography is the best analogy here. I assume the policy rationale / theory of harm for criminalising 'revenge porn' (i.e. distributing true intimate private images of another person) is one of two things: (1) violation of the subject's privacy or (2) infliction of psychological harm on the subject. If the policy rationale is (1) then I don't I don't think there's a sound analogy to deepfake porn - deepfakes are fictional and so do not violate the privacy of the subject.

If the rationale is (2), psychological harm, then I could see a similar policy rationale for legislating against deepfake porn. But if psychological harm is your policy rationale then wouldn't it make more sense to directly criminalise the infliction of psychological harm on others regardless of the method used? If we were regulating on a principled and universal basis we should pass a law that criminalises any act, publication or utterance that inflicts psychological harm on another person, rather than using the law to solve single instances of this class of offences. Although I'd strongly disagree with such a law due to the chilling effect it would have on all forms of speech, expression and public commentary I think there's at least a principled argument to be had.

But if you legislate on this principle then you have to grapple with the far reaching implications of such a law - if someone writes some smutty (but fictional) erotic story about me that I find psychologically distressing should they then be thrown in jail? What if they say hurtful things to me that I find psychologically harmful? What if they insult a religion or political candidate, party or ideology that I strongly identify with? We all inflict psychological harm on others from time to time - what should minimum harm threshold be?

Personally, I don't think the criminal law is the answer in either the deepfake or revenge porn cases if the rationale is 'psychological harm'. Although I'm not sure where I stand on the following, I think a civil tort for infliction of psychological harm would be the sanest option if we feel the need to regulate against infliction of psychological harm. It would be analogous to defamation and libel torts, but instead of having to prove economic harm the plaintiff would have to prove some minimum threshold level of psychological harm to become entitled to compensation from the defendant in proportion to the actual provable injury sustained.

My thoughts aside, what is your general theory of harm / principled policy rationale here and, on that basis, what do you think the state's response should be to regulate?

[go to top]