zlacker

[parent] [thread] 27 comments
1. api+(OP)[view] [source] 2025-07-03 18:33:35
True, but I'm also not convinced that a ten year old being able to be face to face with hard-core BDSM and incest fetish porn within 40 seconds of opening a web browser is healthy.

I don't like this but don't have another solution other than the porn industry self-policing which isn't promising.

replies(7): >>rvnx+32 >>mystif+p2 >>add-su+X7 >>Spivak+9a >>djoldm+db >>burnt-+1y >>csomar+AT
2. rvnx+32[view] [source] 2025-07-03 18:48:32
>>api+(OP)
Now take an intentionally extreme opposite (as a thought experiment): if we put death penalty to people who participate in distributing or in relaying such content, could all of that be solved without the “internet pass” and IDing your internet history ?
replies(3): >>treyd+x2 >>api+i4 >>wbl+bk
3. mystif+p2[view] [source] 2025-07-03 18:50:57
>>api+(OP)
Well, you don't have another solution. That doesn't immediately mean that the one presented in the post is the correct one. Far from it.
replies(1): >>Matteo+k3
◧◩
4. treyd+x2[view] [source] [discussion] 2025-07-03 18:51:57
>>rvnx+32
Maybe, but even this is broken with the internet being international. You'd need a system much more advanced than even the GFW.
replies(1): >>rvnx+T2
◧◩◪
5. rvnx+T2[view] [source] [discussion] 2025-07-03 18:55:17
>>treyd+x2
Somehow this work when dealing with pedophile content, so the tech is already active.

For example, on Discord, all your messages are scanned for such. On Cloudflare as well (for over 5 years).

For now it means they have no interest to remove such content unless coerced or affected by the public opinion.

This would destroy all content though, not just for minors.

Absurd, but it works, in North Korea (death penalty), Iran (death penalty), China (10 year prison), and also protects victims from rape, or "rape" under financial pressure.

The alternative is to let responsibility of the parents to install web filter to their kids, and let others live freely on the internet, without sharing their history or IDing them.

In reality, TikTok also has really traumatizing content, yet is engaging tons of kids and teenagers, and IDing won't solve that, but good parents can.

replies(1): >>treyd+z3
◧◩
6. Matteo+k3[view] [source] [discussion] 2025-07-03 18:58:35
>>mystif+p2
The post does not present a solution to that problem. Governments around the world, especially in Europe, have legislated the solution, and the solution they have picked is a privacy nightmare. This post solves the privacy problem, which is strictly better than the status quo. We (Google) do not decide what should or should not be regulated.
◧◩◪◨
7. treyd+z3[view] [source] [discussion] 2025-07-03 19:01:09
>>rvnx+T2
I agree, that does work, but there are parameters which are different that make it worth the tradeoff to police it that strongly, like the size of the audience and the much more severe real harm caused by its production and distribution.
replies(2): >>rvnx+A4 >>trollb+gc
◧◩
8. api+i4[view] [source] [discussion] 2025-07-03 19:05:08
>>rvnx+32
Adults should be allowed to look at porn. I don't think it's necessarily good for people, but adults are also allowed to binge drink and smoke and eat ultra-processed foods and a lot of other things that are worse for you than porn.

CP is an edge case but that's because it's almost impossible to make CP without abusing children and you could view CP as an incitement to violence -- as incitement to abuse children.

Parents should ultimately monitor what their kids do. I have a pi-hole that subscribes to lists with millions of porn domains, but I'm a technical person. Non-technical parents are helpless, and kids can easily access it at friends' houses etc. The industry has not empowered non-technical parents to do this, probably because there's a conflict of interest. Lots of parents would use such options to keep kids off social media, and like all addictive things social media wants to hook them early. (I think kids should be off social media too, but it's not quite as nuts as letting them watch fetish porn.)

Porn is different now too. It's worse in a way. Like everything else it's subjected to a pressure to get "edgier" to maximize engagement. So today's porn is loaded with simulated incest, simulated rape, extreme BDSM, etc., things that young children are not equipped to properly contextualize. (Some adults aren't either, but at least with adults you can say it's their fault not the porn's fault. The line cuts differently with children which is why children can't smoke, get tattoos, buy alcohol, get credit cards, etc.) If you want to see the consequence of young kids (mostly boys) being raised with unfettered porn access go visit any women-coded space on the Internet (like Reddit) and search for threads discussing why so many men want to choke their girlfriends. Where did this sudden choking fetish come from?

replies(2): >>rvnx+S4 >>Spivak+Pp
◧◩◪◨⬒
9. rvnx+A4[view] [source] [discussion] 2025-07-03 19:06:22
>>treyd+z3
I genuinely don't know what to think on this :|

I just pushed this idea as a "solution" to see what others think, but I don't know. Again perhaps educating the parents about how to educate kids about the dangers of internet, and perhaps a web filter for kids.

This is actually one place where AI could be useful, to do dynamic local content classification (instead of a blocklist), especially if integrated directly in Android / iPhone.

Like https://support.apple.com/en-us/105121 but more dynamic.

◧◩◪
10. rvnx+S4[view] [source] [discussion] 2025-07-03 19:08:11
>>api+i4
I agree with you, at the end I think it could work if we offer to promote better local solutions (e.g. better tooling on iPhone), rather than the server authenticating the user.

Perhaps find a way to force Windows / Android / iOS to include such "firewall"/webfilter by default.

11. add-su+X7[view] [source] 2025-07-03 19:33:00
>>api+(OP)
Teen pregnancy rates are down since the mass adoption of the internet, a kid learning a few years early that there exist sexualities other than the default one will affect them much less than losing internet privacy and anonymity for life.
12. Spivak+9a[view] [source] 2025-07-03 19:48:06
>>api+(OP)
What web browser are you using?! I think this says more about you than about the internet if this is what you're seeing.
13. djoldm+db[view] [source] 2025-07-03 19:55:25
>>api+(OP)
For kids with a guardian, the answer is enabling and empowering the guardian to control what the child can access.

Somehow we've inappropriately shifted responsibility away from parents/guardians in some areas like internet access.

In other areas, like letting your kid go outside by themselves, we've criminalized reasonable caregiver actions.

It's a wild world.

replies(3): >>trollb+3c >>koalam+Dc >>ranger+OD
◧◩
14. trollb+3c[view] [source] [discussion] 2025-07-03 20:02:23
>>djoldm+db
Isn’t that the same argument as “Parents should keep kids away from cigarettes” by tobacco companies who were simultaneously marketing to children?

And parents aren’t in control of children 24/7. Schools tend to provide tablets and laptops everywhere, and how much trust should parents have that things like a content filter are adequate to keep children from asking objectionable pornography, hate sites teaching misogyny and so forth?

replies(1): >>djoldm+qd
◧◩◪◨⬒
15. trollb+gc[view] [source] [discussion] 2025-07-03 20:04:03
>>treyd+z3
I think it’s pretty damned important that my 8 year old son doesn’t run across Andrew Tate or similar stuff.
◧◩
16. koalam+Dc[view] [source] [discussion] 2025-07-03 20:06:50
>>djoldm+db
Another way of looking at it, is that when you put the responsibility of protecting a child from harmful content on the parent, you're deciding to only protect the children with the right kind of parent.
replies(3): >>djoldm+Ad >>djrj47+5H >>derang+mF3
◧◩◪
17. djoldm+qd[view] [source] [discussion] 2025-07-03 20:13:14
>>trollb+3c
> Isn’t that the same argument as “Parents should keep kids away from cigarettes” by tobacco companies who were simultaneously marketing to children?

I think most would agree that there's a significant difference between a physical product that shortens the lifespan of virtually all humans who use it, and looking at images and video, no matter how extreme.

> And parents aren’t in control of children 24/7. Schools tend to provide tablets and laptops everywhere, and how much trust should parents have that things like a content filter are adequate to keep children from asking objectionable pornography, hate sites teaching misogyny and so forth?

Agreed.

Parents and guardians should definitely be aware of and concerned about what internet filters are in place at schools.

replies(2): >>_w1tm+gg >>CJeffe+tN
◧◩◪
18. djoldm+Ad[view] [source] [discussion] 2025-07-03 20:14:43
>>koalam+Dc
What's the right kind of parent?
◧◩◪◨
19. _w1tm+gg[view] [source] [discussion] 2025-07-03 20:36:06
>>djoldm+qd
> Parents and guardians should definitely be aware of and concerned about what internet filters are in place at schools.

Neither of the words you used give parents any control over the situation. Legislation is the circumspect way parents are exerting control over websites that are unable to police themselves.

replies(1): >>djoldm+ti
◧◩◪◨⬒
20. djoldm+ti[view] [source] [discussion] 2025-07-03 20:52:45
>>_w1tm+gg
Fair enough. Sounds like legislation may be a good way to enforce internet filtering on school computers.

Schools have traditionally been ground zero for culture war in the USA, so this fits.

◧◩
21. wbl+bk[view] [source] [discussion] 2025-07-03 21:05:37
>>rvnx+32
You mean like the SF city government? This is stuff that a lot of people enjoy doing and taking photos of. The headquarters of a lot of startups are in what used to be the leather neighborhood.
◧◩◪
22. Spivak+Pp[view] [source] [discussion] 2025-07-03 22:02:26
>>api+i4
Reddit being considered a space for women is the funniest take I've heard in a while. But regardless, you didn't adequately take into account that being choked is one of the top sexual fantasies of women. Whatever explanation you put forth has to also explain why it's also highly desirable to be on the receiving end.

The "porn has been giving men violent sexual fantasies" line has existed since before I was born but it always ignores that they're the top fantasies among women too. Among my friend group the more common refrain is women who want to be choked but their boyfriends are uncomfortable doing it.

23. burnt-+1y[view] [source] 2025-07-03 23:44:34
>>api+(OP)
This is a parenting problem, not a technology and everyone else problem.
◧◩
24. ranger+OD[view] [source] [discussion] 2025-07-04 01:22:53
>>djoldm+db
> reasonable

I think the real issue is that the definition of "reasonable" is subjective and often changes with time/culture/people in charge at the moment.

◧◩◪
25. djrj47+5H[view] [source] [discussion] 2025-07-04 02:17:36
>>koalam+Dc
I'm fine with that. I'd rather parents make "bad" decisions about protecting their own children than the government forcing their own opinions on them.
◧◩◪◨
26. CJeffe+tN[view] [source] [discussion] 2025-07-04 03:52:09
>>djoldm+qd
I do agree there is a significant difference. The images and video are much worse -- one particularly bad video can scar people for months, even years, one cigarette isn't that bad.
27. csomar+AT[view] [source] 2025-07-04 05:29:20
>>api+(OP)
The parents bare the responsibility. Don't baby-proof the Internet, the same way we are not baby-proofing the streets, subways or anything else.
◧◩◪
28. derang+mF3[view] [source] [discussion] 2025-07-05 11:40:23
>>koalam+Dc
Is the “right kind of parent” here synonymous with those that regulate what their children see online?
[go to top]