Create a better, standardized, open-source parental control tool that is installed by default on all types of device that can connect to the web.
The internet aspect of the parental control should be a "Per Whitelist" system rather than Blacklisting. The parents should be the ones to decide which domains are Whitelisted for their kids, and government bodies could contribute with curated lists to help establish a base.
Yes, there would be some gray area sites like search engine image search, or social media sites like Twitter that can allow you to stumble into pornography, and that is why these devices that have the software turned ON, should send a token through the browser saying "Parental Control". It would be easier for websites to implement a blanket block of certain aspects of their site than expect them to implement whole ID checks systems and security to make sure that no leaks occur (look at the TEA app) like the UK is expecting everyone to do.
Also, I'm for teenagers (not little children) having access to pornography. I was once a teenager, every adult was, and we know that it's a natural thing to masturbate which includes the consumption of pornography for most in some way. Repressing their desires, their sexuality, and making this private aspect of their life difficult isn't the way. Yes, yes, there is nuance to it, (very hardcore/addiction/etc) but it should be up to the parents to decide with given tools if they trust their kid to consume such a thing.
As for the tool itself. Of course we have parental tools, but they can be pretty garbage, their all different, they're out of the way, and I understand that many people simply don't know how to operate them. That's why I believe that creating a standardized open-source project that multiple governments can directly contribute to and advertise for parents is the way, because at the end of the day, it should be up to the parents to decide these things, and for the government to facility that choice.
Obviously, besides the internet aspect, the tool should have all the bells and whistles that you'd expect from one, but that's not the topic.
And yes, some children would find a way, just like they're doing now for the currently implemented ID checks. It's not lost of me that VPNs with free plans suddenly exploded in 4 digits % worth of downloads. A lot of those are tiny people who are smart enough. Or using an app like a game to trick Facial Recognition software.
Also, I'd be remiss to not point out a very obvious fact. This, and I'm not just referring to the UK, isn't about children, it's not about terrorism, it's not about public safety. It's about control, it's about tracking, it's about documenting, it's about power over the masses. I know some people will hand wave this away, but we have been seeing a very obvious, very fast, rise of authoritarianism since COVID and later the war in Ukraine. It's not a new trend, but it is one that got accelerated at those stages and has been progressively getting worse world wide.
I'm against: pornography, as found in search results, is generally quite bad. Sexism, racial stereotyping, misrepresentations of queer issues: and that's just the titles. Page 3 has nothing on porn sites.
Maybe I'm judging a book by its SEO spammers here, but I've not read anything that'd disabuse me of this notion: indeed, people raise concerns about unreasonable body image expectations, normalising extreme sex acts like choking without normalising enthusiastic consent practices, the sites allowing CSAM and "revenge porn" that they've already taken down to be re-uploaded…
That said, I routinely come across nudes / sexualised imagery on the Fediverse, and that's… not an issue? Sometimes I find it a bit squicky (which teaches me not to play lift-the-flap with clearly-marked content warnings – I don't know what I expected), but the only times I've seen something viscerally offensive has been people re-posting from porn aggregation sites. (I've blocked three or four accounts for that, and I don't see it any more.)
If porn sites had the kind of stuff that queer / disabled techies post on main on niche social media sites, then I'd be absolutely fine with teenagers accessing porn. As you say, a safe environment for adolescents to explore their sexuality is unequivocably a good thing. I just don't think commercial porn sites provide that.
This is what concerns me the most about the Online Safety Act. It's shutting down the aforementioned queer / disabled techies on their social media sites, and surely plenty of other pro-social sex communities I don't even know about, but it's not going to do a thing about the large aggregators that are the real problem. It in fact makes the whole problem worse.
I said "if porn sites had the kind of stuff": your paraphrase adds an implication I vehemently disagree with. The impersonal nature of a website (or magazine, or whatever) is important. Children shouldn't be looking at porn on social media sites, because they should have neither social nor parasocial relationships with sex workers qua sex workers (lumping amateurs in with professionals, here): this is a (non-central) special case of "adults should not have sexual relationships with children". We can't ignore the power dynamics.
That's one of the things I think the OSA got right: if you read between the lines, each measure does seem to be motivated by an actual problem, some of which aren't obvious to non-experts like me. I'd love to get access to the NSPCC's recommendations for the OSA, before it got translated to this awful implementation: that'd make it much easier to try to design alternative, more effectual implementations.
Note also, the queer/disabled techies I mentioned? They take pains to ensure that minors do not interact with them in a sexual context: some of them explain why, and others make blanket prohibitions without explanation. It is generally understood that consent and boundaries are respected. And, from what I can tell looking at public social graphs, this works: nobody I know to be a child is interacting with nudes, risqué posts, erotica, or accounts dedicated to that purpose, even if they're otherwise quite close in the social graph. (Maybe I should do a study? But analysing people's social graphs without their consent doesn't feel ethical. Perhaps interviews would be a better approach.)
There is the occasional post from a child (youngest I've observed was 16) complaining about these policies, because they think they don't need protection. That they're complaining, rather than just bypassing the technical barriers (as everybody in my school knew how to do), is perhaps another indication that this approach works.
(I'm a degree separated from the communities that post sexy stuff online, so my observations may not be representative of what actually happens. I'm also seeing the situation after moderation, a few minutes delayed due to federation latency: I know that "remove the consequences of a child's foolishness from the public sphere as quickly as possible" is a priority in online moderation, so this selection bias might be quite heavy. Further research is needed.)