This says it so well, acknowledging the work of a misguided bureaucracy.
Looks like it now requires an online community to have its own bureaucracy in place, to preemptively stand by ready to effectively interact in new ways with a powerful, growing, long-established authoritarian government bureaucracy of overwhelming size and increasing overreach.
Measures like this are promulgated in such a way that only large highly prosperous outfits beyond a certain size can justify maintaining readiness for their own bureaucracies to spring into action on a full-time basis with as much staff as necessary to compare to the scale of the government bureaucracy concerned, and as concerns may arise that mattered naught before. Especially when there are new open-ended provisions for unpredictable show-stoppers, now fiercely codified to the distinct disadvantage of so many non-bureaucrats just because they are online.
If you think you are going to be able to rise to the occasion and dutifully establish your own embryonic bureaucracy for the first time to cope with this type unstable landscape, you are mistaken.
It was already bad enough before without a newly imposed, bigger moving target than everything else combined :\
Nope, these type regulations only allow firms that already have a prominent well-funded bureaucracy of their own, on a full-time basis, long-established after growing in response to less-onerous mandates of the past. Anyone else who cannot just take this in stride without batting an eye, need not apply.
What do you mean by bureaucracy in this case? Doing the risk assessment?
> 1. Individual accountable for illegal content safety duties and reporting and complaints duties
> 2. Written statements of responsibilities
> 3. Internal monitoring and assurance
> 4. Tracking evidence of new and increasing illegal harm
> 5. Code of conduct regarding protection of users from illegal harm
> 6. Compliance training
> 7. Having a content moderation function to review and assess suspected illegal content
> 8. Having a content moderation function that allows for the swift take down of illegal content
> 9. Setting internal content policies
> 10. Provision of materials to volunteers
> 11. (Probably this because of file attachments) Using hash matching to detect and remove CSAM
> 12. (Probably this, but could implement Google Safe Browser) Detecting and removing content matching listed CSAM URLs
> ...
> the list goes on.