What Reddit did get a lot of negative public publicity for were subreddits focused on sharing non-explicit photos of minors, but with loads of sexually charged comments. The images themselves, nobody would really object to in isolation, but the discussions surrounding the images were all lewd. So not CSAM, but still creepy and something Reddit tightly decided it didn't want on the site.
Mmkay.
https://en.wikipedia.org/wiki/Twitter_under_Elon_Musk#Child_...
"As of June 2023, an investigation by the Stanford Internet Observatory at Stanford University reported "a lapse in basic enforcement" against child porn by Twitter within "recent months". The number of staff on Twitter's trust and safety teams were reduced, for example, leaving one full-time staffer to handle all child sexual abuse material in the Asia-Pacific region in November 2022."
"In 2024, the company unsuccessfully attempted to avoid the imposition of fines in Australia regarding the government's inquiries about child safety enforcement; X Corp reportedly said they had no obligation to respond to the inquiries since they were addressed to "Twitter Inc", which X Corp argued had "ceased to exist"."