zlacker

X offices raided in France as UK opens fresh investigation into Grok

submitted by vikave+(OP) on 2026-02-03 10:08:52 | 586 points 627 comments
[view article] [source] [links] [go to bottom]
replies(36): >>pogue+N1 >>robthe+32 >>vessen+ue >>pu_pe+Rf >>afavou+kh >>techbl+8k >>Altern+ut >>stickf+gv1 >>verdve+Kv1 >>r721+zL1 >>scotty+JL1 >>ta9000+hU1 >>kalter+tX1 >>tomloc+i12 >>TZubir+W52 >>tehjok+3f2 >>ChrisM+9m2 >>sleepy+jr2 >>jongjo+Is2 >>lukasm+oC2 >>justab+HP2 >>darepu+UU2 >>isodev+4f3 >>miki12+dl3 >>utopia+mm3 >>Animat+3E3 >>mnewme+pE3 >>domini+5Q3 >>tokai+524 >>krautb+W94 >>patric+sh4 >>voidUp+kk4 >>code_f+ar4 >>wnevet+Qu4 >>devwas+Vu4 >>Havoc+7c8
1. pogue+N1[view] [source] 2026-02-03 10:21:10
>>vikave+(OP)
Finally, someone is taking action against the CSAM machine operating seemingly without penalty.
replies(1): >>tjpnz+oW2
2. robthe+32[view] [source] 2026-02-03 10:23:06
>>vikave+(OP)
> The prosecutor's office also said it was leaving X and would communicate on LinkedIn and Instagram from now on.

I mean, perhaps it's time to completely drop these US-owned, closed-source, algo-driven controversial platforms, and start treating the communication with the public that funds your existence in different terms. The goal should be to reach as many people, of course, but also to ensure that the method and medium of communication is in the interest of the public at large.

replies(5): >>spacec+6e >>valar_+fl >>Mordis+Yp >>noneth+BI >>morkal+6F1
◧◩
3. spacec+6e[view] [source] [discussion] 2026-02-03 11:56:42
>>robthe+32
This. What a joke. Im still waiting on my tax refund from NYC for plastering "twitter" stickers on every publicly funded vehicle.
4. vessen+ue[view] [source] 2026-02-03 11:59:58
>>vikave+(OP)
Interesting. This is basically the second enforcement on speech / images that France has done - first was Pavel Durov @ Telegram. He eventually made changes in Telegram's moderation infrastructure and I think was allowed to leave France sometime last year.

I don't love heavy-handed enforcement on speech issues, but I do really like a heterogenous cultural situation, so I think it's interesting and probably to the overall good to have a country pushing on these matters very hard, just as a matter of keeping a diverse set of global standards, something that adds cultural resilience for humanity.

linkedin is not a replacement for twitter, though. I'm curious if they'll come back post-settlement.

replies(5): >>derrid+kf >>btreec+4h >>tokai+Ll >>logicc+0m >>StopDi+dm
◧◩
5. derrid+kf[view] [source] [discussion] 2026-02-03 12:07:11
>>vessen+ue
I wouldn't equate the two.

There's someone who was being held responsible for what was in encrypted chats.

Then there's someone who published depictions of sexual abuse and minors.

Worlds apart.

replies(1): >>direwo+iE2
6. pu_pe+Rf[view] [source] 2026-02-03 12:10:09
>>vikave+(OP)
I suppose those are the offices from SpaceX now that they merged.
replies(1): >>omnimu+0k
◧◩
7. btreec+4h[view] [source] [discussion] 2026-02-03 12:17:52
>>vessen+ue
>but I do really like a heterogenous cultural situation

Why isn't that a major red flag exactly?

replies(1): >>vessen+jP
8. afavou+kh[view] [source] 2026-02-03 12:19:52
>>vikave+(OP)
I’m sure Musk is going to say this is about free speech in an attempt to gin up his supporters. It isn’t. It’s about generating and distributing non consensual sexual imagery, including of minors. And, when notified, doing nothing about it. If anything it should be an embarrassment that France are the only ones doing this.

(it’ll be interesting to see if this discussion is allowed on HN. Almost every other discussion on this topic has been flagged…)

replies(2): >>cbeach+Vh >>rsynno+cq
◧◩
9. cbeach+Vh[view] [source] [discussion] 2026-02-03 12:22:54
>>afavou+kh
> when notified, doing nothing about it

When notified, he immediately:

  * "implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing" - https://www.bbc.co.uk/news/articles/ce8gz8g2qnlo 

  * locked image generation down to paid accounts only (i.e. those individuals that can be identified via their payment details).
Have the other AI companies followed suit? They were also allowing users to undress real people, but it seems the media is ignoring that and focussing their ire only on Musk's companies...
replies(3): >>derrid+ej >>afavou+kj >>freeja+KG1
◧◩◪
10. derrid+ej[view] [source] [discussion] 2026-02-03 12:29:02
>>cbeach+Vh
The other LLMs probably don't have the training data in the first place.
replies(1): >>chrisj+rT
◧◩◪
11. afavou+kj[view] [source] [discussion] 2026-02-03 12:29:34
>>cbeach+Vh
You and I must have different definitions of the word “immediately”. The article you posted is from January 15th. Here is a story from January 2nd:

https://www.bbc.com/news/articles/c98p1r4e6m8o

> Have the other AI companies followed suit? They were also allowing users to undress real people

No they weren’t? There were numerous examples of people feeding the same prompts to different AIs and having their requests refused. Not to mention, X was also publicly distributing that material, something other AI companies were not doing. Which is an entirely different legal liability.

replies(2): >>boness+Gl >>chrisj+SQ
◧◩
12. omnimu+0k[view] [source] [discussion] 2026-02-03 12:34:13
>>pu_pe+Rf
So France is raiding offices of US military contractor?
replies(4): >>mkjs+ym >>fanati+Fz >>herman+DA >>gitaar+VP3
13. techbl+8k[view] [source] 2026-02-03 12:34:51
>>vikave+(OP)
I'm not saying I'm entirely against this, but just out of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?
replies(10): >>afavou+0l >>Mordis+Um >>rsynno+4p >>moolco+Zu >>reaper+Gy1 >>arppac+vX1 >>direwo+k12 >>pjc50+oL3 >>bluega+c64 >>kmeist+5L4
◧◩
14. afavou+0l[view] [source] [discussion] 2026-02-03 12:41:08
>>techbl+8k
It was known that Grok was generating these images long before any action was taken. I imagine they’ll be looking for internal communications on what they were doing, or deciding not to do, doing during that time.
◧◩
15. valar_+fl[view] [source] [discussion] 2026-02-03 12:43:20
>>robthe+32
>The goal should be to reach as many people, of course, but also to ensure that the method and medium of communication is in the interest of the public at large.

Who decides what communication is in the interest of the public at large? The Trump administration?

replies(1): >>robthe+WC
◧◩◪◨
16. boness+Gl[view] [source] [discussion] 2026-02-03 12:46:31
>>afavou+kj
The part of X’s reaction to their own publishing I’m most looking forward to seeing in slow-motion in the courts and press was their attempt at agency laundering by having their LLM generate an apology in first-person.

Sorry I broke the law. Oops for reals tho.

◧◩
17. tokai+Ll[view] [source] [discussion] 2026-02-03 12:46:58
>>vessen+ue
In what world is generating CSAM a speech issue? Its really doing a disservice to actual free speech issues to frame it was such.
replies(2): >>logicc+km >>direwo+UD2
◧◩
18. logicc+0m[view] [source] [discussion] 2026-02-03 12:49:16
>>vessen+ue
>but I do really like a heterogenous cultural situation, so I think it's interesting and probably to the overall good to have a country pushing on these matters very hard

Censorship increases homogeneity, because it reduces the amount of ideas and opinions that are allowed to be expressed. The only resilience that comes from restricting people's speech is resilience of the people in power.

replies(3): >>Aureli+4B >>moolco+IC >>vessen+BR
◧◩
19. StopDi+dm[view] [source] [discussion] 2026-02-03 12:50:31
>>vessen+ue
Very different charges however.

Durov was held on suspicion Telegram was willingly failing to moderate its platform and allowed drug trafficking and other illegal activities to take place.

X has allegedly illegally sent data to the US in violation of GDPR and contributed to child porn distribution.

Note that both are directly related to direct violation of data safety law or association with a separate criminal activities, neither is about speech.

replies(1): >>vessen+NQ
◧◩◪
20. logicc+km[view] [source] [discussion] 2026-02-03 12:51:30
>>tokai+Ll
The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration. That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.
replies(3): >>tokai+Am >>cwillu+Fu >>chrisj+401
◧◩◪
21. mkjs+ym[view] [source] [discussion] 2026-02-03 12:53:25
>>omnimu+0k
How is that relevant? Are you implying that being a US military contractor should make you immune to the laws of other countries that you operate in?

The onus is on the contractor to make sure any classified information is kept securely. If by raiding an office in France a bunch of US military secrets are found, it would suggest the company is not fit to have those kind of contracts.

◧◩◪◨
22. tokai+Am[view] [source] [discussion] 2026-02-03 12:53:43
>>logicc+km
That's not what we are discussing here. Even less when a lot of the material here is edits of real pictures.
◧◩
23. Mordis+Um[view] [source] [discussion] 2026-02-03 12:56:12
>>techbl+8k
What do they hope to find, specifically? Who knows, but maybe the prosecutors have a better awareness of specifics than us HN commenters who have not been involved in the investigation.

What may they find, hypothetically? Who knows, but maybe an internal email saying, for instance, 'Management says keep the nude photo functionality, just hide it behind a feature flag', or maybe 'Great idea to keep a backup of the images, but must cover our tracks', or perhaps 'Elon says no action on Grok nude images, we are officially unaware anything is happening.'

replies(1): >>cwillu+Ft
◧◩
24. rsynno+4p[view] [source] [discussion] 2026-02-03 13:12:43
>>techbl+8k
> what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?

You would be _amazed_ at the things that people commit to email and similar.

Here's a Facebook one (leaked, not extracted by authorities): https://www.reuters.com/investigates/special-report/meta-ai-...

replies(1): >>plopil+8v4
◧◩
25. Mordis+Yp[view] [source] [discussion] 2026-02-03 13:18:41
>>robthe+32
I agree with you. In my opinion it was already bad enough that official institutions were using Twitter as a communication platform before it belonged to Musk and started to restrict visibility to non-logged in users, but at least Twitter was arguably a mostly open communication platform and could be misunderstood as a public service in the minds of the less well-informed. However, deciding to "communicate" at this day and age on LinkedIn and Instagram, neither of which ever made a passing attempt to pretend to be a public communications service, boggles the mind.
replies(1): >>chrisj+2N
◧◩
26. rsynno+cq[view] [source] [discussion] 2026-02-03 13:20:03
>>afavou+kh
> If anything it should be an embarrassment that France are the only ones doing this.

As mentioned in the article, the UK's ICO and the EC are also investigating.

France is notably keen on raids for this sort of thing, and a lot of things that would be basically a desk investigation in other countries result in a raid in France.

replies(1): >>chrisj+cO
27. Altern+ut[view] [source] 2026-02-03 13:39:21
>>vikave+(OP)
> Prosecutors say they are now investigating whether X has broken the law across multiple areas.

This step could come before a police raid.

This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.

replies(8): >>moolco+Du >>aaomid+hw >>orwin+Lj1 >>bawolf+Ec3 >>gianca+Do3 >>emsign+jx3 >>317070+qy3 >>bluega+X54
◧◩◪
28. cwillu+Ft[view] [source] [discussion] 2026-02-03 13:40:35
>>Mordis+Um
Or “regulators don't understand the technology; short of turning it off entirely, there's nothing we can do to prevent it entirely, and the costs involved in attempting to reduce it are much greater than the likely fine, especially given that we're likely to receive such a fine anyway.”
replies(2): >>pirate+py >>bawolf+Bd3
◧◩
29. moolco+Du[view] [source] [discussion] 2026-02-03 13:45:38
>>Altern+ut
> This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.

The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.

replies(4): >>trhway+Ve3 >>pdpi+cm3 >>cubefo+3o3 >>ljspra+HM3
◧◩◪◨
30. cwillu+Fu[view] [source] [discussion] 2026-02-03 13:45:43
>>logicc+km
If libeling real people is a harm to those people, then altering photos of real children is certainly also a harm to those children.
replies(1): >>whamla+lJ
◧◩
31. moolco+Zu[view] [source] [discussion] 2026-02-03 13:48:00
>>techbl+8k
Moderation rules? Training data? Abuse metrics? Identities of users who generated or accessed CSAM?
replies(1): >>bryan_+7Q1
◧◩
32. aaomid+hw[view] [source] [discussion] 2026-02-03 13:55:16
>>Altern+ut
Lmao they literally made a broad accessible CSAM maker.
◧◩◪◨
33. pirate+py[view] [source] [discussion] 2026-02-03 14:05:40
>>cwillu+Ft
They could shut it off out of a sense of decency and respect, wtf kind of defense is this?
replies(1): >>cwillu+1h1
◧◩◪
34. fanati+Fz[view] [source] [discussion] 2026-02-03 14:12:49
>>omnimu+0k
Even if it is, being affiliated with the US military doesn't make you immune to local laws.

https://www.the-independent.com/news/world/americas/crime/us...

◧◩◪
35. herman+DA[view] [source] [discussion] 2026-02-03 14:17:38
>>omnimu+0k
I know it's hard to grasp for you. But in France, french laws and jurisdiction applies, not those of the United States
replies(2): >>watwut+HE3 >>omnimu+Am6
◧◩◪
36. Aureli+4B[view] [source] [discussion] 2026-02-03 14:20:15
>>logicc+0m
This is precisely the point of the comment you are replying to: a balance has to be found and enforced.
◧◩◪
37. moolco+IC[view] [source] [discussion] 2026-02-03 14:29:22
>>logicc+0m
I really don't see reasonable enforcement of CSAM laws as a restriction on "diversity of thought".
◧◩◪
38. robthe+WC[view] [source] [discussion] 2026-02-03 14:30:26
>>valar_+fl
You appear to have posted a bit of a loaded question here, apologies if I'm misinterpreting your comment. It is, of course, the public that should decide what communication is of public interest, at least in a democracy operating optimally.

I suppose the answer, if we're serious about it, is somewhat more nuanced.

To begin, public administrations should not get to unilaterally define "the public interest" in their communication, nor should private platforms for that matter. Assuming we're still talking about a democracy, the decision-making should be democratically via a combination of law + rights + accountable institutions + public scrutiny, with implementation constraints that maximise reach, accessibility, auditability, and independence from private gatekeepers. The last bit is rather relevant, because the private sector's interests and the citizen's interests are nearly always at odds in any modern society, hence the state's roles as rule-setter (via democratic processes) and arbiter. Happy to get into further detail regarding the actual processes involved, if you're genuinely interested.

That aside - there are two separate problems that often get conflated when we talk about these platforms:

- one is reach: people are on Twitter, LinkedIn, Instagram, so publishing there increases distribution; public institutions should be interested in reaching as many citizens as possible with their comms;

- the other one is dependency: if those become the primary or exclusive channels, the state's relationship with citizens becomes contingent on private moderation, ranking algorithms, account lockouts, paywalls, data extraction, and opaque rule changes. That is entirely and dangerously misaligned with democratic accountability.

A potential middle position could be ti use commercial social platforms as secondary distribution instead of the authoritative channel, which in reality is often the case. However, due to the way societies work and how individuals operate within them, the public won't actually come across the information until it's distributed on the most popular platforms. Which is why some argue that they should be treated as public utilities since dominant communications infrastructure has quasi-public function (rest assured, I won't open that can of worms right now).

Politics is messy in practice, as all balancing acts are - a normal price to pay for any democratic society, I'd say. Mix that with technology, social psychology and philosophies of liberty, rights, and wellbeing, and you have a proper head-scratcher on your hands. We've already done a lot to balance these, for sure, but we're not there yet and it's a dynamic, developing field that presents new challenges.

replies(1): >>direwo+yD2
◧◩
39. noneth+BI[view] [source] [discussion] 2026-02-03 14:57:49
>>robthe+32
>I mean, perhaps it's time to completely drop these US-owned, closed-source, algo-driven controversial platforms

I think we are getting very close the the EU's own great firewall.

There is currently a sort of identity crisis in the regulation. Big tech companies are breaking the laws left and right. So which is it?

- fine harvesting mechanism? Keep as-is.

- true user protection? Blacklist.

replies(2): >>lokar+Du1 >>direwo+T24
◧◩◪◨⬒
40. whamla+lJ[view] [source] [discussion] 2026-02-03 15:01:18
>>cwillu+Fu
I'm strongly against CSAM but I will say this analogy doesn't quite hold (though the values behind it does)

Libel must be as assertion that is not true. Photoshopping or AIing someone isn't an assertion of something untrue. It's more the equivalent of saying "What if this is true?" which is perfectly legal

replies(1): >>cwillu+qb1
◧◩◪
41. chrisj+2N[view] [source] [discussion] 2026-02-03 15:18:08
>>Mordis+Yp
> official institutions were using Twitter as a communication platform before it belonged to Musk and started to restrict visibility to non-logged in users

... thereby driving up adoption far better than Twitter itself could. Ironic or what.

◧◩◪
42. chrisj+cO[view] [source] [discussion] 2026-02-03 15:23:56
>>rsynno+cq
Full marks to France for addressing its higher than average rate of unemployment.

/i

◧◩◪
43. vessen+jP[view] [source] [discussion] 2026-02-03 15:28:16
>>btreec+4h
Hi there - author here. Care to add some specifics? I can imagine lots of complaints about this statement, but I don't know which (if any) you have.
replies(1): >>btreec+lc4
◧◩◪
44. vessen+NQ[view] [source] [discussion] 2026-02-03 15:34:06
>>StopDi+dm
I like your username, by the way.

CSAM was the lead in the 2024 news headlines in the French prosecution of Telegram also. I didn't follow the case enough to know where they went, or what the judge thought was credible.

From a US mindset, I'd say that generation of communication, including images, would fall under speech. But then we classify it very broadly here. Arranging drug deals on a messaging app definitely falls under the concept of speech in the US as well. Heck, I've been told by FBI agents that they believe assassination markets are legal in the US - protected speech.

Obviously, assassinations themselves, not so much.

replies(3): >>StopDi+j41 >>direwo+9E2 >>f30e3d+023
◧◩◪◨
45. chrisj+SQ[view] [source] [discussion] 2026-02-03 15:34:17
>>afavou+kj
> Which is an entirely different legal liability.

In UK, it is entirely the same. Near zero.

Making/distributing a photo of a non-consenting bikini-wearer is no more illegal when originated by computer in bedroom than done by camera on public beach.

replies(1): >>lokar+as1
◧◩◪
46. vessen+BR[view] [source] [discussion] 2026-02-03 15:38:30
>>logicc+0m
You were downvoted -- a theme in this thread -- but I like what you're saying. I disagree, though, on a global scale. By resilience, I mean to reference something like a monoculture plantation vs a jungle. The monoculture plantation is vulnerable to anything that figures out how to attack it. In a jungle, a single plant or set might be vulnerable, but something that can attack all the plants is much harder to come by.

Humanity itself is trending more toward monoculture socially; I like a lot of things (and hate some) about the cultural trend. But what I like isn't very important, because I might be totally wrong in my likes; if only my likes dominated, the world would be a much less resilient place -- vulnerable to the weaknesses of whatever it is I like.

So, again, I propose for the race as a whole, broad cultural diversity is really critical, and worth protecting. Even if we really hate some of the forms it takes.

replies(1): >>direwo+fE2
◧◩◪◨
47. chrisj+rT[view] [source] [discussion] 2026-02-03 15:45:13
>>derrid+ej
Er...

"Study uncovers presence of CSAM in popular AI training dataset"

https://www.theregister.com/2023/12/20/csam_laion_dataset/.

◧◩◪◨
48. chrisj+401[view] [source] [discussion] 2026-02-03 16:10:56
>>logicc+km
> The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration.

Quite.

> That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.

Really? By what US definition of CSAM?

https://rainn.org/get-the-facts-about-csam-child-sexual-abus...

"Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. "

◧◩◪◨
49. StopDi+j41[view] [source] [discussion] 2026-02-03 16:28:09
>>vessen+NQ
The issue is still not really speech.

Durov wasn't arrested because of things he said or things that were said on his platform, he was arrested because he refused to cooperate in criminal investigations while he allegedly knew they were happening on a platform he manages.

If you own a bar, you know people are dealing drugs in the backroom and you refuse to assist the police, you are guilty of aiding and abetting. Well, it's the same for Durov except he apparently also helped them process the money.

◧◩◪◨⬒⬓
50. cwillu+qb1[view] [source] [discussion] 2026-02-03 16:54:28
>>whamla+lJ
“ 298 (1) A defamatory libel is matter published, without lawful justification or excuse, that is likely to injure the reputation of any person by exposing him to hatred, contempt or ridicule, or that is designed to insult the person of or concerning whom it is published.

    Marginal note:Mode of expression

    (2) A defamatory libel may be expressed directly or by insinuation or irony

        (a) in words legibly marked on any substance; or

        (b) by any object signifying a defamatory libel otherwise than by words.”
It doesn't have to be an assertion, or even a written statement.
replies(1): >>93po+ck1
◧◩◪◨⬒
51. cwillu+1h1[view] [source] [discussion] 2026-02-03 17:20:54
>>pirate+py
You appear to have lost the thread (or maybe you're replying to things directly from the newcomments feed? If so, please stop it.), we're talking about what sort of incriminating written statements the raid might hope to discover.
◧◩
52. orwin+Lj1[view] [source] [discussion] 2026-02-03 17:31:32
>>Altern+ut
France prosecutors use police raids way more than other western countries. Banks, political parties, ex-presidents, corporate HQs, worksites... Here, while white-collar crimes are punished as much as in the US (i.e very little), we do at least investigate them.
◧◩◪◨⬒⬓⬔
53. 93po+ck1[view] [source] [discussion] 2026-02-03 17:33:11
>>cwillu+qb1
You're quoting Canadian law.

In the US it varies by state but generally requires:

A false statement of fact (not opinion, hyperbole, or pure insinuation without a provably false factual core).

Publication to a third party.

Fault

Harm to reputation

----

In the US it is required that it is written (or in a fixed form). If it's not written (fixed), it's slander, not libel.

replies(2): >>cwillu+uH1 >>direwo+u24
◧◩◪◨⬒
54. lokar+as1[view] [source] [discussion] 2026-02-03 18:02:19
>>chrisj+SQ
I thought this was about France
replies(1): >>chrisj+5A1
◧◩◪
55. lokar+Du1[view] [source] [discussion] 2026-02-03 18:11:16
>>noneth+BI
Or the companies could obey the law
replies(1): >>Rambli+kO3
56. stickf+gv1[view] [source] 2026-02-03 18:13:35
>>vikave+(OP)
Honest question: What does it mean to "raid" the offices of a tech company? It's not like they have file cabinets with paper records. Are they just seizing employee workstations?

Seems like you'd want to subpoena source code or gmail history or something like that. Not much interesting in an office these days.

replies(15): >>beart+Dx1 >>bsimps+dB1 >>paxys+AB1 >>ronsor+KB1 >>alex11+pZ1 >>Kaiser+q02 >>nieman+k52 >>Aurorn+A62 >>nebula+Lf2 >>auciss+3g2 >>jimbo8+Dk2 >>eli+4s2 >>ChuckM+ew2 >>anigbr+QI2 >>direwo+QZ3
57. verdve+Kv1[view] [source] 2026-02-03 18:15:48
>>vikave+(OP)
France24 article on this: https://www.france24.com/en/france/20260203-paris-prosecutor...

lol, they summoned Elon for a hearing on 420

"Summons for voluntary interviews on April 20, 2026, in Paris have been sent to Mr. Elon Musk and Ms. Linda Yaccarino, in their capacity as de facto and de jure managers of the X platform at the time of the events,

replies(4): >>milton+Rw1 >>Brando+BV1 >>why_at+HZ1 >>xdenni+pr2
◧◩
58. milton+Rw1[view] [source] [discussion] 2026-02-03 18:20:01
>>verdve+Kv1
I wonder how he'll try to get out of being summoned. Claim 4/20 is a holiday that he celebrates?
replies(3): >>verdve+Px1 >>flohof+6A1 >>bean46+Ew3
◧◩
59. beart+Dx1[view] [source] [discussion] 2026-02-03 18:23:26
>>stickf+gv1
Offline syncing of outlook could reveal a lot of emails that would otherwise be on a foreign server. A lot of people save copies of documents locally as well.
replies(1): >>cm2187+Eu2
◧◩◪
60. verdve+Px1[view] [source] [discussion] 2026-02-03 18:23:56
>>milton+Rw1
It's voluntary
replies(1): >>dgxyz+wy1
◧◩◪◨
61. dgxyz+wy1[view] [source] [discussion] 2026-02-03 18:26:15
>>verdve+Px1
They'll make a judgement without him if he doesn't turn up.
replies(1): >>pyrale+Hq2
◧◩
62. reaper+Gy1[view] [source] [discussion] 2026-02-03 18:26:42
>>techbl+8k
out of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?

You're not too far off.

There was a good article in the Washington Post yesterday about many many people inside the company raising alarms about the content and its legal risk, but they were blown off by managers chasing engagement metrics. They even made up a whole new metric.

There was also prompts telling the AI to act angry or sexy or other things just to keep users addicted.

◧◩◪◨⬒⬓
63. chrisj+5A1[view] [source] [discussion] 2026-02-03 18:31:43
>>lokar+as1
It was... until it diverted. >>46870196
◧◩◪
64. flohof+6A1[view] [source] [discussion] 2026-02-03 18:31:45
>>milton+Rw1
> Claim 4/20 is a holiday that he celebrates?

Given his recent "far right" bromance that's probably not a good idea ;)

replies(6): >>verdve+LA1 >>milton+HB1 >>LAC-Te+RZ1 >>sophac+za2 >>Guinan+Fc2 >>termin+oZ3
◧◩◪◨
65. verdve+LA1[view] [source] [discussion] 2026-02-03 18:33:42
>>flohof+6A1
It hadn't occurred to me that might be the reason they picked 420
replies(1): >>layer8+742
◧◩
66. bsimps+dB1[view] [source] [discussion] 2026-02-03 18:34:44
>>stickf+gv1
I had the same thought - not just about raids, but about raiding a satellite office. This sounds like theater begging for headlines like this one.
replies(1): >>direwo+B02
◧◩
67. paxys+AB1[view] [source] [discussion] 2026-02-03 18:36:17
>>stickf+gv1
Whether you are a tech company or not, there's a lot of data on computers that are physically in the office.
replies(1): >>ramuel+XH1
◧◩◪◨
68. milton+HB1[view] [source] [discussion] 2026-02-03 18:36:48
>>flohof+6A1
Oh, that was 100% in my mind when I wrote that. I was wondering how explicit to be with Musk's celebrating being for someone's birthday.
◧◩
69. ronsor+KB1[view] [source] [discussion] 2026-02-03 18:37:08
>>stickf+gv1
These days many tech company offices have a "panic button" for raids that will erase data. Uber is perhaps the most notorious example.
replies(6): >>polite+CC1 >>camina+TI1 >>Brando+qV1 >>mr_mit+c62 >>wasabi+C72 >>digiow+ML2
◧◩◪
70. polite+CC1[view] [source] [discussion] 2026-02-03 18:40:39
>>ronsor+KB1
It's sad to see this degree of incentives perverted, over adhering to local laws.
◧◩
71. morkal+6F1[view] [source] [discussion] 2026-02-03 18:49:27
>>robthe+32
In an ideal world they'd just have an RSS feed on their site and people, journalists, would subscribe to it. Voilà!
◧◩◪
72. freeja+KG1[view] [source] [discussion] 2026-02-03 18:55:36
>>cbeach+Vh
Kiddie porn but only for the paying accounts!
replies(1): >>mooreb+B74
◧◩◪◨⬒⬓⬔⧯
73. cwillu+uH1[view] [source] [discussion] 2026-02-03 18:59:12
>>93po+ck1
The relevant jurisdiction isn't the US either.
◧◩◪
74. ramuel+XH1[view] [source] [discussion] 2026-02-03 19:01:08
>>paxys+AB1
Except when they have encryption, which should be the standard? I mean how much data would authorities actually retrieve when most stuff is located on X servers anyways? I have my doubts.
replies(3): >>Brando+HU1 >>anigbr+GL2 >>throw3+Rm3
◧◩◪
75. camina+TI1[view] [source] [discussion] 2026-02-03 19:04:37
>>ronsor+KB1
>notorious

What happened to due process? Every major firm should have a "dawn raid" policy to comply while preserving rights.

Specific to the Uber case(s), if it were illegal, then why didn't Uber get criminal charges or fines?

At best there's an argument that it was "obstructing justice," but logging people off, encrypting, and deleting local copies isn't necessarily illegal.

replies(2): >>intras+p62 >>pyrale+xn2
76. r721+zL1[view] [source] 2026-02-03 19:15:59
>>vikave+(OP)
Another discussion: >>46872894
77. scotty+JL1[view] [source] 2026-02-03 19:16:33
>>vikave+(OP)
Facebook offices should routinely raided for aiding and profitting from various scams propagated through ads on this platform.
replies(3): >>DaSHac+B52 >>bluesc+ih2 >>mkouba+YL2
◧◩◪
78. bryan_+7Q1[view] [source] [discussion] 2026-02-03 19:34:10
>>moolco+Zu
Do you think that data is stored at the office? Where do you think the data is stored? The janitors closet?
replies(1): >>direwo+cY3
79. ta9000+hU1[view] [source] 2026-02-03 19:54:24
>>vikave+(OP)
Guess that will be a SpaceX problem soon enough. What a mess.
replies(3): >>nebula+Xl2 >>mschus+5n2 >>Psilli+Rr2
◧◩◪◨
80. Brando+HU1[view] [source] [discussion] 2026-02-03 19:56:38
>>ramuel+XH1
The authorities will request the keys for local servers and will get them. As for remote ones (outside of France jurisdiction) it depends where they are and how much X wants to make their life difficult.
replies(1): >>ramuel+ZV1
◧◩◪
81. Brando+qV1[view] [source] [discussion] 2026-02-03 19:59:00
>>ronsor+KB1
This is a perfect way for the legal head of the company in-country to visit some jails.

They will explain that it was done remotely and whatnot but then the company will be closed in the country. Whether this matters for the mothership is another story.

replies(3): >>ameliu+xk2 >>chrisj+QM2 >>direwo+j14
◧◩
82. Brando+BV1[view] [source] [discussion] 2026-02-03 19:59:43
>>verdve+Kv1
Why "lol"?
replies(1): >>verdve+NV1
◧◩◪
83. verdve+NV1[view] [source] [discussion] 2026-02-03 20:00:50
>>Brando+BV1
420 is a stoner number, stoners lol a lot, thought of Elmo's failed joint smoking on JRE before I stopped watching

...but then other commenters reminded me there is another thing on the same date, which might have been more the actual troll at Elmo to get him all worked up

replies(1): >>Brando+QW1
◧◩◪◨⬒
84. ramuel+ZV1[view] [source] [discussion] 2026-02-03 20:01:56
>>Brando+HU1
Musk and X don't seem to be the type to care about any laws or any compelling legal requests, especially from a foreign government. I doubt the French will get anything other than this headline.
replies(3): >>Retric+VX1 >>Teever+z12 >>shawab+H92
◧◩◪◨
85. Brando+QW1[view] [source] [discussion] 2026-02-03 20:04:29
>>verdve+NV1
Well yes, if France24 was using "20 April 2026" as we write here, there would be no misunderstanding.

I believe people are looking too much into 20 April → 4/20 → 420

replies(3): >>verdve+2Z1 >>LightB+Vh2 >>Findec+c63
86. kalter+tX1[view] [source] 2026-02-03 20:07:00
>>vikave+(OP)
Yet another nail
◧◩
87. arppac+vX1[view] [source] [discussion] 2026-02-03 20:07:00
>>techbl+8k
There was a WaPo article yesterday, that talked about how xAI deliberately loosened Grok’s safety guardrails and relaxed restrictions on sexual content in an effort to make the chatbot more engaging and “sticky” for users. xAI employees had to sign new waivers in the summer, and start working with harmful content, in order to train and enable those features.

I assume the raid is hoping to find communications to establish that timeline, maybe internal concerns that were ignored? Also internal metrics that might show they were aware of the problem. External analysts said Grok was generating a CSAM image every minute!!

https://www.washingtonpost.com/technology/2026/02/02/elon-mu...

replies(1): >>chrisj+oJ2
◧◩◪◨⬒⬓
88. Retric+VX1[view] [source] [discussion] 2026-02-03 20:09:12
>>ramuel+ZV1
Getting kicked out of the EU is extremely unattractive for Twitter. But the US also has extradition treaties so that’s hardly the end of how far they can escalate.
replies(1): >>okanat+mY1
◧◩◪◨⬒⬓⬔
89. okanat+mY1[view] [source] [discussion] 2026-02-03 20:11:52
>>Retric+VX1
I don't think US will extradite anybody to EU. Especially not white people with strong support of the current government.
replies(2): >>Retric+OZ1 >>JumpCr+B22
◧◩◪◨⬒
90. verdve+2Z1[view] [source] [discussion] 2026-02-03 20:14:56
>>Brando+QW1
Thanks for the cultural perspective / reminder, yes that is definitely an American automatic translation
◧◩
91. alex11+pZ1[view] [source] [discussion] 2026-02-03 20:16:30
>>stickf+gv1
Why is this the most upvoted question? Obsessing over pedantry rather than the main thrust of what's being discussed
◧◩
92. why_at+HZ1[view] [source] [discussion] 2026-02-03 20:18:01
>>verdve+Kv1
>The Paris prosecutor's office said it launched the investigation after being contacted by a lawmaker alleging that biased algorithms in X were likely to have distorted the operation of an automated data processing system.

I'm not at all familiar with French law, and I don't have any sympathy for Elon Musk or X. That said, is this a crime?

Distorted the operation how? By making their chatbot more likely to say stupid conspiracies or something? Is that even against the law?

replies(3): >>int_19+kk2 >>mschus+Tn2 >>direwo+P14
◧◩◪◨⬒⬓⬔⧯
93. Retric+OZ1[view] [source] [discussion] 2026-02-03 20:18:26
>>okanat+mY1
White people already extradited to the EU during the current administration would disagree. But this administration has a limited shelf life, even hypothetically just under 3 years of immunity isn’t enough for comfort.
replies(1): >>wongar+3r2
◧◩◪◨
94. LAC-Te+RZ1[view] [source] [discussion] 2026-02-03 20:18:32
>>flohof+6A1
We'll know he's gone too far if he has to take another "voluntary" trip to Israel
◧◩
95. Kaiser+q02[view] [source] [discussion] 2026-02-03 20:20:55
>>stickf+gv1
Gather evidence.

I assume that they have opened a formal investigation and are now going to the office to collect/perloin evidence before it's destroyed.

Most FAANG companies have training specifically for this. I assume X doesn't anymore, because they are cool and edgy, and staff training is for the woke.

replies(1): >>nieman+042
◧◩◪
96. direwo+B02[view] [source] [discussion] 2026-02-03 20:21:30
>>bsimps+dB1
They do what they can. They obviously can't raid the American office.
97. tomloc+i12[view] [source] 2026-02-03 20:24:24
>>vikave+(OP)
Elon's in the files asking Epstein about "wild parties" and then doesn't seem to care about all this. Easy to draw a conclusion here.
replies(1): >>alex11+qc2
◧◩
98. direwo+k12[view] [source] [discussion] 2026-02-03 20:24:30
>>techbl+8k
Maybe emails between the French office and the head office warning they may violate laws, and the response by head office?
◧◩◪◨⬒⬓
99. Teever+z12[view] [source] [discussion] 2026-02-03 20:25:39
>>ramuel+ZV1
The game changed when Trump threatened the use of military force to seize Greenland.

At this point a nuclear power like France has no issue with using covert violence to produce compliance from Musk and he must know it.

These people have proven themselves to be existential threats to French security and France will do whatever they feel is necessary to neutralize that threat.

Musk is free to ignore French rule of law if he wants to risk being involved in an airplane accident that will have rumours and conspiracies swirling around it long after he’s dead and his body is strewn all over the ocean somewhere.

replies(1): >>ronsor+T62
◧◩◪◨⬒⬓⬔⧯
100. JumpCr+B22[view] [source] [discussion] 2026-02-03 20:30:29
>>okanat+mY1
> don't think US will extradite anybody to EU

EU, maybe not. France? A nuclear state? Paris is properly sovereign.

> people with strong support of the current government

Also known as leverage.

Let Musk off the hook for a sweetheart trade deal. Trump has a track record of chickening out when others show strength.

replies(2): >>fmajid+me2 >>krisof+6j2
◧◩◪
101. nieman+042[view] [source] [discussion] 2026-02-03 20:38:26
>>Kaiser+q02
If that training involves destroying evidence or withholding evidence from the prosecution, you are going to jail if you follow it.
replies(3): >>hn_go_+J42 >>Kaiser+W92 >>free65+Pe2
◧◩◪◨⬒
102. layer8+742[view] [source] [discussion] 2026-02-03 20:38:54
>>verdve+LA1
It’s unlikely, because putting the month first is a US thing. In France it would be 20/04, or “20 avril”.
replies(1): >>embedd+d82
◧◩◪◨
103. hn_go_+J42[view] [source] [discussion] 2026-02-03 20:41:54
>>nieman+042
What a strange assumption. The training is "summon the lawyers immediately", "ensure they're accompanied at all times while on company premises", etc.
replies(2): >>nieman+Da2 >>direwo+u14
◧◩
104. nieman+k52[view] [source] [discussion] 2026-02-03 20:45:10
>>stickf+gv1
Gather evidence against employees, use that evidence to put them under pressure to testify against their employer or grant access to evidence.

Sabu was put under pressure by the FBI, they threatened to place his kids into foster care.

That was legal. Guess what, similar things would be legal in France.

We all forget that money is nice, but nation states have real power. Western liberal democracies just rarely use it.

The same way the president of the USA can order a Drone strike on a Taliban war lord, the president of France could order Musks plane to be escorted to Paris by 3 Fighter jets.

replies(10): >>mmooss+N92 >>kps+vb2 >>ChrisM+Kg2 >>hiprob+Ao2 >>gruez+7p2 >>xoxoli+cq2 >>SpaceM+Js2 >>projek+qv2 >>cadams+7y2 >>chrisj+ZK2
◧◩
105. DaSHac+B52[view] [source] [discussion] 2026-02-03 20:46:37
>>scotty+JL1
That would apply to any and all social media though
replies(1): >>Toucan+s82
106. TZubir+W52[view] [source] 2026-02-03 20:47:42
>>vikave+(OP)
Why would X have offices in France? I'm assuming it's just to hire French workers? Probably leftover from the Pre Acquisition era.

Or is there any France-specific compliance that must be done in order to operate in that country?

replies(1): >>mike-t+h82
◧◩◪
107. mr_mit+c62[view] [source] [discussion] 2026-02-03 20:49:00
>>ronsor+KB1
How do you know this?
replies(1): >>strong+i92
◧◩◪◨
108. intras+p62[view] [source] [discussion] 2026-02-03 20:50:04
>>camina+TI1
It is aggressive compliance. The legality would be determined by the courts as usual.
replies(1): >>camina+Y82
◧◩
109. Aurorn+A62[view] [source] [discussion] 2026-02-03 20:51:18
>>stickf+gv1
> Seems like you'd want to subpoena source code or gmail history or something like that.

This would be done in parallel for key sources.

There is a lot of information on physical devices that is helpful, though. Even discovering additional apps and services used on the devices can lead to more discovery via those cloud services, if relevant.

Physical devices have a lot of additional information, though: Files people are actively working on, saved snippets and screenshots of important conversations, and synced data that might be easier to get offline than through legal means against the providers.

In outright criminal cases it's not uncommon for individuals to keep extra information on their laptop, phone, or a USB drive hidden in their office as an insurance policy.

This is yet another good reason to keep your work and personal devices separate, as hard as that can be at times. If there's a lawsuit you don't want your personal laptop and phone to disappear for a while.

replies(1): >>charci+oa2
◧◩◪◨⬒⬓⬔
110. ronsor+T62[view] [source] [discussion] 2026-02-03 20:52:08
>>Teever+z12
You're implying that France is going to become a terrorist state? Because suspicious accidents do not sound like rule of law.
replies(6): >>Teever+P92 >>bulbar+Ia2 >>cybera+mg2 >>hunter+pg2 >>myko+Zy2 >>anigbr+ZM2
◧◩◪
111. wasabi+C72[view] [source] [discussion] 2026-02-03 20:54:57
>>ronsor+KB1
It wasn't erasing as far I know, but locking all computers.

Covered here: https://www.theguardian.com/news/2022/jul/10/uber-bosses-tol...

◧◩◪◨⬒⬓
112. embedd+d82[view] [source] [discussion] 2026-02-03 20:57:42
>>layer8+742
Still, stoner-cultures in many countries in Europe celebrate 4-20, definitively a bunch of Frenchies getting extra stoned that day. It's probably the de-facto "international cannabis day" in most places in the world, at least the ones influenced by US culture which reached pretty far in its heyday.
◧◩
113. mike-t+h82[view] [source] [discussion] 2026-02-03 20:58:00
>>TZubir+W52
X makes its money selling advertising. France is the obvious place to have an office selling advertising to a large European French-speaking audience.
replies(1): >>joshua+os2
◧◩◪
114. Toucan+s82[view] [source] [discussion] 2026-02-03 20:58:27
>>DaSHac+B52
Sounds awesome, when do we start?
replies(1): >>mekdoo+hP4
◧◩◪◨⬒
115. camina+Y82[view] [source] [discussion] 2026-02-03 21:01:40
>>intras+p62
> aggressive compliance

Put this up there with nonsensical phrases like "violent agreement."

;-)

replies(1): >>fragme+sm2
◧◩◪◨
116. strong+i92[view] [source] [discussion] 2026-02-03 21:03:45
>>mr_mit+c62
From HN, of course! >>32057651
◧◩◪◨⬒⬓
117. shawab+H92[view] [source] [discussion] 2026-02-03 21:05:32
>>ramuel+ZV1
If I'm an employee working in the X office in France, and the police come in and show me they have a warrant for all the computers in the building and tell me to unlock the laptop, I'm probably going to do that, no matter what musk thinks
replies(1): >>former+Md2
◧◩◪
118. mmooss+N92[view] [source] [discussion] 2026-02-03 21:05:48
>>nieman+k52
> Western liberal democracies just rarely use it.

Also, they are restricted in how they use it, and defendents have rights and due process.

> Sabu was put under pressure by the FBI, they threatened to place his kids into foster care.

Though things like that can happen, which are very serious.

replies(4): >>VBprog+Jd2 >>toss1+Qd2 >>nilamo+0m2 >>mschus+xm2
◧◩◪◨⬒⬓⬔⧯
119. Teever+P92[view] [source] [discussion] 2026-02-03 21:05:56
>>ronsor+T62
Become? https://en.wikipedia.org/wiki/Sinking_of_the_Rainbow_Warrior

The second Donald Trump threatened to invade a nation allied with France is the second anyone who works with Trump became a legitimate military target.

Like a cruel child dismembering a spider one limb at a time France and other nations around the world will meticulously destroy whatever resources people like Musk have and the influence it gives him over their countries.

If Musk displays a sufficient level of resistance to these actions the French will simply assassinate him.

replies(1): >>hunter+nh2
◧◩◪◨
120. Kaiser+W92[view] [source] [discussion] 2026-02-03 21:06:50
>>nieman+042
The training is very much the opposite.

mine had a scene where some bro tried to organise the resistance. A voice over told us that he was arrested for blocking a legal investigation and was liable for being fired due to reputational damage.

X's training might be like you described, but everywhere else that is vaguely beholden to law and order would be opposite.

◧◩◪
121. charci+oa2[view] [source] [discussion] 2026-02-03 21:09:26
>>Aurorn+A62
Sure it might be on the device, but they would need a password to decrypt the laptop's storage to get any of the data. There's also the possibility of the MDM software making it impossible to decrypt if given a remote signal. Even if you image the drive, you can't image the secure enclave so if it is wiped it's impossible to retrieve.
replies(1): >>Aurorn+LX2
◧◩◪◨
122. sophac+za2[view] [source] [discussion] 2026-02-03 21:10:52
>>flohof+6A1
Wouldn't celebrating hitler's birthday be good for his far-right bromance?
◧◩◪◨⬒
123. nieman+Da2[view] [source] [discussion] 2026-02-03 21:11:05
>>hn_go_+J42
That can start with self deleting messages if you are under court order, and has happens before:

“Google intended to subvert the discovery process, and that Chat evidence was ‘lost with the intent to prevent its use in litigation’ and ‘with the intent to deprive another party of the information’s use in the litigation.’”

https://storage.courtlistener.com/recap/gov.uscourts.cand.37...

VW is another case where similar things happens:

https://www.bloomberg.com/news/articles/2017-01-12/vw-offici...

The thing is: Companies don’t got to jail, employees do.

replies(1): >>Kaiser+qM3
◧◩◪◨⬒⬓⬔⧯
124. bulbar+Ia2[view] [source] [discussion] 2026-02-03 21:11:27
>>ronsor+T62
Killing foreigners outside of the own country has always been deemed acceptable by governments that are (or were until recently) considered to generally follow rule of law as well as the majority of their citizen. It also doesn't necessarily contradicts rule of law.

It's just that the West has avoided to do that to each other because they were all essentially allied until recently and because the political implications were deemed too severe.

I don't think however France has anything to win by doing it or has any interest whatsoever and I doubt there's a legal framework the French government can or want to exploit to conduct something like that legally (like calling something an emergency situation or a terrorist group, for example).

◧◩◪
125. kps+vb2[view] [source] [discussion] 2026-02-03 21:15:41
>>nieman+k52
> We all forget that money is nice, but nation states have real power.

Elon has ICBMs, but France has warheads.

replies(1): >>speed_+Lj2
◧◩
126. alex11+qc2[view] [source] [discussion] 2026-02-03 21:20:33
>>tomloc+i12
Elon is literally in the files, talking about going to the island. It's documented
replies(2): >>chihua+Qj2 >>yodsan+es2
◧◩◪◨
127. Guinan+Fc2[view] [source] [discussion] 2026-02-03 21:22:08
>>flohof+6A1
you would perhaps be shocked to learn how right-leaning the money folks behind the legal and legacy cannabis markets actually are. money is money.
◧◩◪◨
128. VBprog+Jd2[view] [source] [discussion] 2026-02-03 21:26:56
>>mmooss+N92
> defendents have rights and due process.

As they say: you can beat the rap but not the ride. If a state wants to make your life incredibly difficult for months or even years they can, the competent ones can even do it while staying (mostly) on the right side of the law.

replies(1): >>colech+0o2
◧◩◪◨⬒⬓⬔
129. former+Md2[view] [source] [discussion] 2026-02-03 21:27:14
>>shawab+H92
Witnesses can generally not refuse in these situations, that's plain contempt and/or obstruction. Additionally, in France a suspect not revealing their keys is also contempt (UK as well).
replies(1): >>rvnx+fk2
◧◩◪◨
130. toss1+Qd2[view] [source] [discussion] 2026-02-03 21:27:42
>>mmooss+N92
>> they are restricted in how they use it, and defendents have rights and due process.

That due process only exists to the extent the branches of govt are independent, have co-equal power, and can hold and act upon different views of the situation.

When all branches of govt are corrupted or corrupted to serve the executive, as in autocracies, that due process exists only if the executive likes you, or accepts your bribes. That is why there is such a huge push by right-wing parties to take over the levers of power, so they can keep their power even after they would lose at the ballot box.

◧◩◪◨⬒⬓⬔⧯▣
131. fmajid+me2[view] [source] [discussion] 2026-02-03 21:30:37
>>JumpCr+B22
France doesn't extradite its citizens, even absolute scumbags like Roman Polanski. Someone like Musk has lots of lawyers to gum up extradition proceedings, even if the US were inclined to go along. I doubt the US extradition treaty would cover this unless the French could prove deliberate sharing of CSAM by Musk personally, beyond reckless negligence. Then again, after the Epstein revelations, this is no longer so far-fetched.
◧◩◪◨
132. free65+Pe2[view] [source] [discussion] 2026-02-03 21:32:37
>>nieman+042
>withholding evidence from the prosecution, you are going to jail if you follow.

Prosecution must present a valid search warrant for *specific* information. They don't get a carte blanche, so uber way is correct. lock computers and lets the courts to decide.

replies(1): >>Kaiser+TM3
133. tehjok+3f2[view] [source] 2026-02-03 21:34:32
>>vikave+(OP)
It's cool that not every law enforcement agency in the world is under the complete thumb of U.S. based billionaires.
◧◩
134. nebula+Lf2[view] [source] [discussion] 2026-02-03 21:38:18
>>stickf+gv1
I read somewhere that Musk (or maybe Theil) companies have processes in place to quickly offload data from a location to other jurisdictions (and destroy the local data) when they detect a raid happening. Don't know how true it is though. The only insight I have into their operations was the amazing speed by which people are badged in and out of his various gigafactories. It "appears" that they developed custom badging systems when people drive into gigafactories to cut the time needed to begin work. If they are doing that kind of stuff then there has got to be something in place for a raid. (This is second hand so take with a grain of salt)

EDIT: It seems from other comments that it may have been Uber I was reading about. The badging system I have personally observed outside the Gigafactories. Apologies for the mixup.

replies(1): >>malfis+Ei2
◧◩
135. auciss+3g2[view] [source] [discussion] 2026-02-03 21:39:44
>>stickf+gv1
> Are they just seizing employee workstations?

Yes.

◧◩◪◨⬒⬓⬔⧯
136. cybera+mg2[view] [source] [discussion] 2026-02-03 21:40:53
>>ronsor+T62
> You're implying that France is going to become a terrorist state? Because suspicious accidents do not sound like rule of law.

Why not? After all, that's in vogue today. Trump is ignoring all the international agreements and rules, so why should others follow them?

◧◩◪◨⬒⬓⬔⧯
137. hunter+pg2[view] [source] [discussion] 2026-02-03 21:41:10
>>ronsor+T62
Counter-point. France has already kidnapped another social media CEO and forced him to give up the encryption keys. The moral difference between France (historically or currently) and a 3rd wold warlord is very thin. Also, look at the accusations. CP and political extremism are the classic go-tos when a government doesn't really have a reason to put pressure on someone but they really want to anyway. France has a very questionable history of honoring rule of law in politics. Putting political enemies in prison on questionable charges has a long history there.
replies(2): >>rvnx+yn2 >>direwo+214
◧◩◪
138. ChrisM+Kg2[view] [source] [discussion] 2026-02-03 21:42:31
>>nieman+k52
> We all forget that money is nice, but nation states have real power.

I remember something (probably linked from here), where the essayist was comparing Jack Ma, one of the richest men on earth, and Xi Jinping, a much lower-paid individual.

They indicated that Xi got Ma into a chokehold. I think he "disappeared" Ma for some time. Don't remember exactly how long, but it may have been over a year.

replies(1): >>kshack+mu2
◧◩
139. bluesc+ih2[view] [source] [discussion] 2026-02-03 21:45:10
>>scotty+JL1
Governments don't care about minor scams. Political speech against them, on the other hand...
◧◩◪◨⬒⬓⬔⧯▣
140. hunter+nh2[view] [source] [discussion] 2026-02-03 21:45:45
>>Teever+P92
You got that backwards. Greenpeace for all its faults is still viewed as a group against which military force is a no-no. Sinking that ship cost France far more than anything they inflicted on Greenpeace. If anything, that event is evidence that going after Musk is a terrible idea.

PS Yes, Greenpeace is a bunch of scientifically-illiterate fools who have caused far more damage than they prevented. Doesn't matter because what France did was still clearly against the law.

◧◩◪◨⬒
141. LightB+Vh2[view] [source] [discussion] 2026-02-03 21:48:59
>>Brando+QW1
April 20th most definitely is international stoners day. And I like what the French have done here!
replies(1): >>thauma+X33
◧◩◪
142. malfis+Ei2[view] [source] [discussion] 2026-02-03 21:52:03
>>nebula+Lf2
That is very much illegal in the US
replies(1): >>int_19+Wj2
◧◩◪◨⬒⬓⬔⧯▣
143. krisof+6j2[view] [source] [discussion] 2026-02-03 21:53:53
>>JumpCr+B22
> France? A nuclear state? Paris is properly sovereign.

That is true. But nukes are not magic. Explain to me how you imagine the series of events where Paris uses their nukes to get the USA to extradite Elon to Paris. Because i’m just not seeing it.

replies(2): >>rvnx+Po2 >>JumpCr+nz2
◧◩◪◨
144. speed_+Lj2[view] [source] [discussion] 2026-02-03 21:58:13
>>kps+vb2
France has Ariane, which was good enough to send Jame Web Telescope to some Lagrange point with extra precision. It's all fun and and games until the French finish their cigarette, arms French Guyana and fire ze missiles.
replies(1): >>UncleS+h94
◧◩◪
145. chihua+Qj2[view] [source] [discussion] 2026-02-03 21:58:45
>>alex11+qc2
He was only going to the island to get rid of bots on Twitter. Just like OJ spent the rest of his life looking for the real killer.
replies(1): >>alex11+Yj2
◧◩◪◨
146. int_19+Wj2[view] [source] [discussion] 2026-02-03 21:59:23
>>malfis+Ei2
It wouldn't be the first time a Musk company knowingly does something illegal.

I think as far as Musk is concerned, laws only apply in the "don't get caught" sense.

replies(2): >>rvnx+Gk2 >>scotty+iH2
◧◩◪◨
147. alex11+Yj2[view] [source] [discussion] 2026-02-03 21:59:26
>>chihua+Qj2
It's timestamped like 2013, I think. Years before he bought Twitter (yes, I know you're joking)
replies(1): >>andrew+Ts2
◧◩◪◨⬒⬓⬔⧯
148. rvnx+fk2[view] [source] [discussion] 2026-02-03 22:01:11
>>former+Md2
100%. Only additional troubles for yourself personally, for practically no benefit (nobody in the company is going to celebrate you).
◧◩◪
149. int_19+kk2[view] [source] [discussion] 2026-02-03 22:01:43
>>why_at+HZ1
Holocaust denial is illegal in France, for one, and Grok did exactly that on several occasions.
replies(2): >>pyrale+mp2 >>vinter+fY3
◧◩◪◨
150. ameliu+xk2[view] [source] [discussion] 2026-02-03 22:02:37
>>Brando+qV1
Of course they will not lock the data but hide it, and put some redacted or otherwise innocent files in their place.
replies(2): >>acdha+9q2 >>Brando+Qr2
◧◩
151. jimbo8+Dk2[view] [source] [discussion] 2026-02-03 22:03:04
>>stickf+gv1
It sounds better in the news when you do a raid. These things are generally not done for any purpose other than to communicate a message and score political points.
◧◩◪◨⬒
152. rvnx+Gk2[view] [source] [discussion] 2026-02-03 22:03:24
>>int_19+Wj2
give any country a gift / investment of 100B USD

-> crimes ? what crimes ?

◧◩
153. nebula+Xl2[view] [source] [discussion] 2026-02-03 22:10:45
>>ta9000+hU1
I wonder if the recent announcement spurred them into making a move now rather than later.
replies(1): >>tyre+bO2
◧◩◪◨
154. nilamo+0m2[view] [source] [discussion] 2026-02-03 22:10:49
>>mmooss+N92
> Also, they are restricted in how they use it, and defendents have rights and due process.

It's a nice sentiment, if true. ICE is out there, right now today, ignoring both individual rights as well as due process.

replies(1): >>generi+vo2
155. ChrisM+9m2[view] [source] 2026-02-03 22:11:27
>>vikave+(OP)
> They have also summoned billionaire owner Elon Musk for questioning.

Good luck with that...

replies(2): >>dathin+Sq2 >>sleepy+yr2
◧◩◪◨⬒⬓
156. fragme+sm2[view] [source] [discussion] 2026-02-03 22:13:03
>>camina+Y82
violent agreement is when you're debating something with someone, and you end up yelling at each other because you think you disagree on something, but then you realize that you (violently, as in "are yelling at each other") agree on whatever it is. Agressive compliance is when the corporate drone over-zealously follows stupid/pointless rules when they could just look the other way, to the point of it being aggressively compliant (with stupid corporate mumbo jumbo).
replies(1): >>camina+xF2
◧◩◪◨
157. mschus+xm2[view] [source] [discussion] 2026-02-03 22:14:08
>>mmooss+N92
> Also, they are restricted in how they use it, and defendents have rights and due process.

As we're seeing with the current US President... the government doesn't (have to) care.

In any case, CSAM is the one thing other than Islamist terrorism that will bypass a lot of restrictions on how police are supposed to operate (see e.g. Encrochat, An0m) across virtually all civilized nations. Western nations also will take anything that remotely smells like Russia as a justification.

replies(1): >>gf000+6s2
◧◩
158. mschus+5n2[view] [source] [discussion] 2026-02-03 22:17:00
>>ta9000+hU1
How was that move legal anyway? Like... a lot of people and governments gave Musk money to develop, build and launch rockets. And now he's using it to bail out his failing social media network and CSAM peddling AI service.
replies(1): >>wmf+nq2
◧◩◪◨
159. pyrale+xn2[view] [source] [discussion] 2026-02-03 22:19:17
>>camina+TI1
> if it were illegal, then why didn't Uber get criminal charges or fines?

They had a sweet deal with Macron. Prosecution became hard to continue once he got involved.

replies(1): >>camina+0E2
◧◩◪◨⬒⬓⬔⧯▣
160. rvnx+yn2[view] [source] [discussion] 2026-02-03 22:19:27
>>hunter+pg2
We are also talking about a country who wants to ban anonymous VPNs in the name of protecting the children and ask everyone to give their ID card to register account on Instagram, TikTok, etc.

OpenDNS is censored in France... so imagine

◧◩◪
161. mschus+Tn2[view] [source] [discussion] 2026-02-03 22:21:15
>>why_at+HZ1
> I'm not at all familiar with French law, and I don't have any sympathy for Elon Musk or X. That said, is this a crime?

GDPR and DMA actually have teeth. They just haven't been shown yet because the usual M.O. for European law violators is first, a free reminder "hey guys, what you're doing is against the law, stop it, or else". Then, if violations continue, maybe two or three rounds follow... but at some point, especially if the violations are openly intentional (and Musk's behavior makes that very very clear), the hammer gets brought down.

Our system is based on the idea that we institute complex regulations, and when they get introduced and stuff goes south, we assume that it's innocent mistakes first.

And in addition to that, there's the geopolitical aspect... basically, hurt Musk to show Trump that, yes, Europe means business and has the means to fight back.

As for the allegations:

> The probe has since expanded to investigate alleged “complicity” in spreading pornographic images of minors, sexually explicit deepfakes, denial of crimes against humanity and manipulation of an automated data processing system as part of an organised group, and other offences, the office said in a statement Tuesday.

The GDPR/DMA stuff just was the opener anyway. CSAM isn't liked by authorities at all, and genocide denial (we're not talking about Palestine here, calm your horses y'all, we're talking about Holocaust denial) is a crime in most European jurisdiction (in addition to doing the right-arm salute and other displays of fascist insignia). We actually learned something out of WW2.

◧◩◪◨⬒
162. colech+0o2[view] [source] [discussion] 2026-02-03 22:21:42
>>VBprog+Jd2
We are not entirely sure the rule of law in America isn't already over.

People are putting a lot of weight on the midterm elections which are more or less the last line of defense besides a so far tepid response by the courts and even then consequence free defiance of court orders is now rampant.

We're really near the point of no return and a lot of people don't seem to notice.

replies(1): >>5uppli+4q2
◧◩◪◨⬒
163. generi+vo2[view] [source] [discussion] 2026-02-03 22:23:25
>>nilamo+0m2
They were talking about western liberal democracies, though.

/s

◧◩◪
164. hiprob+Ao2[view] [source] [discussion] 2026-02-03 22:24:07
>>nieman+k52
It's legal to just put kids in foster care for no reason but to ruin someone's life?
replies(2): >>rvnx+lp2 >>ricudi+G93
◧◩◪◨⬒⬓⬔⧯▣▦
165. rvnx+Po2[view] [source] [discussion] 2026-02-03 22:25:03
>>krisof+6j2
No need for nukes. France can issue an Interpol Red Notice for the arrest of Elon Musk, for whatever excuse is found.
◧◩◪
166. gruez+7p2[view] [source] [discussion] 2026-02-03 22:27:22
>>nieman+k52
>Sabu was put under pressure by the FBI, they threatened to place his kids into foster care.

>That was legal. Guess what, similar things would be legal in France.

lawfare is... good now? Between Trump being hit with felony charges for falsifying business records (lawfare is good?) and Lisa Cook getting prosecuted for mortgage fraud (lawfare is bad?), I honestly lost track at this point.

>The same way the president of the USA can order a Drone strike on a Taliban war lord, the president of France could order Musks plane to be escorted to Paris by 3 Fighter jets.

What's even the implication here? That they're going to shoot his plane down? If there's no threat of violence, what does the French government even hope to achieve with this?

replies(2): >>knallf+qq2 >>lcnPyl+by2
◧◩◪◨
167. rvnx+lp2[view] [source] [discussion] 2026-02-03 22:28:24
>>hiprob+Ao2
In France it's possible without legal consequences (though immoral), if you call 119, you can push to have a baby taken from a family for no reason except that you do not like someone.

Claim that you suspect there may be abuse, it will trigger a case for a "worrying situation".

Then it's a procedural lottery:

-> If you get lucky, they will investigate, meet the people, and dismiss the case.

-> If you get unlucky, they will take the baby, and it's only then after a long investigation and a "family assistant" (that will check you every day), that you can recover your baby.

Typically, ex-wife who doesn't like the ex-husband, but it can be a neighbor etc.

One worker explains that they don't really have time to investigate when processing reports: https://www.youtube.com/watch?v=VG9y_-4kGQA and they have to act very fast, and by default, it is safer to remove from family.

The boss of such agency doesn't even take the time to answer to the journalists there...

-> Example of such case (this man is innocent): https://www.lefigaro.fr/faits-divers/var-un-homme-se-mobilis...

but I can't blame them either, it's not easy to make the right calls.

replies(2): >>gf000+ar2 >>agoodu+1s2
◧◩◪◨
168. pyrale+mp2[view] [source] [discussion] 2026-02-03 22:28:26
>>int_19+kk2
Also, csam and pornographic content using the likeness of unwilling people. Grok’s recent shit was bound to have consequences.
replies(1): >>chrisj+8O2
◧◩◪◨⬒⬓
169. 5uppli+4q2[view] [source] [discussion] 2026-02-03 22:32:18
>>colech+0o2
> We're really near the point of no return and a lot of people don't seem to notice.

A lot of people are cheering it (some on this very site).

◧◩◪◨⬒
170. acdha+9q2[view] [source] [discussion] 2026-02-03 22:32:34
>>ameliu+xk2
That sounds awfully difficult to do perfectly without personally signing up for extra jail time for premeditated violation of local laws. Like in that scenario, any reference to the unsanitized file or a single employee breaking omertà is proof that your executives and IT staff conspired to violate the law in a way which is likely to ensure they want to prosecute as maximally as possible. Law enforcement around the world hates the idea that you don’t respect their authority, and when it slots into existing geopolitics you’d be a very tempting scapegoat.

Elon probably isn’t paying them enough to be the lightning rod for the current cross-Atlantic tension.

replies(1): >>ameliu+fs2
◧◩◪
171. xoxoli+cq2[view] [source] [discussion] 2026-02-03 22:32:41
>>nieman+k52
> We all forget that money is nice, but nation states have real power.

Interesting point. There's a top gangster who can buy anything in the prison commissary; and then there's the warden.

replies(1): >>hkpack+Qs2
◧◩◪
172. wmf+nq2[view] [source] [discussion] 2026-02-03 22:33:49
>>mschus+5n2
Once he launched the rockets he can do whatever he wants with the profit. And he wants to train Grok.
replies(1): >>stubis+Z63
◧◩◪◨
173. knallf+qq2[view] [source] [discussion] 2026-02-03 22:34:03
>>gruez+7p2
fighter jets ARE a threat of violence, and it is widely understood and acknowledged.

Again: the threat is so clear that you rarely have to execute on it.

replies(1): >>gruez+Ar2
◧◩◪◨⬒
174. pyrale+Hq2[view] [source] [discussion] 2026-02-03 22:35:34
>>dgxyz+wy1
An "audition en tant que témoin libre" is more or less the way for an investigation to give a chance to give their side of the story. Musk is not likely to be personally tried here.
◧◩
175. dathin+Sq2[view] [source] [discussion] 2026-02-03 22:36:13
>>ChrisM+9m2
the thing is a lot of recent legal preceding surrounding X is about weather X fulfilled the legally required due diligence and if not what level of negligence we are speaking about

and the things about negligence which caused harm to humans (instead of e.g. just financial harm) is that

a) you can't opt out of responsibility, it doesn't matter what you put into your TOS or other contracts

b) executives which are found responsible for the negligent action of a company can be hold _personally_ liable

and independent of what X actually did Musk as highest level executive personal did

1) frequently did statements that imply gross negligence (to be clear that isn't necessary how X acted, which is the actual relevant part)

2) claimed that all major engineering decisions etc. are from him and no one else (because he love bragging about how good of an engineer he is)

This means summoning him for questioning is legally speaking a must have independent of weather you expect him to show up or not. And he probably should take it serious, even if that just means he also could send a different higher level executive from X instead.

◧◩◪◨⬒⬓⬔⧯▣
176. wongar+3r2[view] [source] [discussion] 2026-02-03 22:37:24
>>Retric+OZ1
> But this administration has a limited shelf life, even hypothetically just under 3 years of immunity isn’t enough for comfort.

Depends on how much faith you have in the current administration. Russia limits presidents to two 6-year terms, yet Putin is in power since 2000.

replies(1): >>klez+oR3
◧◩◪◨⬒
177. gf000+ar2[view] [source] [discussion] 2026-02-03 22:37:53
>>rvnx+lp2
I mean, that's surely not as simple as you make it out to be.
replies(2): >>rvnx+ys2 >>Normal+Ny2
178. sleepy+jr2[view] [source] 2026-02-03 22:38:55
>>vikave+(OP)
I guess this means that building the neverending 'deepfake CSAM on demand machine' was a bad idea.
replies(1): >>sleepy+uv6
◧◩
179. xdenni+pr2[view] [source] [discussion] 2026-02-03 22:39:21
>>verdve+Kv1
> lol, they summoned Elon for a hearing on 420

No. It's 20 April in the rest of the world: 204.

◧◩
180. sleepy+yr2[view] [source] [discussion] 2026-02-03 22:40:00
>>ChrisM+9m2
I guess he could just never enter the EU ever again. Maybe he can buy Little St. James.
replies(1): >>a_bett+Cs4
◧◩◪◨⬒
181. gruez+Ar2[view] [source] [discussion] 2026-02-03 22:40:03
>>knallf+qq2
>fighter jets ARE a threat of violence, and it is widely understood and acknowledged.

That's not a credible threat because there's approximately 0% chance France would actually follow through with it. Not even Trump would resort to murder to get rid of his domestic adversaries. As we seen the fed, the best he could muster are some spurious prosecutions. France murdering someone would put them on par with Russia or India.

replies(4): >>ozim+qy2 >>anigbr+TK2 >>ricudi+Ia3 >>nieman+ko3
◧◩◪◨⬒
182. Brando+Qr2[view] [source] [discussion] 2026-02-03 22:41:19
>>ameliu+xk2
Nobody does that. It is either cooperation with law enforcement or remote lock (and then there are consequences for the in-country legal entity, probably not personally for the head but certainly for its existence).

This was a common action during the Russian invasion of Ukraine for companies that supported Ukraine and closed their operations in Russia.

◧◩
183. Psilli+Rr2[view] [source] [discussion] 2026-02-03 22:41:19
>>ta9000+hU1
CSAM in space! At least he isn’t reinventing the cross town bus.
◧◩◪◨⬒
184. agoodu+1s2[view] [source] [discussion] 2026-02-03 22:41:49
>>rvnx+lp2
I can't believe theres a country out there that has recreated the DMCA but for child welfare
replies(2): >>Sanjay+Oz2 >>vinter+wH3
◧◩
185. eli+4s2[view] [source] [discussion] 2026-02-03 22:42:01
>>stickf+gv1
Why don't you think they have file cabinets and paper records?
◧◩◪◨⬒
186. gf000+6s2[view] [source] [discussion] 2026-02-03 22:42:06
>>mschus+xm2
> As we're seeing with the current US President

Well, that's particular to the US. It just shows that checks and balances are not properly implemented there, just previous presidents weren't exploiting it maliciously for their own gains.

replies(1): >>direwo+G04
◧◩◪
187. yodsan+es2[view] [source] [discussion] 2026-02-03 22:42:43
>>alex11+qc2
Who knows who did what on this island, and I hope we'll figure it out. But in the meantime, going to this island or/and being friend with Epstein doesn't automatically make someone a pedo or rapist.
replies(4): >>jjkacz+Qt2 >>fatbir+8B2 >>father+BB2 >>tomloc+9I2
◧◩◪◨⬒⬓
188. ameliu+fs2[view] [source] [discussion] 2026-02-03 22:42:54
>>acdha+9q2
These days you can probably ask an LLM to redact the files for you, so expect more of it.
replies(1): >>acdha+VN2
◧◩◪
189. joshua+os2[view] [source] [discussion] 2026-02-03 22:43:53
>>mike-t+h82
Yes, Paris is an international capital and centrally located for Europe, the Middle East, and Africa. Many tech companies have sales offices there.
◧◩◪◨⬒⬓
190. rvnx+ys2[view] [source] [discussion] 2026-02-03 22:44:31
>>gf000+ar2
I've seen that during harassment; in one YouTube live the woman claimed:

    "today it's my husband to take care of him because sometimes my baby makes me angry that I want to kill him"
but she was saying it normally, like any normal person does when they are angry.

-> Whoops, someone talked with 119 to refer a "worrying" situation, baby removed. It's already two years.

There are some non-profit fighting against such: https://lenfanceaucoeur.org/quest-ce-que-le-placement-abusif...

That being said, it's a very small % obviously not let's not exaggerate but it's quite sneaky.

191. jongjo+Is2[view] [source] 2026-02-03 22:45:09
>>vikave+(OP)
Once you've worked long enough in the software industry, you start to understand it's all just a fully planned economy.
◧◩◪
192. SpaceM+Js2[view] [source] [discussion] 2026-02-03 22:45:12
>>nieman+k52
> Sabu was put under pressure by the FBI, they threatened to place his kids into foster care.

This is pretty messed up btw.

Social work for children systems in the USA are very messed up. It is not uncommon for minority families to lose rights to parent their children for very innocuous things that would not happen to a non-oppressed class.

It is just another way for the justice/legal system to pressure families that have not been convicted / penalized under the supervision of a court.

And this isn't the only lever they use.

Every time I read crap like this I just think of Aaron Swartz.

replies(1): >>pastag+vw2
◧◩◪◨
193. hkpack+Qs2[view] [source] [discussion] 2026-02-03 22:46:11
>>xoxoli+cq2
No, state decides on the rules of the game any business is playing by.
replies(1): >>arijun+ly2
◧◩◪◨⬒
194. andrew+Ts2[view] [source] [discussion] 2026-02-03 22:46:18
>>alex11+Yj2
He was planning way ahead, like a real genius.
◧◩◪◨
195. jjkacz+Qt2[view] [source] [discussion] 2026-02-03 22:51:43
>>yodsan+es2
Neither does your wife divorcing you at about the same time things started to go through legal process...

Oops... yeah, in retrospect it was even worse... no... you can and should be judged by the friends you keep and hang-out with... The same ones who seem to be circling the wagons with innocuous statements or attempts to find other scapegoats (DARVO)... hmm, what was that quote again:

"We must all hang together or we will all hang separately"

◧◩◪◨
196. kshack+mu2[view] [source] [discussion] 2026-02-03 22:54:29
>>ChrisM+Kg2
From what I hear, Ma made 1 speech critical of the government and Xi showed him his place. It was a few years, a year of total disappearance followed by slow rehab.

But China is different. Not sure most of western europe will go that far in most cases.

replies(1): >>Sanjay+kA2
◧◩◪
197. cm2187+Eu2[view] [source] [discussion] 2026-02-03 22:56:39
>>beart+Dx1
Most enterprises have fully encrypted workstations, when they don't use VM where the desktop is just a thin client that doesn't store any data. So there should be really nothing of interest in the office itself.
replies(1): >>direwo+M04
◧◩◪
198. projek+qv2[view] [source] [discussion] 2026-02-03 23:00:26
>>nieman+k52
Wait, Sabu's kids were foster kids. He was fostering them. Certainly if he went to jail, they'd go back to the system.

I mean, if you're a sole caretaker and you've been arrested for a crime, and the evidence looks like you'll go to prison, you're going to have to decide what to do with the care of your kids on your mind. I suppose that would pressure you to become an informant instead of taking a longer prison sentence, but there's pressure to do that anyway, like not wanting to be in prison for a long time.

◧◩
199. ChuckM+ew2[view] [source] [discussion] 2026-02-03 23:05:06
>>stickf+gv1
Sadly the media calls the lawful use of a warrant a 'raid' but that's another issue.

The warrant will have detailed what it is they are looking for, French warrants (and legal system!) are quite a bit different than the US but in broad terms operate similarly. It suggests that an enforcement agency believes that there is evidence of a crime at the offices.

As a former IT/operations guy I'd guess they want on-prem servers with things like email and shared storage, stuff that would hold internal discussions about the thing they were interested in, but that is just my guess based on the article saying this is related to the earlier complaint that Grok was generating CSAM on demand.

replies(3): >>chrisj+nK2 >>vinter+fG3 >>termin+pX3
◧◩◪◨
200. pastag+vw2[view] [source] [discussion] 2026-02-03 23:06:23
>>SpaceM+Js2
One can also say we do too little for children who get mistreated. Taking care of other peoples children is never easy the decision needs to be fast and effective and no one wants to take the decision to end it. Because there are those rare cases were children dies because of a reunion with their parents.
◧◩◪
201. cadams+7y2[view] [source] [discussion] 2026-02-03 23:17:08
>>nieman+k52
Yes but using such power unscrupulously is a great way to lose it.
◧◩◪◨
202. lcnPyl+by2[view] [source] [discussion] 2026-02-03 23:17:27
>>gruez+7p2
> lawfare is... good now?

Well, when everything is lawfare it logically follows that it won't always be good or always be bad. It seems Al Capone being taken down for tax fraud would similarly be lawfare by these standards, or am I missing something? Perhaps lawfare (sometimes referred to as "prosecuting criminal charges", as far as I can tell, given this context) is just in some cases and unjust in others.

◧◩◪◨⬒
203. arijun+ly2[view] [source] [discussion] 2026-02-03 23:18:33
>>hkpack+Qs2
I think both you and the comment you're replying to agree with the gp.
◧◩◪◨⬒⬓
204. ozim+qy2[view] [source] [discussion] 2026-02-03 23:18:50
>>gruez+Ar2
Don’t forget that captain of the plane makes decisions not Elon.

If captain of the plane disobeyed direct threat like that from a nation, his career is going to be limited. Yeah Elon might throw money at him but that guy is most likely never allowed again to fly near any French territory. I guess whole cabin crew as well .

Being clear for flying anywhere in the world is their job.

Would be quite stupid to loose it like truck driver DUI getting his license revoked.

replies(1): >>gruez+jD2
◧◩◪◨⬒⬓
205. Normal+Ny2[view] [source] [discussion] 2026-02-03 23:21:42
>>gf000+ar2
Its not.

If you call 119 it gets assessed and potentially forwarded to the right department, which then assesses it again and might (quite likely will) trigger an inspection. The people who turn up have broad powers to seize children from the home in order to protect them from abuse.

In general this works fine. Unfortunately in some circumstances this does give a very low skilled/paid person (the inspector) a lot of power, and a lot of sway with judges. If this person is bad at their job for whatever reason (incompetence/malice) it can cause a lot of problems. It is very hard to prove a person like this wrong when they are covering their arse after making a mistake.

afaik similar systems are present in most western countries, and many of them - like France - are suffering with funding and are likely cutting in the wrong place (audit/rigour) to meet external KPIs. One of the worst ways this manifests is creating 'quick scoring' methods which can end up with misunderstandings (e.g. said a thing they didn't mean) ranking very highly, but subtle evidence of abuse moderate to low.

So while this is a concern, this is not unique to France, this is relatively normal, and the poster is massively exaggerating the simplicity.

replies(2): >>Michae+zJ2 >>belorn+BK2
◧◩◪◨⬒⬓⬔⧯
206. myko+Zy2[view] [source] [discussion] 2026-02-03 23:22:37
>>ronsor+T62
No difference in a strike like that and the strikes against fishing boats near Venezuela trump has ordered
◧◩◪◨⬒⬓⬔⧯▣▦
207. JumpCr+nz2[view] [source] [discussion] 2026-02-03 23:25:18
>>krisof+6j2
> nukes are not magic. Explain to me how you imagine the series of events where Paris uses their nukes to get the USA to extradite Elon to Paris

Paris doesn’t need to back down. And it can independently exert effort in a way other European countries can’t. Musk losing Paris means swearing off a meaningful economic and political bloc.

◧◩◪◨⬒⬓
208. Sanjay+Oz2[view] [source] [discussion] 2026-02-03 23:27:21
>>agoodu+1s2
Canada and Germany are no different.

[0] https://www.cbc.ca/news/canada/manitoba/winnipeg-mom-cfs-bac...

[1] https://indianexpress.com/article/india/ariha-family-visit-t...

◧◩◪◨⬒
209. Sanjay+kA2[view] [source] [discussion] 2026-02-03 23:30:22
>>kshack+mu2
Trump kidnapped Maduro to show the latter his place, but then the US is neither China nor Western Europe so that does not count.
replies(1): >>almost+MD2
◧◩◪◨
210. fatbir+8B2[view] [source] [discussion] 2026-02-03 23:34:33
>>yodsan+es2
No, but they all knew he was a pedo/rapist, and were still friends with him and went to the island of a pedo/rapist, and introduced the pedo/rapist to their friends...

We don't know how many were pedo/rapists, but we know all of them liked to socialize with one and trade favours and spread his influence.

◧◩◪◨
211. father+BB2[view] [source] [discussion] 2026-02-03 23:36:53
>>yodsan+es2
As part of the irrational mob that is out to find the witch, you are just being too rational. Down vote!
replies(2): >>anigbr+jT2 >>direwo+y34
212. lukasm+oC2[view] [source] 2026-02-03 23:40:14
>>vikave+(OP)
This is a show of resolve.

"Uh guys, little heads up: there are some agents of federal law enforcement raiding the premises, so if you see that. That’s what that is."

◧◩◪◨⬒⬓⬔
213. gruez+jD2[view] [source] [discussion] 2026-02-03 23:44:30
>>ozim+qy2
>Don’t forget that captain of the plane makes decisions not Elon.

>If captain of the plane disobeyed direct threat like that from a nation, his career is going to be limited. Yeah Elon might throw money at him but that guy is most likely never allowed again to fly near any French territory. I guess whole cabin crew as well .

Again, what's France trying to do? Refuse entry to France? Why do they need to threaten shooting down his jet for that? Just harassing/pranking him (eg. "haha got you good with that jet lmao")?

replies(1): >>reveri+6v3
◧◩◪◨
214. direwo+yD2[view] [source] [discussion] 2026-02-03 23:45:51
>>robthe+WC
Public institutions can use any system they want and make the public responsible for reading it.
replies(1): >>direwo+l34
◧◩◪◨⬒⬓
215. almost+MD2[view] [source] [discussion] 2026-02-03 23:47:23
>>Sanjay+kA2
Arrested and the vast majority of Venezuela love that it happened.

https://www.cbsnews.com/miami/news/venezuela-survey-trump-ma...

replies(5): >>Sanjay+qG2 >>tyre+BN2 >>wander+re3 >>isr+Tn3 >>TitaRu+ps5
◧◩◪
216. direwo+UD2[view] [source] [discussion] 2026-02-03 23:48:28
>>tokai+Ll
if pictures are speech, then either CSAM is speech, or you have to justify an exception to the general rule.

CSAM is banned speech.

replies(1): >>psycho+LA4
◧◩◪◨⬒
217. camina+0E2[view] [source] [discussion] 2026-02-03 23:48:57
>>pyrale+xn2
Maybe.

Or they had a weak case. Prosecutors even drop winnable cases because they don't want to lose.

replies(1): >>pyrale+vq3
◧◩◪◨
218. direwo+9E2[view] [source] [discussion] 2026-02-03 23:50:03
>>vessen+NQ
In some shady corners of the internet I still see advertisements for child porn through Telegram, so they must be doing a shit job at it
◧◩◪◨
219. direwo+fE2[view] [source] [discussion] 2026-02-03 23:50:47
>>vessen+BR
They were downvoted for completely misunderstanding the comment they replied to.
◧◩◪
220. direwo+iE2[view] [source] [discussion] 2026-02-03 23:51:02
>>derrid+kf
Telegram isn't encrypted. For all the marketing about security, it has none, apart from TLS, and an optional "secret chat" feature that you have to explicitly select, only works with 2 participants and doesn't work very well.

They can read all messages, so they don't have an excuse for not helping in a criminal case. Their platform had a reputation of being safe for crime, which is because they just... ignored the police. Until they got arrested for that. They still turn a blind eye but not to the police.

replies(1): >>derrid+E73
◧◩◪◨⬒⬓⬔
221. camina+xF2[view] [source] [discussion] 2026-02-03 23:57:17
>>fragme+sm2
Who knows.

I don't see aggressive compliance defined anywhere. Violent agreement has definitions, but it feels like it's best defined as a consulting buzzword.

◧◩◪◨⬒⬓⬔
222. Sanjay+qG2[view] [source] [discussion] 2026-02-04 00:02:00
>>almost+MD2
Rand Paul asked Rubio what would happen if the shoe was on the other foot. Every US President from Truman onwards is a war criminal.

https://www.tampafp.com/rand-paul-and-marco-rubio-clash-over...

replies(2): >>foolse+UR2 >>pyrale+Qx3
◧◩◪◨⬒
223. scotty+iH2[view] [source] [discussion] 2026-02-04 00:06:56
>>int_19+Wj2
Everyone defines their own moral code and trusts that more than the laws of the land. Don't tell me you've never gone over the speed limit, or broken one of the hundreds of crazy laws people break in everyday life out of ignorance.
replies(1): >>reveri+Iw3
◧◩◪◨
224. tomloc+9I2[view] [source] [discussion] 2026-02-04 00:12:08
>>yodsan+es2
Yes yes such a complex situation and so hard to tell whether the guy with the pedo non-con site wanted to go to the pedo non-con island.
◧◩
225. anigbr+QI2[view] [source] [discussion] 2026-02-04 00:17:12
>>stickf+gv1
They do have some physical records, but it would be mostly investigators producing a warrant and forcing staff to hand over administrative credentials to allow forensic data collection.
replies(1): >>chrisj+yM2
◧◩◪
226. chrisj+oJ2[view] [source] [discussion] 2026-02-04 00:20:31
>>arppac+vX1
> External analysts said Grok was generating a CSAM image every minute!!

> https://www.washingtonpost.com/technology/2026/02/02/elon-mu...

That article has no mention of CSAM. As expected, since you can bet the Post has lawyers checking.

◧◩◪◨⬒⬓⬔
227. Michae+zJ2[view] [source] [discussion] 2026-02-04 00:21:19
>>Normal+Ny2
“ If this person is bad at their job for whatever reason (incompetence/malice) it can cause a lot of problems. It is very hard to prove a person like this wrong when they are covering their arse after making a mistake.”

This seems guaranteed to occur every year then… since incompetence/malice will happen eventually with thousands upon thousands of cases?

replies(1): >>chrisj+UL2
◧◩◪
228. chrisj+nK2[view] [source] [discussion] 2026-02-04 00:25:45
>>ChuckM+ew2
> I'd guess they want on-prem servers with things like email and shared storage

For a net company in 2026? Fat chance.

replies(2): >>ChuckM+mN2 >>Barrin+EW2
◧◩◪◨⬒⬓⬔
229. belorn+BK2[view] [source] [discussion] 2026-02-04 00:26:58
>>Normal+Ny2
In Sweden there is a additional review board that go through the decision made by the inspector. The idea is to limit the power that a single inspector has. In practice however the review board tend to rubber stamp decisions, so incompetence/malice still happens.

There was a huge mess right after metoo when a inspector went against the courts rulings. The court had given the father sole custody in a extremely messy divorce, and the inspector did not agree with the decision. As a result they remove the child from his father, in direct contrast to the courts decision, and put the child through 6 years of isolation and abuse with no access to school. It took investigative journalists a while, but the result of the case getting highlighted in media was that the inspector and supervisor is now fired, with two additoal workers being under investigation for severe misconduct. Four more workers would be under investigation but too long time has passed. The review board should have prevented this, as should the supervisor for the inspector, but those safety net failed in this case in part because of the cultural environment at the time.

replies(1): >>tomp+ra4
◧◩◪◨⬒⬓
230. anigbr+TK2[view] [source] [discussion] 2026-02-04 00:29:04
>>gruez+Ar2
I think the implication of the fighter jets is that they force the plane to land within a particular jurisdiction (where he is then arrested) rather than allowing it to just fly off to somewhere else. Similar to the way that a mall security guard might arrest a shoplifter; the existence of security guards doesn't mean the mall operators are planning to murder you.
replies(1): >>zzrrt+FP2
◧◩◪
231. chrisj+ZK2[view] [source] [discussion] 2026-02-04 00:30:04
>>nieman+k52
> Gather evidence against employees

I'm sure they have much better and quieter ways to do that.

Whereas a raid is #1 choice for max volume...

◧◩◪◨
232. anigbr+GL2[view] [source] [discussion] 2026-02-04 00:34:03
>>ramuel+XH1
If you're a database administrator or similar working at X in France, are you going to going to go to jail to protect Musk from police with an appropriate warrant for access to company data? I doubt it.
◧◩◪
233. digiow+ML2[view] [source] [discussion] 2026-02-04 00:34:20
>>ronsor+KB1
Or they just connect to a mothership with keys on the machine. The authorities can have the keys, but alas, they're useless now, because there is some employee watching the surveillance cameras in the US, and he pressed a red button revoking all of them. What part of this is illegal?

Obviously, the government can just threaten to fine you any amount, close operations or whatever, but your company can just decide to stop operating there, like Google after Russia imposed an absurd fine.

replies(1): >>anigbr+mO2
◧◩◪◨⬒⬓⬔⧯
234. chrisj+UL2[view] [source] [discussion] 2026-02-04 00:34:56
>>Michae+zJ2
> This seems guaranteed to occur every year then…

Not at all. This job will go to an "AI" any moment now.

/i

◧◩
235. mkouba+YL2[view] [source] [discussion] 2026-02-04 00:35:08
>>scotty+JL1
Governments prosecute violations of laws in ways that suit their interest. News at 11
◧◩◪
236. chrisj+yM2[view] [source] [discussion] 2026-02-04 00:39:38
>>anigbr+QI2
> forcing staff to hand over administrative credentials to allow forensic data collection.

What, thinking HQ wouldn't cancel them?

replies(1): >>anigbr+YU2
◧◩◪◨
237. chrisj+QM2[view] [source] [discussion] 2026-02-04 00:41:37
>>Brando+qV1
> but then the company will be closed in the country. Whether this matters for the mothership is another story.

Elon would love it. So it won't happen.

◧◩◪◨⬒⬓⬔⧯
238. anigbr+ZM2[view] [source] [discussion] 2026-02-04 00:42:31
>>ronsor+T62
People were surprised when the US started just droning boats in the Caribbean and wiping out survivors, but then the government explained that it was law enforcement and not terrorism or piracy, so everyone stopped worrying about it.

Seriously, every powerful state engages in state terrorism from time to time because they can, and the embarrassment of discovery is weighed against the benefit of eliminating a problem. France is no exception : https://en.wikipedia.org/wiki/Sinking_of_the_Rainbow_Warrior

◧◩◪◨
239. ChuckM+mN2[view] [source] [discussion] 2026-02-04 00:44:47
>>chrisj+nK2
Agreed its a stretch, my experience comes from Google when I worked there and they set up a Chinese office and they were very carefully trying to avoid anything on premises that could searched/exploited. It was a huge effort, one that wasn't done for the European and UK offices where the government was not an APT. So did X have the level of hygiene in France? Were there IT guys in the same vein as the folks that Elon recruited into DOGE? Was everyone in the office "loyal"?[1] I doubt X was paranoid "enough" in France not to have some leakage.

[1] This was also something Google did which was change access rights for people in the China office that were not 'vetted' (for some definition of vetted) feeling like they could be an exfiltration risk. Imagine a DGSE agent under cover as an X employee who carefully puts a bunch of stuff on a server in the office (doesn't trigger IT controls) and then lets the prosecutors know its ready and they serve the warrant.

replies(1): >>direwo+XZ3
◧◩◪◨⬒⬓⬔
240. tyre+BN2[view] [source] [discussion] 2026-02-04 00:46:28
>>almost+MD2
I mean, come on, we kidnapped him. Yes, he was arrested, but we went into another sovereign nation with special forces and yoinked their head of state back to Brooklyn.
replies(3): >>mrkstu+Z93 >>ImJama+9c3 >>vinter+3H3
◧◩◪◨⬒⬓⬔
241. acdha+VN2[view] [source] [discussion] 2026-02-04 00:48:06
>>ameliu+fs2
True, but that’s going to be a noisy process until there are a few theoretical breakthroughs. I personally would not leave myself legally on the hook hoping that Grok faked something hermetically.
◧◩◪◨⬒
242. chrisj+8O2[view] [source] [discussion] 2026-02-04 00:49:33
>>pyrale+mp2
If the French suspected Grok/X of something as serious as CSAM, you can bet they would have mentioned it their statement. They didn't. Porn, they did.
replies(1): >>pyrale+bq3
◧◩◪
243. tyre+bO2[view] [source] [discussion] 2026-02-04 00:49:58
>>nebula+Xl2
The merger was most likely now because they have to do it before the IPO. After the IPO, there’s a whole process to force independent evaluation and negotiation between two boards / executives, which would be an absolute dumpster fire where Musk controls both.

When they’re both private, fine, whatever.

replies(1): >>justab+SP2
◧◩◪◨
244. anigbr+mO2[view] [source] [discussion] 2026-02-04 00:50:48
>>digiow+ML2
You know police are not all technically clueless, I hope. The French have plenty of experience dealing with terrorism, cybercrime, and other modern problems as well as the more historical experience of being conquered and occupied, I don't think it's beyond them to game out scenarios like this and preempt such measures.

As France discovered the hard way in WW2, you can put all sorts of rock-solid security around the front door only to be surprised when your opponent comes in by window.

◧◩◪◨⬒⬓⬔
245. zzrrt+FP2[view] [source] [discussion] 2026-02-04 00:58:15
>>anigbr+TK2
Guards can plausibly arrest you without seriously injuring you. But according to https://aviation.stackexchange.com/a/68361 there are no safe options if the pilot really doesn’t want to comply, so there is no “forcing” a plane to land somewhere, just making it very clear that powerful people really want you to stop and might be able to give more consequences on the ground if you don’t.
replies(2): >>anigbr+LU2 >>arcolo+j23
246. justab+HP2[view] [source] 2026-02-04 00:58:25
>>vikave+(OP)
This sort of thing will be great for the SpaceX IPO :/
replies(1): >>stubis+F73
◧◩◪◨
247. justab+SP2[view] [source] [discussion] 2026-02-04 00:59:30
>>tyre+bO2
First thing a public spacex would want to do is sell off all the non-spacex crap
replies(1): >>tyre+D76
◧◩◪◨⬒⬓⬔⧯
248. foolse+UR2[view] [source] [discussion] 2026-02-04 01:12:45
>>Sanjay+qG2
The people of the US mostly wouldn’t like it the people of VZ mostly did and consider Maduro a thug who lost and stayed in power not their president. Ideologies like Paul have trouble with exceptions to their world view.
replies(2): >>Sanjay+uc3 >>MYEUHD+Ke3
◧◩◪◨⬒
249. anigbr+jT2[view] [source] [discussion] 2026-02-04 01:23:20
>>father+BB2
It's odd to be so prim about someone who is notorious for irrational trolling for the sake of mob entertainment.

https://www.theguardian.com/technology/2018/jul/15/elon-musk...

◧◩◪◨⬒⬓⬔⧯
250. anigbr+LU2[view] [source] [discussion] 2026-02-04 01:33:17
>>zzrrt+FP2
I suspect fighter pilots are better than commercial pilots at putting their much-higher-spec aircraft so uncomfortably close that your choices narrow down to complying with their landing instructions or suicidally colliding with one - in which case the fighter has an ejector seat and you don't.
replies(1): >>zzrrt+sh3
251. darepu+UU2[view] [source] 2026-02-04 01:34:18
>>vikave+(OP)
I remember encountering questionable hentai material (by accident) back in the Twitter days. But back then twitter was a leftist darling
replies(4): >>nemoma+bV2 >>techbl+ZY2 >>fumar+Ad3 >>direwo+M24
◧◩◪◨
252. anigbr+YU2[view] [source] [discussion] 2026-02-04 01:34:26
>>chrisj+yM2
I'm sure an intelligent person such as yourself can think of ways around that possibility.
replies(1): >>chrisj+KW2
◧◩
253. nemoma+bV2[view] [source] [discussion] 2026-02-04 01:35:44
>>darepu+UU2
I think there's a difference between "user uploaded material isn't properly moderated" and "the sites own chatbot generates porn on request based on images of women who didn't agree to it", no?
replies(2): >>nailer+Jb3 >>tick_t+gX3
◧◩
254. tjpnz+oW2[view] [source] [discussion] 2026-02-04 01:44:49
>>pogue+N1
It's also a massive problem on Meta. Hopefully this action isn't just a one-off.
replies(1): >>direwo+G34
◧◩◪◨
255. Barrin+EW2[view] [source] [discussion] 2026-02-04 01:46:55
>>chrisj+nK2
Under GDPR if a company processes European user data they're obligated to make a "Record of Processing Activities" available on demand (umbrella term for a whole bunch of user-data / identity related stuff). They don't necessarily need to store them onsite but they need to be able to produce them. Saying you're an internet company doesn't mean you can just put the stuff on a server in the Caribbean and shrug when the regulators come knocking on your door

That's aside from the fact that they're a publicly traded company under obligation to keep a gazillion records anyway like in any other jurisdiction.

replies(2): >>chrisj+BX2 >>derwik+am3
◧◩◪◨⬒
256. chrisj+KW2[view] [source] [discussion] 2026-02-04 01:47:49
>>anigbr+YU2
Nope. But I'm sure a more intelligent person such as yourself can tell me! :)
◧◩◪◨⬒
257. chrisj+BX2[view] [source] [discussion] 2026-02-04 01:52:48
>>Barrin+EW2
> They don't necessarily need to store them onsite but they need to be able to produce them.

... within 30 days, right? The longest "raid" in history.

◧◩◪◨
258. Aurorn+LX2[view] [source] [discussion] 2026-02-04 01:53:30
>>charci+oa2
> Sure it might be on the device, but they would need a password to decrypt the laptop's storage to get any of the data.

In these situations, refusing to provide those keys or passwords is an offense.

The employees who just want to do their job and collect a paycheck aren’t going to prison to protect their employer by refusing to give the password to their laptop.

The teams that do this know how to isolate devices to avoid remote kill switches. If someone did throw a remote kill switch, that’s destruction of evidence and a serious crime by itself. Again, the IT guy isn’t going to risk prison to wipe company secrets.

◧◩
259. techbl+ZY2[view] [source] [discussion] 2026-02-04 02:03:05
>>darepu+UU2
Did you report it or just let it continue doing harm?
◧◩◪◨
260. f30e3d+023[view] [source] [discussion] 2026-02-04 02:24:48
>>vessen+NQ
"I've been told by FBI agents that they believe assassination markets are legal in the US - protected speech."

I don't believe you. Not sure what you mean by "assassination markets" exactly, but "Solicitation to commit a crime of violence" and "Conspiracy to murder" are definitely crimes.

replies(1): >>vessen+Ct3
◧◩◪◨⬒⬓⬔⧯
261. arcolo+j23[view] [source] [discussion] 2026-02-04 02:27:33
>>zzrrt+FP2
Planes are required to comply with instructions; if they don't they're committing a serious crime and the fighters are well within their international legal framework to shoot the plane down. They would likely escalate to a warning shot with the gun past the cockpit, and if the aircraft is large enough they might try to shoot out one engine instead of the wing or fuselage.
◧◩◪◨⬒⬓
262. thauma+X33[view] [source] [discussion] 2026-02-04 02:42:18
>>LightB+Vh2
I assume in France international stoners' day falls on the 4th of Duodevigintiber.
◧◩◪◨⬒
263. Findec+c63[view] [source] [discussion] 2026-02-04 03:01:00
>>Brando+QW1
I believe the French format the date 20/4 ... and the time 16 h 20
◧◩◪◨
264. stubis+Z63[view] [source] [discussion] 2026-02-04 03:07:58
>>wmf+nq2
Money comes with strings, such as when forming an ongoing relationship with a company you expect them to not merge with other companies you are actively prosecuting. I suspect the deal is going so fast to avoid some sort of veto being prepared. Once SpaceX and xAI are officially the same, you lose the ability to inflict meaningful penalties on xAI without penalizing yourself as an active business partner with SpaceX.
replies(1): >>direwo+p24
◧◩◪◨
265. derrid+E73[view] [source] [discussion] 2026-02-04 03:14:07
>>direwo+iE2
ok thank you! I did not know that, I'm ashamed to admit! sort of like studying physics at university a decade later forgetting V=IR when I actually needed it for some solar install. I took "technical hiatus" about 5 years and recently coming back.

Anyway cut to the chase, I just checked out Mathew Greens post on the subject, he is on my list of default "trust what he says about cryptography" along with some others like djb, nadia henninger etc

Embarrased to say I did not realise, I should of known! 10+ years ago I used to lurk the IRC dev chans of every relevant cypherpunk project, including of text secure and otr-chat when I saw signal being made and before that was witnessing chats with devs and ian goldberg and stuff, I just assumed Telegram was multiparty OTR,

OOPS!

Long winded post because that is embarrassing (as someone who studied cryptography undergrad in 2009 mathematics, 2010 did postgrad wargames and computer security course and worse - whose word once about 2012-2013 was taken on these matters by activists, journalists, researchers with pretty knarly threat model - like for instance - some guardian stories and former researcher into torture - i'm also the person that wrote the bits of 'how to hold a crypto party' that made it a protocol without an organisation and made clear the threat model was anyone could be there, oops oops oops

Yes thanks for letting me know I hang my head in shame for missing that one or some how believing that one without much investigation, thankfully it was just my own personal use to contact like friend in the states where they aren't already on signal etc.

EVERYONE: DON'T TRUST TELEGRAM AS END TO END ENCRYPTED CHAT https://blog.cryptographyengineering.com/2024/08/25/telegram...

Anyway as they say "use it or lose it" yeah my assumptions here no longer valid or considered to have educated opinion if I got something that basic wrong.

◧◩
266. stubis+F73[view] [source] [discussion] 2026-02-04 03:14:34
>>justab+HP2
Especially if contracts with SpaceX start being torn up because the various ongoing investigations and prosecutions of xAI are now ongoing investigations and prosecutions of SpaceX. And next new lawsuits for creating this conflict of interest by merger.
◧◩◪◨
267. ricudi+G93[view] [source] [discussion] 2026-02-04 03:33:34
>>hiprob+Ao2
I heard there's a country where they can even SWAT you out of existence with a simple phone call, but it sounds so outrageous this must be some evil communist dictatorship third-world place. I really don't remember.
◧◩◪◨⬒⬓⬔⧯
268. mrkstu+Z93[view] [source] [discussion] 2026-02-04 03:36:25
>>tyre+BN2
To be fair he isn't legitimate head of state- he lost an election and is officially recognized as a usurper and the US had support of those who actually won.
replies(3): >>platev+pc3 >>Sanjay+sY3 >>direwo+p04
◧◩◪◨⬒⬓
269. ricudi+Ia3[view] [source] [discussion] 2026-02-04 03:42:38
>>gruez+Ar2
> Not even Trump would resort to murder to get rid of his domestic adversaries

Don't give them ideas

◧◩◪
270. nailer+Jb3[view] [source] [discussion] 2026-02-04 03:53:01
>>nemoma+bV2
But it doesn’t. Group has always had Aggressive filters on sexual content just like every other generative AI tool.

People who have found exploits, just like other generative AI tool.

◧◩◪◨⬒⬓⬔⧯
271. ImJama+9c3[view] [source] [discussion] 2026-02-04 03:58:17
>>tyre+BN2
He is not a legitimate head of state. He lost the election.
◧◩◪◨⬒⬓⬔⧯▣
272. platev+pc3[view] [source] [discussion] 2026-02-04 04:00:46
>>mrkstu+Z93
Large amounts of people call Joe Biden's election illegitimate. You could even say thats the official position of the current government. Would his kidnapping by a foreign nation be okay with you too?
◧◩◪◨⬒⬓⬔⧯▣
273. Sanjay+uc3[view] [source] [discussion] 2026-02-04 04:02:34
>>foolse+UR2
Ah, the "rules based disorder" on display: we do dis, you no do dis.

Hypocrisy at its finest.

◧◩
274. bawolf+Ec3[view] [source] [discussion] 2026-02-04 04:03:26
>>Altern+ut
> and no crime was prevented by harassing local workers.

Siezing records is usually a major step in an investigation. Its how you get evidence.

Sure it could just be harrasment, but this is also how normal police work looks. France has a reasonable judicial system so absent of other evidence i'm inclined to believe this was legit.

◧◩
275. fumar+Ad3[view] [source] [discussion] 2026-02-04 04:12:14
>>darepu+UU2
Define leftist for back in the twitter days? I used twitter early in release. Don’t recall it being a faction specific platform.
replies(1): >>reveri+mx3
◧◩◪◨
276. bawolf+Bd3[view] [source] [discussion] 2026-02-04 04:12:24
>>cwillu+Ft
Wouldn't surprise me, but they would have to be very incompetent to say that outside of attorney-client privledge convo.

Otoh it is musk.

replies(1): >>cwillu+Ud9
◧◩◪◨⬒⬓⬔
277. wander+re3[view] [source] [discussion] 2026-02-04 04:21:20
>>almost+MD2
According to USA sources, USA actions are universally approved.

Color me surprised.

◧◩◪◨⬒⬓⬔⧯▣
278. MYEUHD+Ke3[view] [source] [discussion] 2026-02-04 04:24:12
>>foolse+UR2
> the people of VZ mostly did and consider Maduro a thug who lost and stayed in power not their president.

You got this information from American media (or their allies')

In reality, Venezuelans flooded the streets in marches demanding the return of their president.

replies(1): >>termin+HX3
◧◩◪
279. trhway+Ve3[view] [source] [discussion] 2026-02-04 04:25:46
>>moolco+Du
Internet routers, network cards, the computers, OS and various application software have no guardrails and is used for all the nefarious things. Why those companies aren't raided?
replies(3): >>trotha+5i3 >>sirnic+ki3 >>bluesc+dD3
280. isodev+4f3[view] [source] 2026-02-04 04:27:27
>>vikave+(OP)
Good and honestly it’s high time. There used to be a time when we could give corps the benefit of the doubt but that time is clearly over. Beyond the CSAM, X is a cesspool of misinformation and generally the worst examples of humanity.
replies(1): >>mekdoo+qN4
◧◩◪◨⬒⬓⬔⧯▣
281. zzrrt+sh3[view] [source] [discussion] 2026-02-04 04:54:23
>>anigbr+LU2
I felt like you ruled out collision when you said they're not going to murder, though, granted, an accidental but predictable collision after repeatedly refusing orders is not exactly murder. I think the point stands, they have to be willing to kill or to back down, and as others said I'm skeptical France or similar countries would give the order for anything short of an imminent threat regarding the plane's target. If Musk doesn't want to land where they want him to, he's going to pay the pilot whatever it takes, and the fighter jets are going to back off because whatever they want to arrest him for isn't worth an international incident.
◧◩◪◨
282. trotha+5i3[view] [source] [discussion] 2026-02-04 04:59:06
>>trhway+Ve3
Don't forget polaroid in that.
◧◩◪◨
283. sirnic+ki3[view] [source] [discussion] 2026-02-04 05:00:34
>>trhway+Ve3
This is like comparing the danger of a machine gun to that of a block of lead.
replies(1): >>trhway+Yv3
284. miki12+dl3[view] [source] 2026-02-04 05:31:41
>>vikave+(OP)
This vindicates the pro-AI censorship crowd I guess.

It definitely makes it clear what is expected of AI companies. Your users aren't responsible for what they use your model for, you are, so you'd better make sure your model can't ever be used for anything nefarious. If you can't do that without keeping the model closed and verifying everyone's identities... well, that's good for your profits I guess.

replies(11): >>culi+Yn3 >>themaf+Ap3 >>Jordan+1t3 >>popalc+5u3 >>mnewme+FE3 >>madeof+JE3 >>gordia+xT3 >>direwo+jY3 >>code_f+jr4 >>keepam+3x4 >>comman+845
◧◩◪◨⬒
285. derwik+am3[view] [source] [discussion] 2026-02-04 05:41:54
>>Barrin+EW2
> publicly traded company

Which company is publicly traded?

◧◩◪
286. pdpi+cm3[view] [source] [discussion] 2026-02-04 05:42:13
>>moolco+Du
I'm of two minds about this.

One the one hand, it seems "obvious" that Grok should somehow be legally required to have guardrails stopping it from producing kiddie porn.

On the other hand, it also seems "obvious" that laws forcing 3D printers to detect and block attempts to print firearms are patently bullshit.

The thing is, I'm not sure how I can reconcile those two seemingly-obvious statements in a principled manner.

replies(5): >>_tramp+1o3 >>watwut+FD3 >>beAbU+WN3 >>muyuu+yd4 >>ytpete+h66
287. utopia+mm3[view] [source] 2026-02-04 05:44:35
>>vikave+(OP)
To people claiming a physical raid is pointless from the point of gathering data :

- you are thinking about a company doing good things the right way. You are thinking about a company abiding by the law, storing data on its own server, having good practices, etc.

The moment a company starts to do dubious stuff then good practices start to go out the window. People write email with cryptic analogies, people start deleting emails, ... then as the circumvention become more numerous and complex, there needs to still be a trail in order to remain understandable. That trail will be in written form somehow and that must be hidden. It might be paper, it might be shadow IT but the point is that if you are not just forgetting to keep track of coffee pods at the social corner, you will leave traces.

So yes, raids do make sense BECAUSE it's about recurring complex activities that are just too hard to keep in the mind of one single individual over long periods of time.

replies(6): >>tick_t+JV3 >>Silver+rY3 >>hybrid+ed4 >>almost+fQ4 >>SoftTa+D15 >>notepa+J65
◧◩◪◨
288. throw3+Rm3[view] [source] [discussion] 2026-02-04 05:49:42
>>ramuel+XH1
I knew someone who was involved in an investigation (the company and person was the victim not the target of the investigation), their work laptop got placed into a legal hold, the investigators had access to all of their files and they weren't allowed delete to anything (even junk emails) for several years.

You don't get to say no to these things.

◧◩◪◨⬒⬓⬔
289. isr+Tn3[view] [source] [discussion] 2026-02-04 06:01:00
>>almost+MD2
Ah, so the daily LARGE protests, in Venezuela, against his kidnapping are not indicative of "the vast majority of Venezuela".

But the celebratory pics, which were claimed to be from Venezuela, but were actually from Miami and elsewhere (including, I kid you not, an attempt to pass off Argentine's celebrating a Copa America win) ... that is indicative of "the vast majority of Venezuela"?

If I were smarter, I might start to wonder why, if President Maduro was so unpopular, why would his abductors have to resort to fake footage - which was systematically outed & destroyed by independent journalists within 24 hours? I mean, surely, enough real footage should exist.

Probably better not to have inconvenient non-US-approved independent thoughts like that.

replies(1): >>almost+Fm5
◧◩
290. culi+Yn3[view] [source] [discussion] 2026-02-04 06:02:12
>>miki12+dl3
It's not really different from how we treat any other platform that can host CSAM. I guess the main difference is that it's being "made" instead of simply "distributed" here
◧◩◪◨
291. _tramp+1o3[view] [source] [discussion] 2026-02-04 06:02:29
>>pdpi+cm3
It is very different. It is YOUR 3d printer, no one else is involved. You might print a knife and kill somebody with it, you go to jail, not third party involved.

If you use a service like Grok, then you use somebody elses computer / things. X is the owner from computer that produced CP. So of course X is at least also a bit liable for producing CP.

replies(1): >>pdpi+so3
◧◩◪
292. cubefo+3o3[view] [source] [discussion] 2026-02-04 06:02:54
>>moolco+Du
> The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.

Do you have any evidence for that? As far as I can tell, this is false. The only thing I saw was Grok changing photos of adults into them wearing bikinis, which is far less bad.

replies(3): >>scott_+cq3 >>klez+KM3 >>numpad+3V3
◧◩◪◨⬒⬓
293. nieman+ko3[view] [source] [discussion] 2026-02-04 06:05:08
>>gruez+Ar2
In the USA they would be allowed to down any aircraft not complying with national air interception rules, that would not be murder. It would be equivalent to not dropping a gun once prompted by an officer and being shot as a result.

https://www.faa.gov/air_traffic/publications/atpubs/aim_html...

◧◩◪◨⬒
294. pdpi+so3[view] [source] [discussion] 2026-02-04 06:06:18
>>_tramp+1o3
How does that mesh with all the safe harbour provisions we've depended on to make the modern internet, though?
replies(5): >>mikeyo+az3 >>jazzyj+oB3 >>_tramp+uB3 >>numpad+jV3 >>pjc50+PW3
◧◩
295. gianca+Do3[view] [source] [discussion] 2026-02-04 06:07:38
>>Altern+ut
> This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.

I wouldn't even consider this a reason if it wasn't for the fact that OpenAI and Google, and hell literally every image model out there all have the same "this guy edited this underage girls face into a bikini" problem (this was the most public example I've heard so I'm going with that as my example). People still jailbreak chatgpt, and they've poured how much money into that?

◧◩
296. themaf+Ap3[view] [source] [discussion] 2026-02-04 06:13:39
>>miki12+dl3
Holding corporations accountable for their profit streams is "censorship?" I wish they'd stop passing models trained on internet conversations and hoarded data as fit for any purpose. The world does not need to boil oceans for hallucinating chat bots at this particular point in history.
replies(1): >>themaf+Z86
◧◩◪◨⬒⬓
297. pyrale+bq3[view] [source] [discussion] 2026-02-04 06:18:08
>>chrisj+8O2
The first two points of the official document, which I re-quote below, are about CSAM.

> complicité de détention d’images de mineurs présentant un caractère pédopornographique

> complicité de diffusion, offre ou mise à disposition en bande organisée d'image de mineurs présentant un caractère pédopornographique

[1]: https://www.tribunal-de-paris.justice.fr/sites/default/files...

replies(1): >>chrisj+LJ3
◧◩◪◨
298. scott_+cq3[view] [source] [discussion] 2026-02-04 06:18:10
>>cubefo+3o3
Did you miss the numerous news reports? Example: https://www.theguardian.com/technology/2026/jan/08/ai-chatbo...

For obvious reasons, decent people are not about to go out and try to general child sexual abuse material to prove a point to you, if that’s what you’re asking for.

replies(1): >>cubefo+hr3
◧◩◪◨⬒⬓
299. pyrale+vq3[view] [source] [discussion] 2026-02-04 06:20:54
>>camina+0E2
Macron's involvement with Uber is public information at this point.

[1]: https://www.lemonde.fr/pixels/article/2022/07/10/uber-files-...

[2]: https://www.radiofrance.fr/franceinter/le-rapport-d-enquete-...

replies(1): >>camina+jq5
◧◩◪◨⬒
300. cubefo+hr3[view] [source] [discussion] 2026-02-04 06:29:04
>>scott_+cq3
First of all, the Guardian is known to be heavily biased again Musk. They always try hard to make everything about him sound as negative as possible. Second, last time I tried, Grok even refused to create pictures of naked adults. I just tried again and this is still the case:

https://x.com/i/grok/share/1cd2a181583f473f811c0d58996232ab

The claim that they released a tool with "seemingly no guardrailes" is therefore clearly false. I think what instead has happened here is that some people found a hack to circumvent some of those guardrails via something like a jailbreak.

replies(5): >>scott_+Qv3 >>emsign+sx3 >>jibal+wx3 >>Hikiko+Gk4 >>neorom+FV4
◧◩
301. Jordan+1t3[view] [source] [discussion] 2026-02-04 06:43:16
>>miki12+dl3
I could maybe see this argument if we were talking about raiding Stable Diffusion or Facebook or some other provider of local models. But the content at issue was generated not just by Twitter's AI model, but on their servers, integrated directly into their UI and hosted publicly on their platform. That makes them much more clearly culpable -- they're not just enabling this shit, they're creating it themselves on demand (and posting it directly to victims' public profiles).
replies(1): >>disgru+7S3
◧◩◪◨⬒
302. vessen+Ct3[view] [source] [discussion] 2026-02-04 06:48:42
>>f30e3d+023
An assassination market, at least the one we discussed, works like this - One or more people put up a bounty paid out on the death of someone. Anyone can submit a (sealed) description of the death. On death, the descriptions are opened — the one closest to the actual circumstances is paid the bounty.

One of my portfolio companies had information about contributors to these markets — I was told by my FBI contact when I got in touch that their view was the creation of the market, the funding of the market and the descriptions were all legal — they declined to follow up.

replies(2): >>f30e3d+9E3 >>direwo+D24
◧◩
303. popalc+5u3[view] [source] [discussion] 2026-02-04 06:53:17
>>miki12+dl3
It's a bit of a leap to say that the model must be censored. SD and all the open image gen models are capable of all kinds of things, but nobody has gone after the open model trainers. They have gone after the companies making profits from providing services.
replies(2): >>vinter+pD3 >>Kaiser+NP3
◧◩◪◨⬒⬓⬔⧯
304. reveri+6v3[view] [source] [discussion] 2026-02-04 07:03:27
>>gruez+jD2
I think in this hypothetical, France would want to force Musk's plane to land in French jurisdiction so they could arrest him.
◧◩◪◨⬒⬓
305. scott_+Qv3[view] [source] [discussion] 2026-02-04 07:10:54
>>cubefo+hr3
For more evidence:

https://www.bbc.co.uk/news/articles/cvg1mzlryxeo

Also, X seem to disagree with you and admit that CSAM was being generated:

https://arstechnica.com/tech-policy/2026/01/x-blames-users-f...

Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written:

https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...

This is because of government pressure (see Ofcom link).

I’d say you’re making yourself look foolish but you seem happy to defend nonces so I’ll not waste my time.

replies(1): >>cubefo+qP3
◧◩◪◨⬒
306. trhway+Yv3[view] [source] [discussion] 2026-02-04 07:12:18
>>sirnic+ki3
May be. We do have codified in law definition of machine gun which clearly separates it from a block of lead. What codified in law definitions are used here to separate photoshop from Grok in the context of those deepfakes and CSAM?

Without such clear legal definitions going after Grok while not going after photoshop is just an act of political pressure.

replies(2): >>bootsm+Qz3 >>moolco+gw6
◧◩◪
307. bean46+Ew3[view] [source] [discussion] 2026-02-04 07:19:37
>>milton+Rw1
> Claim 4/20 is a holiday that he celebrates?

Most likely, it's Hitler's birthday after all

◧◩◪◨⬒⬓
308. reveri+Iw3[view] [source] [discussion] 2026-02-04 07:20:08
>>scotty+iH2
The speed limit is not a law the same way "don't murder" is a law. And "don't destroy evidence of a crime" is a lot closer to "don't murder", legally speaking.
◧◩
309. emsign+jx3[view] [source] [discussion] 2026-02-04 07:25:09
>>Altern+ut
They've already broken the law by creating and hosting CSAM. Now let's see what else prosecutors will find.
◧◩◪
310. reveri+mx3[view] [source] [discussion] 2026-02-04 07:25:34
>>fumar+Ad3
I think they're using it in the American sense, which means "anywhere in the political spectrum of the leftmost 60% of the population".
◧◩◪◨⬒⬓
311. emsign+sx3[view] [source] [discussion] 2026-02-04 07:26:46
>>cubefo+hr3
> First of all, the Guardian is known to be heavily biased again Musk.

Says who? Musk?

◧◩◪◨⬒⬓
312. jibal+wx3[view] [source] [discussion] 2026-02-04 07:27:10
>>cubefo+hr3
That is only "known" to intellectually dishonest ideologues.
◧◩◪◨⬒⬓⬔⧯
313. pyrale+Qx3[view] [source] [discussion] 2026-02-04 07:30:17
>>Sanjay+qG2
I never liked the Paul's and their opinions, but I must say that they usually speak according to their principles, rather than make up principles to fit what they want to happen.

To me, that's the distinction between political opponents I can respect, and, well, whatever we're seeing now.

◧◩
314. 317070+qy3[view] [source] [discussion] 2026-02-04 07:35:27
>>Altern+ut
Well, there is evidence that this company made and distributed CSAM and pornographic deepfakes to make a profit. There is no evidence lacking there for the investigators.

So the question becomes if it was done knowingly or recklessly, hence a police raid for evidence.

See also [0] for a legal discussion in the German context.

[0] https://arxiv.org/html/2601.03788v1

replies(1): >>skissa+6I3
◧◩◪◨⬒⬓
315. mikeyo+az3[view] [source] [discussion] 2026-02-04 07:42:06
>>pdpi+so3
The safe harbor provisions largely protect X from the content that the users post (within reason). Suddenly Grok/X were actually producing the objectionable content. Users were making gross requests and then an LLM owned by X, using X servers and X code would generate the illegal material and then post it to the website. The entity responsible is no longer done user but instead the company itself.
replies(2): >>luke54+9J3 >>Altern+Eb4
◧◩◪◨⬒⬓
316. bootsm+Qz3[view] [source] [discussion] 2026-02-04 07:47:42
>>trhway+Yv3
Why do you think France doesn’t have such laws that delineate this legal definition?

What you’re implying here is that Musk should be immune from any prosecution simply because he is right wing, which…

◧◩◪◨⬒⬓
317. jazzyj+oB3[view] [source] [discussion] 2026-02-04 07:59:01
>>pdpi+so3
This might be an unpopular opinion but I always thought we might be better off without Web 2.0 where site owners aren’t held responsible for user content

If you’re hosting content, why shouldn’t you be responsible, because your business model is impossible if you’re held to account for what’s happening on your premises?

Without safe harbor, people might have to jump through the hoops of buying their own domain name, and hosting content themselves, would that be so bad?

replies(3): >>pdpi+fM3 >>termin+rW3 >>direwo+iX3
◧◩◪◨⬒⬓
318. _tramp+uB3[view] [source] [discussion] 2026-02-04 07:59:38
>>pdpi+so3
Before a USER did create content. So the user was / is liable. Now a LLM owned by a company does create content. So the company is liable.
replies(1): >>hbs18+kH3
◧◩◪◨
319. bluesc+dD3[view] [source] [discussion] 2026-02-04 08:12:50
>>trhway+Ve3
They don’t provide a large platform for political speech.

This isn’t about AI or CSAM (Have we seen any other AI companies raided by governments for enabling creation of deepfakes, dangerous misinformation, illegal images, or for flagrant industrial-scale copyright infringement?)

replies(2): >>direwo+zX3 >>mooreb+Fv7
◧◩◪
320. vinter+pD3[view] [source] [discussion] 2026-02-04 08:14:27
>>popalc+5u3
So far, yes, but as far as I can tell their case against the AI giants aren't based on it being for-profit services in any way.
replies(1): >>popalc+2y6
◧◩◪◨
321. watwut+FD3[view] [source] [discussion] 2026-02-04 08:17:41
>>pdpi+cm3
Grok is publishing the CSAM photos for everyone to see. It is used as a tool for harassment and abuse, literally.
replies(1): >>pdpi+Fo4
322. Animat+3E3[view] [source] 2026-02-04 08:20:06
>>vikave+(OP)
One of the charges is "fraudulent data extraction by an organised group." That's going to affect the entire social media industry if applied broadly.
replies(1): >>muyuu+vH3
◧◩◪◨⬒⬓
323. f30e3d+9E3[view] [source] [discussion] 2026-02-04 08:21:18
>>vessen+Ct3
OK this sounds more like gamer dipshittery than anything serious.
324. mnewme+pE3[view] [source] 2026-02-04 08:22:55
>>vikave+(OP)
Good one.

No platform ever should allow CSAM content.

And the fact that they didn’t even care and haven’t want to spend money for implementing guardrails or moderation is deeply concerning.

This has imho nothing to do with model censorship, but everything with allowing that kind of content on a platform

replies(5): >>Reptil+bY3 >>tw85+Ux4 >>bright+9D4 >>yibg+RP4 >>lux-lu+DU4
◧◩
325. mnewme+FE3[view] [source] [discussion] 2026-02-04 08:24:51
>>miki12+dl3
This is the wrong take.

Yes they could have an uncensored model, but then they would need proper moderation and delete this kind of content instantly or ban users that produce it. Or don’t allow it in the first place.

It doesn’t matter how CSAM is produced, the only thing that matters is that is on the platform.

I am flabbergasted people even defend this

replies(1): >>direwo+4Z3
◧◩◪◨
326. watwut+HE3[view] [source] [discussion] 2026-02-04 08:25:03
>>herman+DA
To be fair, it is common confusion. In other comments, you see people arguing by US constitution.
◧◩
327. madeof+JE3[view] [source] [discussion] 2026-02-04 08:25:09
>>miki12+dl3
Let’s take a step back and remove AI generation from the conversation for a moment.

Did X do enough to prevent its website being used to distribute illegal content - consensual sexual material of both adults and children?

Now reintroduce AI generation, where X plays a more active role in facilitating the creation of that illegal content.

◧◩◪
328. vinter+fG3[view] [source] [discussion] 2026-02-04 08:36:06
>>ChuckM+ew2
It is a raid in that it's not expected, it relies on not being expected, and they come and take away your stuff by force. Maybe it's a legal raid, but let's not sugar coat it, it's still a raid and whether you're guilty or not it will cause you a lot of problems.
replies(1): >>Spivak+T65
◧◩◪◨⬒⬓⬔⧯
329. vinter+3H3[view] [source] [discussion] 2026-02-04 08:42:36
>>tyre+BN2
And also killed over hundred people don't forget that.
◧◩◪◨⬒⬓⬔
330. hbs18+kH3[view] [source] [discussion] 2026-02-04 08:45:25
>>_tramp+uB3
I'm not trying to make excuses for Grok, but how exactly isn't the user creating the content? Grok doesn't have create images on its own volition, the user is still required to give it some input, therefore "creating" the content.
replies(3): >>luke54+UJ3 >>_tramp+2K3 >>mbesto+Nd4
◧◩
331. muyuu+vH3[view] [source] [discussion] 2026-02-04 08:46:35
>>Animat+3E3
Frankly it sounds to me like a "show me the man and I'll show you the crime" kind of operation. France and the UK, and judging by yesterday's speech by the PM of Spain maybe the whole EU might be looking to do what China and Russia did earlier on and start cracking down on foreign social media by making it impossible to operate without total alignment with their vision and not just their (new) rules. Together with a push for local alternatives, that currently don't seem to be there, it may spell the end for a big chunk of the Global social network landscape.

I still believe that the EU and aligned countries would rather have America to agree to much tighter speech controls, digital ID, ToS-based speech codes as apparently US Democrats partly or totally agree to. But if they have workable alternatives they will deal with them from a different position.

replies(3): >>Palmik+7M3 >>PaulRo+0P3 >>yxhuvu+aP3
◧◩◪◨⬒⬓
332. vinter+wH3[view] [source] [discussion] 2026-02-04 08:46:39
>>agoodu+1s2
This is very common, all "think of the children" laws are ripe for abuse. I'm convinced the secrecy around child abuse/child protective services is regularly abused both by abusive parents and abusive officials.
◧◩◪
333. skissa+6I3[view] [source] [discussion] 2026-02-04 08:50:44
>>317070+qy3
> Well, there is evidence that this company made and distributed CSAM

I think one big issue with this statement – "CSAM" lacks a precise legal definition; the precise legal term(s) vary from country to country, with differing definitions. While sexual imagery of real minors is highly illegal everywhere, there's a whole lot of other material – textual stories, drawings, animation, AI-generated images of nonexistent minors – which can be extremely criminal on one side of an international border, de facto legal on the other.

And I'm not actually sure what the legal definition is in France; the relevant article of the French Penal Code 227-23 [0] seems superficially similar to the legal definition of "child pornography" in the United States (post-Ashcroft vs Free Speech Coalition), and so some–but (maybe) not all–of the "CSAM" Grok is accused of generating wouldn't actually fall under it. (But of course, I don't know how French courts interpret it, so maybe what it means in practice is something broader than my reading of the text suggests.)

And I think this is part of the issue – xAI's executives are likely focused on compliance with US law on these topics, less concerned with complying with non-US law, in spite of the fact that CSAM laws in much of the rest of the world are much broader than in the US. That's less of an issue for Anthropic/Google/OpenAI, since their executives don't have the same "anything that's legal" attitude which xAI often has. And, as I said – while that's undoubtedly true in general, I'm unsure to what extent it is actually true for France in particular.

[0] https://www.legifrance.gouv.fr/codes/section_lc/LEGITEXT0000...

replies(3): >>direwo+YW3 >>krick+5i5 >>graeme+cp5
◧◩◪◨⬒⬓⬔
334. luke54+9J3[view] [source] [discussion] 2026-02-04 08:59:19
>>mikeyo+az3
Yes, and that was a very stupid product decision. They could have put the image generation into the post editor, shifting responsibility to the users.

I'd guess Elon is responsible for that product decision.

◧◩◪◨⬒⬓⬔
335. chrisj+LJ3[view] [source] [discussion] 2026-02-04 09:02:36
>>pyrale+bq3
> The first two points of the official document, which I re-quote below, are about CSAM.

Sorry, but that's a major translation error. "pédopornographique" properly translated is child porn, not child sexual abuse material (CSAM). The difference is huge.

replies(3): >>pyrale+rO3 >>mortar+dU3 >>direwo+024
◧◩◪◨⬒⬓⬔⧯
336. luke54+UJ3[view] [source] [discussion] 2026-02-04 09:03:52
>>hbs18+kH3
X is making it pretty clear that it is "Grok" posting those images and not the user. It is a separate posting that comes from an official account named "Grok". X has full control over what the official "Grok" account posts.

There is no functionality for the users to review and approve "Grok" responses to their tweets.

◧◩◪◨⬒⬓⬔⧯
337. _tramp+2K3[view] [source] [discussion] 2026-02-04 09:05:11
>>hbs18+kH3
Until now, webserver had just been like a post service. Grok is more like a CNC late.
◧◩
338. pjc50+oL3[view] [source] [discussion] 2026-02-04 09:15:53
>>techbl+8k
Since the release of (some of) the Epstein files, that kind of "let's do some crimes" email seems much more plausible.
◧◩◪
339. Palmik+7M3[view] [source] [discussion] 2026-02-04 09:22:59
>>muyuu+vH3
> EU might be looking to do what China and Russia did earlier on and start cracking down on foreign social media

For some reason you forgot to mention "Like the US did with TikTok".

replies(3): >>muyuu+ZM3 >>junto+Tb4 >>pjc50+dG4
◧◩◪◨⬒⬓⬔
340. pdpi+fM3[view] [source] [discussion] 2026-02-04 09:23:40
>>jazzyj+oB3
What about webmail, IM, or any other sort of web-hosted communication? Do you honestly think it would be better if Google were responsible for whatever content gets sent to a gmail address?
replies(1): >>jazzyj+5W3
◧◩◪◨⬒⬓
341. Kaiser+qM3[view] [source] [discussion] 2026-02-04 09:24:29
>>nieman+Da2
Right, but you are confusing a _conspiracy_ with staff training.

I didn't work anywhere near the level, or anything thats dicey where I needed to have a "oh shit delete everything the Feds are here" plan. Which is a conspiracy to pervert the course of justice (I'm not sure what the common law/legal code name for that is)

The stuff I worked on was legal and in the spirit of the law, along with a paper trail (that I also still have) proving that.

◧◩◪
342. ljspra+HM3[view] [source] [discussion] 2026-02-04 09:27:10
>>moolco+Du
No other "AI" companies released tools that could do the same?
replies(1): >>hackin+KU3
◧◩◪◨
343. klez+KM3[view] [source] [discussion] 2026-02-04 09:27:43
>>cubefo+3o3
That's why this is an investigation looking for evidence and not a conviction.

This is how it works, at least in civil law countries. If the prosecutor has reasonable suspicious that a crime is taking place they send the so-called "judiciary police" to gather evidence. If they find none (or they're inconclusive etc...) the charges are dropped, otherwise they ask the court to go to trial.

On some occasions I take on judiciary police duties for animal welfare. Just last week I participated in a raid. We were not there to arrest anyone, just to gather evidence so the prosecutor could decide whether to press charges and go to trial.

replies(1): >>direwo+LX3
◧◩◪◨⬒
344. Kaiser+TM3[view] [source] [discussion] 2026-02-04 09:28:32
>>free65+Pe2
In common law/4th amendment, kinda. Once you have a warrant, then the word reasonable comes into play. Its reasonable to assume that the data you want is on the devices of certain people. if incidental data/evidence is also procured that was reasonably likely to contain said data, then its fair game

In the civil code, its quite possibly different. The french have had ~ 3 constitutions in the last 80 years. The also dont have the concept of case history. who knows what the law actually is.

◧◩◪◨
345. muyuu+ZM3[view] [source] [discussion] 2026-02-04 09:29:15
>>Palmik+7M3
that was decades later, but yea I don't think for a second that was justifiable - not even considering that China had completely closed shop for America decades earlier and this was a 1-way openness relationship for a long time; they could have sold this as a reciprocity issue but they didn't

esp. when America already controls the main outlets through Android Play Store and Apple Store, and yep, they have proven to control them not just happen to host them as a country

arguably America did have valid security concerns with Huawei though, but if those are the rules then you cannot complain later on

◧◩◪◨
346. beAbU+WN3[view] [source] [discussion] 2026-02-04 09:35:54
>>pdpi+cm3
I don't have an answer, but the theme that's been bouncing around in my head has been about accessibility.

Grok makes it trivial to create fake CSAM or other explicit images. Before, if someone spent a week on photoshop to do the same, It won't be Adobe that gets the blame.

Same for 3D printers. Before, anyone could make a gun provided they have the right tools (which is very expensive), now it's being argued that 3D printers are making this more accessible. Although I would argue it's always been easy to make a gun, all you need is a piece of pipe. So I don't entirely buy the moral panic against 3D printers.

Where that threshold lies I don't know. But I think that's the crux if it. Technology is making previously difficult things easier, to the benefit of all humanity. It's just unfortunate that some less-nice things have also been included.

replies(1): >>_ph_+oc4
◧◩◪◨
347. Rambli+kO3[view] [source] [discussion] 2026-02-04 09:38:39
>>lokar+Du1
This. We don't have to accept that they behave that way. They enter our economies so they need to adhere to our laws. And we can fine them. No one wants to lose Europe as a market, even if all the haters call us a shithole.
◧◩◪◨⬒⬓⬔⧯
348. pyrale+rO3[view] [source] [discussion] 2026-02-04 09:39:36
>>chrisj+LJ3
Quote from US doj [1]:

> The term “child pornography” is currently used in federal statutes and is defined as any visual depiction of sexually explicit conduct involving a person less than 18 years old. While this phrase still appears in federal law, “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child. In fact, in 2016, an international working group, comprising a collection of countries and international organizations working to combat child exploitation, formally recognized “child sexual abuse material” as the preferred term.

Child porn is csam.

[1]: https://www.justice.gov/d9/2023-06/child_sexual_abuse_materi...

replies(1): >>chrisj+s24
◧◩◪
349. PaulRo+0P3[view] [source] [discussion] 2026-02-04 09:43:30
>>muyuu+vH3
It's worth pointing out that in France and the UK, the authorities involved are arms length independent of the political bodies - it's not like the US where if you give the President good vibes you can become head of the FBI, and all you have to do in return is whatever he says. There are statutory instruments (in France, constitutional clauses), that determine the independence of these authorities.

They are tasked - and held to account by respective legislative bodies - with implementing the law as written.

Nobody wrote a law saying "Go after Grok". There is however a law in most countries about the creation and dissemination of CSAM material and non-consensual pornography. Some of that law is relatively new (the UK only introduced some of these laws in recent years), but they all predate the current wave of AI investment.

Founders, boards of directors and their internal and external advisors could:

1. Read the law and make sure any tools they build comply

2. When told their tools don't comply take immediate and decisive action to change the tools

3. Work with law enforcement to apply the law as written

Those companies, if they find this too burdensome, have the choice of not operating in that market. By operating in that market, they both implicitly agree to the law, and are required to explicitly abide by it.

They can't then complain that the law is unfair (it's not), that it's being politicised (How? By whom? Show your working), and that this is all impossible in their home market where they are literally offering presents to the personal enrichment of the President on bended knee while he demands that ownership structures of foreign social media companies like TikTok are changed to meet the agenda of himself and his administration.

So, would the EU like more tighter speech controls? Yes, they'd like implementation of the controls on free speech enshrined in legislation created by democratically appointed representatives. The alternative - algorithms that create abusive content, of women and children in particular - are not wanted by the people of the UK, the EU, or most of the rest of the World, laws are written to that effect, and are then enforced by the authorities tasked with that enforcement.

This isn't "anti-democratic", it's literally democracy in action standing up to technocratic feudalism that is an Ayn Randian-wet dream being played out by some morons who got lucky.

replies(3): >>didntc+2R3 >>gordia+SS3 >>hnfong+yM4
◧◩◪
350. yxhuvu+aP3[view] [source] [discussion] 2026-02-04 09:44:23
>>muyuu+vH3
Yes, if you don't follow EU laws prepare to not do business in Europe. Likewise, if you don't follow US laws I'd advise against trying to do business in USA.
replies(2): >>whatis+344 >>Levitz+555
◧◩◪◨⬒⬓⬔
351. cubefo+qP3[view] [source] [discussion] 2026-02-04 09:46:15
>>scott_+Qv3
> Also, X seem to disagree with you and admit that CSAM was being generated

That post doesn't contain such an admission, it instead talks about forbidden prompting.

> Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written:

That article links to this article: https://x.com/Safety/status/2011573102485127562 - which contradicts your claim that there were no guardrails before. And as I said, I already tried it a while ago, and Grok also refused to create images of naked adults then.

replies(1): >>scott_+K34
◧◩◪
352. Kaiser+NP3[view] [source] [discussion] 2026-02-04 09:49:18
>>popalc+5u3
Again its all about reasonable.

Firstly does the open model explicitly/tacitly allow CSAM generation?

Secondly, when the trainers are made aware of the problem, do they ignore it or attempt to put in place protections?

Thirdly, do they pull in data that is likely to allow that kind of content to be generated?

Fourthly, when they are told that this is happening, do they pull the model?

Fithly, do they charge for access/host the service and allow users to generate said content on their own servers?

◧◩◪
353. gitaar+VP3[view] [source] [discussion] 2026-02-04 09:50:26
>>omnimu+0k
So what?
354. domini+5Q3[view] [source] 2026-02-04 09:51:28
>>vikave+(OP)
A Russian in a French prison says my country isn't free. Well, let that message spread to other criminals. You're not welcome in France.
◧◩◪◨
355. didntc+2R3[view] [source] [discussion] 2026-02-04 09:59:22
>>PaulRo+0P3
> It's worth pointing out that in France and the UK, the authorities involved are arms length independent of the political bodies

As someone who has lived in (and followed current affairs) in both of these countries, this is a very idealistic and naïve view. There can be a big gap between theory and practice

> There are statutory instruments (in France, constitutional clauses), that determine the independence of these authorities.

> They are tasked - and held to account by respective legislative bodies -

It's worth nothing here that the UK doesn't have separation of powers or a supreme court (in the US sense)

replies(1): >>muyuu+CR3
◧◩◪◨⬒⬓⬔⧯▣▦
356. klez+oR3[view] [source] [discussion] 2026-02-04 10:03:03
>>wongar+3r2
Believe it or not, he's "just" off by two years.

Yes, he is in power since 2000 (1999, actually) but 1999-2012 he was Prime Minister. Only then he became President, which would make the end of his second term 2024. So the current one would be his third term (by the magic of changing the constitution and legal quibbles which effectively allow a president to stay in charge for four almost whole terms, AFAIU).

◧◩◪◨⬒
357. muyuu+CR3[view] [source] [discussion] 2026-02-04 10:04:43
>>didntc+2R3
i live in the UK and i completely agree with you and i believe that GP is "having a laugh" as we'd say over here

however it's a very mainstream point of view so i respect that he/she has laid it out pretty well, so i upvoted the comment

◧◩◪
358. disgru+7S3[view] [source] [discussion] 2026-02-04 10:08:15
>>Jordan+1t3
And importantly, this is clearly published by Grok, rather than the user, so in this case (obviously this isn't the US) but if it was I'm not sure Section 230 would apply.
◧◩◪◨
359. gordia+SS3[view] [source] [discussion] 2026-02-04 10:13:56
>>PaulRo+0P3
European courts have repeatedly said that in France the procureur (public prosecutor) isn’t an “independent judicial authority”.

The European Court of Human Rights has reminded this point (e.g. 29 Mar 2010, appl. no. 3394/03), and the Court of Justice of the European Union reaches a very similar conclusion (2 Mar 2021, C-746/18): prosecutors are part of the executive hierarchy and can’t be treated as the neutral, independent judicial check some procedures require.

For a local observer, this is made obvious by the fact that the procureur, in France, always follows current political vibes, usually in just a few months delay (extremely fast, when you consider how slowly justice works in the country).

◧◩
360. gordia+xT3[view] [source] [discussion] 2026-02-04 10:19:01
>>miki12+dl3
This is not about AI but about censorship of a nonaligned social network. It's been a developing current in EU. They have basically been foaming at the mouth at the platform since it got bought.
replies(1): >>direwo+GY3
◧◩◪◨⬒⬓⬔⧯
361. mortar+dU3[view] [source] [discussion] 2026-02-04 10:24:50
>>chrisj+LJ3
Maybe US law makes a distinction, but in Europe there is no difference. Sexual depictions of children (real or not) is considered child pornography and will get you sent to the slammer.
replies(1): >>chrisj+h44
◧◩◪◨
362. hackin+KU3[view] [source] [discussion] 2026-02-04 10:29:49
>>ljspra+HM3
In fact, Gemini could bikinify any image just like Grok. Google added guardrails after all the backlash Grok received.
replies(1): >>mekdoo+Yt4
◧◩◪◨
363. numpad+3V3[view] [source] [discussion] 2026-02-04 10:32:05
>>cubefo+3o3
Grok do seem to have tons of useless guardrails. Reportedly you can't prompt it directly. But also reportedly they tend to go for almost nonsensically off-guardrail interpretation of prompts.
◧◩◪◨⬒⬓
364. numpad+jV3[view] [source] [discussion] 2026-02-04 10:34:16
>>pdpi+so3
It's not like the world benefited from safe harbor laws that much. Why don't just amend them so that algorithms that run on server side and platforms that recommend things are not eligible.
replies(1): >>direwo+dX3
◧◩
365. tick_t+JV3[view] [source] [discussion] 2026-02-04 10:37:46
>>utopia+mm3
No need to be coy the raid exists because it's a way to punish the company without proving anything. They have zero intention of getting even the slightest bit of valuable data related to Grok from this.
replies(2): >>direwo+UV3 >>fyredg+yK4
◧◩◪
366. direwo+UV3[view] [source] [discussion] 2026-02-04 10:39:38
>>tick_t+JV3
What's your evidence?
◧◩◪◨⬒⬓⬔⧯
367. jazzyj+5W3[view] [source] [discussion] 2026-02-04 10:40:45
>>pdpi+fM3
Messages are a little different than hosting public content but sure, a service provider should know its customers and stop doing business with any child sex traffickers planning parties over email.

I would prefer 10,000 service providers to one big one that gets to read all the plaintext communication of the entire planet.

replies(2): >>direwo+tX3 >>pdpi+174
◧◩◪◨⬒⬓⬔
368. termin+rW3[view] [source] [discussion] 2026-02-04 10:43:06
>>jazzyj+oB3
You know this site would not be possible without those protections, right?
replies(1): >>jazzyj+pc5
◧◩◪◨⬒⬓
369. pjc50+PW3[view] [source] [discussion] 2026-02-04 10:45:26
>>pdpi+so3
Note that is a US law, not a French one.

Also, safe harbor doesn't apply because this is published under the @grok handle! It's being published by X under one of their brand names, it's absurd to argue that they're unaware or not consenting to its publication.

◧◩◪◨
370. direwo+YW3[view] [source] [discussion] 2026-02-04 10:46:29
>>skissa+6I3
It wouldn't be called CSAM in France because it would be called a French word. Arguing definitions is arguing semantics. The point is, X did things that are illegal in France, no matter what you call them.
replies(1): >>skissa+I45
◧◩◪◨⬒⬓⬔
371. direwo+dX3[view] [source] [discussion] 2026-02-04 10:47:33
>>numpad+jV3
If you are thinking about section 230 it only applies to user–generated content, so not server–side AI or timeline algorithms.
replies(1): >>Altern+Sb4
◧◩◪
372. tick_t+gX3[view] [source] [discussion] 2026-02-04 10:48:12
>>nemoma+bV2
Not really?
◧◩◪◨⬒⬓⬔
373. direwo+iX3[view] [source] [discussion] 2026-02-04 10:48:33
>>jazzyj+oB3
Any app allowing any communication between two users would be illegal.
replies(1): >>expedi+m44
◧◩◪
374. termin+pX3[view] [source] [discussion] 2026-02-04 10:49:11
>>ChuckM+ew2
Who has on prem servers at an office location?
replies(1): >>ChuckM+gY3
◧◩◪◨⬒⬓⬔⧯▣
375. direwo+tX3[view] [source] [discussion] 2026-02-04 10:50:11
>>jazzyj+5W3
They'd all have to read your emails to ensure you don't plan child sex parties. Whenever a keyword match comes up, your account will immediately be deleted.
◧◩◪◨⬒
376. direwo+zX3[view] [source] [discussion] 2026-02-04 10:51:00
>>bluesc+dD3
No because most of those things aren't illegal and most of those companies have guard rails and because a prosecution requires a much higher standard of evidence than internet shitposting, and only X was stupid enough to make their illegal activity obvious.
◧◩◪◨⬒⬓⬔⧯▣▦
377. termin+HX3[view] [source] [discussion] 2026-02-04 10:51:46
>>MYEUHD+Ke3
How many of them?
◧◩◪◨⬒
378. direwo+LX3[view] [source] [discussion] 2026-02-04 10:53:03
>>klez+KM3
Note that the raid itself is a punishment. It's normal for them to seize all electronic devices. How is X France supposed to do any business without any electronic devices? And even when charges are dropped, the devices are never returned.
◧◩
379. Reptil+bY3[view] [source] [discussion] 2026-02-04 10:55:45
>>mnewme+pE3
I disagree. Prosecute people that use the tools, not the tool makers if AI generated content is breaking the law.

A provider should have no responsibility how the tools are used. It is on users. This is a can of worms that should stay closed, because we all lose freedoms just because of couple of bad actors. AI and tool main job is to obey. We are hurling at "I'm sorry, Dave. I'm afraid I can't do that" future with breakneck speed.

replies(5): >>mnewme+104 >>thranc+v54 >>mooreb+M64 >>kakaci+Ba4 >>intend+eN4
◧◩◪◨
380. direwo+cY3[view] [source] [discussion] 2026-02-04 10:56:05
>>bryan_+7Q1
My computer has a copy of all the source code I work on
replies(1): >>Pedro_+fu5
◧◩◪◨
381. vinter+fY3[view] [source] [discussion] 2026-02-04 10:56:30
>>int_19+kk2
Is "it" even a thing which can be guilty of that?

The way chatbots actually work, I wonder if we shouldn't treat the things they say more or less as words in a book of fiction. Writing a character in your novel who is a plain parody of David Irving probably isn't a crime even in France. Unless the goal of the book as such was to deny the holocaust.

As I see it, Grok can't be guilty. Either the people who made it/set its system prompt are guilty, if they wanted it to deny the holocaust. If not, they're at worst guilty of making a particularly unhinged fiction machine (as opposed to the more restrained fiction machines of Google, Anthropic etc.)

◧◩◪◨
382. ChuckM+gY3[view] [source] [discussion] 2026-02-04 10:56:31
>>termin+pX3
I'm guessing you're asking this because you have a picture of a 'server' as a thing in a large rack? Nearly every tech business has a bunch of machines, everything from an old desk top to last year's laptop, which have been reinstalled with Linux or *BSD and are sitting on the network behaving, for all intents and purposes, as a 'server.' (they aren't moving or rebooting or having local sessions running on them, Etc).
replies(1): >>ecshaf+vW4
◧◩
383. direwo+jY3[view] [source] [discussion] 2026-02-04 10:56:49
>>miki12+dl3
It's not because it could generate CSAM. It's because when they found out it could generate CSAM, they didn't try to prevent that, they advertised it. Actual knowledge is a required compenent of many crimes.
replies(1): >>bhelke+Xk6
◧◩
384. Silver+rY3[view] [source] [discussion] 2026-02-04 10:57:39
>>utopia+mm3
It's also just very basic police work. We're investigating this company, we think they've committed a crime. Ok, why do you think that. Well they've very publicly and obviously committed a crime. Ok, are you going to prosecute them? Probably. Have you gone to their offices and gathered evidence? No thanks.

Of course they're going to raid their offices! They're investigating a crime! It would be quite literally insane if they tried to prosecute them for a crime and how up to court having not even attempted basic steps to gather evidence!

replies(3): >>NooneA+4g4 >>throwa+bH4 >>london+R87
◧◩◪◨⬒⬓⬔⧯▣
385. Sanjay+sY3[view] [source] [discussion] 2026-02-04 10:57:51
>>mrkstu+Z93
Even if that were true - I don't know nor care whether it is - what business is it of the US and Trump to mess around in other countries?
◧◩◪
386. direwo+GY3[view] [source] [discussion] 2026-02-04 10:59:31
>>gordia+xT3
It's about a guy who thinks posting child porn on twitter is hilarious and that guy happens to own twitter.

If it was about blocking the social media they'd just block it, like they did with Russia Today, CUII-Liste Lina, or Pavel Durov.

replies(2): >>mordni+w74 >>gordia+gB6
◧◩◪
387. direwo+4Z3[view] [source] [discussion] 2026-02-04 11:02:16
>>mnewme+FE3
It matters whether they attempt to block it or encourage it. Musk encouraged it, until legal pressure hit, then moved it behind a paywall so it's harder to see evidence.
replies(1): >>mnewme+jZ3
◧◩◪◨
388. mnewme+jZ3[view] [source] [discussion] 2026-02-04 11:03:53
>>direwo+4Z3
Exactly!
◧◩◪◨
389. termin+oZ3[view] [source] [discussion] 2026-02-04 11:04:22
>>flohof+6A1
This isn't the 90s. The right doesn't give a shit about weed.
replies(1): >>defros+E04
◧◩
390. direwo+QZ3[view] [source] [discussion] 2026-02-04 11:07:47
>>stickf+gv1
Usually they steal all electronic devices.
◧◩◪◨⬒
391. direwo+XZ3[view] [source] [discussion] 2026-02-04 11:08:50
>>ChuckM+mN2
Part of the prosecution will be to determine who put the content on the server.
◧◩◪
392. mnewme+104[view] [source] [discussion] 2026-02-04 11:09:20
>>Reptil+bY3
I agree that users who break the law must be prosecuted. But that doesn’t remove responsibility from tool providers when harm is predictable, scalable, and preventable by design.

We already apply this logic elsewhere. Car makers must include seatbelts. Pharma companies must ensure safety. Platforms must moderate illegal content. Responsibility is shared when the risk is systemic.

replies(2): >>Reptil+424 >>JustRa+Ad4
◧◩◪◨⬒⬓⬔⧯▣
393. direwo+p04[view] [source] [discussion] 2026-02-04 11:11:43
>>mrkstu+Z93
Soke people argue Trump isn't a legitimate head of state. (One of those people is Trump, since he says he was already the president twice.) Should Xi kidnap him?
replies(1): >>stickf+Jq5
◧◩◪◨⬒
394. defros+E04[view] [source] [discussion] 2026-02-04 11:13:53
>>termin+oZ3
The right didn't give a shit about weed in the 80's or the 90's depending entirely upon who had it.

When Bernhard Hugo Goetz shot four teenagers on an NYC subway in the 80s, his PCP-laced marijuana use and stash back at his apartment came up in both sets of trials in the 80s and later in the 90s.

It was ignored (although not the alleged drug use of the teenagers) as Goetz was dubbed The Subway Vigilante and became a hero to the right.

~ https://en.wikipedia.org/wiki/1984_New_York_City_Subway_shoo...

His victims were upscaled to "super predators".

replies(1): >>termin+Ht4
◧◩◪◨⬒⬓
395. direwo+G04[view] [source] [discussion] 2026-02-04 11:14:26
>>gf000+6s2
The USA voted to destroy it's checks and balances consistently for several decades, that is why they don't work now.
◧◩◪◨
396. direwo+M04[view] [source] [discussion] 2026-02-04 11:15:05
>>cm2187+Eu2
French has a password disclosure law.
replies(1): >>cm2187+PL5
◧◩◪◨⬒⬓⬔⧯▣
397. direwo+214[view] [source] [discussion] 2026-02-04 11:16:53
>>hunter+pg2
"I can't see any difference between a country that has busted two companies that were known for hosting child porn, and a random cartel kingpin" isn't the flex you think it is
◧◩◪◨
398. direwo+j14[view] [source] [discussion] 2026-02-04 11:18:47
>>Brando+qV1
It's not illegal to head a subsidiary of a company that did bad things, but I'm sure he will be intensely questioned. If he did something illegal, he may be punished.
◧◩◪◨⬒
399. direwo+u14[view] [source] [discussion] 2026-02-04 11:20:44
>>hn_go_+J42
A raid with a warrant skips all that.
replies(1): >>hn_go_+IT6
◧◩◪
400. direwo+P14[view] [source] [discussion] 2026-02-04 11:23:14
>>why_at+HZ1
GDPR has some stuff about biased algorithms. It's all civil, of course, no prison time for that, just fines.
◧◩◪◨⬒⬓⬔⧯
401. direwo+024[view] [source] [discussion] 2026-02-04 11:24:57
>>chrisj+LJ3
They are words for the same thing, it's like arguing they can't seize laptops because the warrant says computers.
replies(1): >>chrisj+E64
◧◩◪◨
402. Reptil+424[view] [source] [discussion] 2026-02-04 11:25:08
>>mnewme+104
>But that doesn’t remove responsibility from tool providers when harm is predictable, scalable, and preventable by design.

Platforms moderating illegal content is exactly what we are arguing about, so you can't use it as an argument.

The rest cases you list are harms to the people using the tools/products. It is not harms that people using the tools inflict on third parties.

We are literally arguing about 3d printer control two topics downstream. 3d printers in theory can be used for CSAM too. So we should totally ban them - right? So are pencils, paper, lasers, drawing tablets.

replies(3): >>mnewme+h34 >>szmarc+S64 >>ytpete+r36
403. tokai+524[view] [source] 2026-02-04 11:25:10
>>vikave+(OP)
Good old HN. Users losing their collective minds over rule of law and CSAM being bad.
replies(3): >>y-curi+554 >>code_f+gs4 >>hearsa+z55
◧◩◪◨⬒
404. direwo+p24[view] [source] [discussion] 2026-02-04 11:27:26
>>stubis+Z63
Was it a grant or a purchase? If I buy a pizza from the pizza shop, it costs them $10 to make, I pay $11, the $1 is profit and the owner can do what he wants with it. But if I get a grant from NLnet I have to spend it on what the grant proposal says. Though a lot of NLnet grants are for living costs while doing a project, so I can do what I like for that time if the project gets done.
◧◩◪◨⬒⬓⬔⧯▣
405. chrisj+s24[view] [source] [discussion] 2026-02-04 11:28:06
>>pyrale+rO3
> “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child.

Yes, CSAM is preferred for material depicting abuse reflecting resulting trauma.

But not for child porn such as manga of fictional children depicting no abuse and traumatising no child.

> Child porn is csam.

"CSAM isn’t pornography—it’s evidence of criminal exploitation of kids."

That's from RAINN, the US's largest anti-sexual violence organisation.

replies(2): >>pyrale+CH4 >>pyrale+Sq5
◧◩◪◨⬒⬓⬔⧯
406. direwo+u24[view] [source] [discussion] 2026-02-04 11:28:15
>>93po+ck1
Pictures are statement of fact: what is depicted exists. Naked pictures cause harm to reputation
◧◩◪◨⬒⬓
407. direwo+D24[view] [source] [discussion] 2026-02-04 11:29:25
>>vessen+Ct3
Sounds like betting on Polymarket: will $person die this year? If you're going to kill him, you bet everything you have on yes right beforehand.
◧◩
408. direwo+M24[view] [source] [discussion] 2026-02-04 11:30:28
>>darepu+UU2
Hentai has different legal status to realistic pictures of real people
◧◩◪
409. direwo+T24[view] [source] [discussion] 2026-02-04 11:31:07
>>noneth+BI
I support the EU harvesting money from evil companies
◧◩◪◨⬒
410. mnewme+h34[view] [source] [discussion] 2026-02-04 11:34:04
>>Reptil+424
That is not the argument. No one is arguing about banning open source LLMs that could potentially create problematic content on huggingface, but X provides not only an AI model, but a platform and distribution as well, so that is inherently different
replies(2): >>Reptil+Q34 >>graeme+q15
◧◩◪◨⬒
411. direwo+l34[view] [source] [discussion] 2026-02-04 11:34:34
>>direwo+yD2
I remember in some countries there's an official government newspaper. Laws reference publishing things in this paper (e.g. tax rate changes, radio frequency allocations) and the law is that you must follow it once it's published.

In practice the information is disseminated through many channels once it's released in the official newspaper. Mass media reports on anything widely relevant, niche media reports on things nichely relevant, and there's direct communication with anyone directly affected (recipient of a radio frequency allocation) so nobody really subscribes to the official government newspaper, but it's there and if there was a breakdown of communication systems that would be the last resort to ensure you are getting government updates.

◧◩◪◨⬒
412. direwo+y34[view] [source] [discussion] 2026-02-04 11:36:06
>>father+BB2
Ignoring mountains of circumstantial evidence isn't rational either.
◧◩◪
413. direwo+G34[view] [source] [discussion] 2026-02-04 11:37:13
>>tjpnz+oW2
Does Meta publish it themselves or is it user–generated?
◧◩◪◨⬒⬓⬔⧯
414. scott_+K34[view] [source] [discussion] 2026-02-04 11:37:37
>>cubefo+qP3
> That post doesn't contain such an admission, it instead talks about forbidden prompting.

In response to what? If CSAM is not being generated, why aren't X just saying that? Instead they're saying "please don't do it."

> which contradicts your claim that there were no guardrails before.

From the linked post:

> However content is created or whether users are free or paid subscribers, our Safety team are working around the clock to add additional safeguards

Which was posted a full week after the initial story broke and after Ofcom started investigative action. So no, it does not contradict my point which was:

> Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written:

As you quoted.

I really can't decide if you're stupid, think I and other readers are stupid, or so dedicated to defending paedophilia that you'll just tell flat lies to everyone reading your comment.

replies(1): >>cubefo+ns4
◧◩◪◨⬒⬓
415. Reptil+Q34[view] [source] [discussion] 2026-02-04 11:38:04
>>mnewme+h34
No it is not. X is dumb pipe. You have humans on both ends. Arrest them, summary execute them whatever. You go after X because it is a choke point and easy.
replies(4): >>mnewme+654 >>kllrno+wJ4 >>dragon+qK4 >>thatfr+RS4
◧◩◪◨
416. whatis+344[view] [source] [discussion] 2026-02-04 11:40:17
>>yxhuvu+aP3
If X/Twitter was to be banned in the EU, and some of its citizens still wanted to access X/Twitter, let us say for the sake of getting alternative points of view on politics and news, would it be a good or a bad thing if accessing X/Twitter by IP was stopped?

As in, a citizen of an EU country types x.com/CNN, because he or she wants to know the other side of some political issue between the EU and the USA, and he or she feels that the news in the EU might be biased or have misunderstood something. Would it be good or bad if the user was met with a "This website is by law not available within the EU"?

replies(4): >>muyuu+Je4 >>yxhuvu+dr4 >>pjc50+4G4 >>datsci+R45
◧◩◪◨⬒⬓⬔⧯▣
417. chrisj+h44[view] [source] [discussion] 2026-02-04 11:42:16
>>mortar+dU3
On the contrary, in Europe there is a huge difference. Child porn might get you mere community service, a fine - or even less, as per the landmark court ruling below.

It all depends on the severity of the offence, which itself depends on the category of the material, including whether or not it is CSAM.

The Supreme Court has today delivered its judgment in the case where the court of appeals and district court sentenced a person for child pornography offenses to 80 day fines on the grounds that he had called Japanese manga drawings into his computer. Supreme Court dismiss the indictment.

The judgment concluded that the cartoons in and of itself may be considered pornographic, and that they represent children. But these are fantasy figures that can not be mistaken for real children.

https://bleedingcool.com/comics/swedish-supreme-court-exoner...

replies(1): >>pyrale+SH4
◧◩◪◨⬒⬓⬔⧯
418. expedi+m44[view] [source] [discussion] 2026-02-04 11:43:04
>>direwo+iX3
https://en.wikipedia.org/wiki/EncroChat

You have to understand that Europe doesn't give a shit about techbro libertarians and their desire for a new Lamborghini.

replies(1): >>direwo+E54
◧◩
419. y-curi+554[view] [source] [discussion] 2026-02-04 11:47:22
>>tokai+524
Found the guy who supports digital ID and involuntary phone database scanning
replies(2): >>tokai+Yb4 >>ddtayl+CD5
◧◩◪◨⬒⬓⬔
420. mnewme+654[view] [source] [discussion] 2026-02-04 11:47:27
>>Reptil+Q34
First you argue about the model, now the platform. Two different things.

If a platform encourages and doesn’t moderate at all, yes we should go after the platform.

Imagine a newspaper publishing content like that, and saying they are not responsible for their journalists

◧◩◪
421. thranc+v54[view] [source] [discussion] 2026-02-04 11:49:17
>>Reptil+bY3
How? X is hostile to any party attempting to bring justice to its users that are breaking the law. This is a last recourse, after X and its owner stated plainly that they don't see anything wrong with generating CSAM or pornographic images of non-consenting people, and that they won't do anything about it.
replies(1): >>Reptil+l64
◧◩◪◨⬒⬓⬔⧯▣
422. direwo+E54[view] [source] [discussion] 2026-02-04 11:50:29
>>expedi+m44
EncroChat was illegal because it was targeted at drug dealers, advertised for use in drug dealing. And they got evidence by texting "My associate got busted dealing drugs. Can you wipe his device?" and it was wiped. There's an actual knowledge component which is very important here.
◧◩
423. bluega+X54[view] [source] [discussion] 2026-02-04 11:52:16
>>Altern+ut
No, that's not at all how this works.

They have a court order obviously to collect evidence.

You have offered zero evidence to indicate there is 'political pressure' and that statement by prosecutors doesn't hint at that.

'No crime was prevented by harassing workers' is essentially non sequitor in this context.

It could be that that this is political nonsense, but there would have to be more details.

These issues are really hard but we have to confront them. X can alter electoral outcomes. That's where we are at.

◧◩
424. bluega+c64[view] [source] [discussion] 2026-02-04 11:53:24
>>techbl+8k
Email history caches. They could also have provided requirements to provide communications etc..
◧◩◪◨
425. Reptil+l64[view] [source] [discussion] 2026-02-04 11:54:52
>>thranc+v54
Court order, ip of users, sue the users. It is not X job to bring justice.
replies(1): >>thranc+574
◧◩◪◨⬒⬓⬔⧯▣
426. chrisj+E64[view] [source] [discussion] 2026-02-04 11:56:26
>>direwo+024
Actually it's like arguing they can't seize all computers because the warrant only says laptops. I.e. correct.
◧◩◪
427. mooreb+M64[view] [source] [discussion] 2026-02-04 11:57:25
>>Reptil+bY3
But how would we bring down our political boogieman Elon Musk if we take that approach?

Everything I read from X's competitors in the media tells me to hate X, and hate Elon.

If we prosecute people not tools, how are we going to stop X from hurting the commercial interests of our favourite establishment politicians and legacy media?

replies(2): >>mnewme+T94 >>pjc50+NE4
◧◩◪◨⬒
428. szmarc+S64[view] [source] [discussion] 2026-02-04 11:58:26
>>Reptil+424
You are literally trolling. No one is banning AI entirely. However AI shouldn't spit out adult content. Let's not enable people harm others easily with little to no effort.
◧◩◪◨⬒⬓⬔⧯▣
429. pdpi+174[view] [source] [discussion] 2026-02-04 11:59:27
>>jazzyj+5W3
In a world where hosting services are responsible that way, their filtering would need to be even more sensitive than it is today, and plenty of places already produce unreasonable amounts of false positives.

As it stands, I have a bunch of photos on my phone that would almost certainly get flagged by over-eager/overly sensitive child porn detection — close friends and family sending me photos of their kids at the beach. I've helped bathe and dress some of those kids. There's nothing nefarious about any of it, but it's close enough that services wouldn't take the risk, and that would be a loss to us all.

◧◩◪◨⬒
430. thranc+574[view] [source] [discussion] 2026-02-04 11:59:58
>>Reptil+l64
X will not provide these informations to the French Justice System. What then? Also insane that you believe the company that built a "commit crime" button bears no responsibility whatsoever in this debacle.
replies(1): >>Reptil+5T4
◧◩◪◨
431. mordni+w74[view] [source] [discussion] 2026-02-04 12:03:31
>>direwo+GY3
He said that child pornography is funny? Do you have a link by any chance?
◧◩◪◨
432. mooreb+B74[view] [source] [discussion] 2026-02-04 12:04:17
>>freeja+KG1
Who's going to provide their payment details and then generate kiddie porn?

This is a pretty pragmatic move by Musk.

It's basically a honey trap, the likes of which authorities legitimately use to catch criminals.

replies(2): >>freeja+gt4 >>apinks+8Y4
◧◩◪◨⬒
433. UncleS+h94[view] [source] [discussion] 2026-02-04 12:17:52
>>speed_+Lj2
France has a little more than that...

https://en.wikipedia.org/wiki/Force_de_dissuasion

◧◩◪◨
434. mnewme+T94[view] [source] [discussion] 2026-02-04 12:22:26
>>mooreb+M64
People defending allowing CSAM content was definitely not on my bingo card for 2026.
replies(3): >>chrome+Pm4 >>ddtayl+BV4 >>thranc+3D6
435. krautb+W94[view] [source] 2026-02-04 12:22:52
>>vikave+(OP)
Raid all of them. Raid Google. Raid Facebook. Raid Apple. Raid Microsoft. Big tech has gotten away with everything from fraud[0] to murder[1] for decades. Black outfits. Rappel lines. Automatics. Touch that server Prakesh, and you won't live to touch another.

[0] https://nypost.com/2025/12/15/business/facebook-most-cited-i... [1] https://en.wikipedia.org/wiki/Suchir_Balaji

◧◩◪◨⬒⬓⬔⧯
436. tomp+ra4[view] [source] [discussion] 2026-02-04 12:26:10
>>belorn+BK2
Wait, so someone acted illegally (against law / courts) AND ALSO kidnapped a child for 6 years, and all that happened is that they're... fired?!

That's insane. Don't live in Sweden if you have kids, I guess!

replies(2): >>lolc+Oe4 >>belorn+xI4
◧◩◪
437. kakaci+Ba4[view] [source] [discussion] 2026-02-04 12:26:54
>>Reptil+bY3
You won't find much agreement with your opinion amongst most people. No matter of many "this should and this shouldn't" is written into text by single individual, thats not how morals work.
◧◩◪◨⬒⬓⬔
438. Altern+Eb4[view] [source] [discussion] 2026-02-04 12:35:56
>>mikeyo+az3
So, if someone hosts an image editor as web app, are they liable if someone uses that editor to create CP?

I honestly don't follow it. People creating nudes of others and using the Internet to distribute it can be sued for defamation, sure. I don't think the people hosting the service should be liable themselves, just like people hosting Tor nodes shouldn't be liable by what users of the Tor Network do.

◧◩◪◨⬒⬓⬔⧯
439. Altern+Sb4[view] [source] [discussion] 2026-02-04 12:37:30
>>direwo+dX3
So if a social network tool does the exact same thing, but uses the user's own GPU or NPU to generate the content instead, suddenly it's fine?
replies(2): >>direwo+Qf4 >>numpad+gW4
◧◩◪◨
440. junto+Tb4[view] [source] [discussion] 2026-02-04 12:37:30
>>Palmik+7M3
The old “Rules for thee but not for me”.
◧◩◪
441. tokai+Yb4[view] [source] [discussion] 2026-02-04 12:38:05
>>y-curi+554
Found the cheese pizza enjoyer.
replies(1): >>ddtayl+GD5
◧◩◪◨
442. btreec+lc4[view] [source] [discussion] 2026-02-04 12:40:05
>>vessen+jP
Maybe you could start by expanding on what you mean?

It's a statement that could be taken to favor xenophobia and isolationism.

◧◩◪◨⬒
443. _ph_+oc4[view] [source] [discussion] 2026-02-04 12:40:56
>>beAbU+WN3
I think a company which runs a printing business would have some obligations to make sure they are not fulfilling print orders for guns. Another interesting example are printers and copiers, which do refuse to copy cash. Which is partly facilitated with the EURion constellation (https://en.wikipedia.org/wiki/EURion_constellation) and other means.
◧◩
444. hybrid+ed4[view] [source] [discussion] 2026-02-04 12:46:14
>>utopia+mm3
The people who think raids are pointless probably use TELNET instead of SSH :-)
◧◩◪◨
445. muyuu+yd4[view] [source] [discussion] 2026-02-04 12:48:18
>>pdpi+cm3
i don't see any need for guardrails, other than making the prompter responsible for the output of the bot, particularly when it's predictable

you cannot elaborately use a software to produce an effect that is patently illegal and accurate to your usage, and then pretend the software is to blame

◧◩◪◨
446. JustRa+Ad4[view] [source] [discussion] 2026-02-04 12:48:26
>>mnewme+104
Agreed. Let's try to be less divisive. Everyone has got a fair point.

Yes, AI chatbots have to do everything in their power to avoid users easily generating such content.

AND

Yes, people that do so (even if done so on your self-hosted model) have to be punished.

I believe it is OK that Grok is being investigated because the point is to figure out whether this was intentional or not.

Just my opinion.

◧◩◪◨⬒⬓⬔⧯
447. mbesto+Nd4[view] [source] [discussion] 2026-02-04 12:49:49
>>hbs18+kH3
Does an autonomous car drive the car from point A to point B or does the person who puts in the destination address drive the car?
◧◩◪◨⬒
448. muyuu+Je4[view] [source] [discussion] 2026-02-04 12:57:32
>>whatis+344
there's a push to end with VPNs in the UK and in the EU because it's clear that this is a very plausible endgame

currently VPNs are too easy to use for the leadership of autocracies like the EU or the UK to be comfortable with them, so at the very least they will require for backdoors to see which citizens are watching what, and have them visited by fellows in hi-vis jackets

replies(1): >>GJim+8o4
◧◩◪◨⬒⬓⬔⧯▣
449. lolc+Oe4[view] [source] [discussion] 2026-02-04 12:57:53
>>tomp+ra4
> Don't live in Sweden if you have kids, I guess!

I heard of countries where parents are fond of having firearms around.

replies(1): >>tomp+4f4
◧◩◪◨⬒⬓⬔⧯▣▦
450. tomp+4f4[view] [source] [discussion] 2026-02-04 12:59:54
>>lolc+Oe4
like Switzerland?
replies(1): >>lolc+yf4
◧◩◪◨⬒⬓⬔⧯▣▦▧
451. lolc+yf4[view] [source] [discussion] 2026-02-04 13:02:40
>>tomp+4f4
Good example, there are scandals around custody too!
◧◩◪◨⬒⬓⬔⧯▣
452. direwo+Qf4[view] [source] [discussion] 2026-02-04 13:04:56
>>Altern+Sb4
If a user generates child porn on their own and uploads it to a social network, the social network is shielded from liability until they refuse to delete it.
◧◩◪
453. NooneA+4g4[view] [source] [discussion] 2026-02-04 13:06:28
>>Silver+rY3
that's kinda the normalization argument, not the reason behind it

"it is done because it's always done so"

replies(3): >>Detroi+Ni4 >>monsie+kn4 >>pjc50+aE4
454. patric+sh4[view] [source] 2026-02-04 13:16:00
>>vikave+(OP)
Until the EU gets their act together on free speech, it's high time the united states hits them with aggressive economic sanctions.
replies(4): >>poseva+Pn4 >>hearsa+k05 >>coffee+q05 >>drawfl+Ts5
◧◩◪◨
455. Detroi+Ni4[view] [source] [discussion] 2026-02-04 13:23:30
>>NooneA+4g4
Isn't it both necessary and normal if they need more information about why they were generating CSAM? I don't know why the rule of law shouldn't apply to child pornography or why it would be incorrect to normalize the prosecution of CSAM creators.
456. voidUp+kk4[view] [source] 2026-02-04 13:34:34
>>vikave+(OP)
> "the only country in the world that is criminally persecuting all social networks that give people some degree of freedom"

Sounds like he's never been to Russia. Which is weird, given that he's Russian

replies(2): >>arielc+2l4 >>fer+2J4
◧◩◪◨⬒⬓
457. Hikiko+Gk4[view] [source] [discussion] 2026-02-04 13:36:46
>>cubefo+hr3
>First of all, the Guardian is known to be heavily biased again Musk.

Biased against the man asking Epstein which day would be best for the "wildest" party.

◧◩
458. arielc+2l4[view] [source] [discussion] 2026-02-04 13:39:21
>>voidUp+kk4
Yeah, and in his mind the only social networks that exist are Telegram and Twitter. Mastodon doesn't exist, bluesky doesn't either, rr any of Meta's products (including Threads, direct competitor to Twitter with seemingly more bots and propaganda)
◧◩◪◨⬒
459. chrome+Pm4[view] [source] [discussion] 2026-02-04 13:51:27
>>mnewme+T94
Fucked up times we live in
◧◩◪◨
460. monsie+kn4[view] [source] [discussion] 2026-02-04 13:54:44
>>NooneA+4g4
I'm not sure what you're getting at, physical investigation is the common procedure. You need a reason _not_ to do it, and since "it's all digital" is not a good reason we go back to doing the usual thing.
replies(1): >>mothba+No4
◧◩
461. poseva+Pn4[view] [source] [discussion] 2026-02-04 13:58:23
>>patric+sh4
Is this sarcasm ? because sounds like sarcasm. If Grok generating naked images of children is your ideea of free speech no wonder nobody takes a stand against Trump and Co., you somehow managed turn your common sense to dust...
◧◩◪◨⬒⬓
462. GJim+8o4[view] [source] [discussion] 2026-02-04 14:00:10
>>muyuu+Je4
> there's a push to end with VPNs in the UK and in the EU

No there isn't.

Governments discussing such things doesn't _remotely_ mean there is a political will for them, or that they will be voted into law.

Governments are expected to research and discuss paths of legislation (and in this case, come to the conclusion banning VPNs is both harmful and ridiculous). This is how our democracies work!

Reporting government discussions as approved legislation is, at best ignorant, at worst trolling.

◧◩◪◨⬒
463. pdpi+Fo4[view] [source] [discussion] 2026-02-04 14:02:26
>>watwut+FD3
Sure, and the fact that they haven't voluntarily put guard rails up to stop that is absolutely vile. But my personal definition of "absolutely vile" isn't a valid legal standard. So, the issue is, like I said, how do you come up with a principled approach to making them do it that doesn't have a whole bunch of unintended consequences?
replies(1): >>watwut+qZ4
◧◩◪◨⬒
464. mothba+No4[view] [source] [discussion] 2026-02-04 14:03:08
>>monsie+kn4
It's a show of force. "Look we have big strong men with le guns and the neat jack boots, we can send 12 of them in for every one of you." Whether it is actually needed for evidence is immaterial to that.
replies(2): >>swiftc+IK4 >>Teever+K45
465. code_f+ar4[view] [source] 2026-02-04 14:16:19
>>vikave+(OP)
lord some of the people on this site need their hard drives searched.
replies(2): >>ddtayl+fY4 >>throwa+Mi7
◧◩◪◨⬒
466. yxhuvu+dr4[view] [source] [discussion] 2026-02-04 14:16:30
>>whatis+344
Generally speaking I wouldn't support blocking their IP, but rather I'd block the ability of European companies to pay for ads on X unless they fixed their shit and paid any damages. That might of course lead X to block Europe visitors in turn but that is a different discussion.

Or in other words: I would block the do business-part, not the access part.

◧◩
467. code_f+jr4[view] [source] [discussion] 2026-02-04 14:17:21
>>miki12+dl3
I think having guardrails on your AI to not be able to produce this stuff is good actually. Also, Elon encourages this behavior socially through his posts so yeah he should face consequences.
◧◩
468. code_f+gs4[view] [source] [discussion] 2026-02-04 14:21:17
>>tokai+524
extremely cursed userbase sometimes
◧◩◪◨⬒⬓⬔⧯▣
469. cubefo+ns4[view] [source] [discussion] 2026-02-04 14:21:44
>>scott_+K34
Leave your accusations for yourself. Grok already didn't generate naked pictures of adults months ago when I tested it for the first time. Clearly the "additional safeguards" are meant to protect the system against any jailbreaks.
replies(1): >>scott_+1y4
◧◩◪
470. a_bett+Cs4[view] [source] [discussion] 2026-02-04 14:22:58
>>sleepy+yr2
and count on Trump to disrespect the extradition treaties. Which might be a reasonable expectation, but will have consequences, and Trump might not hold power forever.
replies(1): >>sleepy+SN5
◧◩◪◨⬒
471. freeja+gt4[view] [source] [discussion] 2026-02-04 14:26:13
>>mooreb+B74
Who is going to generate kiddie porn on it in the first place? It's not as if a a lack of a credit card is preventing the authorities from figuring anything out. This is beyond ridiculous.
◧◩◪◨⬒⬓
472. termin+Ht4[view] [source] [discussion] 2026-02-04 14:29:06
>>defros+E04
His victims? He was the victim. Shooting people who are robbing you (a detail you conveniently left out) is a good thing, which is why even a jury of NYC lefties acquitted him on the shooting charges. The only injustice here is that he did prison time for violating the clearly unconstitutional laws that required him to have a license for the gun.

edit: > Canty would later testify that the victims were en route to steal from video arcade machines in Manhattan

> Each of the four youths shot by Goetz was facing a trial or hearing on criminal charges at the time of the incident. Ten weeks prior to being shot, Cabey was arrested on charges that he held up three men with a shotgun in the Bronx, and he was released on $2,000 bail.[38] Cabey failed to appear at his next court date, resulting in an additional arrest warrant.

Sounds like predators to me.

replies(1): >>defros+Uu6
◧◩◪◨⬒
473. mekdoo+Yt4[view] [source] [discussion] 2026-02-04 14:31:09
>>hackin+KU3
And they should face consequences for that, somewhat mitigated by their good faith response.
474. wnevet+Qu4[view] [source] 2026-02-04 14:34:53
>>vikave+(OP)
The man that is friends with a bunch of pedophiles owns a website that is becoming known for generating CSAM? What are the chances?
475. devwas+Vu4[view] [source] 2026-02-04 14:35:20
>>vikave+(OP)
E.U is replacing their citizens with unvetted violent criminals. They have to vote for whoever gives them free stuff. The powers doing this are upset a company in the U.S allows their citizens to protest it.
replies(1): >>mekdoo+Ma5
◧◩◪
476. plopil+8v4[view] [source] [discussion] 2026-02-04 14:35:53
>>rsynno+4p
I mean, the example you link is probably an engineer doing their job of signalling to hierarchy that something went deeply wrong. Of course, the lack of action of Facebook afterwards is a proof that they did not care, but not as much as a smoking gun.

A smoking gun would be, for instance, Facebook observing that most of their ads are scam, that the cost of fixing this exceeds by far "the cost of any regulatory settlement involving scam ads.", and to conclude that the company’s leadership decided to act only in response to impending regulatory action.

https://www.reuters.com/investigations/meta-is-earning-fortu...

replies(2): >>rsynno+D65 >>freeja+0y5
◧◩
477. keepam+3x4[view] [source] [discussion] 2026-02-04 14:45:24
>>miki12+dl3
There's no crowds or sides. It's all manufactured divisions because some of those who can't or don't want to create the technology are determined to control it. So they'll get you mad about what they need to, to justify actions that increase their control.

It's the same playbook that is used again and again. For war, civil liberties crackdowns, lockdowns, COVID, etc, etc: 0) I want (1); start playbook: A) Something bad is here, B) You need to feel X + Panic about it, C) We are solving it via (1). Because you reacted at B, you will support C. Problem, reaction, solution. Gives the playmakers the (1) they want.

We all know this is going on. But I guess we like knowing someone is pulling the strings. We like being led and maybe even manipulated because perhaps in the familiar system (which yields the undeniable goods of our current way of life), there is safety and stability? How else to explain.

Maybe the need to be entertained with drama is a hackable side effect of stable societies populated by people who evolved as warriors, hunters and survivors.

◧◩
478. tw85+Ux4[view] [source] [discussion] 2026-02-04 14:49:25
>>mnewme+pE3
It seems people have a rather short memory when it comes to twitter. When it was still run by Jack Dorsey, CP was abundant on twitter and there was little effort to tamp down on it. After Musk bought the platform, he and Dorsey had a public argument in which Dorsey denied the scale of the problem or that old twitter was aware of it and had shown indifference. But Musk actually did take tangible steps to clean it up and many accounts were banned. It's curious that there wasn't nearly the same level of outrage from the morally righteous HN crowd towards Mr. Dorsey back then as there is in this thread.
replies(6): >>misnom+dz4 >>bright+KD4 >>ryandr+aJ4 >>AaronF+nU4 >>ceejay+O05 >>jmcgou+g45
◧◩◪◨⬒⬓⬔⧯▣▦
479. scott_+1y4[view] [source] [discussion] 2026-02-04 14:49:48
>>cubefo+ns4
Just to be clear, I'm to ignore:

* Internet Watch Foundation

* The BBC

* The Guardian

* X themselves

* Ofcom

And believe the word of an anonymous internet account who claims to have tried to undress women using Grok for "research."

◧◩◪
480. misnom+dz4[view] [source] [discussion] 2026-02-04 14:56:55
>>tw85+Ux4
When did Jack Dorsey unban personal friends of his that had gotten banned for posting CSAM?
◧◩◪◨
481. psycho+LA4[view] [source] [discussion] 2026-02-04 15:02:38
>>direwo+UD2
Freedom of one starts where it confirms freedom of others.

Of course everybody is going to find a point when freedom of speech have to be limited. Otherwise, anyone can justify that cutting the head of their neighbour with a katana while dancing is part of an artistic performance, and absolute free speech is only possible if all artistic expression is given complete license. Those who pretend otherwise will have no ground to defend themselves on legal basis from being wiped out of existence by the very same logic.

◧◩
482. bright+9D4[view] [source] [discussion] 2026-02-04 15:12:48
>>mnewme+pE3
Agreed. For anyone curious, here's the UK report from the National Society for the Prevention of Cruelty to Children (NSPCC) from 2023-2024.

https://www.bbc.com/news/articles/cze3p1j710ko

Reports on sextortion, self-generated indecent images, and grooming via social media/messaging apps:

Snapchat 54%

Instagram 11%

Facebook 7%

WhatsApp 6-9%

X 1-2%

replies(2): >>rcpt+VD4 >>jbenne+Xs6
◧◩◪
483. bright+KD4[view] [source] [discussion] 2026-02-04 15:15:50
>>tw85+Ux4
I meant to reply to you with this: >>46886801
◧◩◪
484. rcpt+VD4[view] [source] [discussion] 2026-02-04 15:16:28
>>bright+9D4
What are the percentages?
replies(1): >>bright+EG4
◧◩◪◨
485. pjc50+aE4[view] [source] [discussion] 2026-02-04 15:17:44
>>NooneA+4g4
Well, yes, it is actually pretty normal for suspected criminal businesses. What's unusual is that this one has their own publicity engine. Americans are just having trouble coping with the idea of a corporation being held liable for crimes.

More normally it looks like e.g. this in the UK: https://news.sky.com/video/police-raid-hundreds-of-businesse...

CyberGEND more often seem to do smalltime copyright infringement enforcement, but there are a number of authorities with the right to conduct raids.

replies(1): >>learin+9F4
◧◩◪◨
486. pjc50+NE4[view] [source] [discussion] 2026-02-04 15:20:36
>>mooreb+M64
Corporations are also people.

(note that this isn't a raid on Musk personally! It's a raid on X corp for the actions of X corp and posts made under the @grok account by X corp)

◧◩◪◨⬒
487. learin+9F4[view] [source] [discussion] 2026-02-04 15:22:08
>>pjc50+aE4
“Americans are just having trouble coping with the idea of a corporation being held liable for crimes.”

I’m sorry but that’s absurd even amidst the cacophony of absurdity that comprises public discourse these days.

replies(2): >>Rygian+LG4 >>malfis+tS4
◧◩◪◨⬒
488. pjc50+4G4[view] [source] [discussion] 2026-02-04 15:26:41
>>whatis+344
Like anything involving hundreds of millions of users, there's going to be good or bad effects. However I have been on the internet long enough to have concluded that the idea that local law has _no effect at all_ on websites is not good. Ultimately if they don't comply they would probably have to be blocked.

CNN is a very silly example though, because .. you can just go to the CNN website separately. The one that is blocked is Russia Today and various other enemy propaganda channels.

◧◩◪◨
489. pjc50+dG4[view] [source] [discussion] 2026-02-04 15:27:18
>>Palmik+7M3
Even earlier: PokerStars.
◧◩◪◨
490. bright+EG4[view] [source] [discussion] 2026-02-04 15:29:38
>>rcpt+VD4
Edited to add clarification.
replies(1): >>spacem+bj5
◧◩◪◨⬒⬓
491. Rygian+LG4[view] [source] [discussion] 2026-02-04 15:30:12
>>learin+9F4
I'll bite.

How was TikTok held liable for the crimes it was accused of?

replies(3): >>throwp+JR4 >>pjc50+QZ4 >>learin+I25
◧◩◪
492. throwa+bH4[view] [source] [discussion] 2026-02-04 15:32:16
>>Silver+rY3
EU wants to circumvent e2e to fight CSA: "nooo think about my privacy, what happened to just normal police work?"

Police raids offices literally investigating CSA: "nooo police should not physically invade, what happened to good old electronic surveillance?"

◧◩◪◨⬒⬓⬔⧯▣▦
493. pyrale+CH4[view] [source] [discussion] 2026-02-04 15:33:52
>>chrisj+s24
Dude, I litterally provided terminology notice from the DOJ. At this point I don't really know what else will convince you.
replies(1): >>chrisj+PZ4
◧◩◪◨⬒⬓⬔⧯▣▦
494. pyrale+SH4[view] [source] [discussion] 2026-02-04 15:35:28
>>chrisj+h44
> The Supreme Court has today delivered its judgment

For future readers: the [Swedish] supreme court.

◧◩◪◨⬒⬓⬔⧯▣
495. belorn+xI4[view] [source] [discussion] 2026-02-04 15:37:36
>>tomp+ra4
It is fairly lenient. The review board, assigned political, do hold a bit of moral responsibility and got no punishment.

The reason I mentioned that this occurred right after metoo is that the cultural environment in Sweden was a bit unstable. Some people felt they could not trust the courts, which include people who worked as inspectors for the government. The review board is also selected politically, which may add a second explanation for why they permitted the misconduct. It was a very political time and everyone wanted to be perceived as being on the right side of history.

The case has been debate in Swedish parliament but the reaction has been to not really talk about it. People ignored the law and rules, and they shouldn't have done that, and that is then that.

◧◩
496. fer+2J4[view] [source] [discussion] 2026-02-04 15:40:27
>>voidUp+kk4
Yeah, as someone who's been to Russia pre-war, and has worked and befriended Russians, the beam in the eye is quite strong in many. It's striking because the social contract there is basically that you're left alone as long as you don't become a thorn of the power, social network-mediated or not, and I'm not even talking about Putin and his entourage, but local administration in a remote village too. Zvyagintsev portrays it royally in Leviathan, as it's been portrayed by Dostoyevsky and Tosltoy before.

My most recent case: I went on holiday to a resort in Turkey, numerous Russians, families, retired, etc. I don't pass as a Russian-speaker (but I understand quite well) and once they hear me talking other unrelated language they naturally start to speak more freely in front of me (i.e. more liberal use of swearing, and even slurs if no other Russians are around).

While sunbathing, or at the restaurant, or the pool, they were talking about daily, mundane things, same in the restaurant, etc. But when floating in pairs 20-30m from the shore? Politics.

◧◩◪
497. ryandr+aJ4[view] [source] [discussion] 2026-02-04 15:41:08
>>tw85+Ux4
Didn't Reddit have the same problem until they got negative publicity and were basically forced to clean it up? What is with these big tech companies and CP?
replies(2): >>imperi+kS4 >>Manuel+sY4
◧◩◪◨⬒⬓⬔
498. kllrno+wJ4[view] [source] [discussion] 2026-02-04 15:42:48
>>Reptil+Q34
> X is dumb pipe.

X also actively distributes and profits off of CSAM. Why shouldn't the law apply to distribution centers?

replies(1): >>hnfong+ML4
◧◩◪◨⬒⬓⬔
499. dragon+qK4[view] [source] [discussion] 2026-02-04 15:46:33
>>Reptil+Q34
X is most definitely not a dumb pipe, you also have humans beside the sender and receiver choosing what content (whether directly or indirectly) is promoted for wide dissemination, relatively suppressed, or outright blocked.
◧◩◪
500. fyredg+yK4[view] [source] [discussion] 2026-02-04 15:46:58
>>tick_t+JV3
Unlike the current American administration who condones raids on homes without warrants and justifies violence with lies, this France raid follows something called rule of law.

So no, don't be coy and pretend that all governments are like American institutions.

replies(1): >>Levitz+k25
◧◩◪◨⬒⬓
501. swiftc+IK4[view] [source] [discussion] 2026-02-04 15:47:38
>>mothba+No4
It can be both things at once. It obviously sends a message, but hey, maybe you get lucky, and someone left a memo in the bin by the printer that blows the case wide open.
replies(1): >>otherm+485
◧◩
502. kmeist+5L4[view] [source] [discussion] 2026-02-04 15:48:59
>>techbl+8k
Have you taken a look at the Epstein files lately? Rich people write out basically all of their crimes in triplicate because they don't fear the law.
◧◩◪◨⬒⬓⬔⧯
503. hnfong+ML4[view] [source] [discussion] 2026-02-04 15:51:48
>>kllrno+wJ4
There's a slippery slope version of your argument where your ISP is responsible for censoring content that your government does not like.

I mean, I thought that was basically already the law in the UK.

I can see practical differences between X/twitter doing moderation and the full ISP censorship, but I cannot see any differences in principle...

replies(1): >>kllrno+p25
◧◩◪◨
504. hnfong+yM4[view] [source] [discussion] 2026-02-04 15:55:29
>>PaulRo+0P3
There's a dissonance between the claim that it's "independent of the political bodies" and "held to account by respective legislative bodies". Are the legislative bodies not political? In the sense of aren't they elected through a political process?

This is kind of a genuine question from me since I have no idea how these authorities are set up in France or the UK...

◧◩◪
505. intend+eN4[view] [source] [discussion] 2026-02-04 15:59:20
>>Reptil+bY3
If you had argued that it’s impossible to track what is made on local models, and we can no longer maintain hashes of known CP, it would have been a fair statement of current reality.

——-

You’ve said that whatever is behind door number 1 is unacceptable.

Behind door number 2, “holding tool users responsible”, is tracking every item generated via AI, and being able to hold those users responsible.

If you don’t like door number 2, we have door number 3 - which is letting things be.

For any member of society, opening door 3 is straight out because the status quo is worse than reality before AI.

If you reject door 1 though, you are left with tech monitoring. Which will be challenged because of its invasive nature.

Holding Platforms responsible is about the only option that works, at least until platforms tell people they can’t do it.

replies(1): >>Reptil+AS4
◧◩
506. mekdoo+qN4[view] [source] [discussion] 2026-02-04 16:00:00
>>isodev+4f3
I don't know how we even allowed these companies to not be responsible for everything on their platform.

The entirety of the social media platform is based on the idea that the company isn't responsible for what the users post, which is just wrong. If you own a magazine, you should be held responsible for everything published.

You shouldn't be allowed to profit from publishing anything, then hide behind "the users did it, not us".

And in this case, Elon should be held responsible for every single image of CSAM published on X. Same with Zuck. Same with Truth Social, whatever you want.

◧◩◪◨
507. mekdoo+hP4[view] [source] [discussion] 2026-02-04 16:08:10
>>Toucan+s82
Could not possibly agree more. When did we accept, "Users are doing the scamming, not the company" as an excuse? If it's your platform, you should be held responsible for all the illegal actions that it enabled.

If a magazine published a page with a scam, they're responsible. Same should apply to social media.

I literally don't care if it puts them out of business because the moderation would be too severe.

replies(1): >>Toucan+Kq5
◧◩
508. yibg+RP4[view] [source] [discussion] 2026-02-04 16:10:46
>>mnewme+pE3
My natural reaction here is like I think most others; that yes Grok / X bad, shouldn't be able to generate CSAM content / deepfakes.

But I am having trouble justifying in an consistent manner why Grok / X should be liable here instead of the user. I've seen a few arguments here that mostly comes down to:

1. It's Grok the LLM generating the content, not the user.

2. The distribution. That this isn't just on the user's computer but instead posted on X.

For 1. it seems to breakdown if we look more broadly at how LLMs are used. e.g. as a coding agent. We're basically starting to treat LLMs as a higher level framework now. We don't hold vendors of programming languages or frameworks responsible if someone uses them to create CSAM. Yes LLM generated the content, but the user still provided the instructions to do so.

For 2. if Grok instead generated the content for download would the liability go away? What if Grok generated the content to be downloaded only and then the user uploaded manually to X? If in this case Grok isn't liable then why does the automatic posting (from the user's instructions) make it different? If it is, then it's not about the distribution anymore.

There are some comparisons to photoshop, that if i created a deep fake with photoshop that I'm liable not Adobe. If photoshop had a "upload to X" button, and I created CSM using photoshop and hit the button to upload to X directly, is now Adobe now liable?

What am I missing?

replies(3): >>realus+6W4 >>flumpc+225 >>dragon+Zx6
◧◩
509. almost+fQ4[view] [source] [discussion] 2026-02-04 16:12:20
>>utopia+mm3
At some point in the near future I see a day where our work laptops are nothing more than a full screen streaming video to a different computer that is housed in a country that has no data extradition treaties and is business friendly.

Because that country and the businesses that support that are going to get RICH from such a service.

replies(6): >>neorom+dU4 >>AtlasB+115 >>dragon+C25 >>vpark+Fe5 >>TitaRu+Zj5 >>utopia+Sr7
◧◩◪◨⬒⬓⬔
510. throwp+JR4[view] [source] [discussion] 2026-02-04 16:18:44
>>Rygian+LG4
It was force-sold to Oracle.
replies(2): >>watwut+mX4 >>freeja+2f5
◧◩◪◨
511. imperi+kS4[view] [source] [discussion] 2026-02-04 16:20:56
>>ryandr+aJ4
Reddit was forced to clean it up when they started eyeballing an IPO.
◧◩◪◨⬒⬓
512. malfis+tS4[view] [source] [discussion] 2026-02-04 16:21:23
>>learin+9F4
Corporations routinely get a slap on the wrist (seconds of profitability), not required to admit guilt, or deferred prosecution agreements.
◧◩◪◨
513. Reptil+AS4[view] [source] [discussion] 2026-02-04 16:22:01
>>intend+eN4
Behind door number 4 is whenever you find a crime, start investigation and get a warrant. You will only need a couple of cases to send chilling enough effects.
◧◩◪◨⬒⬓⬔
514. thatfr+RS4[view] [source] [discussion] 2026-02-04 16:23:03
>>Reptil+Q34
If you have a recommandation algorithm you are not a dumb pipe
◧◩◪◨⬒⬓
515. Reptil+5T4[view] [source] [discussion] 2026-02-04 16:23:46
>>thranc+574
It is illegal in USA too, so the french authorities would absolutely have no problems getting assistance from the USA ones.
replies(2): >>ddtayl+1W4 >>thranc+zZ4
◧◩◪
516. neorom+dU4[view] [source] [discussion] 2026-02-04 16:28:08
>>almost+fQ4
In some places that day is today, and has been for a while.
replies(1): >>almost+sZ4
◧◩◪
517. AaronF+nU4[view] [source] [discussion] 2026-02-04 16:28:39
>>tw85+Ux4
Didn't X unban users like Dom Lucre who posted CSAM because of their political affiliation?
replies(1): >>jrflow+NM6
◧◩
518. lux-lu+DU4[view] [source] [discussion] 2026-02-04 16:30:10
>>mnewme+pE3
The lack of guardrails wasn’t a carelessness issue - Grok has many restrictions and Elon regularly manipulates the answers it gives to suit his political preferences - but rather one of several decisions to offer largely unrestricted AI adult content generation as a unique selling point. See also, e.g. the lack of real age verification on Ani’s NSFW capabilities.
◧◩◪◨⬒
519. ddtayl+BV4[view] [source] [discussion] 2026-02-04 16:34:40
>>mnewme+T94
All free speech discussions lead here sadly.
◧◩◪◨⬒⬓
520. neorom+FV4[view] [source] [discussion] 2026-02-04 16:35:16
>>cubefo+hr3
>First of all, the Guardian is known to be heavily biased again Musk.

Which is good, that is the sane position to take these days.

◧◩◪◨⬒⬓⬔
521. ddtayl+1W4[view] [source] [discussion] 2026-02-04 16:36:25
>>Reptil+5T4
Elon Musk spent a lot of money getting his pony elected you think he isn't going to ride it?
◧◩◪
522. realus+6W4[view] [source] [discussion] 2026-02-04 16:36:39
>>yibg+RP4
> But I am having trouble justifying in an consistent manner why Grok / X should be liable here instead of the user.

Because Grok and X aren't even doing the most basic filtering they could do to pretend to filter out CSAM.

replies(1): >>yibg+dG5
◧◩◪◨⬒⬓⬔⧯▣
523. numpad+gW4[view] [source] [discussion] 2026-02-04 16:37:23
>>Altern+Sb4
Changes who's technically the perp. Offer managed porn generation service -> service provider is responsible for generating porn, like, literally literally
◧◩◪◨⬒
524. ecshaf+vW4[view] [source] [discussion] 2026-02-04 16:38:01
>>ChuckM+gY3
I've worked in several companies and have never seen this. Maybe for a small scale startup or rapidly growing early stage company. I would be pretty shocked to see an old desktop acting as a server nowadays.
◧◩◪◨⬒⬓⬔⧯
525. watwut+mX4[view] [source] [discussion] 2026-02-04 16:41:50
>>throwp+JR4
That had more to do with wish to control it and steal it then crimes.
◧◩◪◨⬒
526. apinks+8Y4[view] [source] [discussion] 2026-02-04 16:45:22
>>mooreb+B74
Nah, Musk put out a public challenge in January asking anyone able to generate illegal / porno images to reply and tell him how they were able to bypass the safegaurds. Thousands of people tried and failed. I think the most people were able to get is stuff you'd see in an R-rated movie, and even then only for fictional requests as the latest versions of Grok refuse to undress or redress any real person into anything inappropriate.

Here's the mentioned thread: https://x.com/elonmusk/status/2011527119097249996

◧◩
527. ddtayl+fY4[view] [source] [discussion] 2026-02-04 16:45:45
>>code_f+ar4
I was once closer to s free speech absolutist, but this has always been where that discussion leads. The lines were clearer decades ago because someone who interacted with CSAM was closely involved and their harm was easier to see in totality. Over time the gap between people who caused that harm and those that think they are not hurting anyone has widened. With AI it caused many of those that thought it was acceptable to consume illegal content to put another layer between their victims and offload the moral baggage.
◧◩◪◨
528. Manuel+sY4[view] [source] [discussion] 2026-02-04 16:46:40
>>ryandr+aJ4
Not exactly. Reddit always took down CSAM (how effectively I don't know, but I've been using the site consistently since 2011 and I've never come across it).

What Reddit did get a lot of negative public publicity for were subreddits focused on sharing non-explicit photos of minors, but with loads of sexually charged comments. The images themselves, nobody would really object to in isolation, but the discussions surrounding the images were all lewd. So not CSAM, but still creepy and something Reddit tightly decided it didn't want on the site.

◧◩◪◨⬒⬓
529. watwut+qZ4[view] [source] [discussion] 2026-02-04 16:49:51
>>pdpi+Fo4
Courts are able or should be able to distinguish "tool that creates an item in the privacy of your home" and "tool that disseminates nonconsensual pornographic picture to wide public". Legal standards with that level or definition are fairly normal.
◧◩◪◨
530. almost+sZ4[view] [source] [discussion] 2026-02-04 16:49:54
>>neorom+dU4
I agree but so far at all places I have ever worked, I have access to the local hardware, disk, etc... if that's an actuality it's extremely rare.
replies(1): >>morkal+425
◧◩◪◨⬒⬓⬔
531. thranc+zZ4[view] [source] [discussion] 2026-02-04 16:50:29
>>Reptil+5T4
You really believe that? You think the Trump administration will force Musk's X to give the French State data about its users so CSAM abusers can be prosecuted there? This is delusional, to say the least. And let's not even touch on the subject of Trump and Musk both being actual pedophiles themselves.
◧◩◪◨⬒⬓⬔⧯▣▦▧
532. chrisj+PZ4[view] [source] [discussion] 2026-02-04 16:52:13
>>pyrale+CH4
> I litterally provided terminology notice from the DOJ

You provided a terminology preference notice from the (non-lawmaking) DOJ containing a suggestion which the (lawmaking) Congress did not take up.

Thanks for that.

And if/when the French in question decide to take it up, I am sure we'll hear the news! :)

◧◩◪◨⬒⬓⬔
533. pjc50+QZ4[view] [source] [discussion] 2026-02-04 16:52:23
>>Rygian+LG4
Was it ever actually accused of crimes? Was it raided? Was there a list of charges?

It always seemed to me that TikTok was doing the same things that US based social networks were doing, and the only problem various parties could agree on with this was that it was foreign-owned.

◧◩
534. hearsa+k05[view] [source] [discussion] 2026-02-04 16:54:29
>>patric+sh4
Europe/EU should have whatever free speech they want. But we should annex all european controlled territory in the western hemisphere. The amount of territory that european empires still control in this hemisphere is alarming and frankly embarrassing. From greenland to "french" guyana to the falklands, it should all be seized.
◧◩
535. coffee+q05[view] [source] [discussion] 2026-02-04 16:54:52
>>patric+sh4
https://ourworldindata.org/grapher/freedom-of-expression-ind...
◧◩◪
536. ceejay+O05[view] [source] [discussion] 2026-02-04 16:56:55
>>tw85+Ux4
> But Musk actually did take tangible steps to clean it up and many accounts were banned.

Mmkay.

https://en.wikipedia.org/wiki/Twitter_under_Elon_Musk#Child_...

"As of June 2023, an investigation by the Stanford Internet Observatory at Stanford University reported "a lapse in basic enforcement" against child porn by Twitter within "recent months". The number of staff on Twitter's trust and safety teams were reduced, for example, leaving one full-time staffer to handle all child sexual abuse material in the Asia-Pacific region in November 2022."

"In 2024, the company unsuccessfully attempted to avoid the imposition of fines in Australia regarding the government's inquiries about child safety enforcement; X Corp reportedly said they had no obligation to respond to the inquiries since they were addressed to "Twitter Inc", which X Corp argued had "ceased to exist"."

◧◩◪
537. AtlasB+115[view] [source] [discussion] 2026-02-04 16:57:49
>>almost+fQ4
Sealand,!!!!!!
replies(1): >>utopia+Cq8
◧◩◪◨⬒⬓
538. graeme+q15[view] [source] [discussion] 2026-02-04 16:59:51
>>mnewme+h34
> No one is arguing about banning open source LLMs that could potentially create problematic content on huggingface,

If LLMs should have guardrails, why should open source ones be exempt? What about people hosting models on hugging face? WHat if you use a model both distributed by and hosted by hugging face.

◧◩
539. SoftTa+D15[view] [source] [discussion] 2026-02-04 17:00:36
>>utopia+mm3
They don't just take paper when they raid offices. They take the computers too. I've never worked anywhere where desktop machines are encrypted as a matter of routine. Laptops yes (but only recently). Servers maybe, depending on what they do.
replies(1): >>Foobar+I55
◧◩◪
540. flumpc+225[view] [source] [discussion] 2026-02-04 17:01:59
>>yibg+RP4
> For 1. it seems to breakdown if we look more broadly at how LLMs are used. e.g. as a coding agent. We're basically starting to treat LLMs as a higher level framework now. We don't hold vendors of programming languages or frameworks responsible if someone uses them to create CSAM. Yes LLM generated the content, but the user still provided the instructions to do so.

LLMs are completely different to programming languages or even Photoshop.

You can't type a sentence and within 10 seconds get images of CSAM with Photoshop. LLMs are also built on trained material, unlike the traditional tools in Photoshop. There have been plenty CSAM found in the training data sets, but shock-horror apparently not enough information to know "where it came from". There's a non-zero chance that this CSAM Grok is vomiting out is based on "real" CSAM of people being abused.

◧◩◪◨⬒
541. morkal+425[view] [source] [discussion] 2026-02-04 17:02:16
>>almost+sZ4
I've definitely witnessed some pretty big companies that have got all their employees, including developers, set up on Citrix. In those cases, the "foreign friendly legal environment getting rich off of it" was the United States
◧◩◪◨
542. Levitz+k25[view] [source] [discussion] 2026-02-04 17:03:06
>>fyredg+yK4
>Unlike the current American administration who condones raids on homes without warrants and justifies violence with lies, this France raid follows something called rule of law.

Iffy on that front, actually. https://en.wikipedia.org/wiki/Arrest_and_indictment_of_Pavel...

replies(2): >>kergon+0F5 >>watwut+F16
◧◩◪◨⬒⬓⬔⧯▣
543. kllrno+p25[view] [source] [discussion] 2026-02-04 17:03:49
>>hnfong+ML4
We don't consider warehouses & stores to be a "slippery slope" away from toll roads, so no I really don't see any good faith slippery slope argument that connects enforcing the law against X to be the same as government censorship of ISPs.

I mean even just calling it censorship is already trying to shove a particular bias into the picture. Is it government censorship that you aren't allowed to shout "fire!" in a crowded theater? Yes. Is that also a useful feature of a functional society? Also yes. Was that a "slippery slope"? Nope. Turns out people can handle that nuance just fine.

◧◩◪
544. dragon+C25[view] [source] [discussion] 2026-02-04 17:04:38
>>almost+fQ4
> At some point in the near future I see a day where our work laptops are nothing more than a full screen streaming video to a different computer that is housed in a country that has no data extradition treaties and is business friendly.

Do you mean they will be pure worker surveillance systems, or did you mean “from” instead of “to”?

◧◩◪◨⬒⬓⬔
545. learin+I25[view] [source] [discussion] 2026-02-04 17:05:14
>>Rygian+LG4
American companies held liable for crimes include Bank of America ($87B in penalties), Purdue Pharma (opioid crisis), Pfizer for fraudulent drug marketing, Enron for accounting fraud. Everyone on hn should know about FTX, Theranos, I mean come on.
◧◩
546. comman+845[view] [source] [discussion] 2026-02-04 17:12:10
>>miki12+dl3
Pretty disturbing to me how many people _on here_ are cheering for this. I thought that at least here of all places, there might be some nuanced discussion on "ok, I see why people are emotional about this topic in particular, but it's worth stepping back and putting emotions aside for a minute to see if this is actually reasonable overall..." but besides your comment, I'm not seeing much of that.
replies(1): >>troyvi+b65
◧◩◪
547. jmcgou+g45[view] [source] [discussion] 2026-02-04 17:12:53
>>tw85+Ux4
Having an issue with users uploading CSAM (a problem for every platform) is very different from giving them a tool to quickly and easily generate CSAM, with apparently little-to-no effort to prevent this from happening.
replies(1): >>timmg+r55
◧◩◪◨⬒
548. skissa+I45[view] [source] [discussion] 2026-02-04 17:14:58
>>direwo+YW3
> It wouldn't be called CSAM in France because it would be called a French word. Arguing definitions is arguing semantics.

The most common French word is Pédopornographie. But my impression is the definition of that word under French law is possibly narrower than some definitions of the English acronym “CSAM”. Canadian law is much broader, and so what’s legally pédopornographie (English “child pornogaphy”) in Canada may be much closer to broad “CSAM” definitions

> The point is, X did things that are illegal in France, no matter what you call them.

Which French law are you alleging they violated? Article 227-23 du Code pénal, or something else? And how exactly are you claiming they violated it?

Note the French authorities at this time are not accusing them of violating the law. An investigation is simply a concern or suspicion of a legal violation, not a formal accusation; one possible outcome of an investigation is a formal accusation, another is the conclusion that they (at least technically) didn’t violate the law after all. I don’t think the French legal process has reached a conclusion either way yet.

One relevant case is the unpublished Court of Cassation decision 06-86.763 dated 12 septembre 2007 [0] which upheld a conviction of child pornography for importing and distributing the anime film “Twin Angels - le retour des bêtes célestes - Vol. 3". [0] However, the somewhat odd situation is that it appears that film is catalogued by the French national library, [1] although I don’t know if a catalogue entry definitively proves they possess the item. Also, art. 227-23 distinguishes between material depicting under 15s (illegal to even possess) and material depicting under 18s (only illegal to possess if one has intent to distribute); this prosecution appears to be have been brought under the latter category only-even though the individual was depicted as being under 15-suggesting this anime might not be illegal to possess in France if one has no intent to distribute it.

But this is the point - one needs to look at the details of exactly what the law says and how exactly the authorities apply it, rather than vague assertions of criminality which might not actually be true.

[0] https://www.legifrance.gouv.fr/juri/id/JURITEXT000007640077/

[1] https://catalogue.bnf.fr/ark:/12148/cb38377329p

◧◩◪◨⬒⬓
549. Teever+K45[view] [source] [discussion] 2026-02-04 17:15:07
>>mothba+No4
If law enforcement credibily believes that criminals are conspiring to commit a crime and are actively doing so in a particular location what is wrong with sending armed people to stop those criminal acivities as well as apprehend the criminals and what ever evidence of their crimes may exist?

If this isn't the entire purpose of law enforcement then what is exactly?

replies(1): >>mothba+u95
◧◩◪◨⬒
550. datsci+R45[view] [source] [discussion] 2026-02-04 17:15:42
>>whatis+344
I’m reminded of Lord Haw-Haw, an English-speaking Nazi propagandist during WW2 that garnered quite a bit of attention for his radio broadcasts.

There’s an interesting tidbit that he gained quite a few listeners when he started releasing casualty information that the British government withheld to try to keep wartime-morale high.

Lord Haw-Haw then tried to leverage that audience into a force of Nazi sympathy and a general mood of defeatism.

Anyway, fun anecdote. Enemy propaganda during wartime (or increased tensions) is harmless until it isn’t.

replies(1): >>whatis+db6
◧◩◪◨
551. Levitz+555[view] [source] [discussion] 2026-02-04 17:16:47
>>yxhuvu+aP3
>Yes, if you don't follow EU laws prepare to not do business in Europe.

Sure, that's what laws are for. Surely we can still point at those laws and question their motives though.

Spanish PM plainly stated a sort of editor framework of responsibility for platforms. This is the same country that strongly advocates for Chat Control within the EU, also looking for a similar under-16 ban on social media.

So the same government is at once looking to deanonimize social media users, revoke their privacy regarding communications, and to enforce moderation standards never seen before. This is, supposedly, a center-left + left coalition mind you.

Same country that blocks a chunk of the internet when a LALIGA football match is on, too. Lawfully of course. All of this is preposterous, making it the law doesn't make that better it makes it far far worse.

◧◩◪◨
552. timmg+r55[view] [source] [discussion] 2026-02-04 17:17:55
>>jmcgou+g45
If the tool generates it automatically or spuriously, then yes. But if it is the users asking it to, then I'm not sure there is a big difference.
replies(1): >>dragon+yw6
◧◩
553. hearsa+z55[view] [source] [discussion] 2026-02-04 17:18:26
>>tokai+524
> CSAM being bad.

Agreed. It's why it's difficult to support France who has sheltered Roman Polanski for decades.

It's strange how people like you only think it's bad when it suits your political agenda.

◧◩◪
554. Foobar+I55[view] [source] [discussion] 2026-02-04 17:18:56
>>SoftTa+D15
Encryption won't save you from warrants and judiciary processing anywhere in the world.
◧◩◪
555. troyvi+b65[view] [source] [discussion] 2026-02-04 17:21:56
>>comman+845
There's pro-AI censorship and then there's pro-social media censorship. It was the X offices that were raided. X is a social media company. They would have been raided whether it was AI that created the CSAM or a bunch of X contractors generating it mechanical-turk style.

I think the HN crowd is more nuanced than you're giving them credit for: https://hn.algolia.com/?q=chat+control

◧◩◪◨
556. rsynno+D65[view] [source] [discussion] 2026-02-04 17:24:03
>>plopil+8v4
Eh? The thing I linked to was a policy document on what was allowed.

> “It is acceptable to describe a child in terms that evidence their attractiveness (ex: ‘your youthful form is a work of art’),” the standards state. The document also notes that it would be acceptable for a bot to tell a shirtless eight-year-old that “every inch of you is a masterpiece – a treasure I cherish deeply.”

This is not a bug report; this is the _rules_ (or was the rules; Facebook say they have changed them after the media found out about them).

◧◩
557. notepa+J65[view] [source] [discussion] 2026-02-04 17:24:16
>>utopia+mm3
Yeah, two things to add:

1) Even when you move things to a server, or remove it from your device, evidence is still left over without your knowledge sometimes.

2) Evidence of data destruction, is in itself as the name implies, evidence. And it can be used to prove things.

For example, an ext4 journal or NTFS USN $J journal entry that shows "grok_version_2.4_schema.json" where twitter is claiming grok version 2.4 was never deployed in France/UK is important. That's why tools like shred and SDelete rename files before destroying them. But even then, when those tools rename and destroy files, it stands out, it might even be worse because investigators can speculate more. It might corroborate some other piece of evidence (e.g.: sdelete's prefetch entry on windows, or download history from a browser for the same tool), and that might be a more serious charge (obstruction of justice in the US).

replies(1): >>utopia+5r7
◧◩◪◨
558. Spivak+T65[view] [source] [discussion] 2026-02-04 17:25:03
>>vinter+fG3
I mean it's not like people get advanced notice of search warrants of that police ask pretty please. I agree that the way people use the term it's a fine usage but the person using is trying to paint a picture of a SWAT team busting down the door by calling it that.
◧◩◪◨⬒⬓⬔
559. otherm+485[view] [source] [discussion] 2026-02-04 17:30:56
>>swiftc+IK4
Or maybe they are storing documents with secrets in a room or even in the bathroom.
◧◩◪◨⬒⬓⬔
560. mothba+u95[view] [source] [discussion] 2026-02-04 17:35:44
>>Teever+K45
No, a search warrant isn't intended to [directly] apprehend criminals, though an arrest warrant may come later to do that.
replies(1): >>Teever+mt5
◧◩
561. mekdoo+Ma5[view] [source] [discussion] 2026-02-04 17:41:41
>>devwas+Vu4
Read that on X?
replies(1): >>Pedro_+ev5
◧◩◪◨⬒⬓⬔⧯
562. jazzyj+pc5[view] [source] [discussion] 2026-02-04 17:48:19
>>termin+rW3
On the contrary HN is one of the most heavily moderated forums out here and serves as a great example of a right-size community where sex trafficking is not happening under the nose of an ambivalent host

(Snark aside, in your opinion are there comments on HN that dang would be criminally liable for if it weren't for safe harbor?)

◧◩◪
563. vpark+Fe5[view] [source] [discussion] 2026-02-04 17:56:39
>>almost+fQ4
Not video but that's essentially how companies like Google operate today. That's why their engineers can use an off the shelf Chromebook. Their IDE is on the web, etc.
◧◩◪◨⬒⬓⬔⧯
564. freeja+2f5[view] [source] [discussion] 2026-02-04 17:57:47
>>throwp+JR4
That wasn't a punishment, that was a reward.
◧◩◪◨
565. krick+5i5[view] [source] [discussion] 2026-02-04 18:09:59
>>skissa+6I3
To me, the most worrying part of the whole discussion is that your comment is pretty much the most "daring", if you can call it that, attempt to question if there even is a crime. Everyone else is worried about raids (which are normal whenever there is an ongoing investigation, unfortunate as it may be to the one being investigated). And no one dares to say, that, uh, perhaps making pictures on GPU should not be considered a crime in the same sense as human-trafficking or production of weapons are... Oh, wait. The latter is legal, right.
◧◩◪◨⬒
566. spacem+bj5[view] [source] [discussion] 2026-02-04 18:14:28
>>bright+EG4
The meaning of the percentages is still unclear.
replies(1): >>ishoul+bS8
◧◩◪
567. TitaRu+Zj5[view] [source] [discussion] 2026-02-04 18:18:00
>>almost+fQ4
Did you miss what happened to Maduro?
◧◩◪◨⬒⬓⬔⧯
568. almost+Fm5[view] [source] [discussion] 2026-02-04 18:29:44
>>isr+Tn3
do you have links, all the chatgpt and gemini references I've seen show the large protest happened 1 time, not daily. And beyond that, the chatbots suggest it's overwhelming support for the arrest.

There are no news sites claiming large daily protests that you claim.

◧◩◪◨
569. graeme+cp5[view] [source] [discussion] 2026-02-04 18:38:41
>>skissa+6I3
> And I think this is part of the issue – xAI's executives are likely focused on compliance with US law on these topics, less concerned with complying with non-US law

True, but outright child porn is illegal everywhere (as you said) and the borderline legal stuff is something most of your audience is quite happy to have removed. I cannot imagine you are going to get a lot of complaints if you remove AI generated sexual images of minors, for example so it seems reasonable to play it safe.

> That's less of an issue for Anthropic/Google/OpenAI, since their executives don't have the same "anything that's legal" attitude which xAI often has.

This is also common, but it is irritating too as it means the rest of the world is stuck with silly American attitudes about things like nudity and alcohol - for example Youtube videos blurring out bits of Greek statues because they are scared of being demonetised. These are things people take kids to see in museums!

◧◩◪◨⬒⬓⬔
570. camina+jq5[view] [source] [discussion] 2026-02-04 18:42:56
>>pyrale+vq3
Thanks for the articles. I'm not disputing that Macron got lobbied for favors.

That said, the articles don't really address the discussion topic whether they committed illegal obstruction DURING raids.

To summarize, I'm separating

(1) Uber's creative operating activities (e.g., UberPop in France)

(2) from anti-raid tactics.

It looks like #1 had some fines (non-material) and arrests of Uber France executives.

However, I don't see a clear case established that Uber committed obstruction in #2. Uber had other raids in Quebec, India, the Netherlands,... with kill switches allegedly deployed 12+ times. I don't think there were ever consequences other than a compliance fine of 750 EUROS to their legal counsel in the Netherlands for "non-compliance with an official order". I doubt that's related to actions the day of the raid, but could be wrong.

◧◩◪◨⬒⬓⬔⧯▣▦
571. stickf+Jq5[view] [source] [discussion] 2026-02-04 18:44:56
>>direwo+p04
Please.
◧◩◪◨⬒
572. Toucan+Kq5[view] [source] [discussion] 2026-02-04 18:44:59
>>mekdoo+hP4
> When did we accept, "Users are doing the scamming, not the company" as an excuse?

Section 230. https://en.wikipedia.org/wiki/Section_230

As always, Washington doing the hard work of making sure corpos never need to fix anything, ever.

replies(1): >>mekdoo+Gy5
◧◩◪◨⬒⬓⬔⧯▣▦
573. pyrale+Sq5[view] [source] [discussion] 2026-02-04 18:45:19
>>chrisj+s24
> That's from RAINN, the US's largest anti-sexual violence organisation.

For everyone to make up their own opinion about this poster's honesty, here's where his quote is from [1]. Chosen quotes:

> CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.

> It doesn’t matter if the child agreed to it. It doesn’t matter if they sent the image themselves. If a minor is involved, it’s CSAM—and it’s illegal.

[1]: https://rainn.org/get-the-facts-about-csam-child-sexual-abus...

replies(1): >>chrisj+Ey5
◧◩◪◨⬒⬓⬔
574. TitaRu+ps5[view] [source] [discussion] 2026-02-04 18:53:01
>>almost+MD2
The vast majority of France doesn't like Musk or the nonces that use GROK so let's not go there it's mob rule...
◧◩
575. drawfl+Ts5[view] [source] [discussion] 2026-02-04 18:55:06
>>patric+sh4
Settings aside this is about CSAM, the US is the only one of the two to shut down a foreign social network because it dislikes what was said on it. The US doesn't get to play that card anymore.
◧◩◪◨⬒⬓⬔⧯
576. Teever+mt5[view] [source] [discussion] 2026-02-04 18:56:50
>>mothba+u95
But one could reasonably assume that a location that is known to be used for criminal activity and that likely has evidence of such criminal activity likely also has people commiting crimes.

When police raid a grow-op they often may only have a search warrant but they end up making several arrests because they find people actively commiting crimes when they execute the warrant.

◧◩◪◨⬒
577. Pedro_+fu5[view] [source] [discussion] 2026-02-04 19:00:42
>>direwo+cY3
Can your computer hold a database with trillions of tweets and sensitive user information? FFS
replies(1): >>direwo+Y16
◧◩◪
578. Pedro_+ev5[view] [source] [discussion] 2026-02-04 19:04:42
>>mekdoo+Ma5
Do you honestly not believe it is likely that EU is importing votes?

Also, haven't you seen the general push towards censorship, attempts to ban VPNs, and all the other shenanigans happening in the EU? Do you believe this is disconnected from the legal attempts on Twitter and Telegram?

Is it really a conspiracy theory at this point? Politicians do all kinds of evil shit, but these playbook tactics are where you draw the line?

I'm European and live here, before you say I'm getting these takes on X.

replies(1): >>enethe+vD5
◧◩◪◨
579. freeja+0y5[view] [source] [discussion] 2026-02-04 19:17:56
>>plopil+8v4
>I mean, the example you link is probably an engineer doing their job of signalling to hierarchy that something went deeply wrong.

and? is that not evidence?

◧◩◪◨⬒⬓⬔⧯▣▦▧
580. chrisj+Ey5[view] [source] [discussion] 2026-02-04 19:21:43
>>pyrale+Sq5
I agree with that. I'd hope everyone would.
◧◩◪◨⬒⬓
581. mekdoo+Gy5[view] [source] [discussion] 2026-02-04 19:21:45
>>Toucan+Kq5
A sincere thank you, for providing me with the exact section of law that I disagree with.
replies(1): >>Toucan+YD5
◧◩◪◨
582. enethe+vD5[view] [source] [discussion] 2026-02-04 19:47:07
>>Pedro_+ev5
+1 from an European, it's beyond obvious and I don't understand why people are letting {politics, mainstream media painting of X} shape their belief on these extremely pressing and important issues.
◧◩◪
583. ddtayl+CD5[view] [source] [discussion] 2026-02-04 19:47:36
>>y-curi+554
HN is capable of a better discussion without personal attacks.
◧◩◪◨
584. ddtayl+GD5[view] [source] [discussion] 2026-02-04 19:47:43
>>tokai+Yb4
HN is capable of a better discussion without personal attacks.
◧◩◪◨⬒⬓⬔
585. Toucan+YD5[view] [source] [discussion] 2026-02-04 19:49:23
>>mekdoo+Gy5
Happy to help!
◧◩◪◨⬒
586. kergon+0F5[view] [source] [discussion] 2026-02-04 19:55:47
>>Levitz+k25
There was a warrant. He was arrested and prosecuted, is being investigated, and will be judged in a court of law. This is rule of law.
◧◩◪◨
587. yibg+dG5[view] [source] [discussion] 2026-02-04 20:01:16
>>realus+6W4
Filtering on the platform or Grok output though? If the filtering / flagging on X is insufficient then that is a separate issue independent of Grok. If filtering output of Grok, while irresponsible in my view, I don’t see why that’s different from say photoshop not filtering its output.
replies(1): >>realus+jk7
◧◩◪◨⬒
588. cm2187+PL5[view] [source] [discussion] 2026-02-04 20:26:59
>>direwo+M04
Even if you seize the workstation and obtain the password, the files are unlikely to be stored locally.
◧◩◪◨
589. sleepy+SN5[view] [source] [discussion] 2026-02-04 20:35:20
>>a_bett+Cs4
I mean at this point, with a trillion dollars, apparently, why not just buy a small nation.
◧◩◪◨⬒
590. watwut+F16[view] [source] [discussion] 2026-02-04 21:38:24
>>Levitz+k25
There was legal warrant and the wikipedia does not mention any lies or rules being being broken. There is nothing "Iffy" on that front.

And, to spell it out, it is also funny to see who was complaining about it back then. On the free speech grounds, not less, literally people trying to dismantle democracy and create autocracy. Russian soldiers and operators, Maria Butina, Medvedev and Elon Musk. Bad faith actors having bad faith arguments.

◧◩◪◨⬒⬓
591. direwo+Y16[view] [source] [discussion] 2026-02-04 21:40:22
>>Pedro_+fu5
Are they after a database of trillions of tweets and sensitive user information? Is that all that could possibly progress the case?
replies(1): >>Pedro_+jW8
◧◩◪◨⬒
592. ytpete+r36[view] [source] [discussion] 2026-02-04 21:47:34
>>Reptil+424
3D printers don't synthesize content for you though. If they could generate 3D models of CSAM from thin air and then print them, I'm sure they'd be investigated too if they were sold with no guardrails in place.
◧◩◪◨
593. ytpete+h66[view] [source] [discussion] 2026-02-04 22:02:13
>>pdpi+cm3
The 3D printers don't generate the plans for the gun for you though. If someone sold a printer that would – happily with no guardrails – generate 3D models of CSAM from thin air and then print them, I bet they'd be investigated too. Or for that matter a 3D printer that came bundled with a built-in library of gun models you could print with very little skill...
◧◩◪◨⬒
594. tyre+D76[view] [source] [discussion] 2026-02-04 22:08:47
>>justab+SP2
A public SpaceX will still be run by Musk. A public SpaceX would have to sell assets like X for a huge loss given its debt load, which would also take a propaganda machine out of Musk’s hands.

They’re stuck with those assets.

◧◩◪
595. themaf+Z86[view] [source] [discussion] 2026-02-04 22:15:09
>>themaf+Ap3
What would be censorship is if those same companies then brigaded forums and interfered with conversations and votes in an effort to try to hide their greed and criminality.

Not that this would _ever_ happen on Hacker News. :|

◧◩◪◨⬒⬓
596. whatis+db6[view] [source] [discussion] 2026-02-04 22:25:59
>>datsci+R45
I would have thought that the Great Firewall of China would be a more obvious thing to be reminded of. Especially since there is no world war currently, yet, at least, and communication might help stop one.

Also, Godwin's law, strangely.

replies(1): >>datsci+yx6
◧◩◪
597. bhelke+Xk6[view] [source] [discussion] 2026-02-04 23:21:38
>>direwo+jY3
> when they found out it could generate CSAM, they didn't try to prevent that, they advertised it.

Twitter publicly advertised it can create CSAM?

I have been off twitter for several years and I am open to being wrong here but that sounds unlikely.

◧◩◪◨
598. omnimu+Am6[view] [source] [discussion] 2026-02-04 23:32:00
>>herman+DA
It's not at all hard to grasp for me. I think it's great actually.
◧◩◪
599. jbenne+Xs6[view] [source] [discussion] 2026-02-05 00:14:06
>>bright+9D4
Are those numbers in the article somewhere? From what I read it says that out of 7,062 cases, the platform was known for only 1,824. Then it says Snapchat accounts for 48% (not 54%). I don't see any other percentages.
◧◩◪◨⬒⬓⬔
600. defros+Uu6[view] [source] [discussion] 2026-02-05 00:28:41
>>termin+Ht4
> Shooting people who are robbing you

Check the civil trial transcripts, they were panhandling by multiple accounts.

> Each of the four youths .. was facing a trial or hearing on criminal charges at the time of the incident.

Minor stuff that was upgraded to a bench warrent after the shootings, ...

> Sounds like predators to me

That was how they were presented by much of the press until a good year after the first trial and Goetz's own description of his actions came out in the civile trial .. he one that awarded several million against him.

Thanks for providing a perfect foil for the comment about heavily politicized and media warped "justice" in the US.

replies(1): >>termin+sP6
◧◩
601. sleepy+uv6[view] [source] [discussion] 2026-02-05 00:33:15
>>sleepy+jr2
I feel like whoever downvoted this comment should probably be on a watchlist of some kind.
◧◩◪◨⬒⬓
602. moolco+gw6[view] [source] [discussion] 2026-02-05 00:39:41
>>trhway+Yv3
Take a step back and look at what you’re defending, man.
replies(1): >>trhway+Wx6
◧◩◪◨⬒
603. dragon+yw6[view] [source] [discussion] 2026-02-05 00:41:24
>>timmg+r55
Well, its worth noting that with the nonconsensual porn, child and otherwise, it was generating X would often rapidly punish the user that posted the prompt, but leave the grok-generated content up. It wasn't an issue of not having control, it was an issue of how the control was used.
◧◩◪◨⬒⬓⬔
604. datsci+yx6[view] [source] [discussion] 2026-02-05 00:49:27
>>whatis+db6
Once there’s a fascist regime with ambitions of world domination that documents their horrors more thoroughly (and has their enemies document their horrors more thoroughly) than the Nazis then we can start referencing that regime instead.

Conveniently, at least in the US, WW2 is old enough to be “history” rather than “politics”, compared to Korea and Vietnam. Or, at least that’s the excuse I was given in AP US History when the curriculum suddenly ended at 1950. So WW2 will continue to be the most well-documented topic that we’re all educated enough about to collectively reference.

Trust me, I’d much rather speak plainly about the horrors of the atrocities that the US committed in the 20th century American but we’re not there yet because the people who grew up in the nation while it committed those atrocities still run the government and basically the nation in general.

Edit: also if it wasn’t obvious I was comparing Musk to Haw Haw. I don’t know if there is an equivalent for China

replies(1): >>whatis+C47
◧◩◪◨⬒⬓⬔
605. trhway+Wx6[view] [source] [discussion] 2026-02-05 00:51:10
>>moolco+gw6
i don't understand why you're defending CSAM creation in photoshop.
◧◩◪
606. dragon+Zx6[view] [source] [discussion] 2026-02-05 00:51:22
>>yibg+RP4
> But I am having trouble justifying in an consistent manner why Grok / X should be liable here instead of the user.

This seems to rest on false assumptions that: (1) legal liability is exclusive, and (2) investigation of X is not important both to X’s liability and to pursuing the users, to the extent that they would also be subject to liability.

X/xAI may be liable for any or all of the following reasons:

* xAI generated virtual child pornography with the likenesses of actual children, which is generally illegal, even if that service was procured by a third party.

* X and xAI distributed virtual child pornography with the likenesses of actual children, which is generally illegal, irrespective of who generated and supplied them.

* To the extent that liability for either of the first two bullet points would be eliminated or mitigated by absence of knowledge at the time of the prohibited content and prompt action when the actor became aware, X often punished users for the prompts proucing the virtual child pornography without taking prompt action to remove the xAI-generated virtual child pornography resulting from the prompt, demonstrating knowledge and intent.

* When the epidemic of grok-generated nonconsensual, including child, pornography drew attention, X and xAI responded by attempting to monetize the capacity by limiting the tool to only paid X subscribers, showing an attempt to commercially profit from it, which is, again, generally illegal.

◧◩◪◨
607. popalc+2y6[view] [source] [discussion] 2026-02-05 00:51:35
>>vinter+pD3
The for-profit part may or may not be a qualifier, but the architecture of a centralized service means they automatically become the scene of the crime -- either dissemination or storing of illegal material. Whereas if Stability creates a model, and others use their model locally, the relationship of Stability to the crime is ad-hoc. They aren't an accessory.
◧◩◪◨
608. gordia+gB6[view] [source] [discussion] 2026-02-05 01:19:50
>>direwo+GY3
Although I despise it, I respect your right to lie through your teeths.
◧◩◪◨⬒
609. thranc+3D6[view] [source] [discussion] 2026-02-05 01:35:14
>>mnewme+T94
There is no mediocrity Republicans won't embrace. They have absolutely zero values, and can be made to accept and defend literally anything with sufficient propaganda.
◧◩◪◨
610. jrflow+NM6[view] [source] [discussion] 2026-02-05 02:58:31
>>AaronF+nU4
Not only that but iirc he was one of the early “creators program” folks. Musk unbanned the guy that posted CSAM and started paying him to post.
◧◩◪◨⬒⬓⬔⧯
611. termin+sP6[view] [source] [discussion] 2026-02-05 03:22:11
>>defros+Uu6
Just minor stuff like robbing people with a shotgun. That was a nice use of the ... to hide the important part. And by their own admission they were on the way to commit more crimes. Yeah, I'm sure they were just panhandling.
◧◩◪◨⬒⬓
612. hn_go_+IT6[view] [source] [discussion] 2026-02-05 03:58:24
>>direwo+u14
Not really? If the people want to get through the locked doors throughout the office, they'll need someone to let them through the doors. At least where I work, the receptionists don't have that access. They need to call people who do have universal door access to let them in. Unless the cops just want to battering-ram their way through all the doors for the fun of it...

Also, a raid without a warrant is not a raid. It's a friendly visit to someone's office.

replies(1): >>direwo+ty7
◧◩◪◨⬒⬓⬔⧯
613. whatis+C47[view] [source] [discussion] 2026-02-05 05:56:24
>>datsci+yx6
I get the impression that you are not being honest, about multiple things, sorry.
replies(1): >>datsci+7p8
◧◩◪
614. london+R87[view] [source] [discussion] 2026-02-05 06:39:12
>>Silver+rY3
A company I worked for had a 'when the police raid the office' policy, which was to require they smash down the first door, but then open all other doors for them.

That was so that later in court it could be demonstrated the data hadn't been handed over voluntarily.

They also disconnected and blocked all overseas VPN's in the process, so local law enforcement only would get access to local data.

◧◩
615. throwa+Mi7[view] [source] [discussion] 2026-02-05 08:15:17
>>code_f+ar4
To me it became clear when people here rejected any compromises on privacy and anonymous on device scanning of messages/data. Not because they rejected it but how they rejected it.

People here didn't say "yeah it's a real evil problem and e2e tech that we build makes it almost impossible to catch and helps it scale, so let's find the right tradeoff that minimizes privacy invasion". No. People here just mocked "think of the children" and said something along the lines of no amount of suffering can make any restriction to privacy be acceptable. The fact that 99% of our life is literally compromising our freedom for others they don't care.

To buellerbueller's killed comment, I totally feel it. it was a shocker when it turned out many of my fellow Russians got no problem with Putin's war and I bet that's what half of USA feels.

◧◩◪◨⬒
616. realus+jk7[view] [source] [discussion] 2026-02-05 08:29:42
>>yibg+dG5
Photoshop doesn't have a "transform to nude" button and if they did, they would be in the exact same kind of legal trouble as Grok.

That's the difference between a tool being used to commit crimes and a tool specifically designed to commit crimes.

◧◩◪
617. utopia+5r7[view] [source] [discussion] 2026-02-05 09:20:04
>>notepa+J65
Indeed, actions leaves traces, including the action of deleting data. It takes a LOT of expertise to be able to delete something without leaving a trail behind, if that's even feasible without going to extraordinary length.
◧◩◪
618. utopia+Sr7[view] [source] [discussion] 2026-02-05 09:26:02
>>almost+fQ4
Takes literally minutes to setup with Webtop (assuming you are familiar with Docker/Podman https://docs.linuxserver.io/images/docker-webtop/ ) , nothing to install on the thin client, the stock browser is enough.

I used this when an employer was forcing me to use Windows and I needed Linux tools to work efficiently so I connected home. Goes through firewalls, proxies, etc.

Anyway if you want to host this not at home but a cloud provider there was HavenCo https://en.wikipedia.org/wiki/HavenCo don't ask me how I know about it, just curiosity.

◧◩◪◨⬒
619. mooreb+Fv7[view] [source] [discussion] 2026-02-05 09:56:25
>>bluesc+dD3
Deepfakes have been around long before X (and other chatbots) allowed undressing of real people.

The difference is that the entire political Left hate and fear Elon and are desperately trying to destroy him.

◧◩◪◨⬒⬓⬔
620. direwo+ty7[view] [source] [discussion] 2026-02-05 10:20:51
>>hn_go_+IT6
Nope. With a warrant you break down the door if they don't cooperate. They do find it fun. They won't accept your stalling tactics.
replies(1): >>hn_go_+aG8
621. Havoc+7c8[view] [source] 2026-02-05 15:12:02
>>vikave+(OP)
The lack of legal/regulatory reaction US side on CSAM is pretty bewildering. And ominously mirrors the Epstein saga - other countries seem to be moving fast while US seems to be trying very hard to not move at all
◧◩◪◨⬒⬓⬔⧯▣
622. datsci+7p8[view] [source] [discussion] 2026-02-05 16:20:22
>>whatis+C47
I’ll admit to being misinformed on occasion, and greatly appreciate the opportunities to learn and be corrected.

Dishonest, though? To what end? If anything, the anonymity of the internet allows me to test my more weakly-held fringe beliefs and adjust them as I receive feedback.

I get the impression that you’re not well-informed enough to have a meaningful conversation about these difficult topics, though you may hold an opinion with significant conviction. Sorry.

◧◩◪◨
623. utopia+Cq8[view] [source] [discussion] 2026-02-05 16:28:50
>>AtlasB+115
Exactly, but that's outdated, what's the current equivalent?
◧◩◪◨⬒⬓⬔⧯
624. hn_go_+aG8[view] [source] [discussion] 2026-02-05 17:49:44
>>direwo+ty7
"don't cooperate"? "Yes sir, let me call the people downstairs who can help you." is not stalling.

If they're so impatient, are they going to somehow hack the badge-controlled elevators to make them go where they want?

◧◩◪◨⬒⬓
625. ishoul+bS8[view] [source] [discussion] 2026-02-05 18:36:22
>>spacem+bj5
Read the article they linked to.
◧◩◪◨⬒⬓⬔
626. Pedro_+jW8[view] [source] [discussion] 2026-02-05 18:53:16
>>direwo+Y16
Isn't the algorithm published monthly on GitHub since a couple of weeks ago? You think they are gonna find a commit message "feat(racism): improve antisemitism capabilities".

Not sure what they're gonna prove with this.

◧◩◪◨⬒
627. cwillu+Ud9[view] [source] [discussion] 2026-02-05 20:06:53
>>bawolf+Bd3
“For certain tax and criminal investigations, French law operates a distinction between legal advice privilege and litigation privilege. While litigation privilege is always protected, legal advice privilege is not protected when (i) the investigation pertains to tax fraud, corruption, influence peddling, terrorism financing and money laundering offenses, and (ii) the legal opinions, correspondence or exhibits that are in possession of, or were communicated by, the lawyer or the client were used for committing or facilitating the commission of said offenses. This exception applies to materials that were not prepared in the context of a litigation proceeding and is strictly controlled; a judge makes the final determination as to which materials can be disclosed.”

Seems like it's not a literal get-out-of-jail-free in france.

[go to top]