zlacker

[parent] [thread] 13 comments
1. vbezhe+(OP)[view] [source] 2026-02-06 19:55:10
Put private key into every digital camera and hash/sign every frame. That private key is accompanied with manufacturer signature and can't be easily extracted. Mark all unsigned media as suspicious.
replies(6): >>gabrie+L >>acesso+Z >>widder+21 >>malfis+f1 >>eqvino+j1 >>catera+P1
2. gabrie+L[view] [source] 2026-02-06 19:59:07
>>vbezhe+(OP)
Isn't that similar to this?

https://en.wikipedia.org/wiki/Content_Authenticity_Initiativ...

3. acesso+Z[view] [source] 2026-02-06 20:00:15
>>vbezhe+(OP)
More surveillance and tracking won't be the solution.
4. widder+21[view] [source] 2026-02-06 20:00:32
>>vbezhe+(OP)
"and can't be easily extracted" is doing a lot of work there. People are very good at reverse-engineering. There would soon be a black market for 'clean' private keys that could be used to sign any video you want.
replies(2): >>munk-a+32 >>nerdsn+i3
5. malfis+f1[view] [source] 2026-02-06 20:01:42
>>vbezhe+(OP)
That certainly won't be used to violate someone's privacy.
6. eqvino+j1[view] [source] 2026-02-06 20:01:52
>>vbezhe+(OP)
"can't easily be extracted" = "the number of people who can extract it is small but still non-zero"

And those people now have the power to put you in jail, by putting your camera's signature on illegal content.

You've also just made journalism 3 notches harder. Like documenting atrocities in, say, North Korea. Or for whistleblowers in your home steel mill run by a corporate slavedriver.

Oh. Also. Why are you choosing the camera side to put this on? Why not the AI side? Require watermarks and signatures for anything created in such a way…

…of course that has its own set of intractable problems.

replies(1): >>nerdsn+L3
7. catera+P1[view] [source] 2026-02-06 20:04:01
>>vbezhe+(OP)
That makes it easy to prove authenticity (has signature), but doesn’t solve the “prove it’s fake” problem.
replies(1): >>nerdsn+l2
◧◩
8. munk-a+32[view] [source] [discussion] 2026-02-06 20:05:34
>>widder+21
There would also be a requirement for all playback to actually properly check the private keys and for all the parties involved in the process to be acting in good faith. Not only would you have a black market for individuals to scalp clean keys but you'd likely have nation states with interests putting pressure on local manufacturers to give them backdoors.

We'd probably hit a lot of that with SSL if it wasn't so unimportant from a political perspective[1]... but if the thing we were trying to secure is going to boost or damage some prominent politician directly then the level of pressure is going to be on a whole different scale.

1. And we might still have that corruption of SSL when it comes to targeted phishing attacks.

replies(1): >>cheeze+w3
◧◩
9. nerdsn+l2[view] [source] [discussion] 2026-02-06 20:07:26
>>catera+P1
Ideally, the prosecutor bears the burden of proof. We generally shouldn't impose systems that require defendants to prove a negative. I recognize that reality does not necessarily match this ideal.
replies(1): >>wat100+z9
◧◩
10. nerdsn+i3[view] [source] [discussion] 2026-02-06 20:12:03
>>widder+21
There's also always the "analog loophole". Display the AI-generated video on a sufficiently high-resolution / color gamut display and record it on whatever device has convenient specs for making the recording, then do some light post-processing to fix moire/color/geometry. This would likely be detectable, but could shift the burden of (dis-)proof to the defendant, who might not have the money for the expert witnesses required to properly argue the technical merits of their case.

More likely, the signing would have to use compression-resistant steganography, otherwise it's pretty easy to just remux/re-encode the video to strip the metadata.

◧◩◪
11. cheeze+w3[view] [source] [discussion] 2026-02-06 20:13:25
>>munk-a+32
> There would also be a requirement for all playback to actually properly check the private keys

I don't think that's true. Only for someone who wanted to prove authenticity to grab the signature. No private keys would be exposed (except those which were hacked.)

If Netflix and Amazon can't keep their 4k HDR webrips from being leaked (supposedly via extracted licenses from Nvidia Shields), I have no idea how we'd expect all camera manufacturers to do it. Maybe iPhones and flagship Apple devices, but even then we'd find vulns in older devices over time.

replies(1): >>munk-a+X6
◧◩
12. nerdsn+L3[view] [source] [discussion] 2026-02-06 20:14:37
>>eqvino+j1
Ideally, the keys would be per-manufacturer, like HDCP or (DVD-)CSS. Personally I don't think I'd love the idea of any kind of attestation like this, but if TPTB did implement it, I'd prefer a key per-manufacturer rather than each unit having its own unique signing key. We do have precedent, in the form of printer tracking dots, which were kept 'secret' from the public for 20 years. [0]

0: https://en.wikipedia.org/wiki/Printer_tracking_dots

◧◩◪◨
13. munk-a+X6[view] [source] [discussion] 2026-02-06 20:34:17
>>cheeze+w3
I was thinking more about the spread of disinformation at large - but yea, that playback requirement would only be necessary for anything that wanted to be considered a potential source and trying to protect against disinformation platforms is a much larger problem then technology can solve on its own.
◧◩◪
14. wat100+z9[view] [source] [discussion] 2026-02-06 20:49:37
>>nerdsn+l2
It's ultimately up to juries to decide whether a defendant's assertion that evidence is fake is enough to constitute reasonable doubt in the absence of hard evidence for it. I imagine that's going to be very context-dependent. It would probably work if I was accused of this, with no history of anything like this, versus a guy who does this frequently, posts videos of himself doing it regularly, and never gave any indication they're fake until he got in trouble.
[go to top]