Finding an original copy on a go-pro would likely be pretty compelling evidence but this (and the more scary politically centered questions like this) are why I wish we had a way to build a durable chain of custody into these technologies. It is infeasible from everything I've seen but it would be a big win for society.
Do you? Consider for a moment all the dissidents and protestors who would be ensnared by their own devices then, with no "it was all ai" defense available?
https://en.wikipedia.org/wiki/Content_Authenticity_Initiativ...
I don't think the lack of a durable chain of custody really provides any protection - that protection needs to come from a strong legal system and social contract to protect whistleblowers. If you're thinking of, as an example, an Iranian smuggling out protest footage, they're already taking an extreme risk and have a state using numerous tools to try and track them down - but the lack of a durable chain gives a wide area of authorities to cast doubt on the truth.
I think your question is interesting to ponder and I think there are arguments in both directions - but my mind keeps coming back to the tank man photo being smuggled out of China and how much more difficult it would be in the modern world for a single image to carry such weight.
And those people now have the power to put you in jail, by putting your camera's signature on illegal content.
You've also just made journalism 3 notches harder. Like documenting atrocities in, say, North Korea. Or for whistleblowers in your home steel mill run by a corporate slavedriver.
Oh. Also. Why are you choosing the camera side to put this on? Why not the AI side? Require watermarks and signatures for anything created in such a way…
…of course that has its own set of intractable problems.
That social contract is quite a bit of a hit&miss if you look at countries across the globe. Same for the strong legal system. Other concerns aside, does this not make the whole approach a non-starter?
We'd probably hit a lot of that with SSL if it wasn't so unimportant from a political perspective[1]... but if the thing we were trying to secure is going to boost or damage some prominent politician directly then the level of pressure is going to be on a whole different scale.
1. And we might still have that corruption of SSL when it comes to targeted phishing attacks.
"I extracted and added the noise profile to the AI generated video with a goPro to make it look legitimate"
Nah. People who do something like this can't help but brag. They'll incriminate themselves in seconds voluntarily.
More likely, the signing would have to use compression-resistant steganography, otherwise it's pretty easy to just remux/re-encode the video to strip the metadata.
I don't think that's true. Only for someone who wanted to prove authenticity to grab the signature. No private keys would be exposed (except those which were hacked.)
If Netflix and Amazon can't keep their 4k HDR webrips from being leaked (supposedly via extracted licenses from Nvidia Shields), I have no idea how we'd expect all camera manufacturers to do it. Maybe iPhones and flagship Apple devices, but even then we'd find vulns in older devices over time.
creating a fake video about a protest is going to upset those that want to hide the protest just as much.