https://en.wikipedia.org/wiki/Content_Authenticity_Initiativ...
And those people now have the power to put you in jail, by putting your camera's signature on illegal content.
You've also just made journalism 3 notches harder. Like documenting atrocities in, say, North Korea. Or for whistleblowers in your home steel mill run by a corporate slavedriver.
Oh. Also. Why are you choosing the camera side to put this on? Why not the AI side? Require watermarks and signatures for anything created in such a way…
…of course that has its own set of intractable problems.
We'd probably hit a lot of that with SSL if it wasn't so unimportant from a political perspective[1]... but if the thing we were trying to secure is going to boost or damage some prominent politician directly then the level of pressure is going to be on a whole different scale.
1. And we might still have that corruption of SSL when it comes to targeted phishing attacks.
More likely, the signing would have to use compression-resistant steganography, otherwise it's pretty easy to just remux/re-encode the video to strip the metadata.
I don't think that's true. Only for someone who wanted to prove authenticity to grab the signature. No private keys would be exposed (except those which were hacked.)
If Netflix and Amazon can't keep their 4k HDR webrips from being leaked (supposedly via extracted licenses from Nvidia Shields), I have no idea how we'd expect all camera manufacturers to do it. Maybe iPhones and flagship Apple devices, but even then we'd find vulns in older devices over time.