zlacker

[parent] [thread] 1 comments
1. JaySta+(OP)[view] [source] 2022-05-24 02:39:22
Huh, I had never thought of that. Makes it seem like there's a small window of authenticity closing.

The irony is that if you had a great discriminator to separate the wheat from the chaff, that it would probably make its way into the next model and would no longer be useful.

My only recommendation is that OpenAI et al should be tagging metadata for all generated images as synthetic. That would be a really interesting tag for media file formats (would be much better native than metadata though) and probably useful across a lot of domains.

replies(1): >>joshsp+m5
2. joshsp+m5[view] [source] 2022-05-24 03:47:39
>>JaySta+(OP)
The OpenAI access agreement actually says that you must add (or keep?) a watermark on any generated images, so you’re in good company with that line of thinking.
[go to top]