zlacker

[parent] [thread] 2 comments
1. manimi+(OP)[view] [source] 2022-12-15 13:09:44
I am not allowed to print $100 bills with my general-purpose printer. Many printing and copy machines come with built-in safeguards to prevent users from even trying.

It's quite possible to apply the same kind of protections to generative models. (I hope this does not happen, but it is fully possible.)

replies(1): >>bootsm+N7
2. bootsm+N7[view] [source] 2022-12-15 13:50:37
>>manimi+(OP)
Entirely different scales apply here. You can hardcode a printer the 7 different bills each country puts out no problem, but you cannot hardcode the billions of "original" art pieces that the model is supposed to check against during training, its just infeasible.
replies(1): >>Curiou+OC
◧◩
3. Curiou+OC[view] [source] [discussion] 2022-12-15 15:49:59
>>bootsm+N7
Not exactly true. Given an image, you can find the closest point in the latent space that image corresponds to. It is totally feasible to do this with every image in the training set, and if that point in the latent space is too close to the training image, just add it to a set of "disallowed" latent points. This wouldn't fly for local generation, as the process would take a long time and generate a multi gigabyte (maybe even terabyte) "disallowed" database, but for online image generators it's not insane.
[go to top]