zlacker

[parent] [thread] 2 comments
1. Gigach+(OP)[view] [source] 2022-12-15 12:46:52
I’m not too sure how it works but someone commented that you can take the model and “resume training” it on the extra dataset you want to add.

Given most of the heavy lifting is already done, this seems like a pretty easy thing for anyone to do.

replies(2): >>gpdere+f1 >>mejuto+xs
2. gpdere+f1[view] [source] 2022-12-15 12:53:04
>>Gigach+(OP)
https://dreambooth.github.io/

edit: the examples are all about objects, but my understanding is that it is capable of style transfers as well.

3. mejuto+xs[view] [source] 2022-12-15 14:56:34
>>Gigach+(OP)
It is called fine-tuning or transfer learning, and you usually train the last layer.

Here is an example for keras (a popular ML framework). https://keras.io/guides/transfer_learning/

[go to top]