zlacker
[parent]
[thread]
0 comments
1. sangwu+(OP)
[view]
[source]
2025-07-31 23:20:25
The architecture is the same so we found that some LoRAs work out-of-the box, but some LoRAs don't. In those cases, I would expect people to re-run their LoRA finetuning with the trainer they've used.
[go to top]