zlacker
[parent]
[thread]
1 comments
1. swyx+(OP)
[view]
[source]
2025-07-31 22:43:07
do LoRAs conflict with your distillation?
replies(1):
>>sangwu+15
◧
2. sangwu+15
[view]
[source]
2025-07-31 23:20:25
>>swyx+(OP)
The architecture is the same so we found that some LoRAs work out-of-the box, but some LoRAs don't. In those cases, I would expect people to re-run their LoRA finetuning with the trainer they've used.
[go to top]