zlacker

[parent] [thread] 1 comments
1. faeyan+(OP)[view] [source] 2026-01-23 19:21:28
what would a knwoedge distillation prompt even look like, and how could I make sure I would not accidentally fall into this trap?
replies(1): >>ddtayl+684
2. ddtayl+684[view] [source] 2026-01-25 09:57:30
>>faeyan+(OP)
My guess is that something that looks like the "teacher and student" model. I know there were methods in the past to utilize the token distribution to "retrain" one model with another, kind of like an auto fine-tuning, but AFAIK those are for offline model usage since you need the token distribution. There do appear to be similar methods for online-only models?
[go to top]