zlacker

[return to "I was banned from Claude for scaffolding a Claude.md file?"]
1. ddtayl+ry2[view] [source] 2026-01-23 14:55:42
>>hugoda+(OP)
You are probably triggering their knowledge distillation checks.
◧◩
2. faeyan+7r3[view] [source] 2026-01-23 19:21:28
>>ddtayl+ry2
what would a knwoedge distillation prompt even look like, and how could I make sure I would not accidentally fall into this trap?
◧◩◪
3. ddtayl+dz7[view] [source] 2026-01-25 09:57:30
>>faeyan+7r3
My guess is that something that looks like the "teacher and student" model. I know there were methods in the past to utilize the token distribution to "retrain" one model with another, kind of like an auto fine-tuning, but AFAIK those are for offline model usage since you need the token distribution. There do appear to be similar methods for online-only models?
[go to top]