A logical mistake might imply a blind spot inherent to the model, a blind spot that might not be present in all models.
Would it be better to just double the size of one of the models rather than house both?
Genuine question
https://www.reddit.com/r/LocalLLaMA/comments/17vcr9d/llm_com...