zlacker

[parent] [thread] 1 comments
1. vages+(OP)[view] [source] 2026-02-02 07:25:07
Which local AI do you use? I am local-curious, but don’t know which models to try, as people mention them by model name much less than their cloud counterparts.
replies(1): >>stego-+4b2
2. stego-+4b2[view] [source] 2026-02-02 21:22:13
>>vages+(OP)
I'm frequently rotating and experimenting specifically because I don't want to be dependent upon a single model when everything changes week-to-week; focusing on foundations, not processes. Right now, I've got a Ministral 3 14B reasoning model and Qwen3 8B model on my Macbook Pro; I think my RTX 3090 rig uses a slightly larger parameter/less quantized Ministral model by default, and juggles old Gemini/OpenAI "open weights" models as they're released.
[go to top]