zlacker

[return to "Qwen3-Coder-Next"]
1. simonw+l3[view] [source] 2026-02-03 16:15:21
>>daniel+(OP)
This GGUF is 48.4GB - https://huggingface.co/Qwen/Qwen3-Coder-Next-GGUF/tree/main/... - which should be usable on higher end laptops.

I still haven't experienced a local model that fits on my 64GB MacBook Pro and can run a coding agent like Codex CLI or Claude code well enough to be useful.

Maybe this will be the one? This Unsloth guide from a sibling comment suggests it might be: https://unsloth.ai/docs/models/qwen3-coder-next

◧◩
2. codazo+fm1[view] [source] 2026-02-03 22:00:07
>>simonw+l3
I can't get Codex CLI or Claude Code to use small local models and to use tools. This is because those tools use XML and the small local models have JSON tool use baked into them. No amount of prompting can fix it.

In a day or two I'll release my answer to this problem. But, I'm curious, have you had a different experience where tool use works in one of these CLIs with a small local model?

◧◩◪
3. regula+do1[view] [source] 2026-02-03 22:10:54
>>codazo+fm1
Surely the answer is a very small proxy server between the two?
◧◩◪◨
4. codazo+No1[view] [source] 2026-02-03 22:14:27
>>regula+do1
That might work, but I keep seeing people talk about this, so there must be a simple solution that I'm over-looking. My solution is to write my own minimal and experimental CLI that talks JSON tools.
[go to top]