zlacker

[parent] [thread] 3 comments
1. river_+(OP)[view] [source] 2026-01-12 10:41:06
Next level up is self hosting your LLM! I put LM Studio on a mac mini at home and have been extremely happy with it. Then you can use a tool like opencode to connect to that LLM and boom, Claude Code dependency is removed and you just got even more self-hosted. For what you're using Claude Code for, a smaller open-weight model would probably work fine
replies(1): >>NicoJu+G
2. NicoJu+G[view] [source] 2026-01-12 10:45:52
>>river_+(OP)
Well, to a limit. I have an RTX 3090 24gb that enables a lot of use-cases.

But for what i'm using Agents right now, claude code is the tool to go.

replies(1): >>river_+Qj
◧◩
3. river_+Qj[view] [source] [discussion] 2026-01-12 13:02:22
>>NicoJu+G
makes sense. You could look at something like https://github.com/musistudio/claude-code-router if at some point you're interested in going down that path. I've been using gpt-oss-20b which would fit on your GPU and I've found useful for basic tasks like recipe creation and agentic tool usage (I use it with Notion MCP tools)
replies(1): >>NicoJu+oh2
◧◩◪
4. NicoJu+oh2[view] [source] [discussion] 2026-01-12 23:03:08
>>river_+Qj
It's a really good model for its size, but context length is a serious issue to avoid hallucination
[go to top]