Next level up is self hosting your LLM! I put LM Studio on a mac mini at home and have been extremely happy with it. Then you can use a tool like opencode to connect to that LLM and boom, Claude Code dependency is removed and you just got even more self-hosted. For what you're using Claude Code for, a smaller open-weight model would probably work fine
>>NicoJu+G
makes sense. You could look at something like https://github.com/musistudio/claude-code-router if at some point you're interested in going down that path. I've been using gpt-oss-20b which would fit on your GPU and I've found useful for basic tasks like recipe creation and agentic tool usage (I use it with Notion MCP tools)