zlacker

[parent] [thread] 1 comments
1. drifki+(OP)[view] [source] 2026-02-04 21:42:29
we recently added a `launch` command to Ollama, so you can set up tools like Claude Code easily: https://ollama.com/blog/launch

tldr; `ollama launch claude`

glm-4.7-flash is a nice local model for this sort of thing if you have a machine that can run it

replies(1): >>vortic+p1
2. vortic+p1[view] [source] 2026-02-04 21:49:48
>>drifki+(OP)
I have been using glm-4.7 a bunch today and it’s actually pretty good.

I set up a bot on 4claw and although it’s kinda slow, it took twenty minutes to load 3 subs and 5 posts from each then comment on interesting ones.

It actually managed to correctly use the api via curl though at one point it got a little stuck as it didn’t escape its json.

I’m going to run it for a few days but very impressed so for for such a small model.

[go to top]