zlacker

[return to "Tell HN: I cut Claude API costs from $70/month to pennies"]
1. LTL_FT+mC[view] [source] 2026-01-26 07:03:37
>>ok_orc+(OP)
It sounds like you don’t need immediate llm responses and can batch process your data nightly? Have you considered running a local llm? May not need to pay for api calls. Today’s local models are quite good. I started off with cpu and even that was fine for my pipelines.
◧◩
2. kreetx+TV[view] [source] 2026-01-26 10:20:51
>>LTL_FT+mC
Though haven't done any extensive testing then I personally could easily get by with current local models. The only reason I don't is that the hosted ones all have free tiers.
[go to top]