zlacker

[return to "My AI skeptic friends are all nuts"]
1. cesarb+Zl[view] [source] 2025-06-02 23:25:17
>>tablet+(OP)
This article does not touch on the thing which worries me the most with respect to LLMs: the dependence.

Unless you can run the LLM locally, on a computer you own, you are now completely dependent on a remote centralized system to do your work. Whoever controls that system can arbitrarily raise the prices, subtly manipulate the outputs, store and do anything they want with the inputs, or even suddenly cease to operate. And since, according to this article, only the latest and greatest LLM is acceptable (and I've seen that exact same argument six months ago), running locally is not viable (I've seen, in a recent discussion, someone mention a home server with something like 384G of RAM just to run one LLM locally).

To those of us who like Free Software because of the freedom it gives us, this is a severe regression.

◧◩
2. dabock+YH2[view] [source] 2025-06-03 19:07:29
>>cesarb+Zl
You can get 90%+ of the way there with a tiny “coder” LLM running on the Ollama backend with an extension like RooCode and a ton of MCP tools.

In fact, MCP is so ground breaking that I consider it to be the actual meat and potatoes of coding AIs. Large models are too monolithic, and knowledge is forever changing. Better just to use a small 14b model (or even 8b in some cases!) with some MCP search tools, a good knowledge graph for memory, and a decent front end for everything. Let it teach itself based on the current context.

And all of that can run on an off the shelf $1k gaming computer from Costco. It’ll be super slow compared to a cloud system (like HDD vs SSD levels of slowness), but it will run in the first place and you’ll get *something* out of it.

◧◩◪
3. esaym+803[view] [source] 2025-06-03 20:52:40
>>dabock+YH2
Why don't you elaborate on your setup then?
◧◩◪◨
4. xandri+kc3[view] [source] 2025-06-03 22:13:30
>>esaym+803
Because you can look it easily up. Jan, gtp4all, etc.

It's not black magic anymore.

[go to top]