zlacker

[return to "My AI skeptic friends are all nuts"]
1. cesarb+Zl[view] [source] 2025-06-02 23:25:17
>>tablet+(OP)
This article does not touch on the thing which worries me the most with respect to LLMs: the dependence.

Unless you can run the LLM locally, on a computer you own, you are now completely dependent on a remote centralized system to do your work. Whoever controls that system can arbitrarily raise the prices, subtly manipulate the outputs, store and do anything they want with the inputs, or even suddenly cease to operate. And since, according to this article, only the latest and greatest LLM is acceptable (and I've seen that exact same argument six months ago), running locally is not viable (I've seen, in a recent discussion, someone mention a home server with something like 384G of RAM just to run one LLM locally).

To those of us who like Free Software because of the freedom it gives us, this is a severe regression.

◧◩
2. elever+g61[view] [source] 2025-06-03 07:19:00
>>cesarb+Zl
It's also why local models, even if less powerful, are so important. The gap between "state of the art" and "good enough for a lot of workflows" is narrowing fast
◧◩◪
3. mplanc+CC4[view] [source] 2025-06-04 14:00:02
>>elever+g61
Yeah I am very excited for local models to get good enough to be properly useful. I’m a bit of an AI skeptic I’ll admit, but I’m much more of a SV venture-backed company skeptic. The idea of being heavily reliant on such a company, plus needing to be online, plus needing to pay money just to get some coding done is pretty unpalatable to me.
[go to top]