zlacker

[return to "My AI skeptic friends are all nuts"]
1. baobun+44[view] [source] 2025-06-02 21:34:31
>>tablet+(OP)
The privacy aspect and other security risks tho? So far all the praise I hear on productivity are from people using cloud-hosted models.

Claude, Gemini, Copilot and and ChatGPT are non-starters for privacy-minded folks.

So far, local experiements with agents have left me underwhelmed. Tried everything on ollama that can run on my dedicated Ryzen 8700G with 96GB DDR5. I'm ready to blow ~10-15k USD on a better rig if I see value in it but if I extrapolate current results I believe it'll be another CPU generation before I can expect positive productivity output from properly securely running local models when factoring in the setup and meta.

◧◩
2. storus+Ap[view] [source] 2025-06-02 23:52:12
>>baobun+44
MacStudio with 512GB RAM starts at around 10k and quantized DeepSeek R1 671B needs around 400GB RAM, making it usable for your needs. It produced some outstanding code on many tasks I tried (some not so outstanding as well).
[go to top]