zlacker

[return to "My AI skeptic friends are all nuts"]
1. baobun+44[view] [source] 2025-06-02 21:34:31
>>tablet+(OP)
The privacy aspect and other security risks tho? So far all the praise I hear on productivity are from people using cloud-hosted models.

Claude, Gemini, Copilot and and ChatGPT are non-starters for privacy-minded folks.

So far, local experiements with agents have left me underwhelmed. Tried everything on ollama that can run on my dedicated Ryzen 8700G with 96GB DDR5. I'm ready to blow ~10-15k USD on a better rig if I see value in it but if I extrapolate current results I believe it'll be another CPU generation before I can expect positive productivity output from properly securely running local models when factoring in the setup and meta.

◧◩
2. storus+Ap[view] [source] 2025-06-02 23:52:12
>>baobun+44
MacStudio with 512GB RAM starts at around 10k and quantized DeepSeek R1 671B needs around 400GB RAM, making it usable for your needs. It produced some outstanding code on many tasks I tried (some not so outstanding as well).
◧◩◪
3. baobun+Gr[view] [source] 2025-06-03 00:09:06
>>storus+Ap
Am I right in assuming that running Linux (or anything else than macOS) on the MacStudio is experimental at best?

I'd be looking for something that can run offline and receive system updates from an internal mirror on the airgapped network. Needing to tie an AppleID to the machine and allow it internet access for OS updates is a hard sell. Am I wrong in thinking that keeping an airgapped macOS installation up to date would additional infrastructure that requires some enterprise contract with Apple?

◧◩◪◨
4. storus+it[view] [source] 2025-06-03 00:23:02
>>baobun+Gr
IIRC you can download OS update/installation DMG from Apple, put it on a USB key and run it on airgapped system. I don't think you even need Apple ID. MacOS with homebrew works more-less like Linux, at least tooling is basically the same. You won't be able to install any Linux on M3 Ultra.
[go to top]