zlacker
[parent]
[thread]
1 comments
1. crysta+(OP)
[view]
[source]
2026-02-02 03:13:17
Seems much more likely the cost will go down 99%. With open source models and architectural innovations, something like Claude will run on a local machine for free.
replies(1):
>>walter+G9
◧
2. walter+G9
[view]
[source]
2026-02-02 05:00:36
>>crysta+(OP)
How much RAM and SSD will be needed by future local inference, to be competitive with present cloud inference?
[go to top]