zlacker
[parent]
[thread]
1 comments
1. storys+(OP)
[view]
[source]
2026-01-28 12:24:54
I thought the RTX 6000 Ada was 48GB? If you have 96GB available that implies a dual setup, so you must be relying on tensor parallelism to shard the model weights across the pair.
replies(1):
>>embedd+d2
◧
2. embedd+d2
[view]
[source]
2026-01-28 12:40:08
>>storys+(OP)
RTX Pro 6000 - 96GB VRAM - Single card
[go to top]