zlacker

[parent] [thread] 4 comments
1. knicho+(OP)[view] [source] 2023-11-17 22:13:03
When I was mining with a bunch of RTX 3080s and RTX 3090s, the electricity cost (admittedly) was about $20/month per card. Running a 70B model takes 3-4 cards. Assuming you're pushing these cards to their extreme max, it's going to be $80/mo. Then again, ChatGPT is pretty awesome, and is likely running more than a 70B model (or I think I heard it was running an ensemble of models), so there's at least a ballpark.
replies(3): >>sodali+13 >>Sebb76+89 >>698969+O11
2. sodali+13[view] [source] 2023-11-17 22:27:59
>>knicho+(OP)
Batched inference makes these calculations hard - roughly takes the same amount of power and time for one inference vs 30 (as i understand it)
3. Sebb76+89[view] [source] 2023-11-17 22:54:13
>>knicho+(OP)
Datacenters probably do not pay retail rates on electricity, so they might actually run quite a bit cheaper (or more expensive if they use highly available power, but this seems like overkill for pure compute power).
replies(1): >>015a+Fy
◧◩
4. 015a+Fy[view] [source] [discussion] 2023-11-18 00:57:52
>>Sebb76+89
Sure, but everything else about a data center is more expensive (real estate, operations people, networking, equipment). There's a reason AWS is so expensive.
5. 698969+O11[view] [source] 2023-11-18 04:24:22
>>knicho+(OP)
Presumably your miner is running 24/7 throughout the month. Not the same for ChatGPT which would answer maybe 10 sessions (with multiple pauses between queries) tops from a single person in a day.
[go to top]