zlacker
[parent]
[thread]
3 comments
1. s17n+(OP)
[view]
[source]
2022-05-24 00:57:57
Running inference on one of these models takes like a GPU minute, so they can't just let the public use them.
replies(2):
>>throwa+31
>>yeldar+b6
◧
2. throwa+31
[view]
[source]
2022-05-24 01:06:50
>>s17n+(OP)
They can absolutely do for it if they charge the public the cost of GPU time.
◧
3. yeldar+b6
[view]
[source]
2022-05-24 01:52:52
>>s17n+(OP)
Can't be this; Google Colab gives out tons of free GPU usage.
replies(1):
>>astran+as
◧◩
4. astran+as
[view]
[source]
[discussion]
2022-05-24 06:02:57
>>yeldar+b6
Google has a lot of GPUs, but even so Colab seems like it’s a lot cheaper than it should be. You can get some very good GPUs on the paid plan.
[go to top]