zlacker

[parent] [thread] 29 comments
1. rektid+(OP)[view] [source] 2023-03-05 06:13:59
Because it's exactly like the parent said: Nvidia has always Nvidia & always has been, a tightfisted tightwad that makes everything they do ultra-proprietary. Nvidia never creates standards or participates.

Sometimes, like with CUDA, they just have an early enough lead that they entrench.

Vile player. They're worse than IBM. Soulless & domineering to the max, to every extent possible. What a sad story.

replies(5): >>jemmyw+I1 >>mschue+R1 >>rabf+13 >>DeathA+D8 >>agumon+Bd
2. jemmyw+I1[view] [source] 2023-03-05 06:34:38
>>rektid+(OP)
I think any company who feels they are in the lead with something competitive would do the same. The ones who open their standards were behind to begin with and that's their way of combating the proprietary competition.
replies(1): >>rektid+j2
3. mschue+R1[view] [source] 2023-03-05 06:37:27
>>rektid+(OP)
> Sometimes, like with CUDA, they just have an early enough lead that they entrench.

The problem in case of CUDA isn't just that NVIDIA was there early, it's that AMD and Khronos still offer no viable alternative after more than a decade. I've switched to CUDA half a year ago after trying to avoid it for years due to being proprietary. Unfortunately I discovered that CUDA is absolutely amazing - It's easy to get started, developer friendly in that it "just works" (which is never the case for Khronos APIs and environments), and it's incredibly powerful, kind of like programming C++17 for 80 x 128 SIMD processors. I wish there was a platform independent alternative, but OpenCL, Sycl, ROCm aren't it.

replies(1): >>ribs+a8
◧◩
4. rektid+j2[view] [source] [discussion] 2023-03-05 06:44:18
>>jemmyw+I1
Belief in your own technology, even if it is good, as it turns out, is often insufficient to really win. At some point, in computing, you need some ecosystem buy in, and you almost certainly will not be able to go it alone.

Nvidia seems utterly disinterested in learning these lessons, decades in now: they just gets more and more competitive, less and less participatory. It wild. On the one hand they do a great job maintaining products like the Nvidia Shield TV. On the other hand, if you try anything other than Linux4Tegra (l4t) on most of their products (the Android devices wont work at all for anything but Android btw) it probably wont work at all or will be miserable.

Nvidia has one of the weirdest moats, of being open source like & providing ok-ish open source mini-worlds, but you have to stay within 100m of the keep or it all falls apart. And yea, a lot of people simply dont notice. Nvidia has attracted a large camp-followers group, semi-tech folk, that they enable, but who dont really grasp the weird limited context they are reserved on.

replies(2): >>fud101+R6 >>bsder+UH1
5. rabf+13[view] [source] 2023-03-05 06:54:54
>>rektid+(OP)
Nvidia has had driver parity for linux, freebsd and windows for many many years. No other graphics card manufacturer has come close to the quality of their software stack accross platforms. For that they have my gratitude.
replies(1): >>foxhil+m4
◧◩
6. foxhil+m4[view] [source] [discussion] 2023-03-05 07:11:57
>>rabf+13
DLSS was windows only for some time.

linux’s amdgpu is far better than the nvidia-driver.

replies(3): >>rabf+75 >>alanfr+z6 >>Gordon+B7
◧◩◪
7. rabf+75[view] [source] [discussion] 2023-03-05 07:25:03
>>foxhil+m4
ATI drivers were a horror show for the longest time on windows never mind linux. What Nvidia did was have have basically the same driver code for all operating systems with a compatibility shim. If you were using any sort of professioinal 3d software over the previous 2 decades Nvidia were the only viable solution.

Source: Was burned by ATI, Matrox, 3dlabs before finallly coughing up the cash for Nvidia.

replies(2): >>foxhil+vc >>nick__+oZ
◧◩◪
8. alanfr+z6[view] [source] [discussion] 2023-03-05 07:44:46
>>foxhil+m4
amdgpu is better now. But was terrible for years, probably 2000-2015. That’s what gp is saying.
replies(2): >>hulitu+7b >>foxhil+7c
◧◩◪
9. fud101+R6[view] [source] [discussion] 2023-03-05 07:48:07
>>rektid+j2
What do they get right with shield?
◧◩◪
10. Gordon+B7[view] [source] [discussion] 2023-03-05 07:57:14
>>foxhil+m4
Except it doesn't do GPU compute stuff, so it's no use for anything except games.
replies(1): >>foxhil+cc
◧◩
11. ribs+a8[view] [source] [discussion] 2023-03-05 08:04:09
>>mschue+R1
I keep hearing that ROCm is DOA, but there’s a lot of supercomputing labs that are heavily investing in it, with engineers who are quite in favor of it.
replies(4): >>mschue+oa >>pixele+Ec >>doikor+3f >>pjmlp+9g
12. DeathA+D8[view] [source] 2023-03-05 08:10:42
>>rektid+(OP)
How is NVIDIA different from Apple?
replies(1): >>verall+SU
◧◩◪
13. mschue+oa[view] [source] [discussion] 2023-03-05 08:40:22
>>ribs+a8
I hope it takes off, a platform independent alternative to CUDA would be great. But if they want it to be successfully outside of supercomputing labs, it needs to be as easy to use as CUDA. And I'd say being successfull outside of supercomputer labs is important for overall adoption and success. For me personally, it would also need fast runtime compilation so that you can modify and hot-reload ROCm programs at runtime.
◧◩◪◨
14. hulitu+7b[view] [source] [discussion] 2023-03-05 08:51:51
>>alanfr+z6
Huh ? Compared to open source nvidia driver which could do nothing ?

I had a Riva TNT 2 card. The only "accelerated" thing it could do in X was DGA (direct graphics access). Switched to Ati and never looked back. Of course you could use the proprietary driver. If you had enough time to solve instalation problems and didn't mind frequent crashes.

replies(3): >>badsec+qg >>onphon+jz >>anthk+QP
◧◩◪◨
15. foxhil+7c[view] [source] [discussion] 2023-03-05 09:06:49
>>alanfr+z6
amdgpu is new. you may be thinking about fglrx: a true hell.
replies(1): >>alanfr+Gna
◧◩◪◨
16. foxhil+cc[view] [source] [discussion] 2023-03-05 09:08:17
>>Gordon+B7
it doesn’t do CUDA, but it does do opencl, and vulkan compute
replies(1): >>Gordon+No
◧◩◪◨
17. foxhil+vc[view] [source] [discussion] 2023-03-05 09:13:18
>>rabf+75
yes, i am very familiar with that pain. fglrx was hell compared to nvidia.

nvidia being the only viable solution for 3d on linux is a bit of an exaggeration imo (source: i did it for 5 years), but that was a long time ago: we have amdgpu, which is far superior to nvidia’s closed source driver.

◧◩◪
18. pixele+Ec[view] [source] [discussion] 2023-03-05 09:15:34
>>ribs+a8
If you want to run compute on AMD GPU hardware on Linux, it does work - however it's not as portable as CUDA as you practically have to compile your code for every AMD GPU architecture, whereas with CUDA the nvidia drivers give you an abstraction layer (ish, it's really PTX which provides it, but...) which is forwards and backwards compatible, which makes it trivial to support new cards / generations of cards without recompiling anything.
19. agumon+Bd[view] [source] 2023-03-05 09:31:46
>>rektid+(OP)
Some say the nurbs model was also not fit with culture at the time and not supported either on modeling tools or texturing. Game dev would get faster results with triangles than with nurbs. Not sure who should have footed the bill, game studios or nvidia.
◧◩◪
20. doikor+3f[view] [source] [discussion] 2023-03-05 09:54:16
>>ribs+a8
With supercomputers you write your code for that specific supercomputer. In such an environment ROCm works ok. Trying to make a piece of ROCm code work on different cards/setups is real pain (and not that easy with CUDA either if you want good performance)
◧◩◪
21. pjmlp+9g[view] [source] [discussion] 2023-03-05 10:11:33
>>ribs+a8
Some random HPC lab with weight to have a AMD team drop by isn't the same thing as average joe and jane developer.
◧◩◪◨⬒
22. badsec+qg[view] [source] [discussion] 2023-03-05 10:16:07
>>hulitu+7b
> Compared to open source nvidia driver which could do nothing ?

Compared to the official Nvidia driver.

> If you had enough time to solve instalation problems and didn't mind frequent crashes

I used Nvidia GPUs from ~2001 to ~2018 on various machines with various GPUs and i never had any such issues on Linux. I always used the official driver installer and it worked perfectly fine.

◧◩◪◨⬒
23. Gordon+No[view] [source] [discussion] 2023-03-05 12:00:10
>>foxhil+cc
Maybe, but nothing really uses that, at least for video.
◧◩◪◨⬒
24. onphon+jz[view] [source] [discussion] 2023-03-05 13:36:36
>>hulitu+7b
Did people not try the nvidia driver back then? Even as a casual user at the time it was miles ahead - but it wasn’t open source
◧◩◪◨⬒
25. anthk+QP[view] [source] [discussion] 2023-03-05 15:38:12
>>hulitu+7b
DGA and later XV.
◧◩
26. verall+SU[view] [source] [discussion] 2023-03-05 16:07:34
>>DeathA+D8
Nvidia makes superior graphics cards which are for dirty gamers while Apple makes superior webshit development machines.
◧◩◪◨
27. nick__+oZ[view] [source] [discussion] 2023-03-05 16:34:02
>>rabf+75
I was a big Matrox fan, mostly because I knew someone there, and was able to upgrade their products at a significant discount. This was important for me as a teenager whose only source of income was power washing eighteen-wheelers and their associated semi-trailers. It was a dirty and somewhat dangerous job, but I fondly remember my first job. Anyway, I digress, so let's get back to the topic of Matrox cards.

The MGA Millennium had unprecedented image quality, and its RAMDAC was in a league of its own. The G200 had the best 3D image quality when it was released, but it was really slow and somewhat buggy outside of Direct3D where it shined. However, even with my significant discount and my fanboyism, when the G400 was released, I defected to NVIDIA since its relative performance was abysmal.

replies(1): >>antod+wU1
◧◩◪
28. bsder+UH1[view] [source] [discussion] 2023-03-05 20:58:56
>>rektid+j2
As much as I hate Nvidia, AMD and Intel have done themselves zero favors in the space.

It's not that hard--you must provide a way to use CUDA on your hardware. Either support it directly, transcompile it, emulate it, provide shims, anything. After that, you can provide your own APIs that take advantage of every extra molecule of performance.

And neither AMD nor Intel have thrown down the money to do it. That's all it is. Money. You have an army of folks in the space who would love to use anything other than Nvidia who would do all the work if you just threw them money.

◧◩◪◨⬒
29. antod+wU1[view] [source] [discussion] 2023-03-05 22:12:43
>>nick__+oZ
One usecase Matrox kept doing well was X11 multimonitor desktops. The G400 era was about the time I was drifting away from games and moving to full time Linux, so they suited me at least.
◧◩◪◨⬒
30. alanfr+Gna[view] [source] [discussion] 2023-03-08 10:08:42
>>foxhil+7c
No, I was thinking about amdgpu. amdgpu, the open source driver, since 4-5 years is better than nvidia closed source driver (excluding the cuda vs opencl/rocm debacle ofc).

fglrx has always been a terrible experience indeed, so AMD was no match for nvidia closed source driver.

So, once upon a time (I'd say 2000-2015) the best Linux driver for discrete GPUs was nVidia closed source one. Nowadays it's the amd open source one. Intel has always been good, but doesn't provide the right amount of power.

[go to top]