zlacker

[return to "3dfx: So powerful it’s kind of ridiculous"]
1. ChuckM+25[view] [source] 2023-03-05 05:41:02
>>BirAda+(OP)
My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

Sigh.

◧◩
2. Razeng+K5[view] [source] 2023-03-05 05:58:18
>>ChuckM+25
> he assured me that NURBs were going to be the dominant rendering model

Wow, this sounds like those little cases where a few different decisions could have easily led us down into an alternate parallel world :)

Can someone expand on why NURBs didn't/don't win out against polygons?

Could this be like AI/ML/VR/Functional Programming, where the idea had been around for decades but could only be practically implemented now after we had sufficient hardware and advances in other fields?

◧◩◪
3. rektid+07[view] [source] 2023-03-05 06:13:59
>>Razeng+K5
Because it's exactly like the parent said: Nvidia has always Nvidia & always has been, a tightfisted tightwad that makes everything they do ultra-proprietary. Nvidia never creates standards or participates.

Sometimes, like with CUDA, they just have an early enough lead that they entrench.

Vile player. They're worse than IBM. Soulless & domineering to the max, to every extent possible. What a sad story.

◧◩◪◨
4. rabf+1a[view] [source] 2023-03-05 06:54:54
>>rektid+07
Nvidia has had driver parity for linux, freebsd and windows for many many years. No other graphics card manufacturer has come close to the quality of their software stack accross platforms. For that they have my gratitude.
◧◩◪◨⬒
5. foxhil+mb[view] [source] 2023-03-05 07:11:57
>>rabf+1a
DLSS was windows only for some time.

linux’s amdgpu is far better than the nvidia-driver.

◧◩◪◨⬒⬓
6. alanfr+zd[view] [source] 2023-03-05 07:44:46
>>foxhil+mb
amdgpu is better now. But was terrible for years, probably 2000-2015. That’s what gp is saying.
◧◩◪◨⬒⬓⬔
7. foxhil+7j[view] [source] 2023-03-05 09:06:49
>>alanfr+zd
amdgpu is new. you may be thinking about fglrx: a true hell.
◧◩◪◨⬒⬓⬔⧯
8. alanfr+Gua[view] [source] 2023-03-08 10:08:42
>>foxhil+7j
No, I was thinking about amdgpu. amdgpu, the open source driver, since 4-5 years is better than nvidia closed source driver (excluding the cuda vs opencl/rocm debacle ofc).

fglrx has always been a terrible experience indeed, so AMD was no match for nvidia closed source driver.

So, once upon a time (I'd say 2000-2015) the best Linux driver for discrete GPUs was nVidia closed source one. Nowadays it's the amd open source one. Intel has always been good, but doesn't provide the right amount of power.

[go to top]