zlacker

[return to "3dfx: So powerful it’s kind of ridiculous"]
1. ChuckM+25[view] [source] 2023-03-05 05:41:02
>>BirAda+(OP)
My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

Sigh.

◧◩
2. Razeng+K5[view] [source] 2023-03-05 05:58:18
>>ChuckM+25
> he assured me that NURBs were going to be the dominant rendering model

Wow, this sounds like those little cases where a few different decisions could have easily led us down into an alternate parallel world :)

Can someone expand on why NURBs didn't/don't win out against polygons?

Could this be like AI/ML/VR/Functional Programming, where the idea had been around for decades but could only be practically implemented now after we had sufficient hardware and advances in other fields?

◧◩◪
3. rektid+07[view] [source] 2023-03-05 06:13:59
>>Razeng+K5
Because it's exactly like the parent said: Nvidia has always Nvidia & always has been, a tightfisted tightwad that makes everything they do ultra-proprietary. Nvidia never creates standards or participates.

Sometimes, like with CUDA, they just have an early enough lead that they entrench.

Vile player. They're worse than IBM. Soulless & domineering to the max, to every extent possible. What a sad story.

◧◩◪◨
4. rabf+1a[view] [source] 2023-03-05 06:54:54
>>rektid+07
Nvidia has had driver parity for linux, freebsd and windows for many many years. No other graphics card manufacturer has come close to the quality of their software stack accross platforms. For that they have my gratitude.
◧◩◪◨⬒
5. foxhil+mb[view] [source] 2023-03-05 07:11:57
>>rabf+1a
DLSS was windows only for some time.

linux’s amdgpu is far better than the nvidia-driver.

◧◩◪◨⬒⬓
6. rabf+7c[view] [source] 2023-03-05 07:25:03
>>foxhil+mb
ATI drivers were a horror show for the longest time on windows never mind linux. What Nvidia did was have have basically the same driver code for all operating systems with a compatibility shim. If you were using any sort of professioinal 3d software over the previous 2 decades Nvidia were the only viable solution.

Source: Was burned by ATI, Matrox, 3dlabs before finallly coughing up the cash for Nvidia.

◧◩◪◨⬒⬓⬔
7. foxhil+vj[view] [source] 2023-03-05 09:13:18
>>rabf+7c
yes, i am very familiar with that pain. fglrx was hell compared to nvidia.

nvidia being the only viable solution for 3d on linux is a bit of an exaggeration imo (source: i did it for 5 years), but that was a long time ago: we have amdgpu, which is far superior to nvidia’s closed source driver.

[go to top]