zlacker

[return to "3dfx: So powerful it’s kind of ridiculous"]
1. ChuckM+25[view] [source] 2023-03-05 05:41:02
>>BirAda+(OP)
My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

Sigh.

◧◩
2. flohof+Uo[view] [source] 2023-03-05 10:36:12
>>ChuckM+25
Microsoft pushing D3D was a good thing, OpenGL drivers were an even bigger mess back then than today, and drivers for popular 3D accelerators only implemented the 'happy path' needed for running GLQuake but were either very slow or sloppily implemented for the rest of the API.

D3D was a terribly designed API in the beginning, but it caught up fast and starting at around DX7 was the objectively better API, and Microsoft forced GPU vendors to actually provide conforming and performant drivers.

◧◩◪
3. ChuckM+Ov1[view] [source] 2023-03-05 18:54:00
>>flohof+Uo
I see it a bit differently, but there is a lesson in here.

Microsoft pushed D3D to support their own self interest (which is totally an expected/okay thing for them to do), the way they evolved it made it both Windows only and ultimately incredibly complex (a lot of underlying GPU design leaks through the API into user code (or it did, I haven't written D3D code since DX10).

The lesson though, is that APIs "succeed", no matter what the quality, based on how many engineers are invested in having them succeed. Microsoft created a system whereby not only could a GPU vendor create a new feature in their GPU, they could get Microsoft to make it part of the "standard" (See the discussion of the GeForce drivers elsewhere) and that incentivizes the manufacturers to both continue to write drivers for Microsoft's standard, and to push developers to use that standard which keeps their product in demand.

This is an old lesson (think Rail Gauge standards as a means of preferentially making one company's locomotives the "right" one to buy) and we see it repeated often. One of the places "Open Source" could make a huge impact on the world would be in "standards." It isn't quite there yet but I can see inklings of people who are coming around to that point of view.

[go to top]