zlacker

[return to "3dfx: So powerful it’s kind of ridiculous"]
1. ChuckM+25[view] [source] 2023-03-05 05:41:02
>>BirAda+(OP)
My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

Sigh.

◧◩
2. rabf+E9[view] [source] 2023-03-05 06:48:52
>>ChuckM+25
Part of the success of directx over opengl was that very few graphics card companies seemed capable of producing a fully functional opengl driver, for the longest time Nvidia was the only option.

I recall ATI and Matrox both failing in this regard despite repeated promises.

◧◩◪
3. averev+Gc[view] [source] 2023-03-05 07:32:46
>>rabf+E9
Fully functional opengl was not exactly the issue, or not the only one

Opengl was stagnating at the time vendors started a feature wars. On opengl you can have vendor specific extensions, because it was meant for tightly integrated hardware and software. Vendors started leaning heavily on extensions to one up each other.

The cronus group took ages to get up and standardize modern features

By that time gl_ext checks became nightmarishly complicated and cross compatibility was further damaged by vendors lying about their actual gl_ext support, where drivers started claiming support for things the hardware could do, but using the ext causes the scene to not look right or outright crash

Developers looked at that and no wonder they didn't want to take part in any of it

This all beautifully exploded a few year later when compiz started taking a foothold which required this or that gl_ext and finally caused enough rage to get cronus working at bringing back under control the mess

By that time ms were already at directx 9, you could use xlna to target different architectures, and it brought networking and io libraries with it making it a very convenient development environment

*this is all a recollection from the late nineties early 2k and it's by now a bit blurred, it's hard to fill in the details on the specific exts Nvidia was the one producing the most but it's not like the blame is on them, Maxtor and ATI wonky support to play catch up was overall more damaging. Ms didn't need to really do much to win hearts with dx.

◧◩◪◨
4. qwerto+5i[view] [source] 2023-03-05 08:51:26
>>averev+Gc
Plus MS was also trying to offer more than just graphics by adding audio and networking to the stack which kind of started to make the whole ecosystem attractive, even if it was painful to program against.

I had my share of fun with DirectMusic.

◧◩◪◨⬒
5. mrguyo+wn8[view] [source] 2023-03-07 18:27:26
>>qwerto+5i
DirectInput (and later XInput) had it's faults but it's probably the only reason you can just plug random first and third party controllers into a USB port and expect everything to just work.
[go to top]