zlacker

[return to "3dfx: So powerful it’s kind of ridiculous"]
1. ChuckM+25[view] [source] 2023-03-05 05:41:02
>>BirAda+(OP)
My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

Sigh.

◧◩
2. samsta+yF1[view] [source] 2023-03-05 19:55:48
>>ChuckM+25
Don't forget the co-marketing from Intel's DRG (dev relations group) (the group I worked in) which started in ~1995 or so for game optimization dev on SIMD and, later AGP, openGL and unreal engine (our lab had some of the very first iterations of this - and NURBs were a major topic in the lab especially for OpenGL render tests etc. (if you recall the NURBs Dolphin benchmark/demo)

Intel would offer upto(?) (cant recall if it base, set, or upto) $1 million in marketing funds if me and my buddy did our objective and subjective gaming tests between the two looking for a subjective feel that the games ran better on Intel.

The objective tests were to determine if the games were actually using the SIMD instructions...

[go to top]