zlacker

[return to "3dfx: So powerful it’s kind of ridiculous"]
1. former+DX[view] [source] 2023-03-05 15:43:13
>>BirAda+(OP)
3dfx is another example of "worse is better" or "slope vs intercept". This article doesn't quite spell it out, but as far as I can tell, 3dfx had one GPU architecture that was tweaked from 1995 till their end. It was much better for a few years than what the competitors put out, but the competitors kept iterating. nVidia created several generations of GPUs in the same time frame. 3dfx started higher but had no slope. Everyone else started much lower, but had a lot of slope (nVidia and ATI in particular). Two of these went ahead and started creating a new "fastest ever GPU" every other year for a quarter century, the other tried putting more of the same GPU on bigger boards and folded.
◧◩
2. djmips+a22[view] [source] 2023-03-05 22:15:14
>>former+DX
Yes, I agree wholeheartedly

Nvidia had a supercomputer and great hardware design software tools that were a trade secret and basically behind an off limits curtain in the center of their office and it helped them get chips out rapidly and on first turn. First turn means the first silicon coming back is good without requiring fixes and another costly turn.

I'd say 3dfx weren't poised to industrialize as well as Nvidia and they just couldn't keep up in the evolutionary race.

I'm not sure I understand where your worse is better idiom fits because 3dfx was better and Nvidia was worse but iterated to get better than 3dfx and won the day. Truly if worse was better in this case 3Dfx would still be around?

On the other hand triangle based rendering is a case of worse is better and Nvidia learned that and switched course from their early attempts with nurb based primitives.

◧◩◪
3. johnwa+5j3[view] [source] 2023-03-06 10:58:46
>>djmips+a22
As soon as other cards had 32 bit color, voodoo cards with 16bit color looked a lot worse.
◧◩◪◨
4. rasz+lB3[view] [source] 2023-03-06 13:30:39
>>johnwa+5j3
Nvidia 16 bit color depth up to GF256 was rendered internally at 16bit precision and looked really bad, while their 32bit was just a marketing bullet point due to 100% performance hit when enabled. 3dfx had 16bit frame buffer, but internally rendered at 32bits and output was dithered resulting in >20bit color https://www.beyond3d.com/content/articles/59/

"Voodoo2 Graphics uses a programmable color lookup table to allow for programmable gamma correction. The 16-bit dithered color data from the frame buffer is used an an index into the gamma-correction color table -- the 24-bit output of the gamma-correction color table is then fed to the monitor or Television."

[go to top]