zlacker

[return to "3dfx: So powerful it’s kind of ridiculous"]
1. ChuckM+25[view] [source] 2023-03-05 05:41:02
>>BirAda+(OP)
My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

Sigh.

◧◩
2. Aardwo+wp[view] [source] 2023-03-05 10:44:00
>>ChuckM+25
Around 1999 we had a PC with both a Riva TNT and a Voodoo 2. The main games I played were Half Life and Unreal 1 (in addition to various games that came bundled with hardware like Monster truck madness and Urban Assault). I found the Riva TNT to work much better than the Voodoo 2 for the main games I played (e.g. when choosing in the game options, the D3D or OpenGL options had less glitches, better looking translucency in Unreal, etc..., than the options that used the voodoo card), and in addition the Riva TNT supported 32-bit color while the Voodoo 2 only had 16-bit color and had this awkward passthrough.

Maybe being 1999 it was just a little bit too late to still fully appreciate 3dfx and modern day D3D and OpenGL took over around that time, so I just missed the proper Voodoo era by a hair.

Note that by OpenGL here I meant OpenGL using the Riva TNT (I assume the Voodoo card drivers must have been called Glide or 3DFx in the settings). I've always seen D3D and OpenGL existing side by side, performing very similarly in most games I played, and supporting the same cards, with GeForce cards etc that came later. I mainly game using Wine/Proton on Linux now by the way.

◧◩◪
3. flohof+Op[view] [source] 2023-03-05 10:47:54
>>Aardwo+wp
Yep, as soon as the TNT came out it was pretty much over for 3dfx. Quake II on a Riva TNT running in 1024x768 was a sight to behold.
◧◩◪◨
4. ChuckN+es[view] [source] 2023-03-05 11:21:11
>>flohof+Op
3D gaming at 1024x768 in 1998 would be like 8K gaming today.
◧◩◪◨⬒
5. orbita+bN[view] [source] 2023-03-05 14:35:42
>>ChuckN+es
More like 4K gaming today. PII 450MHz + Riva 128ZX or TNT easily ran Half-Life at 1152x864. (FPS expectations were also lower, however - nobody expected 100fps+ like we do today)
◧◩◪◨⬒⬓
6. Aardwo+bO1[view] [source] 2023-03-05 20:53:59
>>orbita+bN
> nobody expected 100fps+

Not for the games framerates indeed, but I did set my CRT monitor at 120 Hz to avoid eyestrain. You could effortlessly switch between many framerates from 60 Hz to 160 Hz or so on those monitors and it was just a simple setting.

Today it seems there now exist LCD monitors that can do (much) more than 60 Hz, but somehow it has to have all those vendor lock in sounding brandnames that makes it all sound a bit unreliable [in the sense of overcomplicated and vendor dependent] compared to back then, when it was just a number you could configure that was just a logical part of how the stuff worked.

◧◩◪◨⬒⬓⬔
7. synthe+QF2[view] [source] 2023-03-06 03:17:07
>>Aardwo+bO1
With respect to raw refresh rates, it's mostly the connectivity standards at fault. After VGA things got a bit out of hand with one connector after another in various form factors.

The part you're probably thinking of is GSync vs Freesync which is a feature for making a tear-free dynamic refresh rate, something that was simply impossible in the CRT days but does add some perceptual smoothness and responsiveness in games. Not using a compatible monitor just means you're doing sync with the traditional fixed rate system.

What has gotten way more complex is the software side of things because we're in a many-core, many-thread world and a game can't expect to achieve exact timing of their updates to hit a target refresh, so things are getting buffers on top of buffers and in-game configuration reflects that with various internal refresh rate settings.

◧◩◪◨⬒⬓⬔⧯
8. rasz+Mw3[view] [source] 2023-03-06 12:57:58
>>synthe+QF2
>GSync vs Freesync which is a feature for making a tear-free dynamic refresh rate, something that was simply impossible in the CRT days

fun fact: the very same technique used by Freesync, delaying the vsync, works with CRTs

Genericness of Variable Refresh Rate (VRR works on any video source including DVI and VGA, even on MultiSync CRT tubes) https://forums.blurbusters.com/viewtopic.php?f=7&t=8889

[go to top]