Maybe being 1999 it was just a little bit too late to still fully appreciate 3dfx and modern day D3D and OpenGL took over around that time, so I just missed the proper Voodoo era by a hair.
Note that by OpenGL here I meant OpenGL using the Riva TNT (I assume the Voodoo card drivers must have been called Glide or 3DFx in the settings). I've always seen D3D and OpenGL existing side by side, performing very similarly in most games I played, and supporting the same cards, with GeForce cards etc that came later. I mainly game using Wine/Proton on Linux now by the way.
Like here, the V3 seems to have a pretty handy lead in most cases https://www.anandtech.com/show/288/14 - but that's a TNT2 (not TNT2 Ultra) and it's all in 16 bit colour depth (not supported by the V3).
It was certainly an interesting time, and as a V3 owner I did envy that 32 bit colour depth on the TNT2 and the G400 MAX's gorgeous bump mapping :D
For years it was so hard to find LCDs that came close to the resolution I had on my CRT for a reasonable price.
Everyone had monitors that could do 1024x768 (usually up to 1600x1200) where as 8K monitors are much less ubiquitous today in comparison.
It was also the first PC I ever installed Linux on. My dad would not let me do such a risky operation as dual booting Linux on the family computer. I don’t even remember what distro at this point.
Whatever. In late 1996, I got a PowerMac 8500/180DP (PowerPC 604e) and a 1024x768 monitor. The 8500 didn't even have a graphics card, but had integrated/dedicated graphics on the motherboard with 4MB VRAM (also S-video and composite video in and out). It came bundled with Bungie's Marathon[1] (1994) which filled the screen in 16-bit color.
Edit: @throwawayx38: I 100% agree with you! Thanks for your reply.
Unless they had a dedicated harddrive, they would also need to resize existing fat32/ntfs partitions and add a couple of new partitions for Linux. This process had a certain risk.
Maybe I'm still misinterpreting, but this was where my mind went. Man, I wish I could've appreciated how distinct the 90s were as a kid, but I was too young and dumb to have a shred of hope of being that aware!
1920x1080 would be 800x600 back in the day, something everyone used for a viable (not just usable) desktop in order to be confortable with daily tasks such as browing and using a word processor. Not top-end, but most games would look nice enough, such as Unreal, Deus Ex and Max Payne at 800x600, which looked great.
32bit on TNT at half the framerate, performance hit was brutal. 16bit on TNT was ugly AF due to bad internal precision while 3dfx did some dithering ~22bit magic
"Voodoo2 Graphics uses a programmable color lookup table to allow for programmable gamma correction. The 16-bit dithered color data from the frame buffer is used an an index into the gamma-correction color table -- the 24-bit output of the gamma-correction color table is then fed to the monitor or Television."
but thats exactly what you got on Voodoo2 with P2 450 back then in 640x480 https://www.bluesnews.com/benchmarks/081598.html
Not for the games framerates indeed, but I did set my CRT monitor at 120 Hz to avoid eyestrain. You could effortlessly switch between many framerates from 60 Hz to 160 Hz or so on those monitors and it was just a simple setting.
Today it seems there now exist LCD monitors that can do (much) more than 60 Hz, but somehow it has to have all those vendor lock in sounding brandnames that makes it all sound a bit unreliable [in the sense of overcomplicated and vendor dependent] compared to back then, when it was just a number you could configure that was just a logical part of how the stuff worked.
I never 'totally borked' a PC with LILO but definitely had to fix some things at least once, and that was with a nice thick Slackware 7.1 book to guide me.
GRUB, IIRC, vastly improved things but took a little while to get there and truly 'easy-peasy'
16 bit on TNT was fine for most of what I played at the time, although at the time it was mostly Quake/Quake2 and a few other games. Admittedly I was much more into 2d (especially strategy) games at the time, so 2d perf (and good VESA compat for dos trash and emulators) was more important to me for the most part.
I think 3dfx had a good product but lost the plot somewhere in between/combination of their cutting 3rd parties out of the market, and not deeply integrating as quickly vs considering binning. VSA-100 was a good idea in theory but the idea they could make a working board with 4 chips in sync at an affordable cost was too bold, and probably a sign they needed to do some soul seeking before going down that path.
Now, it's possible that comment is only discernable in hindsight only. After all, these folks had seemed like engineering geniuses with what they had already pulled off. OTOH, when we consider the cost jump of a '1 to 2 to 4 CPU' system back then... maybe everyone was a bit too optimistic.
The part you're probably thinking of is GSync vs Freesync which is a feature for making a tear-free dynamic refresh rate, something that was simply impossible in the CRT days but does add some perceptual smoothness and responsiveness in games. Not using a compatible monitor just means you're doing sync with the traditional fixed rate system.
What has gotten way more complex is the software side of things because we're in a many-core, many-thread world and a game can't expect to achieve exact timing of their updates to hit a target refresh, so things are getting buffers on top of buffers and in-game configuration reflects that with various internal refresh rate settings.
Ditto with today's 1920x1080 desktop resolution on my Intel NUC and games at 1280x720.
But I could run 1280x1024@60 if I wanted. And a lot games would run fine at 1024x768.
We could write to buffers at 60 Hz effortlessly with computers from 1999, speeds have increased more than enough to write to buffers at 120 Hz and more, even with 16x more pixels.
1/120th of a second is a huge amount of time in CPU/GPU clock ticks, more than enough to compute a frame and write it to a double buffer to swap, and more threads should make that easier to do, not harder: more threads can compute pixels so pixels can be put in the buffer faster.
If there's problems with connector standards, software side of things, multithreading making it require third-party complexity, then that's a problem of those connector standards, the software, things like the LCD monitors themselves trying to be too smart and add delay, etc... Take also for example the AI upscaling done in NVidia cards now: adding yet more latency (since it needs multiple frames to compute this) and complexity (and I've seen it create artefacts too, then I'd rather just have a predictable bicubic or lanczos upscaling).
Same with audio: why do people tolerate such latency with bluetooth audio? Aptx had much less latency but the latest headphones don't support it anymore, only huge delay.
fun fact: the very same technique used by Freesync, delaying the vsync, works with CRTs
Genericness of Variable Refresh Rate (VRR works on any video source including DVI and VGA, even on MultiSync CRT tubes) https://forums.blurbusters.com/viewtopic.php?f=7&t=8889
Until 2020, this was always a myth. When matching features and performance, the price of a Mac was always within $100 of a PC that is its equal. Not anymore with Apple Silicon. Now when matching performance and features you'll have a PC costing twice as much or more.
Between a Celeron 333A running at 550MHz and a dual voodoo2 you could drive games at pretty ridiculous frame rates.
Unfortunately for mine the foolery didn't work perfectly and I recall the card was a bit unstable when running that way, or maybe it didn't achieve the optimal performance possible for one of those fiddled M64s.
Not really a huge surprise, IIRC it was a super cheap card...
You were smarter than me. I wanted all those free compilers so badly I just went and installed redhat on the family pc. Ask me how well that conversation went with the old man...
The "Apple is expensive"-myth has been perpetuated since the days of 8-bit computing. Less expensive computers are cheaper because they have fewer features, use inferior parts, and are simply not as performant. But all that is behind us with Apple Silicon. Now you'd be hard-pressed to find a PC that performs half as well as the current line up of low-end Macs for their price.
For most entry level stuff performance is not that important so that's not the metric where customers focus on (price is). A desktop all-in-one from e.g. Lenovo starts at 600 euro's, the cheapest iMac starts at 1500. A reasonable Windows laptop starts at around 400 euro's while MacBook air starts at 1000 euro's. It's not that the Apple machines aren't better, it's just that lots of folks here don't want to pay the entry fee.
Same reason most people here don't drive BMWs but cheaper cars.