For years it was so hard to find LCDs that came close to the resolution I had on my CRT for a reasonable price.
Everyone had monitors that could do 1024x768 (usually up to 1600x1200) where as 8K monitors are much less ubiquitous today in comparison.
Whatever. In late 1996, I got a PowerMac 8500/180DP (PowerPC 604e) and a 1024x768 monitor. The 8500 didn't even have a graphics card, but had integrated/dedicated graphics on the motherboard with 4MB VRAM (also S-video and composite video in and out). It came bundled with Bungie's Marathon[1] (1994) which filled the screen in 16-bit color.
1920x1080 would be 800x600 back in the day, something everyone used for a viable (not just usable) desktop in order to be confortable with daily tasks such as browing and using a word processor. Not top-end, but most games would look nice enough, such as Unreal, Deus Ex and Max Payne at 800x600, which looked great.
but thats exactly what you got on Voodoo2 with P2 450 back then in 640x480 https://www.bluesnews.com/benchmarks/081598.html
Not for the games framerates indeed, but I did set my CRT monitor at 120 Hz to avoid eyestrain. You could effortlessly switch between many framerates from 60 Hz to 160 Hz or so on those monitors and it was just a simple setting.
Today it seems there now exist LCD monitors that can do (much) more than 60 Hz, but somehow it has to have all those vendor lock in sounding brandnames that makes it all sound a bit unreliable [in the sense of overcomplicated and vendor dependent] compared to back then, when it was just a number you could configure that was just a logical part of how the stuff worked.
The part you're probably thinking of is GSync vs Freesync which is a feature for making a tear-free dynamic refresh rate, something that was simply impossible in the CRT days but does add some perceptual smoothness and responsiveness in games. Not using a compatible monitor just means you're doing sync with the traditional fixed rate system.
What has gotten way more complex is the software side of things because we're in a many-core, many-thread world and a game can't expect to achieve exact timing of their updates to hit a target refresh, so things are getting buffers on top of buffers and in-game configuration reflects that with various internal refresh rate settings.
Ditto with today's 1920x1080 desktop resolution on my Intel NUC and games at 1280x720.
But I could run 1280x1024@60 if I wanted. And a lot games would run fine at 1024x768.
We could write to buffers at 60 Hz effortlessly with computers from 1999, speeds have increased more than enough to write to buffers at 120 Hz and more, even with 16x more pixels.
1/120th of a second is a huge amount of time in CPU/GPU clock ticks, more than enough to compute a frame and write it to a double buffer to swap, and more threads should make that easier to do, not harder: more threads can compute pixels so pixels can be put in the buffer faster.
If there's problems with connector standards, software side of things, multithreading making it require third-party complexity, then that's a problem of those connector standards, the software, things like the LCD monitors themselves trying to be too smart and add delay, etc... Take also for example the AI upscaling done in NVidia cards now: adding yet more latency (since it needs multiple frames to compute this) and complexity (and I've seen it create artefacts too, then I'd rather just have a predictable bicubic or lanczos upscaling).
Same with audio: why do people tolerate such latency with bluetooth audio? Aptx had much less latency but the latest headphones don't support it anymore, only huge delay.
fun fact: the very same technique used by Freesync, delaying the vsync, works with CRTs
Genericness of Variable Refresh Rate (VRR works on any video source including DVI and VGA, even on MultiSync CRT tubes) https://forums.blurbusters.com/viewtopic.php?f=7&t=8889
Until 2020, this was always a myth. When matching features and performance, the price of a Mac was always within $100 of a PC that is its equal. Not anymore with Apple Silicon. Now when matching performance and features you'll have a PC costing twice as much or more.
Between a Celeron 333A running at 550MHz and a dual voodoo2 you could drive games at pretty ridiculous frame rates.
The "Apple is expensive"-myth has been perpetuated since the days of 8-bit computing. Less expensive computers are cheaper because they have fewer features, use inferior parts, and are simply not as performant. But all that is behind us with Apple Silicon. Now you'd be hard-pressed to find a PC that performs half as well as the current line up of low-end Macs for their price.
For most entry level stuff performance is not that important so that's not the metric where customers focus on (price is). A desktop all-in-one from e.g. Lenovo starts at 600 euro's, the cheapest iMac starts at 1500. A reasonable Windows laptop starts at around 400 euro's while MacBook air starts at 1000 euro's. It's not that the Apple machines aren't better, it's just that lots of folks here don't want to pay the entry fee.
Same reason most people here don't drive BMWs but cheaper cars.