zlacker

[parent] [thread] 31 comments
1. flohof+(OP)[view] [source] 2023-03-05 10:47:54
Yep, as soon as the TNT came out it was pretty much over for 3dfx. Quake II on a Riva TNT running in 1024x768 was a sight to behold.
replies(4): >>ChuckN+q2 >>smcl+o6 >>rasz+q71 >>OOPMan+897
2. ChuckN+q2[view] [source] 2023-03-05 11:21:11
>>flohof+(OP)
3D gaming at 1024x768 in 1998 would be like 8K gaming today.
replies(6): >>jonhoh+Zf >>orbita+nn >>kodt+sr >>jbvers+jt >>Maursa+ju >>anthk+qU
3. smcl+o6[view] [source] 2023-03-05 12:04:09
>>flohof+(OP)
I feel like 3dfx still had a slight edge until the tail end of the Voodoo3 era (when the GF256 came out and blew everything away). But I think it depends what you prioritise.

Like here, the V3 seems to have a pretty handy lead in most cases https://www.anandtech.com/show/288/14 - but that's a TNT2 (not TNT2 Ultra) and it's all in 16 bit colour depth (not supported by the V3).

It was certainly an interesting time, and as a V3 owner I did envy that 32 bit colour depth on the TNT2 and the G400 MAX's gorgeous bump mapping :D

replies(1): >>dwater+u02
◧◩
4. jonhoh+Zf[view] [source] [discussion] 2023-03-05 13:31:47
>>ChuckN+q2
I had a Rage 128 in 1999 and easily gamed at 1024x768 - and ran Windows 98 at 1600x1200!

For years it was so hard to find LCDs that came close to the resolution I had on my CRT for a reasonable price.

replies(1): >>double+Pl
◧◩◪
5. double+Pl[view] [source] [discussion] 2023-03-05 14:24:07
>>jonhoh+Zf
And the LCDs of the day really struggled with contrast ratios and ghosting.
◧◩
6. orbita+nn[view] [source] [discussion] 2023-03-05 14:35:42
>>ChuckN+q2
More like 4K gaming today. PII 450MHz + Riva 128ZX or TNT easily ran Half-Life at 1152x864. (FPS expectations were also lower, however - nobody expected 100fps+ like we do today)
replies(5): >>eecc+hF >>rasz+S71 >>Aardwo+no1 >>ajnin+cr3 >>argleb+Zt3
◧◩
7. kodt+sr[view] [source] [discussion] 2023-03-05 15:03:12
>>ChuckN+q2
I would say 4K, as it was the resolution those with a powerful PC could handle.

Everyone had monitors that could do 1024x768 (usually up to 1600x1200) where as 8K monitors are much less ubiquitous today in comparison.

◧◩
8. jbvers+jt[view] [source] [discussion] 2023-03-05 15:16:48
>>ChuckN+q2
Exactly. But now we are drowning in overhead
◧◩
9. Maursa+ju[view] [source] [discussion] 2023-03-05 15:22:14
>>ChuckN+q2
> 3D gaming at 1024x768 in 1998 would be like 8K gaming today.

Whatever. In late 1996, I got a PowerMac 8500/180DP (PowerPC 604e) and a 1024x768 monitor. The 8500 didn't even have a graphics card, but had integrated/dedicated graphics on the motherboard with 4MB VRAM (also S-video and composite video in and out). It came bundled with Bungie's Marathon[1] (1994) which filled the screen in 16-bit color.

[1] https://en.wikipedia.org/wiki/Marathon_(video_game)

replies(1): >>bzzzt+yJ2
◧◩◪
10. eecc+hF[view] [source] [discussion] 2023-03-05 16:25:13
>>orbita+nn
I had a Riva 128 and it was garbage, I was constantly kicking myself for cheaping out and not getting a Voodoo2
◧◩
11. anthk+qU[view] [source] [discussion] 2023-03-05 17:46:06
>>ChuckN+q2
No, 4K. 1280x720 would be 640x480, the minimum usable for a Windows 95/98 desktop and most multimedia based software. Today that resolution it's almost the minimum for 720p video and modern gaming.

1920x1080 would be 800x600 back in the day, something everyone used for a viable (not just usable) desktop in order to be confortable with daily tasks such as browing and using a word processor. Not top-end, but most games would look nice enough, such as Unreal, Deus Ex and Max Payne at 800x600, which looked great.

replies(1): >>int_19+cz2
12. rasz+q71[view] [source] 2023-03-05 19:03:31
>>flohof+(OP)
>Quake II on a Riva TNT running in 1024x768

40 fps, 70 fps on V2 sli

https://www.bluesnews.com/benchmarks/081598.html

◧◩◪
13. rasz+S71[view] [source] [discussion] 2023-03-05 19:06:17
>>orbita+nn
>nobody expected 100fps+

but thats exactly what you got on Voodoo2 with P2 450 back then in 640x480 https://www.bluesnews.com/benchmarks/081598.html

◧◩◪
14. Aardwo+no1[view] [source] [discussion] 2023-03-05 20:53:59
>>orbita+nn
> nobody expected 100fps+

Not for the games framerates indeed, but I did set my CRT monitor at 120 Hz to avoid eyestrain. You could effortlessly switch between many framerates from 60 Hz to 160 Hz or so on those monitors and it was just a simple setting.

Today it seems there now exist LCD monitors that can do (much) more than 60 Hz, but somehow it has to have all those vendor lock in sounding brandnames that makes it all sound a bit unreliable [in the sense of overcomplicated and vendor dependent] compared to back then, when it was just a number you could configure that was just a logical part of how the stuff worked.

replies(1): >>synthe+2g2
◧◩
15. dwater+u02[view] [source] [discussion] 2023-03-06 00:57:38
>>smcl+o6
I worked at CompUSA the summer of 1999 and there were demo machines for the Voodoo3 and TNT2, and what I remember most was that the Voodoo looked muddy while the TNT2 looked crisp. Frame rates weren’t different enough to have a clear winner, since there were 3 different Voodoo models and Nvidia had the ultra. I ended up getting a TNT2 Ultra and loved it. Never had any compatibility issues that I remember.
replies(1): >>dbspin+mc3
◧◩◪◨
16. synthe+2g2[view] [source] [discussion] 2023-03-06 03:17:07
>>Aardwo+no1
With respect to raw refresh rates, it's mostly the connectivity standards at fault. After VGA things got a bit out of hand with one connector after another in various form factors.

The part you're probably thinking of is GSync vs Freesync which is a feature for making a tear-free dynamic refresh rate, something that was simply impossible in the CRT days but does add some perceptual smoothness and responsiveness in games. Not using a compatible monitor just means you're doing sync with the traditional fixed rate system.

What has gotten way more complex is the software side of things because we're in a many-core, many-thread world and a game can't expect to achieve exact timing of their updates to hit a target refresh, so things are getting buffers on top of buffers and in-game configuration reflects that with various internal refresh rate settings.

replies(2): >>Aardwo+kN2 >>rasz+Y63
◧◩◪
17. int_19+cz2[view] [source] [discussion] 2023-03-06 07:19:39
>>anthk+qU
It was much more common to run games at a lower resolution than regular desktop, though. Even as 1024x768 became the norm for desktop, mostly, the crazy iteration rates on 3D hardware meant that most people who couldn't afford a new card every year would stick to 640x480 for the more recent 3D games.
replies(1): >>anthk+7B2
◧◩◪◨
18. anthk+7B2[view] [source] [discussion] 2023-03-06 07:41:43
>>int_19+cz2
Yes, I did that even in the Geforce 2MX days. Games maxed @800x600, desktop at 1024x768.

Ditto with today's 1920x1080 desktop resolution on my Intel NUC and games at 1280x720.

But I could run 1280x1024@60 if I wanted. And a lot games would run fine at 1024x768.

◧◩◪
19. bzzzt+yJ2[view] [source] [discussion] 2023-03-06 09:06:50
>>Maursa+ju
Probably cost a heck of a lot more than a comparable PC gaming setup in those days. Not to say it's not a great game, but Marathon uses a Doom-like 2.5D raycasting engine which doesn't need a 3D accelerator, just enough memory speed to draw each frame (which the PowerMac obviously had). Life gets a lot more complicated when you have to render perspective correct triangles with filtered textures, lighting and z-buffers in a true 3D space.
replies(1): >>Maursa+p93
◧◩◪◨⬒
20. Aardwo+kN2[view] [source] [discussion] 2023-03-06 09:49:18
>>synthe+2g2
I don't buy that it has to be this complex.

We could write to buffers at 60 Hz effortlessly with computers from 1999, speeds have increased more than enough to write to buffers at 120 Hz and more, even with 16x more pixels.

1/120th of a second is a huge amount of time in CPU/GPU clock ticks, more than enough to compute a frame and write it to a double buffer to swap, and more threads should make that easier to do, not harder: more threads can compute pixels so pixels can be put in the buffer faster.

If there's problems with connector standards, software side of things, multithreading making it require third-party complexity, then that's a problem of those connector standards, the software, things like the LCD monitors themselves trying to be too smart and add delay, etc... Take also for example the AI upscaling done in NVidia cards now: adding yet more latency (since it needs multiple frames to compute this) and complexity (and I've seen it create artefacts too, then I'd rather just have a predictable bicubic or lanczos upscaling).

Same with audio: why do people tolerate such latency with bluetooth audio? Aptx had much less latency but the latest headphones don't support it anymore, only huge delay.

◧◩◪◨⬒
21. rasz+Y63[view] [source] [discussion] 2023-03-06 12:57:58
>>synthe+2g2
>GSync vs Freesync which is a feature for making a tear-free dynamic refresh rate, something that was simply impossible in the CRT days

fun fact: the very same technique used by Freesync, delaying the vsync, works with CRTs

Genericness of Variable Refresh Rate (VRR works on any video source including DVI and VGA, even on MultiSync CRT tubes) https://forums.blurbusters.com/viewtopic.php?f=7&t=8889

◧◩◪◨
22. Maursa+p93[view] [source] [discussion] 2023-03-06 13:17:19
>>bzzzt+yJ2
> Probably cost a heck of a lot more than a comparable PC gaming setup in those days.

Until 2020, this was always a myth. When matching features and performance, the price of a Mac was always within $100 of a PC that is its equal. Not anymore with Apple Silicon. Now when matching performance and features you'll have a PC costing twice as much or more.

replies(1): >>bzzzt+ltb
◧◩◪
23. dbspin+mc3[view] [source] [discussion] 2023-03-06 13:35:12
>>dwater+u02
I owned a Voodoo 3, while all my friends had TNT2's. My experience was the opposite. Not sure if it was the handling of anisotropic filtering, or some kind of texture filtering - but my Voodoo 3 was notably sharper on all the games of the time.
replies(1): >>smcl+lA3
◧◩◪
24. ajnin+cr3[view] [source] [discussion] 2023-03-06 14:53:07
>>orbita+nn
I remember that the framerate was pretty important in Quake. Because player positions were interpolated linearly, you could miss the top of the parabola when jumping if your framerate was too low. I don't remember any numbers but getting a high framerate was definitely a concern when playing a bit competitively.
replies(1): >>orbita+KY5
◧◩◪
25. argleb+Zt3[view] [source] [discussion] 2023-03-06 15:08:13
>>orbita+nn
My 200+ fps quake 2 config sure did. If you weren't running 200+ fps for online quake 2 you were in for a bad time.

Between a Celeron 333A running at 550MHz and a dual voodoo2 you could drive games at pretty ridiculous frame rates.

◧◩◪◨
26. smcl+lA3[view] [source] [discussion] 2023-03-06 15:36:12
>>dbspin+mc3
It may even vary depending on the game, the settings, whether D3D, OpenGL or GLide were used or which versions of the drivers were installed.
replies(1): >>dbspin+0de
◧◩◪◨
27. orbita+KY5[view] [source] [discussion] 2023-03-07 04:10:16
>>ajnin+cr3
Competitive play, sure, especially at low resolutions and everything disabled. Single player games were running at pretty low framerates though, with a few exceptions like Quake/Half-Life. Even 60fps was more of a luxury than a rule.
28. OOPMan+897[view] [source] 2023-03-07 14:50:32
>>flohof+(OP)
I had one of those weird TNT2 M64 cards that you could fool into thinking it was a proper TNT2.

Unfortunately for mine the foolery didn't work perfectly and I recall the card was a bit unstable when running that way, or maybe it didn't achieve the optimal performance possible for one of those fiddled M64s.

Not really a huge surprise, IIRC it was a super cheap card...

◧◩◪◨⬒
29. bzzzt+ltb[view] [source] [discussion] 2023-03-08 18:11:19
>>Maursa+p93
Here in the Netherlands Macs were outrageously expensive in the 90's. I only knew a few people who bothered to buy them (mostly because they wanted something simpler than a PC or because of Adobe stuff). Macs also used 'better' components at the time (SCSI drives instead of slow IDE, better sound, etc) so yes, if you wanted a comparable PC you had to pay up. But most people here had a much cheaper PC...
replies(1): >>Maursa+OJc
◧◩◪◨⬒⬓
30. Maursa+OJc[view] [source] [discussion] 2023-03-09 01:51:47
>>bzzzt+ltb
Depending on options, there are at least 2-4 different US-made SUV models that cost half as much as a BMW X1, which is not exactly expensive afa BMWs go.

The "Apple is expensive"-myth has been perpetuated since the days of 8-bit computing. Less expensive computers are cheaper because they have fewer features, use inferior parts, and are simply not as performant. But all that is behind us with Apple Silicon. Now you'd be hard-pressed to find a PC that performs half as well as the current line up of low-end Macs for their price.

replies(1): >>bzzzt+LJg
◧◩◪◨⬒
31. dbspin+0de[view] [source] [discussion] 2023-03-09 16:30:08
>>smcl+lA3
For sure, and I'm pretty certain I also tweaked the hell out of 3dfx tools too.
◧◩◪◨⬒⬓⬔
32. bzzzt+LJg[view] [source] [discussion] 2023-03-10 10:42:56
>>Maursa+OJc
There are workloads where a high-end PC outclasses a Mac for the same money (think top-end GPU or lots of memory, Apple doesn't have the first and wants a king sized markup for the second).

For most entry level stuff performance is not that important so that's not the metric where customers focus on (price is). A desktop all-in-one from e.g. Lenovo starts at 600 euro's, the cheapest iMac starts at 1500. A reasonable Windows laptop starts at around 400 euro's while MacBook air starts at 1000 euro's. It's not that the Apple machines aren't better, it's just that lots of folks here don't want to pay the entry fee.

Same reason most people here don't drive BMWs but cheaper cars.

[go to top]