zlacker

[return to "3dfx: So powerful it’s kind of ridiculous"]
1. ChuckM+25[view] [source] 2023-03-05 05:41:02
>>BirAda+(OP)
My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

Sigh.

◧◩
2. Razeng+K5[view] [source] 2023-03-05 05:58:18
>>ChuckM+25
> he assured me that NURBs were going to be the dominant rendering model

Wow, this sounds like those little cases where a few different decisions could have easily led us down into an alternate parallel world :)

Can someone expand on why NURBs didn't/don't win out against polygons?

Could this be like AI/ML/VR/Functional Programming, where the idea had been around for decades but could only be practically implemented now after we had sufficient hardware and advances in other fields?

◧◩◪
3. rektid+07[view] [source] 2023-03-05 06:13:59
>>Razeng+K5
Because it's exactly like the parent said: Nvidia has always Nvidia & always has been, a tightfisted tightwad that makes everything they do ultra-proprietary. Nvidia never creates standards or participates.

Sometimes, like with CUDA, they just have an early enough lead that they entrench.

Vile player. They're worse than IBM. Soulless & domineering to the max, to every extent possible. What a sad story.

◧◩◪◨
4. rabf+1a[view] [source] 2023-03-05 06:54:54
>>rektid+07
Nvidia has had driver parity for linux, freebsd and windows for many many years. No other graphics card manufacturer has come close to the quality of their software stack accross platforms. For that they have my gratitude.
◧◩◪◨⬒
5. foxhil+mb[view] [source] 2023-03-05 07:11:57
>>rabf+1a
DLSS was windows only for some time.

linux’s amdgpu is far better than the nvidia-driver.

◧◩◪◨⬒⬓
6. rabf+7c[view] [source] 2023-03-05 07:25:03
>>foxhil+mb
ATI drivers were a horror show for the longest time on windows never mind linux. What Nvidia did was have have basically the same driver code for all operating systems with a compatibility shim. If you were using any sort of professioinal 3d software over the previous 2 decades Nvidia were the only viable solution.

Source: Was burned by ATI, Matrox, 3dlabs before finallly coughing up the cash for Nvidia.

◧◩◪◨⬒⬓⬔
7. nick__+o61[view] [source] 2023-03-05 16:34:02
>>rabf+7c
I was a big Matrox fan, mostly because I knew someone there, and was able to upgrade their products at a significant discount. This was important for me as a teenager whose only source of income was power washing eighteen-wheelers and their associated semi-trailers. It was a dirty and somewhat dangerous job, but I fondly remember my first job. Anyway, I digress, so let's get back to the topic of Matrox cards.

The MGA Millennium had unprecedented image quality, and its RAMDAC was in a league of its own. The G200 had the best 3D image quality when it was released, but it was really slow and somewhat buggy outside of Direct3D where it shined. However, even with my significant discount and my fanboyism, when the G400 was released, I defected to NVIDIA since its relative performance was abysmal.

◧◩◪◨⬒⬓⬔⧯
8. antod+w12[view] [source] 2023-03-05 22:12:43
>>nick__+o61
One usecase Matrox kept doing well was X11 multimonitor desktops. The G400 era was about the time I was drifting away from games and moving to full time Linux, so they suited me at least.
[go to top]