zlacker

[parent] [thread] 14 comments
1. rabf+(OP)[view] [source] 2023-03-05 06:48:52
Part of the success of directx over opengl was that very few graphics card companies seemed capable of producing a fully functional opengl driver, for the longest time Nvidia was the only option.

I recall ATI and Matrox both failing in this regard despite repeated promises.

replies(2): >>averev+23 >>tanepi+w83
2. averev+23[view] [source] 2023-03-05 07:32:46
>>rabf+(OP)
Fully functional opengl was not exactly the issue, or not the only one

Opengl was stagnating at the time vendors started a feature wars. On opengl you can have vendor specific extensions, because it was meant for tightly integrated hardware and software. Vendors started leaning heavily on extensions to one up each other.

The cronus group took ages to get up and standardize modern features

By that time gl_ext checks became nightmarishly complicated and cross compatibility was further damaged by vendors lying about their actual gl_ext support, where drivers started claiming support for things the hardware could do, but using the ext causes the scene to not look right or outright crash

Developers looked at that and no wonder they didn't want to take part in any of it

This all beautifully exploded a few year later when compiz started taking a foothold which required this or that gl_ext and finally caused enough rage to get cronus working at bringing back under control the mess

By that time ms were already at directx 9, you could use xlna to target different architectures, and it brought networking and io libraries with it making it a very convenient development environment

*this is all a recollection from the late nineties early 2k and it's by now a bit blurred, it's hard to fill in the details on the specific exts Nvidia was the one producing the most but it's not like the blame is on them, Maxtor and ATI wonky support to play catch up was overall more damaging. Ms didn't need to really do much to win hearts with dx.

replies(3): >>moring+k8 >>qwerto+r8 >>pjmlp+ld
◧◩
3. moring+k8[view] [source] [discussion] 2023-03-05 08:48:13
>>averev+23
How did Microsoft solve the "extensions problem"? Did they publish standardized APIs in time so vendors wouldn't come up with any extensions? Even then, how did MS prevent them from having the driver lie about the card's features to make it look better than it is?
replies(3): >>averev+f9 >>flohof+Eg >>moth-f+CB1
◧◩
4. qwerto+r8[view] [source] [discussion] 2023-03-05 08:51:26
>>averev+23
Plus MS was also trying to offer more than just graphics by adding audio and networking to the stack which kind of started to make the whole ecosystem attractive, even if it was painful to program against.

I had my share of fun with DirectMusic.

replies(1): >>mrguyo+Sd8
◧◩◪
5. averev+f9[view] [source] [discussion] 2023-03-05 09:03:47
>>moring+k8
Directx api, and driver certification program
◧◩
6. pjmlp+ld[view] [source] [discussion] 2023-03-05 10:10:05
>>averev+23
They haven't learn much from it, see how many Vulkan extensions exist already.
◧◩◪
7. flohof+Eg[view] [source] [discussion] 2023-03-05 10:55:53
>>moring+k8
MS had a rigorous certification and internal testing process, new D3D versions came out quickly to support new hardware features, and through the Xbox Microsoft had more real world experience for what games actually need than the GPU vendors themselves, which probably helped to rein in some of the more bizarre ideas of the GPU designers.

I don't know how the D3D design process worked in detail, but it is obvious that Microsoft had a 'guiding hand' (or maybe rather 'iron fist') to harmonize new hardware features across GPU vendors.

Over time there have been a handful of 'sanctioned' extensions that had to be activated with magic fourcc codes, but those were soon integrated into the core API (IIRC hardware instancing started like this).

Also, one at the time controversial decision which worked really well in hindsight was that D3D was an entirely new API in each new major version, which allowed to leave historical baggage behind quickly and keep the API clean (while still supporting those 'frozen' old D3D versions in new Windows versions).

replies(1): >>moring+2v
◧◩◪◨
8. moring+2v[view] [source] [discussion] 2023-03-05 13:22:26
>>flohof+Eg
> Also, one at the time controversial decision which worked really well in hindsight was that D3D was an entirely new API in each new major version, which allowed to leave historical baggage behind quickly and keep the API clean (while still supporting those 'frozen' old D3D versions in new Windows versions).

This is interesting. I have always wondered if that is a viable approach to API evolution, so it is good to know that it worked for MS. We will probably add a (possibly public) REST API to a service at work in the near future, and versioning / evolution is certainly going to be an issue there. Thanks!

replies(2): >>becuri+CZ >>jamesf+AI2
◧◩◪◨⬒
9. becuri+CZ[view] [source] [discussion] 2023-03-05 16:47:49
>>moring+2v
It’s a COM based API and so everything is an interface described via an IDL. You add a new member or change the parameters to a method, you must create a new version of the interface with a new GUID descriptor. You can query any interface for other interfaces it supports, so it’s easy for clients to check for newer functionality on incremental versions of DirectX.

In practice for DirectX you just use the header files that are in the SDK.

◧◩◪
10. moth-f+CB1[view] [source] [discussion] 2023-03-05 20:29:56
>>moring+k8
At the time Microsoft worked directly with vendors to have many of what would be vendor extension on OpenGL become cross-vendor core DirectX features.
◧◩◪◨⬒
11. jamesf+AI2[view] [source] [discussion] 2023-03-06 05:53:12
>>moring+2v
I've never had to migrate between DirectX versions but I don't imagine it's the easiest thing in the world due to this approach. Somewhat related I saw a library to translate DirectX 9 function calls to DirectX 12 because apparently so much of the world is still using DirectX 9.
replies(1): >>kalleb+YU2
◧◩◪◨⬒⬓
12. kalleb+YU2[view] [source] [discussion] 2023-03-06 08:19:42
>>jamesf+AI2
That's what the drivers for the new Intel GPUs have to do, since the GPUs were only designed for DX12, and need earlier versions to be emulated
replies(1): >>jamesf+2x5
13. tanepi+w83[view] [source] 2023-03-06 10:48:40
>>rabf+(OP)
Back in 1999 when the Quake source code came out, I started working on "Quake 2000" which was an improvement on the rendering pipeline of the code. I ended up getting free cards ship to me - one was a GeForce256, one was the Matrox G400 DualHead and I think the other was the ATI Rage 128 Pro.

The GeForce blew the other cards performance out the water. The Matrox was particularly bad and the dual screen didn't add much and I remember maybe 2 games that supported it.

◧◩◪◨⬒⬓⬔
14. jamesf+2x5[view] [source] [discussion] 2023-03-06 22:33:40
>>kalleb+YU2
Ah yeah, and it looks like this is what they're using: https://github.com/microsoft/D3D9On12 And it looks like DirectX 11 to DirectX 12 translation exists as well: https://github.com/microsoft/D3D11On12
◧◩◪
15. mrguyo+Sd8[view] [source] [discussion] 2023-03-07 18:27:26
>>qwerto+r8
DirectInput (and later XInput) had it's faults but it's probably the only reason you can just plug random first and third party controllers into a USB port and expect everything to just work.
[go to top]