Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.
In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.
Sigh.
OpenGL was the nicer API to use, on all platforms, because it hid all the nasty business of graphics buffer and context management, but in those days, it was also targeted much more at CAD/CAM and other professional use. The games industry wasn't really a factor in road maps and features. Since OpenGL did so much in the driver for you, you were dependent on driver support for all kinds of use cases. Different hardware had different capabilities and GL's extension system was used to discover what was available, but it wasn't uncommon to have to write radically different code paths for some rendering features based on the cababilities present. These capabilities could change across driver versions, so your game could break when the user updated their drivers. The main issue here was quite sloppy support from driver vendors.
DirectX was disgusting to work with. All the buffer management that OpenGL hid was now your responsibility, as was resource management for textures and vertex arrays. In DirectX, if some other 3D app was running at the same time, your textures and vertex buffers would be lost every frame, and you're have to reconstruct everything. OpenGL did that automatically behind the scenes. this is just one example. What DirectX did have, though was some form of certification, eg, "Direct X9", which guaranteed some level of features, so if you wrote your code to a DX spec, it was likely to work on lots of computers, because Microsoft did some thorough verification of drivers, and pushed manufacturers to do better. Windows was the most popular home OS, and MacOS was insignificant. OpenGL ruled on IRIX, SunOS/Solaris, HP/UX, etc, basically along the home/industry split, and that's where engineering effort went.
So, we game developers targeted the best supported API on Windows, and that was DX, despite having to hold your nose to use it. It didn't hurt that Microsoft provided great compilers and debuggers, and when XBox came out, which used the same toolchain, that finally cinched DX's complete victory, because you could debug console apps in the same way you did desktop apps, making the dev cycle so much easier. The PS1/PS2 and GameCube were really annoying to work with from an API standpoint.
Microsoft did kill OpenGL, but it was mainly because they provided a better alternative. They also did sabotage OpenGL directly, by limiting the DLL's shipped with windows to OpenGL 1.2, so you ended up having to work around this by poking into you driver vendor's OpenGL DLL and looking up symbols by name before you could use them. Anticompetitive as they were technically, though, they did provide better tools.