Granted, I've always had these kinds of issues with new laptops, especially when it came to proprietary nvidia or AMD graphics (before AMDGPU) and I agree it's improved a lot, but I still need to tell people that there's caveats with some (especially newer) laptops.
At the end of 2021 I got an EliteBook 845 g8 (Zen 3) that worked completely fine out of the box on Linux (Arch with up-to-date kernel). Every last bit of kit worked perfectly. Bluetooth, IR webcam, fingerprint sensor, light sensor, mute LEDs, etc. On Windows, to this day, the webcam isn't recognized because of some USB chip along the line. There's also a lot of lag when adjusting the display backlight, for some reason.
I also have its cousin, an EB 840 g8 (intel 11th gen). A few days ago I installed Win11 22h2 on it. I was lucky to have had an external mouse, since neither the touchpad nor the track point could be used for setup. And it absolutely needs the latest Intel GPU drivers to correctly output 4k@60 through its HP dock (DP pass-thtough, not DisplayLink). On Linux, the same display setup has worked well since day one. But the mute LEDs are still broken.
Both laptops don't come with integrated wired network, so I have an HP USB dongle (Realtek chip). This works quite well on Linux. On Windows, it initially works well, but then, for some reason, Windows figures it needs to update the driver. Then it gains some interesting failure modes, where from the terminal I can do whatever I want, but Edge keeps thinking the connection is lost.
It was a muxed setup. The screen was switched back and forth between GPUs and one would power off as needed (assuming everything went well). The HDMI port was only connected to the discrete GPU. T here was no way to get video out on the Intel card. By default, Linux would power on both, but use the Intel.
This was well before any AMD cooperation, and I had the laptop much longer than the FGLRX setup was supported. The open source Intel driver and simply turning off the AMD card was eventually the only way I could get it to run.
Even in Windows it was a strange setup. You had to manually switch, and when you did the screen would turn black, you'd wait a few seconds, and now you were on the other GPU.
I'm sure the situation is better these days, but after that experience I just stick to integrated.
The "black screen for a couple seconds" thing is still there, you just don't notice it, and once a game has "started" the discrete GPU, you can seamlessly switch back and forth.
some people are mentioning that "i can't believe it took 10 years for this to get fixed" - however back in the late 90s this exact scenario was the most common power gaming setup, with 3dfx cards you'd have 3 cards, two 3d cards with SLI, and a 2D card, usually an intel. The same black screen for a couple seconds, and switching between the desktop and a game had the potential to break things.
The "automatic" switching between igpu and discrete was managed on windows before 2011, because i had a laptop with that setup in 2011 and it would detect 3d applications and use the discrete for that, or you could force one gpu or the other, if you wanted.