At the end of 2021 I got an EliteBook 845 g8 (Zen 3) that worked completely fine out of the box on Linux (Arch with up-to-date kernel). Every last bit of kit worked perfectly. Bluetooth, IR webcam, fingerprint sensor, light sensor, mute LEDs, etc. On Windows, to this day, the webcam isn't recognized because of some USB chip along the line. There's also a lot of lag when adjusting the display backlight, for some reason.
I also have its cousin, an EB 840 g8 (intel 11th gen). A few days ago I installed Win11 22h2 on it. I was lucky to have had an external mouse, since neither the touchpad nor the track point could be used for setup. And it absolutely needs the latest Intel GPU drivers to correctly output 4k@60 through its HP dock (DP pass-thtough, not DisplayLink). On Linux, the same display setup has worked well since day one. But the mute LEDs are still broken.
Both laptops don't come with integrated wired network, so I have an HP USB dongle (Realtek chip). This works quite well on Linux. On Windows, it initially works well, but then, for some reason, Windows figures it needs to update the driver. Then it gains some interesting failure modes, where from the terminal I can do whatever I want, but Edge keeps thinking the connection is lost.
I personally like the rolling approach, but that doesn't reflect everyone's experience.
Had an experience like this several years ago, but with hackintoshing.
On a Dell workstation laptop with a Quadro FX770M GPU (basically a relabeled Geforce 8800M GT), the Nvidia drivers had an issue under XP, Vista, and 7 where if the card downclocked when idle it'd cause Windows to bluescreen. The only fix for this for many years was to disable power saving features on the card, turning the laptop into a furnace even when it was doing nothing.
The proprietary Linux drivers for the card worked better (at least it could idle properly) but occasionally they'd cause your WM to lock up for no apparent reason.
The only thing that ran the card for extended periods without issues, of all things, was hackintoshed OS X. The built-in Nvidia drivers recognized it as an 8800M GT (which had been used in real Macs at some point) and it ran beautifully with power saving and everything. I even used that setup to play WoW on for several years.
The bug in the Windows driver was finally fixed at some point during the Windows 8/10 era, and so now I can run Windows on that laptop without problems, but holy cow it shouldn't have taken a decade (it was manufactured in 2008) for that to happen.
I have used Linux on many laptops and I never had problems with the video outputs, but most of them had NVIDIA GPUs and a few used the integrated Intel GPU. I have no recent experience with AMD GPUs on laptops.
I do not normally use Ubuntu, so that might matter, but when I bought a Dell Precision, it came with Ubuntu preinstalled and it worked fine until I wiped Ubuntu and I installed another Linux distribution.
I used once a Lenovo on which I had to waste a couple of days until I made the GPU work properly in Linux, because it was an NVIDIA Optimus switchable GPU, but even on that laptop there were no problems with the video outputs, but only with the OpenGL acceleration, until it was configured in the right way.
My desktop with a AMD Vega 64 crashes weekly (with occasional stable months) running Fedora (usually about 1 minor version behind mainline) since I've gotten it (maybe 3-4 years ago now)
It was a muxed setup. The screen was switched back and forth between GPUs and one would power off as needed (assuming everything went well). The HDMI port was only connected to the discrete GPU. T here was no way to get video out on the Intel card. By default, Linux would power on both, but use the Intel.
This was well before any AMD cooperation, and I had the laptop much longer than the FGLRX setup was supported. The open source Intel driver and simply turning off the AMD card was eventually the only way I could get it to run.
Even in Windows it was a strange setup. You had to manually switch, and when you did the screen would turn black, you'd wait a few seconds, and now you were on the other GPU.
I'm sure the situation is better these days, but after that experience I just stick to integrated.
btw HWE isn't even the best "ubuntu flavored kernel" in terms of hardware support.., there are the OEM kernels designed for Ubuntu certified laptops (such as XPS 13 Developer Edition) which get newer kernel versions and drivers faster than HWE, you can install them on any Ubuntu with regular apt ("apt install linux-oem-22.04" for example) ...
The "black screen for a couple seconds" thing is still there, you just don't notice it, and once a game has "started" the discrete GPU, you can seamlessly switch back and forth.
some people are mentioning that "i can't believe it took 10 years for this to get fixed" - however back in the late 90s this exact scenario was the most common power gaming setup, with 3dfx cards you'd have 3 cards, two 3d cards with SLI, and a 2D card, usually an intel. The same black screen for a couple seconds, and switching between the desktop and a game had the potential to break things.
The "automatic" switching between igpu and discrete was managed on windows before 2011, because i had a laptop with that setup in 2011 and it would detect 3d applications and use the discrete for that, or you could force one gpu or the other, if you wanted.