I have used Linux on many laptops and I never had problems with the video outputs, but most of them had NVIDIA GPUs and a few used the integrated Intel GPU. I have no recent experience with AMD GPUs on laptops.
I do not normally use Ubuntu, so that might matter, but when I bought a Dell Precision, it came with Ubuntu preinstalled and it worked fine until I wiped Ubuntu and I installed another Linux distribution.
I used once a Lenovo on which I had to waste a couple of days until I made the GPU work properly in Linux, because it was an NVIDIA Optimus switchable GPU, but even on that laptop there were no problems with the video outputs, but only with the OpenGL acceleration, until it was configured in the right way.
My desktop with a AMD Vega 64 crashes weekly (with occasional stable months) running Fedora (usually about 1 minor version behind mainline) since I've gotten it (maybe 3-4 years ago now)
It was a muxed setup. The screen was switched back and forth between GPUs and one would power off as needed (assuming everything went well). The HDMI port was only connected to the discrete GPU. T here was no way to get video out on the Intel card. By default, Linux would power on both, but use the Intel.
This was well before any AMD cooperation, and I had the laptop much longer than the FGLRX setup was supported. The open source Intel driver and simply turning off the AMD card was eventually the only way I could get it to run.
Even in Windows it was a strange setup. You had to manually switch, and when you did the screen would turn black, you'd wait a few seconds, and now you were on the other GPU.
I'm sure the situation is better these days, but after that experience I just stick to integrated.
The "black screen for a couple seconds" thing is still there, you just don't notice it, and once a game has "started" the discrete GPU, you can seamlessly switch back and forth.
some people are mentioning that "i can't believe it took 10 years for this to get fixed" - however back in the late 90s this exact scenario was the most common power gaming setup, with 3dfx cards you'd have 3 cards, two 3d cards with SLI, and a 2D card, usually an intel. The same black screen for a couple seconds, and switching between the desktop and a game had the potential to break things.
The "automatic" switching between igpu and discrete was managed on windows before 2011, because i had a laptop with that setup in 2011 and it would detect 3d applications and use the discrete for that, or you could force one gpu or the other, if you wanted.