Granted, I've always had these kinds of issues with new laptops, especially when it came to proprietary nvidia or AMD graphics (before AMDGPU) and I agree it's improved a lot, but I still need to tell people that there's caveats with some (especially newer) laptops.
Linux worked perfectly on my old laptop from 2015 though.
My last laptop (an AMD version of the HP Envy 13) was also rough at the beginning. A BIOS update updated the AMD GPU firmware or microcode or something and broke compatibility with the current kernel stable kernel at the time. Had to switch to an -rc kernel to get video to work.
Admittedly, my day job is basically Linux kernel development so I'm intimately familiar with most of this stuff. Not exactly your typical user.
But I always take some time to look if somebody succeed in installing Linux on the laptop I want to buy before. If it means I need to wait an extra 6 months, then I wait a bit.
BT is a trainwreck.
I forgot to mention in the parent post that the SD card reader can't detect insertion/removal at times, yeah, so I have a script to reload the rtsx_pci_sdmmc kernel module to force it to recheck.
Linux still needs an 'it just works' version. I really thought pop would be it, but the last year of development has been very disappointing with system breaking updates being pushed (I'm on system 76 hardware).
Putting the OS or even just the display to sleep causes the whole thing to completely freeze, forcing me to hold the power button until it shuts off.
Other than that, usable, but some really bad quirks that would make me switch back to Windows if I didn't have workarounds (use an ethernet cable, never let the display sleep, never close the lid while the laptop is running).
I love it because these days I have less time to fiddle with it every six months.
At the end of 2021 I got an EliteBook 845 g8 (Zen 3) that worked completely fine out of the box on Linux (Arch with up-to-date kernel). Every last bit of kit worked perfectly. Bluetooth, IR webcam, fingerprint sensor, light sensor, mute LEDs, etc. On Windows, to this day, the webcam isn't recognized because of some USB chip along the line. There's also a lot of lag when adjusting the display backlight, for some reason.
I also have its cousin, an EB 840 g8 (intel 11th gen). A few days ago I installed Win11 22h2 on it. I was lucky to have had an external mouse, since neither the touchpad nor the track point could be used for setup. And it absolutely needs the latest Intel GPU drivers to correctly output 4k@60 through its HP dock (DP pass-thtough, not DisplayLink). On Linux, the same display setup has worked well since day one. But the mute LEDs are still broken.
Both laptops don't come with integrated wired network, so I have an HP USB dongle (Realtek chip). This works quite well on Linux. On Windows, it initially works well, but then, for some reason, Windows figures it needs to update the driver. Then it gains some interesting failure modes, where from the terminal I can do whatever I want, but Edge keeps thinking the connection is lost.
Instead, there should be an actual list of well supported devices and people should buy only them.
I personally like the rolling approach, but that doesn't reflect everyone's experience.
It used to be quite hard to find new laptops with hardware combos that worked well with Linux but it's become a lot easier in recent years.
Also my experience with windows has actually gotten quite a bit worse, actually, unless you use the stuffed-full-of-garbage oem installs I've found it way more likely that I get stuck in a catch 22 where there's no network drivers for either the Ethernet or wifi so you wind up downloading some drivers off a sketchy site to put on a USB stick just to get started.
The post is really only an anecdote about a ThinkPad, and a relatively old one at that, which is probably as good as it gets in terms of Linux compatibility.
I personally more or less agree with the title, though, assuming a suitable hardware choice. I have a new-ish Ryzen ThinkPad for work and the only issue I've had is Gnome occasionally semi-hanging, and I don't know if that's just because of Ubuntu being a bit flimsy or because of something more general such as an issue with the AMD graphics driver.
Also, the Teams client the post mentions is about to be dropped by MS and it was never really that good to begin with, but having seen about two decades of desktop Linux, I'd rather be surprised that it's been available and worked somewhat reliably at all without hit-and-miss with Wine.
Had an experience like this several years ago, but with hackintoshing.
On a Dell workstation laptop with a Quadro FX770M GPU (basically a relabeled Geforce 8800M GT), the Nvidia drivers had an issue under XP, Vista, and 7 where if the card downclocked when idle it'd cause Windows to bluescreen. The only fix for this for many years was to disable power saving features on the card, turning the laptop into a furnace even when it was doing nothing.
The proprietary Linux drivers for the card worked better (at least it could idle properly) but occasionally they'd cause your WM to lock up for no apparent reason.
The only thing that ran the card for extended periods without issues, of all things, was hackintoshed OS X. The built-in Nvidia drivers recognized it as an 8800M GT (which had been used in real Macs at some point) and it ran beautifully with power saving and everything. I even used that setup to play WoW on for several years.
The bug in the Windows driver was finally fixed at some point during the Windows 8/10 era, and so now I can run Windows on that laptop without problems, but holy cow it shouldn't have taken a decade (it was manufactured in 2008) for that to happen.
Even "Linux works damn well on your ancient laptop" is a great selling point. Want to run Windows or macOS on an ancient machine? You can run an insecure ancient version, or, if the up-to-date version can even be installed, it'll run at a crawl. Linux makes those machines still usable.
For me its quite a usable machine now. But I'm currently giving a M1 Macbook a shot and it certainly is convenient not to have hiccups like this (yet).
1. Crashing regularly for most of the early Windows 10 era, leaving users with a frozen mute LED,
2. Was found to contain an actual keylogger. Yes, the driver as shipped by HP and signed by MS had malware.
Google "mictray64.exe" .
I run debian stable on my headless desktops/television and testing on my laptops. It's so easy it's boring.
It's like asking for a book review of a book that hasn't been published. Yes, other people have published reviews, but they got advance copies and a supplementary synopsis from the publisher six months ago.
When I decided to switch to Linux as my main OS, I researched well supported models and settled on the X1 Carbon. I bought it a large discount right after a new generation was released and the Linux support has been near perfect. Really only one or two minor issues in the past ~3 years, which is similar to what I have experienced with most Windows and macOS devices.
I just mention this to say, this can be an issue with any recent hardware. With Linux (the the most part) drivers are built-in and vendors do often ship drivers, so we have to wait sometimes for compatibility.
- Thinkpad Carbon X1 14" (i7-5600u). Everything worked out of the box with Arch Linux at the time. Best experience I've ever had.
- HP Envy 13z (R5 2500u) everything works today but the out of the box experience was very poor. Windows update installed an APU microcode update that broke the Linux AMDGPU driver and had to run an -rc kernel for awhile. Took a year to get a touchscreen driver and years to get the driver for the tablet sensors (rotation, etc.). Total wait of 3 years for all features, but I never had the desire to use it as a tablet so I was okay with it. Sleep works but this laptop had awful battery drain issues in sleep (30% per day).
- Dell XPS 15 7590 (i9-9980hk) - Sleep is broke in both Linux and Windows. Everything else works well, including, notably, NVIDIA Optimus / DRI PRIME.
- Asus ZenBook 14 (R7 5800U) - second best out of box experience. Touchpad is connected via i2c and my Gentoo install didn't have it enabled. I'd never bumped into i2c hid devices other than touchscreens.
The title of this post is "Linux on the laptop works so damn well that it’s boring".
I have used Linux on many laptops and I never had problems with the video outputs, but most of them had NVIDIA GPUs and a few used the integrated Intel GPU. I have no recent experience with AMD GPUs on laptops.
I do not normally use Ubuntu, so that might matter, but when I bought a Dell Precision, it came with Ubuntu preinstalled and it worked fine until I wiped Ubuntu and I installed another Linux distribution.
I used once a Lenovo on which I had to waste a couple of days until I made the GPU work properly in Linux, because it was an NVIDIA Optimus switchable GPU, but even on that laptop there were no problems with the video outputs, but only with the OpenGL acceleration, until it was configured in the right way.
All the builtin radios, cameras, microphones, and sensors in modern laptops make them ideal for stealing your private data. I already have an untrusted cell phone, I want my personal laptop to be something I can feel comfortable keeping my data on. Because I can't personally audit every chip, that means I need some level of trust, and Lenovo has demonstrated over and over and over again that they cannot be trusted.
how do people on windows figure out what driver has updates? do you guys check the version installed and go to each manufacture to see if there is a new version>?
My desktop with a AMD Vega 64 crashes weekly (with occasional stable months) running Fedora (usually about 1 minor version behind mainline) since I've gotten it (maybe 3-4 years ago now)
As a regular customer you can't order it with Linux though, it is only sold to enterprise customers.
It was a muxed setup. The screen was switched back and forth between GPUs and one would power off as needed (assuming everything went well). The HDMI port was only connected to the discrete GPU. T here was no way to get video out on the Intel card. By default, Linux would power on both, but use the Intel.
This was well before any AMD cooperation, and I had the laptop much longer than the FGLRX setup was supported. The open source Intel driver and simply turning off the AMD card was eventually the only way I could get it to run.
Even in Windows it was a strange setup. You had to manually switch, and when you did the screen would turn black, you'd wait a few seconds, and now you were on the other GPU.
I'm sure the situation is better these days, but after that experience I just stick to integrated.
Dell XPS is the latest addition to this group.
Consumer laptops come with a lot of trickery analogous to WinModems of the era, which require Windows specifically. Hence these cost saving measurements create a lot of problems.
If you've got an Android phone and a USB cable, you should be able to USB tether to your phone's WiFi connection. This should work out the box on Linux and Windows.
System76 is looking into making their own hardware now too so I'm really looking forward to seeing what they come up with in-house.
In 2022.
That is the kind of basic thing that does not work.
In addition to that, if you have a high-DPI laptop display and you want to plug it into a low-DPI desktop monitor (or vice-versa), good luck getting the scaling to work in a usable way.
Edit: The author uses an 11 year old machine. Not a surprise it works well. With all the new stuff the vendors introduce difficulties are much more common. I hear a lot of complaints from colleagues with Thunderbolt docks, the newest Intel camera generation has no Linux support, not that much has changed. Whether it's 2 steps forward and 1 step back or the other way round is debatable.
My Dad's Lenovo Ideapad comes with a soft-raid of two SSDs for example, since a faster and twice bigger would be much more pricey.
Also, I've seen non-standard GPUs, tons of broken BIOS tables, vendor specific devices with weird quirks and whatnot over the years.
Maybe these things still happen but newer kernels know how to deal with this better, I don't know.
So much for things working on older laptops, my 6ish years old Asus as some weird Intel BT chip that has completely broken drivers on Ubuntu. Not as in that they can't be built or installed, but the damn thing keeps fucking disconnecting and reconnecting every few seconds. It literally would've been better if they hadn't bothered.
But also like in general, at least anyone making any new protocol or standard can rest easy knowing that they cannot possibly fuck up worse than IEEE making the bluetooth spec.
Like just give me a big text file with hundreds of tweakables and tunables like X had...
They hide behind 'you just need to get your client to make the right API calls'... but that just means most wayland compositors don't support most of the available options...
The same config pane where I adjust my pointer speed should let me adjust my scroll speed.
Even swapped out the Framework mainboard after a long back and forth with support. Just some poor battery unloading or similar causing shorts. I was very close to committing my company to using them until this started happening to my tester unit and my lead engineer's tester unit.
I hope the best for Framework -- I really love their repairability promise -- but before I can commit my company to them I need them to not be lemons.
i don't care what they put on the default windows partition (i replace it on arrival) and the uefi issue was a production mistake where they imaged with a nonproduction image.
they're still used widely by serious people in academia, open source and security sensitive industry.
i suspect a lot of the bad press they get comes from the fact that there's a lot of very sharp eyes making use of their gear and that similar issues happen in other lines but just go unnoticed.
if you're truly paranoid, a pine arm machine or fully open source risc-v may be your jam. everything else is going to be loaded up with proprietary blobs everywhere along with overcomplicated supply chains and overzealous marketing departments cross selling adware onto that default image you should be tossing anyway.
I only used Void Linux on it; maybe it's different with other distros.
That was 15 years ago in 2007. I never went back. Now macOS has its struggles, but I can work and focus on a clean UI.
Yes! How can they sell these like that? My XPS 13 will never go to sleep correctly, either the screen stays on or it doesn't shut off correctly, in Windows or Linux. You'd think that this is the basic feature a laptop has to have. And it's not just me, their forums are full of people having problems and their support has no idea. They were sending me guides for latitudes from 2012.
Definitely not going for Dell hardware again.
Most users won't even know the difference between Wayland and X.org and X11 unless they are already the kind of tinkerers who used Linux on the desktop despite its drawbacks. Normal people have no idea what any of it means, and they should not need to know.
They do make their own desktops and minis now. I think they use Clevo for laptops, and those do get more complaints here on HN than the desktops (but I think the consensus is they are getting better). They have more laptop models, so making their own would be a huge task.
Graphics always worked fine except for random full system lock-ups/kernel panics in amdgpu which have been fixed at some point I don't remember when. I have no idea what caused them but a kernel option (something with iommu) made them go away until it was properly fixed, and I think that wasn't exclusive to this laptop. Graphics are still scrambled when waking from sleep though, but they take a split second to restore. The rest of the problems (bluetooth, fingerprint), still persist.
Sure? This is exactly the thing that Wayland was supposed to solve. Only X has one DPI for all screens.
I still use X because I'm on FreeBSD and I even got multi-screen multi-dpi scaling to work there, with xrandr settings but indeed it was not fun. In Wayland it should be click & play though.
It's not real multi-DPI no but effectively it does work. Does require a pretty decent GPU to render all screens at 200% before it scales them down though.
It's not a fault of Wayland but it is reflective of the whole Linux laptop experience.
Supposedly upstream electron has fixed this but I'm yet to see a single electron app that works. Maybe they just haven't updated electron.
For example when I move my mouse from my 192 DPI screen to my 96 DPI screen, the mouse position translates in physical pixel, not in physical location. So at the bottom it matches but near the middle of the 192 DPI screen it stops going to the left (it already ends up on the top of the 96 DPI screen going from the middle of the 192 DPI screen) and becomes an 'invisible wall'. Even in Windows 11 they didn't bother to fix this :(
The only OS that had a good transition to multi-DPI capabilities was macOS and that's really because Apple doesn't care about legacy and forces app devs to update their stuff. But it's not just Linux that's having a hard time with this.
But I didn't know this was a specific problem. I'm not using Wayland yet and won't for the foreseeable future. I'm on FreeBSD and KDE on Wayland has been broken a long time. When I hear this it sounds like a good decision anyway :)
I think largely these days people just plug less stuff in, printers, scanners, other odd gadgets are less common so its really just USB mass storage devices and video outputs getting physically plugged in. Otherwise everything else is controlled over wifi with a phone app.
Even sharing with the help of various 'portals', e.g. xdg-desktop-portal-gnome or xdg-desktop-portal-wlr
It 'simply' takes some arguments at runtime. Below are what I use -- taken from my Sway 'start on login' script [some is superfluous]:
ElectronThingHere --silent --enable-gpu --use-gl=egl --enable-features='VaapiVideoDecoder,VaapiVideoEncoder,WebRTCPipeWireCapturer,UseOzonePlatform' --ozone-platform=wayland
You'll find they're basically identical to what you'd use to enable/force Wayland on Chrome. Also VAAPI {en,de}coding and pipewire based sharingYou can also replace --ozone-platform=wayland with --ozone-platform-hint=auto for less strong-handed encouragement
I use quite a few different Electron-driven things on Wayland. Discord is the only one seemingly refusing to update their Electron base... and getting free Wayland support
If not for them I'd remove XWayland support entirely from my Sway configuration
KDE's upcoming release in October should hopefully be addressing this by allowing you to disable the bitmap-based scaling.
If you want the "works so well it's boring", go with X11. The one exception, as you note, is multi-DPI, which has native support in Wayland.
For Wayland, there are (depending on DE/compositor) some specific issues or inconsistencies, like the scroll speed you are mentioning. Personally, I also have qt5 apps being all over the place with window placement under wlroots. There are times when you'll need to look up some environment variable to make an application or toolkit behave properly.
So if you're in the high-DPI+low-DPI scenario, yeah, it still takes some effort. For anyone else, I think OP holds.
My pick for a "boring stable desktop" stack:
* Dist: Your preference of Fedora/Debian/Arch. (Mint, Pop, and Endeavour acceptable derivatives)
* DE: Budgie/XFCE/MATE/Cinnamon
For me, every OS has rough spots and it's about which ones I can tolerate the most. On Linux I get better window tiling than on Windows, and shortcuts for navigating directly to a virtual desktop, and no shenanigans with WSL2 having a separate memory pool from the rest of the OS. And I don't feel like the entire OS is antithetical to how I use a computer like with macOS.
But a bunch of more mundane things become a lot more fiddly or flaky. E.g., this week openSUSE Tumbleweed pushed out Gnome 43 before any of my extensions got marked as compatible and now they just won't work for a little while. That's easier for me to live with when the OS is well suited for me most of the time.
btw HWE isn't even the best "ubuntu flavored kernel" in terms of hardware support.., there are the OEM kernels designed for Ubuntu certified laptops (such as XPS 13 Developer Edition) which get newer kernel versions and drivers faster than HWE, you can install them on any Ubuntu with regular apt ("apt install linux-oem-22.04" for example) ...
I've had multiple Thinkpad T-generations from T410 to the latest. Sometimes it does works flawlessly out of the factory at purchase.
This time, it did not. The 12th gen Intel CPUs have a heterogeneous design with traditional "P-cores" and low-power "E-cores". I'm suspecting the reason I see terrible performance is that the CPU scheduler does not handle this efficiently and assigns the wrong task to the E-cores.
Also the Intel WiFi does not even get detected. Have not dug deeper into that yet.
Anything > ~6(Intel) ~12(AMD/Realtek) months old tend to work smoothly out of the box, IME.
No, it's really just them. They worked hard to earn that bad press. It's not even that they keep pre-installing malware, but how they've handled it when they're caught speaks volumes.
When the truth about superfish came out first they fiercely denied there was any security risk to anyone ("we have thoroughly investigated this technology and do not find any evidence to substantiate security concerns”), then eventually they admitted it was a problem and said they'd stop shipping devices infected by it, but continued to anyway more than a month later (https://arstechnica.netblogpro.com/information-technology/20...) and the instructions they gave users for removing the offending software still left systems vulnerable while giving people a false sense of security. When they were caught doing that they issued new instructions and those still left users vulnerable!! (https://www.theguardian.com/technology/2015/feb/20/lenovo-ap...)
Decades ago one of the most important benchmarks of Linux distributions was they were all higher performance using less resources than the original Windows that came with the PC.
If you have a PC with only 1GB of memory which still works fine with XP or W7, most distros are now unusable.
How do people typically learn to debug kernel issues on their hardware? It seems like actively promoting widespread knowledge of the practical methods would benefit the community.
It is a community-developed project, so it only really needs to appeal to developers. What motivation is there to attract non-technical users? Particularly ones who require lots of effort doing uninteresting polishing related tasks to keep them happy. Other platforms do this sort of thing because their entire reason for existing is to satisfy customers.
Gotta say this issue sounds minor compared to not being able to set the scroll speed.
Lol thats rich. They did it like 2 or 3 times for the windows laptops they sold most of the time not part of the thinkpad line. So yeah. Long history it is. You also have a long history of making bad comments then?
+1 if you're looking for some anecdata. The thing that finally pushed me from Windows to Linux was a privacy setting not actually being persisted (after a long battle to find the relevant settings). The fact that some wireless network cards don't work yet is definitely a rough spot, but I can also just buy a new one or write a driver, whereas getting Windows to care about my privacy or MacOS to care about basic usability with respect to keyboard remapping or window positioning seems unnecessarily daunting.
First of all, adjusting scrolling speed is not an "uninteresting polishing related task," it is a basic standard of usability.
Secondly, if you don't think Linux on laptops should be broadly usable by the general population, you are in the wrong thread. The central point of the HN post we are all commenting on is the usability of the Linux desktop ecosystem on commodity laptops.
Though it has nothing to do with Wayland before the flamewar starts, it’s just libinput and gtk maintainers not agreeing upon whose responsibility is it to handle scroll events (it is gtk’s though, libinput doesn’t have enough context to implement kinetic scrolling, so it really should be the framework that adds semantic meaning to an event stream)
Swaywm has the ability to set this (you have to edit the config file). It seems weird that gnome or whatever you use lacks this option. Although, gnome has a lot of t's to cross and i's to dot, maybe they just haven't gotten around to it.
You can check into git so you have a history of changes?
So you can copy the config to another machine?
There are lots of reasons why text files are the preferred format to store configuration in.
Other than perhaps a slight performance boost, why do we want settings in a non-human readable database?
Hell, even Microsoft are starting to use json config files for stuff like Windows terminal because they know people like to be able to quickly copy and edit settings.
It "just works" until I decide to view videos online, and despite all my hacking efforts, keeps using software decoding for videos.
It "just works" until I try some 3D stuff that requires GL 4.1, but the AMD open source driver for the GPU only does hardware acceleration up to GL 3.3.
Sleeping "just works" until I wake it up and is in such a frozen state that only taking the battery out makes the booting process work again.
Yeah it just works.
You are talking about the configs being stored in text files. The comment you are responding to was talking about being forced to edit text files to configure.
Yours is about the format of data representation and theirs is about UX.
The first step of not forcing users to edit text files is having sensible well thought out defaults. If I have to think about configs the designers of the app failed me.
The second way to not force the users to edit text files is by having a well thought out gui for the kind of changes you might want.
The format of how the config settings are stored is almost orthogonal to this questions. And yes, you are right, a text based format is preferable over a properitary binary one.
The Windows mouse thing has been somewhat fixed in Win11 22H2, where you can now even move your mouse to the side "above" the other screen and it will still move there.
As for apps working seamlessly, I'm really not convinced. Not even the taskbar works well. If you change the DPI while it's running, the taskbar icons become blurry. The initial start menu (on first click) adapts fine, but then if you start typing to search something, the results are a blurry mess. Edge has weird artefacts in the tab animation after a DPI change, where half of the icon moves at a different speed. IntelliJ has funny fonts, with some of them huge, others tiny.
To me, the killer feature of MacOS when it comes to multi-DPI setups is that it remembers the per-screen-per-setup DPI. In my case, my PC has a 14" 1920x1080 screen. When I use it alone, it's much closer than with an external screen. I like it in 100% mode. When I plug in the screen, a 32" 4k, they're both much further away. They have roughly the same DPI (by design - I mostly use Linux) so there's no "matching" to do, but I'd like both of them to be at say 125%. Tough luck. If I change the laptop's screen to 125% while the external screen is plugged in, it will stay at 125% when on its own, too. MacOS would remember that with this screen it's 125%, alone it's 100.
At one point I was using two 24" screens, one 1920x1080, one 3840x2160. I've tried messing around with settings, until I ended up on xrandr scaling as being, basically, the only solution. Five minutes later, the low-dpi display was in the closet, because I couldn't stand the blurry fonts.
Linux users don't have to either. (Linux devs do.) Another thing they don't have to do is searching vendor site for drivers which may not even be installed correctly.
Some higher end devices of course need it (esp. in the biometrics department), but rest is automagic now, as far as I experienced.
But indeed YMMV here.. There is no way to use non-anti aliased fonts at small sizes this way. For me it is fine, I would set it up the same way anyway but I forgot it's not for everyone.
Sure, but for me as an end user, it's irrelevant who's fault of this bazaar engineering endeavor it is that very basic quality of life features from Windows/MacOS do not work on Linux.
As a dev I understand the struggle why this and many other stuff doesn't work right on Linux, but as a consumer/end user I don't care about their internal feud and I expect the product I use to have basic stuff like this working out of the box.
Also, unfortunately the bazaar style of development sort of begets this kind end-user experience. Some people like it, others don’t. I change between OSX and Linux quite often nowadays, what I prefer in the latter is that I actually have a chance of fixing problems, not just wait around and pray to the Apple/Microsoft gods that they may have fixed the issue in the next multi-GB update. Also, piece-by-piece, free software often beats out proprietary offerings’ alternatives, it is usually the experience together with the whole stack that is lacking. E.g. pipewire may well be a better sound stack than that of the other two OS’s.
What did I attack and which false claims did I make?
>what I prefer in the latter is that I actually have a chance of fixing problems
What I and most consumers want is a product that does not require fixing or learning how to fix things. I and most other people don't want to play sys-admin at home despite having cut my teeth in it and making it a career. I work in cybersecurity so all our workforce is fluent in linux which we daily drive at work and yet at home everyone of us only uses Windows and/or MacOS on our personal machines with only one guy using Linux religiously at home.
When even experienced linux users don't want it in their personal lives that says something. Even though we know how to fix things but our free time is much more valuable. Nobody likes a desktop that stutters and ruins your immersion and productivity, especially if you're running a system that costs several grand.[1]
Maybe when the hardware manufacturers can work with the bazaar engineers and finally agree on something and work together with the desktop environment devs on how to make Wayland a fully feature complete drop in replacement for X11 with no rough edges, quirks or issues and have feature parity, smoothness and polish to Windows/MacOS, we can finally have the "year of the (polished) Linux desktop". Until then, I and most consumers will continue to use whichever OS provides the best experience with least amount of friction.
I gave a potential explanation to why some people may still prefer Linux, understanding well why others don’t.
How long in months was “really quickly”?
The “looks like resolutions” work by setting your screen to the resolution it claims to be multiplied by two, and then downsampling the image to your native size. Depending on the resolutions involved, the screen might feel a bit blurry. On Windows, setting an intermediary scale changes the way the UI is drawn, while keeping your native resolution.
Sway is an example of a Wayland compositor, that is an actual piece of software, and has a config file.
> input <identifier> scroll_factor <floating point value>
> Changes the scroll factor for the specified input device. Scroll speed will be scaled by the given value, which must be non-negative.
My older Mac Mini can't do the HDMI handshake after booting. If the AV receiver isn't on when the computer boots, it will never use full display resolution or play audio over HDMI.
The idea that Apple products do anything "absolutely" is silly fanboyism.
The two-finger gesture scroll speed seems to be at a fixed speed, and way too slow for my liking.
I would like it to scroll faster than the mouse movement speed.
Try out Memtest86 - I think it's also usually an option in the boot menu on Ubuntu live-DVDs.
Let it run overnight, I've had crashes like that before where the RAM only starts failing after a few hours of memtesting.
The "black screen for a couple seconds" thing is still there, you just don't notice it, and once a game has "started" the discrete GPU, you can seamlessly switch back and forth.
some people are mentioning that "i can't believe it took 10 years for this to get fixed" - however back in the late 90s this exact scenario was the most common power gaming setup, with 3dfx cards you'd have 3 cards, two 3d cards with SLI, and a 2D card, usually an intel. The same black screen for a couple seconds, and switching between the desktop and a game had the potential to break things.
The "automatic" switching between igpu and discrete was managed on windows before 2011, because i had a laptop with that setup in 2011 and it would detect 3d applications and use the discrete for that, or you could force one gpu or the other, if you wanted.
I guess next step would be Memtest! Thanks for the reminder.
Distro maintainers certainly, unless you're Gentoo, Arch, or one of the other mostly-bleeding-edge rolling release distros. The "stable" kernel is whatever the current release is and "longterm" kernels are typically the last major kernel version released in a given year.
https://www.kernel.org/category/releases.html
Most distributions pick whatever the latest longterm kernel is when they cut releases. Sometimes they don't and things get strange, such as when Canonical chose kernel 4.15 for Ubuntu 18.04, requiring them to maintain an unsupported kernel themselves. IIRC that was because a bunch of AMD CPU and GPU support was added in 4.15.