I find the autoplay so annoying because it hides the thumbnail which was carefully designed to communicate why I should click on the video and replaces it with, usually, a talking head or stock footage. Often the video gets inexplicably added to my watch history, and if I do choose to click on it I have to go back to the beginning because I missed the start of the audio
Additionally there's a bug on the Android app that it sometimes doesn't show video titles (or the worlds worst A/B test?), so scrolling through I just see talking heads (since it autoplays instead of showing the video thumb) and have to force restart it to actually understand what's going on.
I use YouTube 6+ hours a day and I have for probably 10 years, and I don’t even work there. (I have a few annoying personality limitations which make it so that I usually work better with YouTube on in the background, and NOT on autoplay, autoplay always chooses something I don’t want to see/hear; I know that because I use the tool a lot.)
I can tell you that it has steadily and continually gotten worse in that 10 year time. “I have to come up with stories or I won’t have a job” no you don’t, but even if you did, there are so many things YouTube needs more that enlarged thumbnails with visible compression artifacts.
I did. Not that anyone listened tho.
Using the most commonly version of the product, on the commonly used hardware, at least 2 days a week should be a prerequisite for every product owner.
I am a firm believer that the software should also be developed on commonly used hardware.
Your average user isn't going to have the top-of-the-line MacBook pro, and your program isn't going to be the only thing running on it.
It may run fine on your beefed up monstrosity, and you'll not feel the need to care about performance (worse: you may justify laggy performance with "it runs fine on my machine"). And your users will pay the price for the bloat, which becomes an externality.
Same for websites. Yes, you are going to have a hundred tabs open while working on your web app, but guess what - so will your users.
Performance isn't really product's domain, as in — they would always be happier with things being more snappy; they have to rely on the developer's word as to what's reasonable to expect.
And the expectation becomes that the software should and can only run fine on whatever hardware the developer has, taking all the resources available, and any optimization beyond that is costly and unnecessary.
Giving the devs more modest hardware to develop with (limited traffic/cloud compute/CPU time/...) solves this problem preemptively by making the developers feel the discomfort resulting from the product being slow, and thus having the motivation to improve performance without the product demanding it.
The product, of course, should also have the same modest hardware — otherwise, they'll deprioritize performance improvements.
----
TL;DR: overpowered dev machines turn bloat into an externality.
Make devs use 5+-year-old commodity hardware again.