They were opposed to C++ (they thought C was all you need), opposed to git (they used IBM clearcase or subversion), opposed to putting internal tools in a web browser (why not use Qt and install the tool), opposed to using python or javascript for web services (it's just a script kiddie language), opposed to sublime text/pycharm/vscode (IDEs are for people who don't know how to use a CLI).
I have encountered it over and over, and each time these people get stuck in late career jobs making less than 1/3 of what most 23 year old SWEs I know are making.
But then hindsight is 20/20.
Back when I was in college in the 00s, if I had developed a preference for not using compilers in my work, I might have been able to build a career that way, but my options would have been significantly limited. And that's not because people were just jerks who were biased against compiler skeptics, or evil executives squeezing the bottom line, or whatever. It's because the kind of software most people were making at that period of time would have been untenable to create without higher level languages.
In my view, we clearly aren't at this point yet with llm-based tooling, and maybe we never will be. But it seems a lot more plausible to me that we will than it did a year or even six months ago.
My most successful "this is doomed to fail" grouchiness was social media games (like Farmville).
But I just can't think of any examples in the dev tooling space.
You can rightly avoid new things 99% of the time, but if you miss the 1% of things that matter, you get left behind.
On the other hand if you adopt the latest thing 100% of the time and 99% of those things are a waste, you will probably be fine.
But if you expect to get paid, you need to keep up and stay productive.
And it doesn't burn everyone out. All of the best 50+ year old engineers I know use LLMs constantly.