I fail to see the skill level in someone working on the web knowing about interrupts. And a firmware engineer knowing about devops, integrations or react.
> Although I should add that things like interrupts/events/synchronization and I/O operations are fairly universal computing concepts even in the cloud world
Not really. I/O has nothing to do with cloud, likewise interrupts. Those remain buried way, way down in the hardware that run the cloud at a place where not even datacenter engineers reach.
> If you're a cloud programmer and you don't know how long an operation takes / its big-O complexity
That still has nothing to do with interrupts or hardware I/O.
As a firmware/app guy I'm not qualified on talking about relative skill sets between different areas of cloud development. I agree that interrupts/threads aren't important at all to the person writing a web interface, should have found a better example. I'm not here to argue, for sure there are talented people up and down the stack.
What I can tell you is that I'm amazed at the mistakes I see this new generation of junior programmers making, the kind of stuff indicating they have little understanding of how computers actually actually work.
As an example, I continue to run into young devs that don't have any idea of what numeric over/underflow is. We do a lot of IoT and edge computing, so ranges/limits/size of the data being passed around matters a lot. Attempting to explain the concept reveals that a great many of them have no mental concept of how a computer even holds a number (let alone different variable sizes, types, signed/unsigned etc). When you explain that variables are a fixed size and don't have unlimited range, it's a revelation to many of them.
Sometimes they'll argue that this stuff doesn't matter, even as as you're showing them the error in their code. They feel the problem is that the other devs built it wrong, chose the wrong language or tool for the problem at hand etc. We had a dev (wrote test scripts) that would argue with his boss that everyone (including app and firmware teams) should ditch their languages and write everything in python, where mistakes can't be made. He was dead serious, ended up quitting out of frustration. I'm sure that was a personality problem, but still, the lack of basic understanding astounded us, and the phrase "knows enough to be dangerous" comes to mind.
I find it strange that there is a new type of programmer that knows very little about how computers actually work. I find it stranger that they are even a bit productive in their careers, although I suspect it's because the problem domains they work in are much more tolerant to these kinds of errors. CI/CD system is setup to catch/fix their problems, and hence the job positions can tolerate what used to be considered a below average programmer. Efficient? No. Good enough? Sure.
I suspect some of these positions can be automated before the others can.
> As an example, I continue to run into young devs that don't have any idea of what numeric over/underflow is
That doesn't happen in web application development either. You don't write that low level code that you could cause an overflow or underflow. There are a zillion layers in between your code and what could cause an overflow.
> they have little understanding of how computers actually actually work.
'The computer' has been abstracted away at the level of the Internet. Not even the experts who attend datacenters would ever pass near anything that is related to a numeric overflow. That stuff is hidden deep inside hardware or deep inside the software stack near the OS level in any given system. If there is anything that causes an overflow in such a machine, what they do would be to replace that machine instead of going into debugging. Its the hardware manufacturers' and OS developers' responsibility to do that. No company that does cloud or develops apps on the Internet would need to know about interrupts, numeric overflows and whatnot.
> I find it stranger that they are even a bit productive in their careers, although I suspect it's because the problem domains they work in are much more tolerant to these kinds of errors
Interrupt errors dont happen in web development. You have no idea at the level of abstraction that was built between the layers where it could happen and the modern Internet apps. We are even abstracting away servers, databases at this point.
You are applying a hardware perspective to the Internet. That's not applicable.
I still stand behind my main point, which is that some of these jobs will be automated before others. Apparently the skill set differences between different kinds of programmers even wider than I thought it was. So instead of talking about whether AI will/won't automate programming in general, it's more productive to discuss which kind of programming AI will automate first.
Isnt that the case in every field in technology? Way back engineers used to know how circuits worked. Now network engineers never deal with actual circuits themselves. Way back back programmers had to do a lot of things manually. Now the underlying stack automates much of that. On top of TCP/IP, we laid the WWW, then we laid web apps, then we laid CMSes, then we came to such a point that CMSes like WordPress has their own plugins, and the very INDIVIDUAL plugins themselves became expertise fields. When looking for someone to work on a Woocommerce store, people dont look for WordPress developers, or plugin developers. They look for 'Woocommerce developers'. WP became so big that every facet of it became specializations in itself.
Same for everything else in tech: We create a technology, which enables people to build stuff on it, then people build so much stuff that each of those became individual worlds in themselves. Then people standardize that layer and then move on to building next level up. It goes infinitely upwards.