zlacker

[parent] [thread] 2 comments
1. unity1+(OP)[view] [source] 2022-12-16 03:23:34
> What I can tell you is that I'm amazed at the mistakes I see this new generation of junior programmers making, the kind of stuff indicating they have little understanding of how computers actually actually work.

> As an example, I continue to run into young devs that don't have any idea of what numeric over/underflow is

That doesn't happen in web application development either. You don't write that low level code that you could cause an overflow or underflow. There are a zillion layers in between your code and what could cause an overflow.

> they have little understanding of how computers actually actually work.

'The computer' has been abstracted away at the level of the Internet. Not even the experts who attend datacenters would ever pass near anything that is related to a numeric overflow. That stuff is hidden deep inside hardware or deep inside the software stack near the OS level in any given system. If there is anything that causes an overflow in such a machine, what they do would be to replace that machine instead of going into debugging. Its the hardware manufacturers' and OS developers' responsibility to do that. No company that does cloud or develops apps on the Internet would need to know about interrupts, numeric overflows and whatnot.

> I find it stranger that they are even a bit productive in their careers, although I suspect it's because the problem domains they work in are much more tolerant to these kinds of errors

Interrupt errors dont happen in web development. You have no idea at the level of abstraction that was built between the layers where it could happen and the modern Internet apps. We are even abstracting away servers, databases at this point.

You are applying a hardware perspective to the Internet. That's not applicable.

replies(1): >>lordfr+XF1
2. lordfr+XF1[view] [source] 2022-12-16 15:58:07
>>unity1+(OP)
I agree with everything you're saying. It surprises me that people can call themselves programmers and not know the basics of computer computation, but it seems that just means I have an older/more-narrow definition of what "programming" is compared to what it has become.

I still stand behind my main point, which is that some of these jobs will be automated before others. Apparently the skill set differences between different kinds of programmers even wider than I thought it was. So instead of talking about whether AI will/won't automate programming in general, it's more productive to discuss which kind of programming AI will automate first.

replies(1): >>unity1+Px3
◧◩
3. unity1+Px3[view] [source] [discussion] 2022-12-17 01:25:34
>>lordfr+XF1
> narrow definition of what "programming" is compared to what it has become.

Isnt that the case in every field in technology? Way back engineers used to know how circuits worked. Now network engineers never deal with actual circuits themselves. Way back back programmers had to do a lot of things manually. Now the underlying stack automates much of that. On top of TCP/IP, we laid the WWW, then we laid web apps, then we laid CMSes, then we came to such a point that CMSes like WordPress has their own plugins, and the very INDIVIDUAL plugins themselves became expertise fields. When looking for someone to work on a Woocommerce store, people dont look for WordPress developers, or plugin developers. They look for 'Woocommerce developers'. WP became so big that every facet of it became specializations in itself.

Same for everything else in tech: We create a technology, which enables people to build stuff on it, then people build so much stuff that each of those became individual worlds in themselves. Then people standardize that layer and then move on to building next level up. It goes infinitely upwards.

[go to top]