zlacker

[return to "My AI skeptic friends are all nuts"]
1. capnre+15[view] [source] 2025-06-02 21:39:49
>>tablet+(OP)
The argument seems to be that for an expert programmer, who is capable of reading and understanding AI agent code output and merging it into a codebase, AI agents are great.

Question: If everyone uses AI to code, how does someone become an expert capable of carefully reading and understanding code and acting as an editor to an AI?

The expert skills needed to be an editor -- reading code, understanding its implications, knowing what approaches are likely to cause problems, recognizing patterns that can be refactored, knowing where likely problems lie and how to test them, holding a complex codebase in memory and knowing where to find things -- currently come from long experience writing code.

But a novice who outsources their thinking to an LLM or an agent (or both) will never develop those skills on their own. So where will the experts come from?

I think of this because of my job as a professor; many of the homework assignments we use to develop thinking skills are now obsolete because LLMs can do them, permitting the students to pass without thinking. Perhaps there is another way to develop the skills, but I don't know what it is, and in the mean time I'm not sure how novices will learn to become experts.

◧◩
2. gwbas1+c9[view] [source] 2025-06-02 22:05:05
>>capnre+15
> Question: If everyone uses AI to code, how does someone become an expert capable of carefully reading and understanding code and acting as an editor to an AI?

Well, if everyone uses a calculator, how do we learn math?

Basically, force students to do it by hand long enough that they understand the essentials. Introduce LLMs at a point similar to when you allow students to use a calculator.

◧◩◪
3. palata+4k[view] [source] 2025-06-02 23:11:43
>>gwbas1+c9
> Well, if everyone uses a calculator, how do we learn math?

Calculators have made most people a lot worse in arithmetic. Many people, for instance, don't even grasp what a "30%" discount is. I mean other than "it's a discount" and "it's a bigger discount than 20% and lower than 40%". I have seen examples where people don't grasp that 30% is roughly one third. It's just a discount, they trust it.

GPS navigation has made most people a lot worse at reading maps or generally knowing where they are. I have multiple examples where I would say something like "well we need to go west, it's late in the day so the sun will show us west" and people would just not believe me. Or where someone would follow their GPS on their smartphone around a building to come back 10m behind where they started, without even realising that the GPS was making them walk the long way around the building.

Not sure the calculator is a good example to say "tools don't make people worse with the core knowledge".

◧◩◪◨
4. ethbr1+bp[view] [source] 2025-06-02 23:48:42
>>palata+4k
At the end of the day, it's the average productivity across a population that matters.

So GPS makes people worse at orienteering -- on average, does it get everyone where they need to go, better / faster / easier?

Sometimes, the answer is admittedly no. Google + Facebook + TikTok certainly made us less informed when they cannibalized reporting (news media origination) without creating a replacement.

But on average, I'd say calculators did make the population more mathematically productive.

After all, lots of people sucked at math before them too.

◧◩◪◨⬒
5. palata+Hq[view] [source] 2025-06-03 00:00:04
>>ethbr1+bp
> After all, lots of people sucked at math before them too.

A calculator doesn't do maths, it does arithmetic. People sucked at maths, but I'm pretty sure they were better with arithmetic.

> At the end of the day, it's the average productivity across a population that matters.

You're pushing my example. My point is that AI may actually make the average developer worse. Sure, also more productive. So it will reinforce this trend that has been in the software industry for more than a decade: produce more but worse software.

Productivity explains why we do it. It doesn't mean it is desirable.

◧◩◪◨⬒⬓
6. ethbr1+rJ[view] [source] 2025-06-03 03:07:01
>>palata+Hq
You're looking at this from a human-centric perspective.

I'm suggesting you consider it from an objective perspective.

It's easily possible for an organization to be more productive with worse developers because of the tools they have access to.

And no, that's not some slight of verbal hand in measuring "productive" -- they are able to ship more value, faster.

◧◩◪◨⬒⬓⬔
7. fzeror+hk1[view] [source] 2025-06-03 09:48:03
>>ethbr1+rJ
> And no, that's not some slight of verbal hand in measuring "productive" -- they are able to ship more value, faster.

Ship more value faster is exactly a verbal slight of hand. That's the statement used by every bad product manager and finance asshole to advocate for shipping out broken code faster. It's more value because more code is more content, but without some form of quality guard rails you run into situations where everything breaks. I've been on teams just like that where suddenly everything collapses and people get mad.

◧◩◪◨⬒⬓⬔⧯
8. ethbr1+9r1[view] [source] 2025-06-03 10:58:33
>>fzeror+hk1
Do you think compilers helped teams ship more value faster from worse developers? IDEs with autocomplete? Linters?

At the end of the day, coders are being paid money to produce something.

It's not art -- it's a machine that works and does a thing.

We can do that in ways that create a greater or lesser maintenance burden, but it's still functional.

LLM coding tools detractors are manufacturing reasons to avoid using another tool that helps them write code.

They need to get over the misconception of what the job is. As another comment previously quipped 'If you want to write artisanal, hand-tuned assembly that's beautiful, do that on your own time for a hobby project.'

◧◩◪◨⬒⬓⬔⧯▣
9. fzeror+Xs1[view] [source] 2025-06-03 11:15:16
>>ethbr1+9r1
> Do you think compilers helped teams ship more value faster from worse developers? IDEs with autocomplete? Linters?

I'm tired of engaging with this false equivalence so I won't. Deterministic systems are not the same.

> It's not art -- it's a machine that works and does a thing.

That's right. But what you need to understand is that the machines we create can and do actively harm people. Leaking secure information, creating software that breaks systems and takes down critical infrastructure. We are engineers first and foremost and artists second. And that means designing systems to be robust and safe. If you can't understand that then you shouldn't be an engineer and should kindly fuck off.

[go to top]