zlacker

[return to "My AI skeptic friends are all nuts"]
1. capnre+15[view] [source] 2025-06-02 21:39:49
>>tablet+(OP)
The argument seems to be that for an expert programmer, who is capable of reading and understanding AI agent code output and merging it into a codebase, AI agents are great.

Question: If everyone uses AI to code, how does someone become an expert capable of carefully reading and understanding code and acting as an editor to an AI?

The expert skills needed to be an editor -- reading code, understanding its implications, knowing what approaches are likely to cause problems, recognizing patterns that can be refactored, knowing where likely problems lie and how to test them, holding a complex codebase in memory and knowing where to find things -- currently come from long experience writing code.

But a novice who outsources their thinking to an LLM or an agent (or both) will never develop those skills on their own. So where will the experts come from?

I think of this because of my job as a professor; many of the homework assignments we use to develop thinking skills are now obsolete because LLMs can do them, permitting the students to pass without thinking. Perhaps there is another way to develop the skills, but I don't know what it is, and in the mean time I'm not sure how novices will learn to become experts.

◧◩
2. hiAndr+Ja[view] [source] 2025-06-02 22:14:24
>>capnre+15
I'll take the opposite view of most people. Expertise is a bad thing. We should embrace technological changes that render expertise economically irrelevant with open arms.

Take a domain like US taxation. You can certainly become an expert in that, and many people do. Is it a good thing that US taxes are so complicated that we have a market demand for thousands of such experts? Most people would say no.

Don't get my wronf, I've been coding for more years of being alive than I haven't by this point, I love the craft. I still think younger me would have far preferred a world where he could have just had GPT do it all for him so he didn't need to spend his lunch hours poring over the finer points of e.g. Python iterators.

◧◩◪
3. jacobg+eb[view] [source] 2025-06-02 22:17:49
>>hiAndr+Ja
> We should embrace technological changes that render expertise economically irrelevant with open arms.

To use your example, is using AI to file your taxes actually "rendering [tax] expertise economically irrelevant?" Or is it just papering over the over-complicated tax system?

From the perspective of someone with access to the AI tool, you've somewhat eased the burden. But you haven't actually solved the underlying problem (with the actual solution obviously being a simpler tax code). You have, on the other hand, added an extra dependency on top of an already over-complicated system.

◧◩◪◨
4. hiAndr+Gv[view] [source] 2025-06-03 00:45:04
>>jacobg+eb
I never said anything about using AI to do your taxes.

I was drawing an analogy. We would probably be better off with a tax system that wasn't so complicated it creates its own specialized workforce. Similarly we would be better off with programming tools that make the task so simple that professional computer programmers feel like a 20th century anachronism. It might not be what we personally want as people who work in the field, but it's for the best.

◧◩◪◨⬒
5. jacobg+Nv3[view] [source] 2025-06-04 01:59:55
>>hiAndr+Gv
> I never said anything about using AI to do your taxes. I was drawing an analogy.

Yeah, I was using your analogy.

> It might not be what we personally want as people who work in the field, but it's for the best.

You're inventing a narrative and borderline making a strawman argument. I said nothing about what people who work in the field "personally want." I'm talking about complexity.

> Similarly we would be better off with programming tools that make the task so simple that professional computer programmers feel like a 20th century anachronism.

My point is that if the "tools that make the task simple" don't actually simplify what's happening in the background, but rather paper over it with additional complexity, then no, we would not "be better off" with that situation. An individual with access to an AI tool might feel that he's better off; anyone without access to those tools (now or in the future) would be screwed, and the underlying complexity may still create other (possibly unforeseen) problems as that ecosystem grows.

[go to top]